Project Details
ExplainableMine: Anomaly Quantification for Explainable and Privacy-aware Process Mining
Applicant
Professorin Dr. Agnes Koschmider
Subject Area
Data Management, Data-Intensive Systems, Computer Science Methods in Business Informatics
Term
since 2023
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 496119880
This project will develop an outlier quantification framework making the analysis results and sourced process mining explainable. For this purpose, we will classify outliers, define patterns for outlierness and will quantify them in order finally to integrate them within a framework, be publicly available. The outlier model will rely on a taxonomy for outlier quantification and user-centric filtering and outlier detection. As a result, the framework will allow to rank process attributes according to their sensitivity from most to least noisy and then to apply polishing techniques instead of outlier removal for highly noise sensitive attributes. Also, data annotations will be provided characterizing outliers like interesting vs. unwanted behavior, or if outliers are random. The synergies of SOURCED make it possible to understand the topic of outliers more accurately. For example, distributed sensors offer the possibility of making a more robust and accurate classification (outlier vs. no outlier). The combination of new visualizations with outlier methods also allows users to have better transparency and understanding of what and why something has been filtered, should be filtered, repaired or deleted. In turn, outlier quantification allows a more accurate estimation how much new noise is needed to guarantee differential privacy with minimal loss of utility. Based on these measures, models to control the pro-active perturbation of data for privacy-awareness based on the extent of anomalies that are present already can be developed.
DFG Programme
Research Units
Major Instrumentation
mobiles Tinyhaus
Instrumentation Group
9060 Baracken, Gewächshäuser, Maschinenhallen