Name of participant: Alexander H. Berger
Project’s name: Interpretable Machine Learning for Clinical Decision Support (IMKE)
Project description:
In recent years, significant advancements have been made in the field of medical image analysis, with many diagnostic tasks achieving near-perfect accuracy in academic settings. However, the clinical adoption of these AI models remains limited. A major barrier to widespread clinical use is the lack of trust in the often opaque decision-making processes of these models. Clinicians and medical practitioners need to understand how and why a model arrives at a particular diagnosis or treatment recommendation. This transparency is crucial for identifying and correcting errors, especially when small hardware changes in imaging devices can lead to unexpected variations in diagnostic outcomes. Moreover, in complex cases where multiple diseases present similar symptoms, a transparent AI model can provide valuable insights, aiding clinicians in making accurate differential diagnoses. Therefore, enhancing the interpretability of AI models is essential for building trust and ensuring their effective integration into clinical practice.
Our project aims to bridge the gap between high diagnostic accuracy and interpretability in AI models used for medical image analysis. We propose to develop novel AI models that leverage new data representations, such as graph-based structures, to enhance both interpretability and diagnostic performance. These innovative representations will allow for a more transparent decision-making process, making it easier for clinicians to understand and trust the AI’s recommendations. Specifically, we will explore methods to represent medical image data in ways that are both interpretable and accurate. Building on these representations, we will develop and refine explainability techniques tailored to these new data forms. Our project will also focus on integrating these explainable models into clinical workflows, ensuring they provide practical support to healthcare professionals. This includes developing a prototype diagnostic support tool that incorporates our models and explainability methods and validating its performance in a clinical test environment.
Software Campus Partner: Technische Universität München and Merck KGaA
Implementation period: 01.01.2025 – 31.12.2026