Iwo Kurzidem
Software and Computational Systems Lab
Department of Computer Science
Ludwig-Maximilians-Universität München (LMU Munich)
Oettingenstraße 67
80538
Munich
(Germany)
- Room
-
Fraunhofer IKS (Institute for Cognitive Systems)
Hansastrasse 32
80686 Munich - Office Hours
- By appointment, just write me an e-mail. We can also schedule a vitual meeting.
- firstname . lastname @ iks.fraunhofer.de
Thesis Mentoring
Currently no topics available.
Currently assigned topics
Formal Verification of Different Properties of an Explainable Random Forest Model
This topic is available as a BSc thesis and as a MSc research training course with 6/12 ECTS. Mentoring is available in English and German.
Goal: In this Master's Thesis, we want to use formal methods to verify different properties of an explainable ML model (random forest). The goal is to identify which properties of the ML model can be verified, with which approach, and to what degree. If any approach or method is not suitable, we want to find out why and if modifications or adaptations are possible to make it work. ML properties of interest are: Contradictions, identification of sets, logic simplification etc.
Background: Current Autonomous Systems use Machine Learning (ML) based algorithms for object detection, classification, trajectory planning and more. However, most of these ML models are trained end-to-end and are therefore a "black-box". For safety relevant applications this requires an additional verification at different levels of abstraction. Especially explainable ML promises a humanly understandable model, but it is not without its own uncertainties. To solve this, we use formal verification to investigate the properties of a trained RF, provided as JSON file.
Requirements:
- Practical knowledge of decision trees and random forest
- Successfully completed the following courses: Formale Spezifikation und Verifikation (FSV) and Formale Sprachen und Komplexitaet (FSK)
- (optional) Ability to program in Python
Formal Verification of Different Properties of a Random Forest Model
This topic is available as a BSc thesis and as a MSc research training course with 6/12 ECTS. Mentoring is available in English and German.
Goal: In this Master's Thesis, we want to use formal methods to verify different properties of an explainable ML model (random forest). The goal is to identify which properties of the ML model can be verified, with which approach, and to what degree. If any approach or method is not suitable, we want to find out why and if modifications or adaptations are possible to make it work. ML properties of interest are: Contradictions, identification of sets, logic simplification etc.
Background: Current Autonomous Systems use Machine Learning (ML) based algorithms for object detection, classification, trajectory planning and more. However, most of these ML models are trained end-to-end and are therefore a "black-box". For safety relevant applications this requires an additional verification at different levels of abstraction. Especially explainable ML promises a humanly understandable model, but it is not without its own uncertainties. To solve this, we use formal verification to investigate the properties of a trained RF, provided as JSON file.
Requirements:
- Practical knowledge of decision trees and random forest
- Successfully completed the following courses: Formale Spezifikation und Verifikation (FSV) and Formale Sprachen und Komplexitaet (FSK)
- (optional) Ability to program in Python