Name of the participant: Ahmed Elnaggar
Beschreibung des IT-Forschungsprojekts: Humans can transfer the knowledge they gained from one task to other tasks. If someone has gained knowledge from a specific task, he can apply it in similar problems or in a general life problem to perform it better/faster. In this project, we use similar AI concepts, including: transfer learning, multi-task learning and long-life learning, to build deep learning models that can perform better than current single task models. This will allow companies to accelerate machine learning training and overcome data scarcity.
This project focuses on analyzing, training and testing different deep learning models using transfer learning and multi-task learning approaches under different use-cases and scenarios. This will lead to:
- Solving the data scarcity problem and poor performance results that faces most of the industry companies when dealing with deep learning algorithms.
- Sharing the knowledge between interested parties and overcome data privacy without sharing data.
- Boost the machine learning models’ results.
- Minimizing the training time required to solve data related tasks.
- Minimizing the computation resources required to train models which reduces the cost, to allow startups and small companies to gain benefits from the power of AI.
The aim of the “TFMT” project is to provide a complete guide for using the power of transfer learning alongside pre-trained models for the public.
Software Campus partners: TU München, Merck
Implementation period: 01.03.2020 – 31.05.2021