(Un)fair AI by ZOLLHOF
Machine learning has now become an integral part of our society: whether in the selection of job applicants, medical treatment recommendations or credit rating checks. They often support or even replace human decision makers. While the benefits of such automation are undeniable, there are increasing reports of algorithms discriminating against certain groups of people because of so-called sensitive characteristics, such as gender or skin color. Women in particular are often affected by this.
Why do algorithms discriminate? And how can this happen?
Within the 7th Know-How Events by ZOLLHOF in July, speaker Klara Krieg will give you an insight into the research field of fair machine learning, which deals with these questions. She explains for non-techies and computer science beginners how algorithms learn, what role data plays and why discrimination can occur.
Klara Krieg has a bachelor's degree in industrial engineering and is currently completing her master's degree in business informatics. Klara Krieg is actively involved in various women's networks and stands for more diversity in tech and IT. The focus of her current research is Gender Bias in Neural Networks, which is also in the area of discriminatory algorithms. Klara Kreg has already been on stage at FEMTEC and PANDA and wants to create more attention and visibility for the topics of algorithmic fairness and discrimination.
The event will take place from 4.30pm - 5.45pm in German language. Registration for the event via eventbrite.
Tel.: +49 89 24210 7508
- calender Download event as iCal
- language Languages