Simona Scholz, Analytics & AI Enthusiast and MINTchanger:in at A1 Telekom, delivered a talk at the WiDS Villach 2024 on the importance of fair AI and its implementation in their organization.
Highlights:
💛 Simone expressed her long-standing passion for fairness, emphasizing the significance of ensuring fairness in AI. She highlighted real-life examples of biases in voice recognition, job recommendations, diagnostics, and credit card limits, underscoring the need for fairness in AI systems.
💡 The initiative to integrate fairness and trustworthiness into AI systems at A1 Telekom began in autumn 2022, with a small team working in a garage, fostering open discussions and exploring bias metrics for comprehensive understanding.
🛝 The team created a synthetic data playground, intentionally introducing biases to study and measure their impact. They extensively tested bias metrics, emphasizing the importance of understanding bias sources and addressing sensitive attributes to minimize risks.
✅ The insights gained from the playground were applied to real machine learning models. The team introduced a Fair AI check, incorporating standard bias metrics, such as the flip test, which assesses positive or negative discrimination in AI models, leading to corrective actions and discussions with business and legal departments.
⚠️ Simone stressed the importance of transparency and assessment. They implemented AI impact and sensitivity metrics in Jira for documentation and assessment, ensuring that any high-risk AI cases are promptly addressed to minimize potential harm.
Simone's presentation showcased the company's commitment to embedding fairness in AI and the proactive measures taken to address biases and ensure ethical AI practices. If you have any specific questions or need further insights from the talk, feel free to ask for additional details!
Video credits: MC Digitalproduktions GmbH
Ещё видео!