September 1, 2021

Autonomous Intelligence: What will it take to solve the bias problem in Artificial Intelligence (AI)?

Missed our live session? No worries… you can now access the recording and watch at your pace.

Event Description

AI bias, also known as algorithmic bias is a phenomenon that occurs when an algorithm produces results that are systemically prejudiced due to erroneous or incorrect assumptions in the machine learning process or from the use of incomplete, faulty or prejudicial data sets to train and/or validate the machine learning systems. AI bias often stems from problems introduced by the individuals who design and/or train the machine learning systems and often leads to the creation of algorithms that reflect unintended cognitive or social biases or prejudices. As AI systems are making their way into the military, banking, and bio-medical sector and assisting humans continuously, in this Inclusive AI Forum we will examine in what ways bias in an algorithm is a threat to humans and what can be done about this.

In this Inclusive Artificial Intelligence (AI) Forum we discover the concept of AI, or algorithm bias, what risks it presents and to what degree we can eliminate it. To frame this discussion, we can classify the source of bias in AI systems in 3 ways. Bias in the data,  Bias in the human, Bias in the process. 

Presenter: Simon Swan, Machine Learning Lead

Simon Swan contributes to the core technology of Synthesized and is the Machine Learning Lead. Simon is passionate about algorithmic and AI fairness and is the product lead for FairLens, Synthesized open-source bias measurement and detection toolkit. Prior to joining Synthesized in 2019, he worked in the legal and medical industries as a NLP & Machine Learning engineer. He has an academic background in Statistical Thermodynamics and Computational Linguistics from the University of Cambridge.

Join our DataOps community on Slack

Learn about modern DataOps practices and connect directly with your peers, Synthesized users, and our engineers.