– Developed by Microsoft in 2016 – Learned from user interactions on Twitter – Started spewing racist and offensive language within 16 hours – Taken offline by Microsoft
– Facial recognition tool launched by Amazon in 2016 – Shown to have racial bias, falsely identifying dark-skinned people more often – Raised concerns about fairness and discrimination in AI algorithms
– AI system designed to make natural-sounding phone calls for tasks like booking appointments – Criticized for being deceptive, not clearly disclosing it was a machine – Google made changes to improve transparency
– AI system designed to make natural-sounding phone calls for tasks like booking appointments – Criticized for being deceptive, not clearly disclosing it was a machine – Google made changes to improve transparency