The next BostonCHI meeting is Personal Anecdotes of AI Bias and where we go from here on Tue, Sep 23 at 6:00 PM.
Personal Anecdotes of AI Bias and where we go from here
Artificial intelligence systems increasingly shape our daily experiences, from voice assistants to image generation tools, yet these technologies often fail to recognize or fairly represent diverse users. This talk combines personal narratives with empirical research to examine how AI bias manifests in real-world interactions and its psychological impact on marginalized communities. Drawing from studies on accent bias in voice cloning services and representation gaps in image search results, we explore how current AI systems perpetuate exclusion through both technical limitations and training data skewed toward Western, predominantly white perspectives. The presentation reveals the human cost of algorithmic bias—from feelings of erasure to impacts on self-esteem—while also highlighting emerging efforts to create more inclusive AI systems. Through case studies ranging from voice recognition failures to beauty standard reinforcement, we demonstrate how personal experiences of bias reflect broader systemic issues in AI development and deployment.
About our speaker
Dr. Avijit Ghosh is an Applied Policy Researcher at Hugging Face. He works at the intersection of machine learning, ethics, and policy, aiming to implement fair ML algorithms into the real world. He has published and peer-reviewed several research papers in top ML and AI Ethics venues, and has organized academic workshops as a member of QueerInAI. His work has been covered in the press, including articles in The New York Times, Forbes, The Guardian, Propublica, Wired, and the MIT Tech Review. Dr. Ghosh has been an invited speaker as a Responsible AI expert, at various high impact events such as SXSW, MIT Sloan AI Conference and the Summit on State AI Legislation. He has also engaged with policymakers at various levels in the United States, United Kingdom, and Singapore. His research and outreach have led to real-world impact, such as helping shape regulation in New York City and causing Facebook to remove their biased ad targeting algorithm.