How data bias is making being a woman more dangerous with Caroline Criado Perez

Image credit: Rachel Louise Brown

“A lot of these tech solutions are driven by algorithms that have been trained on data that is hopelessly male biased and is severely lacking when it comes to female data. And the result of that is that a whole load of tech solutions for all sorts of things just don’t work very well for women.”
– Caroline Criado Perez, Author of Invisible Women: Exposing Data Bias in a World Designed for Men

Caroline Criado Perez is a writer, journalist and feminist campaigner. She has written two books: Do It Like A Woman and Invisible Women. In her most recent book Invisible Women: Exposing Data Bias in a World Designed for Men she describes how very old data bias can affect women today. In this episode, Sheana learns about the different ways data bias is affecting women today, from trivial things such as phone size to not so trivial things such as seat belt safety. Caroline tells all this and more in this episode of Innovation For All Podcast. 

In this episode you will learn:

  • What is male default thinking?
  • What are the consequences of male default thinking?
  • What are the consequences in tech?
  • Why the market is so bad at providing for women?
  • What is low hanging fruit for those of us who want to make money?
  • A stove example of male default thinking.
  • What can entrepreneurs and consumers do about these issues?

Links and mentions:

Connect With Caroline:

ai-ethics-podcast

When Are “Fair” Algorithms Better Than Accurate Ones? with Osonde Osoba

Artificial Intelligence continues to penetrate our lives. As it does so, we should be wary of its ethical and social implications.

Listen in iTunes

Listen on Stitcher

Listen in-browser

Osonde Osoba, an engineer at the RAND Corporation and a professor at the Pardee RAND Graduate School, joins Sheana Ahlqvist in today’s episode of Innovation For All Podcast to talk about fairness in Artificial Intelligence and Machine Learning. AI has the ability to seriously impact our lives, which is why Osonde is pushing for systems that are accurate, unbiased, and flexible.

Discover what areas we should be wary when handing over the decision-making to AI’s, why this isn’t just a technical issue, but also political, and who should we put in charge of these systems. Learn also the importance of accountability, ethics, privacy, and regulation in AI systems.

IN THIS EPISODE YOU’LL LEARN:

  • The difference between Machine Learning and Artificial Intelligence
  • Should AI systems intentionally be made to ‘align with our comfort’?
  • What roles do the legislators, policy makers, etc. do?
  • Strategies to protect Data Privacy in AI and ML models
  • Regulatory rules between the developers and the users
  • If technology changes so rapidly, how can regulators keep up?
  • How can we build accountability into AI & ML?

LINKS

Others Mentioned

CONNECT WITH OSONDE

The 80/20 rule hurts everything from education to self-driving cars featuring Dr. Jutta Treviranus

In this episode of Innovation for All, host Sheana Ahlqvist talks to Dr. Jutta Treviranus, Director of the Inclusive Design Research Centre at OCAD University in Toronto. Dr. Treviranus explains how traditional approaches to business, design practices, and research can results in suboptimal or unfair systems. They discuss what inclusive design is, why it is so important, and how we can design systems that accommodate everyone.

Dr. Treviranus also reimagines the future of education. They cover her efforts to incorporate inclusivity into the current change-resistant educational system, her “Unlearning and Questioning” course, and her most recent project: developing a lab school for inclusive life-long learning.

YOU’LL LEARN:

• How to use AR to combine real data with simulated data to create and experience new, imagined futures
• What is the Pareto Principle (80/20 Principle)?
• What are the real world consequences of ignoring the “20%”?
• What is the Cobra Effect?

Find Jutta on Twitter as @juttatervira