Revenge Porn is Really a Tech Problem with Darieth Chisolm

darieth-chisolm-podcast
“Whatever they do in the bedroom is their business. But once someone takes that content and makes it available for other people to see and they’re doing it with the intent to do harm, the game changes.” – Darieth Chisolm, founder of 50 Shades of Silence

Overview:

In this episode, Darieth Chisolm, Emmy-award winning television personality, NBC News anchor and activist for cyber sexual crimes, discusses her personal experience with revenge porn, the obstacles faced by victims today, and the complexities of free speech as it relates to sharing nude photos online.

In this episode you will learn:

  • About Darieth’s personal experience with revenge porn
  • Her challenge of taking legal action outside of the U.S.
  • The impact of the Digital Millennium Copyright Act (DMCA)
  • Examples of first steps on how victims can take legal action
  • Obstacles to taking down nude content that is published online
  • A brief history of policies like the SHIELD Act and Enough Act
  • How Freedom of Speech should not apply when it is enacted with the intent to do harm
  • The pervasiveness of victim shaming and victim blaming
  • The importance of parents having conversations about nude photos with their children
  • Resources for victims of revenge porn (linked below)
  • How Darieth is supporting victims today

Links and mentions:

  • 50 Shades of Silence Documentary
  • SHIELD Act
  • Enough Act
  • SpeakServeSoar.com membership
  • Cyber Crimes Act of 2015 in Jamaica
  • www.50shadesofsilence.com/
  • DMCA.com

Connect with Darieth:

How data bias is making being a woman more dangerous with Caroline Criado Perez

Image credit: Rachel Louise Brown

“A lot of these tech solutions are driven by algorithms that have been trained on data that is hopelessly male biased and is severely lacking when it comes to female data. And the result of that is that a whole load of tech solutions for all sorts of things just don’t work very well for women.”
– Caroline Criado Perez, Author of Invisible Women: Exposing Data Bias in a World Designed for Men

Caroline Criado Perez is a writer, journalist and feminist campaigner. She has written two books: Do It Like A Woman and Invisible Women. In her most recent book Invisible Women: Exposing Data Bias in a World Designed for Men she describes how very old data bias can affect women today. In this episode, Sheana learns about the different ways data bias is affecting women today, from trivial things such as phone size to not so trivial things such as seat belt safety. Caroline tells all this and more in this episode of Innovation For All Podcast. 

In this episode you will learn:

  • What is male default thinking?
  • What are the consequences of male default thinking?
  • What are the consequences in tech?
  • Why the market is so bad at providing for women?
  • What is low hanging fruit for those of us who want to make money?
  • A stove example of male default thinking.
  • What can entrepreneurs and consumers do about these issues?

Links and mentions:

Connect With Caroline:

ai-ethics-podcast

When Are “Fair” Algorithms Better Than Accurate Ones? with Osonde Osoba

Artificial Intelligence continues to penetrate our lives. As it does so, we should be wary of its ethical and social implications.

Listen in iTunes

Listen on Stitcher

Listen in-browser

Osonde Osoba, an engineer at the RAND Corporation and a professor at the Pardee RAND Graduate School, joins Sheana Ahlqvist in today’s episode of Innovation For All Podcast to talk about fairness in Artificial Intelligence and Machine Learning. AI has the ability to seriously impact our lives, which is why Osonde is pushing for systems that are accurate, unbiased, and flexible.

Discover what areas we should be wary when handing over the decision-making to AI’s, why this isn’t just a technical issue, but also political, and who should we put in charge of these systems. Learn also the importance of accountability, ethics, privacy, and regulation in AI systems.

IN THIS EPISODE YOU’LL LEARN:

  • The difference between Machine Learning and Artificial Intelligence
  • Should AI systems intentionally be made to ‘align with our comfort’?
  • What roles do the legislators, policy makers, etc. do?
  • Strategies to protect Data Privacy in AI and ML models
  • Regulatory rules between the developers and the users
  • If technology changes so rapidly, how can regulators keep up?
  • How can we build accountability into AI & ML?

LINKS

  • RAND Corporation
  • GDPR
  • Fairness, Accountability, and Transparency in Machine Learning (FAT/ML)
  • Fairness and Machine Learning by Solon Barocas, Moritz Hardt & Arvind Narayanan

Others Mentioned

CONNECT WITH OSONDE

The 80/20 rule hurts everything from education to self-driving cars featuring Dr. Jutta Treviranus

In this episode of Innovation for All, host Sheana Ahlqvist talks to Dr. Jutta Treviranus, Director of the Inclusive Design Research Centre at OCAD University in Toronto. Dr. Treviranus explains how traditional approaches to business, design practices, and research can results in suboptimal or unfair systems. They discuss what inclusive design is, why it is so important, and how we can design systems that accommodate everyone.

Dr. Treviranus also reimagines the future of education. They cover her efforts to incorporate inclusivity into the current change-resistant educational system, her “Unlearning and Questioning” course, and her most recent project: developing a lab school for inclusive life-long learning.

YOU’LL LEARN:

• How to use AR to combine real data with simulated data to create and experience new, imagined futures
• What is the Pareto Principle (80/20 Principle)?
• What are the real world consequences of ignoring the “20%”?
• What is the Cobra Effect?

Find Jutta on Twitter as