The dangers of period-tracking apps with Dr. Maggie Delano

“The important thing for designing inclusively is thinking ahead of time and . . . making sure that you have many options. Because no matter how great you intend to design something, it won’t necessarily work for everyone, but by having more options you increase the inclusivity for everybody.” -Maggie Delano, Swarthmore Assistant Professor

Overview:

In the season finale of Innovation for All, Maggie Delano, Assistant Professor of Engineering at Swarthmore College, breaks down how period-tracking apps exclude people who are not straight, cis-gendered women without medical conditions. She explains how user design could be more inclusive and introduces us to the benefits of Quantified Self.

In this episode, you will learn:

  • The issues surrounding period-tracking apps
  • What the Quantified Self community consists of
  • How period-tracking apps can be more inclusive of people with medical conditions
  • How user research can think about cases that fall outside of the set target audience
  • Ways to increase inclusivity in the on-boarding process of app design
  • Concerns of data privacy in period-tracking apps
  • How self-tracking can be beneficial
  • Ways that self-tracking is happening organically
  • Ideas on tracking “subjective” experiences such as emotion and mood
  • How to leverage user research to avoid stereotypes and generalizations
  • Examples of queer-inclusive business ideas

Links and mentions:

Connect with Maggie:

How to battle racism with Janet Stovall

“It is not about individual bigotry. It’s about systemic racism. Racism is not just bigotry, and it’s not just prejudice. It’s prejudice plus power, so we must disrupt the power structures. It’s not the individual . . . it’s the institutions that our country was built on that.” – Janet Stovall

Overview:

In this episode of Innovation for All, Janet Stovall tells us about her history fighting for inclusion since she was a student at Davidson College through present day, where she is the current speech writer for the CEO of UPS. She discusses the complexities of being a woman of color in the workforce and how to address institutionalized racism.

In this episode you will learn:

  • The history of Project 87 at Davidson College
  • How measurable, quantifiable movements are successful
  • What it’s like to be a “stand-in director of diversity”
  • Experiences of being an Executive Speech Writer for UPS’s CEO
  • How Janet left corporate America to start her own business
  • The business case for diversity
  • Challenges of corporate America
  • Pros and cons of being self employed
  • Overcoming discrimination against women of color in the workforce
  • How not all forms of diversity are equal

Links and mentions:

Connect with Janet:

What should change in 2020? My favorite guests return.

In this special episode, our favorite experts on AI, product designers and more return to the podcast to answer two key questions: What’s the biggest news in your field in 2019, since we recorded the podcast? What’s something that’s been missing from the conversation that you’d like to see gain more interest in 2020?

You don’t want to miss this one. You’ll hear from:

The hiring process wasn’t built for women. Katharine Zaleski of PowerToFly is changing that.

“When you’re recruiting women, you need to start a dialogue with the group of women you want to bring in and recruit. And it’s a long conversation.” — Katharine Zaleski, President of PowerToFly

How can we build a more inclusive and productive workforce? In this episode of the Innovation For All podcast, Sheana speaks with Katherine Zaleski, one of the founders of Power To Fly. Katherine shares how PowerToFly is completely reinventing the traditional hiring process to companies bring more women into the workplace and become more inclusive.

In this episode you will learn:

  • What is wrong with traditional work?
  • What it the mission of Power To Fly?
  • How is Power to Fly addressing the gender pipeline problem?
  • How can remote work play a key role in hiring women?

Links

What does human-centered AI even mean? A very meta conversation with Josh Lovejoy.

“When a system begins to remember us forever, and wherever we go…. we will not be our true selves. We will be the self we know it’s okay to remember.”
— Josh Lovejoy, Principal design manager, ethics and society at Microsoft.

AI and Machine Learning systems are quickly becoming an integral part of how we work with, understand, and socialize with each other. Although this new technology is extremely exciting and offers a new wave of technological advancement, with it comes many ethical issues concerning discrimination, undermining human emotion, breaking social contracts and more.

Sheana Ahlqvist talks to Josh Lovejoy, Principal Design Manager at Microsoft, specializing in the Ethics and Society sector. Josh believes that human-centered design thinking can change the world for the better; that by seeking to address the needs of people- especially those at the margins- in ways that respect, restore and augment their capabilities, we can invent forms of technological innovation that would have otherwise been invisible.

IN THIS EPISODE YOU’LL HEAR:

  • Why do corporations want to know what people are thinking and feeling?
  • Forming trust relationships using AI systems.
  • What is a design ethicist?
  • What kinds of things can impartial AI autonomous systems do better than humans.
  • How do autonomous AI systems take advantage of consumers?
  • What is predictive policing and how does it relate to AI ethics?
  • What are some examples of misapplications of Machine Learning systems.
  • What is a deepfake?
  • What is a mean opinion score and how does it apply to voice automation?
  • Josh’s opinion on how AI tools should be developed.
  • What happens when you give up personal data in exchange for a more personalized experience?
  • Who should have the authority to make consequential decisions about AI?
  • How will AI and Machine Learning systems shape our knowledge and create change for the future?
  • How do you create machine learning systems that are unbiased but still function effectively for the user?

LINKS:

OTHERS MENTIONED:

  • Youtube
  • Spotify
  • AI
  • Machine-Learning Algorithms
  • Predictive Policing
  • Google
  • Reddit
  • Terminator
  • Deepfake
  • Eric Horvitz
  • Microsoft Research
  • Google Duplex
  • Brad Smith
  • Wavenet
  • Deep Mind
  • Adobe
  • Mean Opinion Score
  • Moritz Hart
  • Kate Crawford
  • Stanford
  • Star Trek
  • Facebook
  • Meredith Whittaker
  • AI Now
  • Nick Bostrom
  • Super Intelligence
  • Joy Buolamwini
  • Google Clips

CONNECT WITH JOSH