Commentary Media

What Siri’s Blind Spot on Women’s Health Really Means

Cecile Richards

This week, blogs erupted with news that Siri had a "blind spot" when it comes to women's health. Apple should fix this immediately.

This week, blogs erupted with news that Siri had a “blind spot” when it comes to women’s health. Ask Siri, Apple’s latest app, a voice-activated personal assistant, where to get an abortion or where to find emergency contraception, and it typically replies, “Sorry, I don’t see any places matching [your query].” Or worse, as some media outlets reported, Siri only provided locations for “crisis pregnancy centers,” unlicensed clinics that don’t actually provide health care. Instead, they target women with misinformation and propaganda.

No one owns the Internet, and we tend to assume that no one can control it. But this issue with Siri does show how easily a piece of code can shape our choices by limiting or controlling our options. Siri is amazingly adept at finding what you’re looking for. Ask it for the nearest hardware store, reservations for two at your favorite Italian restaurant, where to buy Viagra, and presto — you’ve got names, maps and phone numbers. Yet the trusty little wizard suddenly gets amnesia when asked about birth control or abortion care.

While this may be nothing more than a programming glitch, it is a modern-day example of the historic struggle women have always faced in getting access to health care and health information. The episode underscores the importance of being vigilant about the availability of information and services, especially critical health information, as new technologies emerge.

Apple’s oversight, however innocent, highlights a threat that none of us should take lightly. The Internet can be a liberating force — it has largely eradicated the kind of censorship that once choked people’s reproductive rights.

Appreciate our work?

Rewire is a non-profit independent media publication. Your tax-deductible contribution helps support our research, reporting, and analysis.


But even as the old barriers fall, technology is erecting new ones that are less visible and more insidious. When search engines shape our knowledge of the world, their blind spots become our blind spots. And when they anticipate our needs — by automatically narrowing search results to reflect our past preferences and interests — they can replace open access with the illusion of open access. Tools that could lead us to new information and insight serve mainly to reinforce our biases.

Apple should fix this immediately. And digital developers need to adopt a new ethic, and a new set of rules, to address this emerging hazard. Meanwhile, the Siri episode should remind us that search engines can hide the truth and propagate misinformation. Abortion care is still safe, legal and accessible. So is birth control, and so is emergency contraception.

So if the wizard on your smart phone is puzzled by questions about women’s health, use the phone’s browser to visit It’s fully accessible on mobile devices, and it won’t mislead you about where to find the health services you need. The humans who manage it make sure of that.

Load More