Apple: Siri Abortion Answers “Not Intentional Omissions”

    December 1, 2011
    Josh Wolford

Earlier this week, we told you about the growing number of people who were discovering that Apple’s iPhone 4S voice assistant Siri might not be the best place to turn if you’re looking for help with pregnancy issues.

Specifically, Siri appears to be unwilling or unable to retrieve useful information for queries regarding abortions and emergency contraception. Some of the problems that have been reported:

  • Siri stating that she cannot find any abortion clinics in areas where users know there are plenty
  • Siri pointing users in the direction of clinics miles away – even when closer options exist
  • Siri is unable to help when you ask where to find birth control
  • When you ask about Plan B, Siri repeatedly replies that she couldn’t help you with that
  • If Siri provides you with an “abortion clinic” near you, it is often actually an anti-abortion crisis center

I tested many of these questions on my own iPhone 4S and received similar responses. It sure feels like Siri is giving you the runaround when you ask about these basic women’s health issues. It’s not that Siri is stupid – she can understand fairly complex problems and diagnose the proper responses. For instance, my “I’ve had a 5 hour erection” query was met with listings to local hospitals.

Some in the blogosphere suggested that something was rotten in the state of Apple, and that Siri’s uselessness in this area was part of some sort of social agenda being pushed by the company. I felt that the evidence was definitely there, but it was far from conclusive enough to say that Apple was implementing pro-life leanings into their software. But it is obvious that Siri has trouble with these types of questions.

Apple has finally responded to the Siri abortion problem, and they simply say that Siri is clearly not ready for prime time. Here’s their statement, obtained by the New York Times:

Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.

Let’s hope that’s the case. If Apple was in some way consciously denying women information like this, that would be an unforgivable screwup.

Have you tried this out for yourself? Can you get useful information from Siri on these topics? Does Apple’s response put you at ease? Let us know in the comments.


Josh Wolford
Josh Wolford is a writer for WebProNews. He likes beer, Japanese food, and movies that make him feel weird afterward. Mostly beer. Follow him on Twitter: @joshgwolf Instagram: @joshgwolf Google+: Joshua Wolford StumbleUpon: joshgwolf