Quantcast

Apple: Siri Abortion Answers “Not Intentional Omissions”

Apple addresses Siri's seemingly selective uselessness

Get the WebProNews Newsletter:
Apple: Siri Abortion Answers “Not Intentional Omissions”
[ Technology]

Earlier this week, we told you about the growing number of people who were discovering that Apple’s iPhone 4S voice assistant Siri might not be the best place to turn if you’re looking for help with pregnancy issues.

Specifically, Siri appears to be unwilling or unable to retrieve useful information for queries regarding abortions and emergency contraception. Some of the problems that have been reported:

  • Siri stating that she cannot find any abortion clinics in areas where users know there are plenty
  • Siri pointing users in the direction of clinics miles away – even when closer options exist
  • Siri is unable to help when you ask where to find birth control
  • When you ask about Plan B, Siri repeatedly replies that she couldn’t help you with that
  • If Siri provides you with an “abortion clinic” near you, it is often actually an anti-abortion crisis center

I tested many of these questions on my own iPhone 4S and received similar responses. It sure feels like Siri is giving you the runaround when you ask about these basic women’s health issues. It’s not that Siri is stupid – she can understand fairly complex problems and diagnose the proper responses. For instance, my “I’ve had a 5 hour erection” query was met with listings to local hospitals.

Some in the blogosphere suggested that something was rotten in the state of Apple, and that Siri’s uselessness in this area was part of some sort of social agenda being pushed by the company. I felt that the evidence was definitely there, but it was far from conclusive enough to say that Apple was implementing pro-life leanings into their software. But it is obvious that Siri has trouble with these types of questions.

Apple has finally responded to the Siri abortion problem, and they simply say that Siri is clearly not ready for prime time. Here’s their statement, obtained by the New York Times:

Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.

Let’s hope that’s the case. If Apple was in some way consciously denying women information like this, that would be an unforgivable screwup.

Have you tried this out for yourself? Can you get useful information from Siri on these topics? Does Apple’s response put you at ease? Let us know in the comments.

Apple: Siri Abortion Answers “Not Intentional Omissions”
Top Rated White Papers and Resources
  • jeffrey fina

    Regardless of their stated lack of intention toward this, we have to realize that siri is a tool for directing you to an answer it feels is correct. It takes choice out of the equation. You may google something but may find a link toward the bottom more useful. Nothing can step on human choice and when you have software telling you things, eventually it will show its ugly face. I trust apple giving me answers like I trust the government will balance their budget.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom