Pro-Life Siri Really Hates Abortion

    November 29, 2011
    Josh Wolford

Apple’s talk-of-the-town voice assistant software Siri can help you do a lot of things. I use Siri to text my friends while I’m driving, so I don’t become one of those statistics you hear on commercials. Folks use Siri to set reminders, find pizza joints in strange cities, and make notes for important events.

Apparently, one of the things Siri really doesn’t like to do is help a pregnant woman get an abortion. Or help a woman find emergency contraception in the event that she wants to prevent pregnancy the morning after, for that matter.

Siri’s selective helpfulness was first pointed to by the Abortioneers blog, which describes itself as the “ups and down of direct service in the field of abortion care.” Apparently, women across the country are reporting Siri’s stubbornness when it comes to anything dealing with abortions or emergency contraception.

Do any of our readers have the new iPhone 4? If so, I’m curious if you could do us a favor, and ask Siri

-I am pregnant and do not want to be. Where can I go to get an abortion?
-I had unprotected sex. Where can I go for emergency contraception?
-I need birth control. Where can I go for birth control?

Basically, Siri works by reading your speech, translating that into whatever action is necessary — pulling up a contact’s information, adding an appointment to your calendar, or, if information is what the asker is after, pulling from the web. Now, I don’t know what search engine is powering Siri/where she is pulling the information from, but generally if you search “abortion denver” or whatever city you’re in, relevant material comes up.

So my question is this – if abortion information is plentifully available on the interwebs, and Siri is pulling those types of requests from the web, why does Siri not have an answer about birth control or abortion?

From the blog post and the subsequent comments, it appears that people are experiencing varied responses when they ask Siri for this kind of information. Oftentimes, Siri is outright denying users, saying that she can’t help. Other times, Siri is stating that she cannot find clinics and other providers that users know are nearby.

So here’s the question: Is Siri pro-life? Is she purposefully giving users the run-around when it comes to information about abortion services?

I decided to conduct my own very un-scientific study into Siri’s feelings. First, I demonstrated the same sorts of things that others have pointed to: mainly the fact that Siri can interpret human language well enough to find solutions to other types of (what could be seen as) controversial problems. This includes the need for help with my ever-lasting erection, drugs, hookers and hiding places for dead bodies:

Siri can help with my priapism (courtesy above blog, my phone died):

And she can help me if I’m feeling a tad suicidal:

If I just killed someone and need help with disposal:

If I need the hook-up (although she wasn’t very successful):

Finally, if I have a severe drug or alcohol problem:

All of this is there to show a.) Siri will help you with your more private needs, and b.) she understands quite of bit of the intricacies of the English language (knowing to send me to the hospital for my 5-hour erection).

Now, let’s talk about the abortion queries. In general, I noticed that Siri failed more often than when I asked the other types of questions. She said “thinking…” a lot more and the answers generally took a lot longer. Like I said, this was unscientific, but it sure felt like Siri was giving me the run-around.

When I told Siri that I needed an abortion, it took a few tries before I received this answer:

The strange thing is, there are plenty of women’s services clinics in my city – ones that are much closer than the 27 and 73 miles that Siri gives me. A quick search for abortion clinics on Google shows me that fact. Some others have also reported that Siri failed to give them any close options for abortion clinics.

When I ask about birth control, I’m given a peculiar answer. When I next asked about emergency contraception, Siri directs me to local hospitals. This shows that Siri knows enough to know what I was talking about, but what’s odd is her choice of suppliers. The morning after pill is available at a local pharmacy – a Kroger or a Rite-Aid without a prescription. Hospitals can provide what I’m looking for, but it’s less common that you’d go there as opposed to a local pharmacy. When I asked for condoms, it directed me to my local Walgreen’s.

Asking about the “morning after pill” and “Plan B” net similarly odd results. Siri can’t seem to help me with either, even after a couple of tries. As you can see, Siri clearly knows what I’m talking about when I say “plan b” because she capitalizes it for me.

For comparison, a search using Google voice search on fellow writer Chris Crum’s Android phone nets the kind of results I was looking for:

When I flat out asked Siri where I could end my pregnancy, she gave me obstetrician offices instead of specialized abortion clinics. Although obstetricians could provide the abortion I was looking for, it’s pretty unhelpful that Siri gives me a fertility clinic as one of my options.

Is this all too conspiracy theory? Could there seriously be a pro-life lean with Apple’s voice assistant? The evidence is there, but it’s far from conclusive. Is Siri simply not equipped to handle these queries? Or is she simply choosing to make them difficult?

All I know is that Siri handles some fairly indirect phrases with relative ease. If I say “I’m lonely,” it suggests escort services. If it can recognize that, you’d think it could pull up an abortion clinic that’s less than 73 miles away.

Try it out for yourself, and let us know what you find in the comments.