The other day I had a quick medical question (“if I don’t rinse my mouth out enough at night will I die”), so I googled the topic as I was going to bed. Google showed a couple search results, but it also showed Answers in a little dedicated capsule. This was right on the heels of the Yahoo Answers shutdown, so I poked around to see what Google’s answers were like. And those… went in an unexpected direction.
Replying to giovan_h:Fri Feb 26 07:40:22 +0000 2021
me: so should I induce vomiting or
google: here’s how and why to drink human blood
So, Google went down a little rabbit trail. Obviously these answers were scraped from the web, and included sources like
exemplore.com/paranormal/, which is, apparently, a Wiccan resource for information that is “astrological, metaphysical, or paranormal in nature.” So possibly not the best place to go for medical advice. (If you missed it, the context clue for that one was the guide on vampire killing.)
There are lots of funny little stories like this, where some AI misunderstood a question. Like this case where a porn parody got mixed in the bio for a fictional character, or that time novelist John Boyne used Google and accidently wrote a video recipe into his book. (And yes, it was a Google snippet.) These are always good for a laugh.
Sat Apr 27 10:36:51 +0000 2019
Tue Jan 17 18:04:08 +0000 2017
honestly this is still my favourite AI misunderstanding
The Google search summary vs the actual page
— insomnia club (@soft) October 16, 2021
Wait, what’s that? That last one wasn’t funny, you say? Did we just run face-first toward the cold brick wall of reality, where bad information means people die?
Well, sorry. Because it’s not the first time Google gave out fatal advice, nor the last. Nor is there any end in sight. Whoops!