The other day I had a quick medical question (āif I donāt rinse my mouth out enough at night will I dieā), so I googled the topic as I was going to bed. Google showed a couple search results, but it also showed Answers in a little dedicated capsule. This was right on the heels of the Yahoo Answers shutdown, so I poked around to see what Googleās answers were like. And thoseā¦ went in an unexpected direction.
Replying to giovan_h:Fri Feb 26 07:40:22 +0000 2021me: so should I induce vomiting or
google: hereās how and why to drink human blood
So, Google went down a little rabbit trail. Obviously these answers were scraped from the web, and included sources like exemplore.com/paranormal/
which is, apparently, a Wiccan resource for information that is āastrological, metaphysical, or paranormal in nature.ā So possibly not the best place to go for medical advice. (If you missed it, the context clue for that one was the guide on vampire killing.)
There are lots of funny little stories like this where some AI misunderstood a question. Like this case where a porn parody got mixed in the bio for a fictional character, or that time novelist John Boyne used Google and accidently wrote a video recipe into his book. (And yes, it was a Google snippet.) These are always good for a laugh.
Sat Apr 27 10:36:51 +0000 2019thanks google
Tue Jan 17 18:04:08 +0000 2017honestly this is still my favourite AI misunderstanding
The Google search summary vs the actual page
ā insomnia club (@soft) October 16, 2021
Wait, whatās that? That last one wasnāt funny, you say? Did we just run face-first toward the cold brick wall of reality, where bad information means people die?
Well, sorry. Because itās not the first time Google gave out fatal advice, nor the last. Nor is there any end in sight. Whoops!