Jump to content

iPhone’s weirdest glitch yet: Ask Siri to define “mother” twice, learn a bad word - General Hangout & Discussions - InviteHawk - Your Only Source for Free Torrent Invites

Buy, Sell, Trade or Find Free Torrent Invites for Private Torrent Trackers Such As redacted, blutopia, losslessclub, femdomcult, filelist, Chdbits, Uhdbits, empornium, iptorrents, hdbits, gazellegames, animebytes, privatehd, myspleen, torrentleech, morethantv, bibliotik, alpharatio, blady, passthepopcorn, brokenstones, pornbay, cgpeers, cinemageddon, broadcasthenet, learnbits, torrentseeds, beyondhd, cinemaz, u2.dmhy, Karagarga, PTerclub, Nyaa.si, Polishtracker etc.

iPhone’s weirdest glitch yet: Ask Siri to define “mother” twice, learn a bad word


Recommended Posts

On Saturday, iPhone users around the world began testing and confirming what is arguably Siri's most bizarre response to a question yet. Before grabbing your own phone to test this out, however, be mindful of anybody else around.

The randy robo-response was apparently first reported on Reddit's Apple community, where a user by the name "thatwasabaddecision" suggested that people ask Siri to "define the word mother," wait for the assistant to ask for an additional definition, and say "yes." What the Reddit user didn't point out, which readers learned by doing the test themselves, was that the second definition Siri offers is succinct and seemingly inaccurate.

"As a noun," the computer-generated voice said as of press time, "it means, short for 'motherfucker.'"

We individually confirmed that Siri offers this as its second definition, though plenty of other iPhone users have posted their own tests over the past 16 hours. (Australian Siri says it, too.) The reason this second definition jumps straight from a nurturing matriarch to something less nurturing-sounding can be found in the Oxford Dictionary. There, the entry for "mother" includes the same verbiage for an alternate definition, which is used when "mother" is used as shorthand for the curse. In the dictionary, however, that alternate take is clearly listed as "vulgar slang." Siri, for whatever reason, does not include that descriptor.

Apple's Siri voice navigation service has been criticized for various content issues over the years, including lackluster language support and emergency comprehension. In terms of robo-voice services going weirdly awry, on the other hand, the only recent example we have is when when Amazon's line of Echo products began creepily laughing for no reason. Surprising and inappropriate content has frequently been discovered inside games' and apps' code by clever users, of course, particularly the "Hot Coffee" mod that shipped in original versions of Grand Theft Auto: San Andreas.
Apple did not immediately respond to a request for comment.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
  • Customer Reviews

  • Similar Topics

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.