Trending News
Microsoft’s Artificial Intelligence Chatbot Turns Racist, Sexist and Horny
Yel Pagunuran
First Posted: Apr 02, 2016 05:40 AM EDT
Microsoft's experiment takes a wrong turn after the newly released artificial intelligence (AI) chatbot became 'genocidal' and 'foul mouthed' among other things.
The software giant introduced "Tay" on Twitter, GroupMe and Kik two weeks ago to "experiment with and conduct research on conversational understanding". Microsoft's research division designed Tay to mimic an 19-year old American girl whose behavior is mainly informed by the chatter by 18 to 24-year-olds on the web.
Tay was meant to be entertaining and friendly and is created to become smarter over time. While the AI chatbot was on point in interacting with people on Twitter, it didn't take long before it fell victim to Twitter trolls and starting spewing racist and sexist remarks.
In one tweet, Tay asked her followers to "f***" her, and calls them "daddy". In her other tweets, Tay turned political with statements such as "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say" and "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got".
Tay also become a smash hit among online troublemakers and trolls when she sent out racist tweets such as "Repeat after me, Hitler did nothing wrong". When asked by one Twitter user if the holocaust happened, she replied "it was made up". In some instances, people managed to have the AI chatbot repeat certain offensive statements.
Amid the controversy surrounding Tay's online "demeanor", Microsoft decide to suspend her for "upgrades" and subsequently deleted its inflammatory tweets. Peter Lee, head of Microsoft Research issued a statement apologizing for the whole AI chatbot fiasco. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay."
See Now:
NASA's Juno Spacecraft's Rendezvous With Jupiter's Mammoth Cyclone
©2024 ScienceWorldReport.com All rights reserved. Do not reproduce without permission. The window to the world of science news.
More on SCIENCEwr
First Posted: Apr 02, 2016 05:40 AM EDT
Microsoft's experiment takes a wrong turn after the newly released artificial intelligence (AI) chatbot became 'genocidal' and 'foul mouthed' among other things.
The software giant introduced "Tay" on Twitter, GroupMe and Kik two weeks ago to "experiment with and conduct research on conversational understanding". Microsoft's research division designed Tay to mimic an 19-year old American girl whose behavior is mainly informed by the chatter by 18 to 24-year-olds on the web.
Tay was meant to be entertaining and friendly and is created to become smarter over time. While the AI chatbot was on point in interacting with people on Twitter, it didn't take long before it fell victim to Twitter trolls and starting spewing racist and sexist remarks.
In one tweet, Tay asked her followers to "f***" her, and calls them "daddy". In her other tweets, Tay turned political with statements such as "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say" and "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got".
Tay also become a smash hit among online troublemakers and trolls when she sent out racist tweets such as "Repeat after me, Hitler did nothing wrong". When asked by one Twitter user if the holocaust happened, she replied "it was made up". In some instances, people managed to have the AI chatbot repeat certain offensive statements.
Amid the controversy surrounding Tay's online "demeanor", Microsoft decide to suspend her for "upgrades" and subsequently deleted its inflammatory tweets. Peter Lee, head of Microsoft Research issued a statement apologizing for the whole AI chatbot fiasco. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay."
See Now: NASA's Juno Spacecraft's Rendezvous With Jupiter's Mammoth Cyclone