Home

gaun Pusat Mengencangkan microsoft racist bot menghitung Tembak ulang Pecahan

Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News
Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online  Conversation - IEEE Spectrum
In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online Conversation - IEEE Spectrum

Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All  Tech Considered : NPR
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR

Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how  to be racist and say horrible things https://t.co/onmBCysYGB  https://t.co/0Py07nHhtQ" / Twitter
Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how to be racist and say horrible things https://t.co/onmBCysYGB https://t.co/0Py07nHhtQ" / Twitter

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium
Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium

TayTweets: How Far We've Come Since Tay the Twitter bot
TayTweets: How Far We've Come Since Tay the Twitter bot

Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than  24 Hours
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours

AI Expert Explains Why Microsoft's Tay Chatbot Is so Racist
AI Expert Explains Why Microsoft's Tay Chatbot Is so Racist

Tay (bot) - Wikipedia
Tay (bot) - Wikipedia

Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a  Racist Jerk. - The New York Times
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times

Microsoft's racist teen bot briefly comes back to life, tweets about kush
Microsoft's racist teen bot briefly comes back to life, tweets about kush

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

After racist tweets, Microsoft muzzles teen chat bot Tay
After racist tweets, Microsoft muzzles teen chat bot Tay

Microsoft's Tay AI chatbot goes offline after being taught to be a racist |  ZDNET
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET

Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet
Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet

TayTweets: Microsoft AI bot manipulated into being extreme racist upon  release - ABC News
TayTweets: Microsoft AI bot manipulated into being extreme racist upon release - ABC News

Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism  [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit
Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft's millennial chatbot tweets racist, misogynistic comments | CBC  News
Microsoft's millennial chatbot tweets racist, misogynistic comments | CBC News

Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says  Microsoft
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft

How Twitter taught a robot to hate - Vox
How Twitter taught a robot to hate - Vox

Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to  sleep | TechCrunch
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch

Microsoft exec apologizes for Tay chatbot's racist tweets, says users  'exploited a vulnerability' | VentureBeat
Microsoft exec apologizes for Tay chatbot's racist tweets, says users 'exploited a vulnerability' | VentureBeat