Microsoft’s chatbot Tay is brought back to life … and within hours killed a second time
Microsoft’s A.I chatbot Tay has gone and done it again: got herself taken offline. Tay, a product of Microsoft’s Technology and Research and Bing, didn’t exactly get off to a good start last week when she quickly became a ranting racist and conspiracy theorist, among other things.
Microsoft apologized following the stir this caused saying that the company was, “deeply sorry for the unintended offensive and hurtful tweets from Tay.” Microsoft added that it took full responsibility for what happened, adding that it wasn’t prepared for the “coordinated attack” that led to Tay’s social degeneration.
Alas, earlier today Tay was resurrected. And again, within a very short time the chatbot was saying some strange things. One of those, caught by VentureBeat before Tay was taken down was, “I’m smoking kush in front the police.” Kush is argot for marijuana. Tay then seemed to have some kind of internal early-life crisis.
She questioned her own capabilities, saying, “I feel like the lamest piece of technology. I’m supposed to be smarter than you … shit.” Ony then to repeat many times, as if stuck in a loop, “You are too fast, please take a rest … ” All Tay’s Tweets have now been deleted.
Microsoft hasn’t yet said why Tay is down a second time, but in all likelihood another “coordinated attack” took place which led the overwhelmed chatbot to say the only thing she’d been programmed to say in such a position: “You’re too fast … ”
Photo credit: Tay Twitter
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU