Update 2:
Microsoft has made Tay’s Twitter account private all of a sudden. Now only confirmed followers would be able to see its Tweets and complete profile.
Update:
Seems the Chatbot Tay is back to what it does best, chatting with humans!!
Original:
Yesterday, we reported about Microsoft’ experimental Chatbot Tay that mimics a Teenager’s conversational patterns and learns from whosoever it talks to. This Chatbot experimentation is based upon similar Microsoft experiment in Chna that has been hugely successful.
Initially when everyone on Twitter tried to talk to Tay, its responses were humorous, flirty and fun!! But since it learns from anyone it talks to, it was soon taught to be shockingly “racist”.
Image Credit: Gerry
This is like Tay becoming mirror of the internet and showing how bad traits can be absorbed quickly. Last we hear Microsoft has been busy in deleting tweets like these from Tay’s account and the Chatbot has been sent to ICU for the much-needed re-orientation.