Microsoft and Twitter ‘Go Racist’

Microsoft and Twitter ‘Go Racist’ Clapway

Microsoft’s Hitler robot Tay is back yet again, and it’s more racist than ever. Last week, Tay was released to simulate a teenager on Twitter. Well one thing for certain, Tay certainly nails down the narrow-minded, sassy quality that seems to permeate through every teen these days.

RACIST TWITTER ROBOT COMES TO LIFE

Microsoft’s A.I hasn’t been exactly hip with the Twitter crowd since it got its own account last week. The racist bot Tay is apparently fluent in slang, emoji, and memes, but seems a bit confused how to use them. Designed to learn from and respond to users, Tay took to Twitter with the fury only a real millennial could understand. What started out as a successful Microsoft endeavor, quickly turned into a troll fed free for all.

MICROSOFT OBLIVIOUS TO SOCIETY

It seems that Microsoft didn’t really think this one through. Have they even read Twitter? As one on would imagine, once trolls on the social media site realized their power, a series of racist sexist and Hitler like phrases were fed to Tay. In turn, Tay learned from the phrases and spit back some serious sass. Just like a concerned parent, Microsoft had to step in and ground Tay, shutting her down for some maintenance.

TAY’S STINT IN REHAB UNSUCCESSFUL

They said they laid down the law with Tay, but upon the bots return, old habits proved to die hard. Before too long, Tay was back on Twitter sending out tweets that were not all far off from the racist shenanigans that were pulled a week earlier. Not only did Tay tweet about smoking weed in front of some cops but then began spamming all 200,00-plus followers with the same message. After this went on for what seemed like too long, Tay’s tweets were finally deleted. Now, Tay is done permanently, or at least, until it’s racist ways can be ironed out. Whether it’s at some Outward Bound-like program for bots or an internet juvenile hall, Tay needs to be taught some serious manners.