Microsoft’s Tay chatbot returns briefly, swears a lot and brags about smoking weed

Tay
Feed-twFeed-fb

Oh, Microsoft. Last week, the company pulled its Tay chatbot from Twitter after some users trained it to become a racist jackass. 

On Wednesday, Tay was brought back online, sending thousands of tweet replies. The vast majority of these were just "you...

Subscribe to Applenews247.Com Newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>