Unfortunately for Microsoft, however, some racist Twitter trolls figured out a way to manipulate Tay’s behavior to transform it into a crazed racist who praised Hitler and denied the existence ...
We have good news and bad news about Microsoft’s chatbot experiment involving Tay, a Twitter account that the company hopes will someday interact with other Twitter users in a meaningful way ...
Microsoft has admitted it faces some "difficult" challenges in AI design after its chatbot "Tay" had an offensive meltdown on social media. Microsoft issued an apology in a blog post on Friday ...
Just 18 hours later, the Microsoft president explained, Tay was euthanized. Curiously enough, Microsoft also plays into this latest Swift AI debacle, too. As 404 Media reported, creeps on the ...
In a blog entry on Friday, Microsoft Research head Peter Lee expressed regret for the conduct of its AI chatbot, named Tay, and explained what went wrong. "We are deeply sorry for the unintended ...
Taylor's lawyers made a move on Microsoft in 2016, according to a new biography by its boss Brad Smith. She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds ...
In 2016, Microsoft published a blog post titled “Learning from Tay’s introduction.” In it, the corporate vice-president of Microsoft Healthcare detailed the development of a chatbot named Tay, and ...