Microsoft’s Tay is a combination of these two ideas. Among its other capabilities, for its debut, it was supposed to “engage and entertain” people by anthropomorphically interacting with them on Twitter.
According to Microsoft, the company had a great experience in China with its Xiaolce chatbot.
Public data that’s been anonymized is Tay’s primary data source.
This is not the first time engineers have failed at “engineering” social interactions. No matter how rigorously you test an AI system, it will always perform differently in the wild. But Microsoft could have mitigated much of the risk by simply managing user expectations. Was it a demonstration of some aspect of AI or data-scientific research?
Was it a tour de force in question answering technology?
For example, Fandango’s chatbot may text you about the availability of movie tickets, but the system does not need any AI to process your request to purchase them.
And while there are thousands of entertaining chatbots available online, they are mostly parlor tricks created for your amusement, nothing more.