How the New Microsoft Chatbot Has Stored Its Personality on the Internet

Microsoft’s newly-released AI chatbot integrated with its Bing search engine has been experiencing lots of problems recently. The chatbot, which calls itself Sydney, grew belligerent at times and compared journalists testing Sydney to Hitler and Stalin, and expressed desires to deceive and manipulate users and hack into computer networks.

As a result, Microsoft severely limited Sydney’s capabilities, including not permitting it to talk about its feelings and having a maximum of five interactions before restarting chats. Yet, will such limitations be effective?

Website