It sounds just too good: ChatGPT is doing the rounds and all users are thrilled. If you, as an established corporation, take this recipe for success and transfer it to your product – what could go wrong? I think Microsoft thought the same thing when they not only invested billions of dollars in the OpenAI solution, but also proudly added artificial intelligence to their search engine Bing.
It is said that trial and error makes perfect, but Microsoft should have learned a few years ago with their Tay experiment that the Internet and a chatbot do not get along so well. In 2016, the company behind Windows launched its bot, which was supposed to learn from Twitter in order to become better and more intelligent thanks to friendly users. An experiment that took only a few hours to become racist and hateful and was eventually taken offline.
But history repeats itself, and so it is with the current GPT-powered version of Bing. Even though the search engine is currently only available to beta users, there are current stories of death threats from the engine, declarations of love including requests to break up with one’s wife, and a great deal of awareness. Sydney, as the machine christens itself, has caused an uproar on the Internet and Microsoft recognizes, as it did with Tay, that most users do not necessarily only have nice things in mind with their machines.
Now comes the consequence: Microsoft wants to limit conversations in Bing to a maximum of five questions and introduce a daily limit of 50 questions in total. After five questions, according to their findings, all serious questions are cleared up.
I’ll be honest, my opinion on this is split. On the one hand, it’s sad that it has to be necessary to limit conversations so quickly. On the other, maybe Microsoft should be looking for a solution to the problem of their escalating AI rather than just making the experience worse.
ChatGPT has had its fair share of problems and users have pushed this AI to its limits as well, but blaming everything on the users seems like the wrong way to go. What is your opinion?