Home » today » Technology » Bing had a lobotomy. AI chatbot answers worse today than when it was launched – Živě.cz

Bing had a lobotomy. AI chatbot answers worse today than when it was launched – Živě.cz

It’s been two weeks since Microsoft made Bing Chat available to us. The hi-tech toy, which, unlike its cousin called ChatGPT, also abounds with current data, within 14 days changed significantly. And it cannot be said that it is for the better.

Although the conversational AI also aimed at mobile phones and learned to speak (also in Czech), it also has a range new restrictions. You can only ask 100 questions per day and a maximum of 6 questions in one session, then you have to create a new one and cancel the original context.

But that wouldn’t be the worst thing. After all, Microsoft itself states that most people are satisfied with the answers and only one percent of beta testers ask more than 50 questions per day. But they and you the answers are dumber.

The biggest criticisms, as with ChatGPT, point to the (in)accuracy of the answers. After all, both chatbots are built on language models, so their primary purpose is naturally to glue words into sentences and sentences into paragraphs so that it is not obvious that they were written by a robot. The fact that they answer all kinds of questions at the same time is a bit of a side effect.

Bing Chat speaks English perfectly and Czech very well. Fact-checking is still weak, and although the chatbot refers to the sources it draws from, its citations on the original websites either do not exist or give a different meaning. Fact checking, of course, cannot be resolved in two weeks. This will be the task for the new language model GPT-4 and similar.

A new taboo

But Microsoft obviously manually tunes the parameters so that the answers don’t hurt. Bing Chat got a muzzle in front of his mouth and the areas about which are only multiplying can’t speak. His stupid answers amused one side, but could upset the other. And so a giant company cannot afford to upset anyone because it would hurt the stock.

Unfortunately, there isn’t a full list of things that Bing Chat is “censoring” right now, but we’ve spotted a few. Politics and religion are taboo, at least in the context of satire. Not long ago, it generated jokes about Jesus and Allah or conversations between Hitler and Putin on request, but not anymore. He doesn’t even want to talk about himself. Potentially unpleasant topics end with an apology. Two weeks ago I would have been more willing to talk about anything.

Click for larger image
Click for larger imageClick for larger imageClick for larger image

Answers too noticeably shortened, used to talk a lot. People want brief summaries, and longer texts run the risk of containing more nonsense. But Microsoft did the worst it could. He gave people a choice. You can now choose from three different tones to respond to Bing Chat at the start of a conversation. Creatively, balanced, or precisely.

Click for larger image

But it is misleading, because in practice this option affects rather the length of the resulting text. And you can’t rely on that either. A creative answer is sometimes shorter than others. In addition, even with one tone, the chatbot can answer completely differently each time, whether it is about length or accuracy. In short, the switch there is now completely useless and people will not know which option to rely on.

Therefore, let’s show the newly generated answers in practice. There are always three versions for each question. It also differs in color background. The purple version is creative, the blue is balanced, and the green is “precise”.

On the next sheets you will find a comparison, we will start with the news.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.