Tech

Bing doesn’t always respond well, and it’s normal

February is turning out to be a most transcendental month for Bing, the Microsoft search engine that until a few weeks ago was little less than unknown to most (actually it was known, but very little used) but which, thanks to the joint efforts of Microsoft and OpenAI to implement in it a chatbot based on artificial intelligence, hclimbed to the top of the most repeated searches and the interest of a huge number of users.

Since Microsoft officially announced the new Bing with Prometheus, as well as the new Microsoft Edge with its AI-based “co-pilot” mode, a legion of users have wanted to try the beta version of it, for which the Redmond people enabled a list of wait. We were able to test it a few days ago and, as I already mentioned at that time, the first contact was quite positivea feeling that has been growing since then, as I have been using it more.

This does not mean that it is perfect, of course. I already commented on it after carrying out the test, and we have been able to see it in recent days through social networks. Sometimes the new Bing can provide erroneous answers, although at this point we find a fundamental difference with respect to ChatGTP in these cases, and that is that the Microsoft search engine always shows us the sources on which it has been based to compose its answers, so if we pull the thread a little bit, we can confirm or deny their answers.

Something more different and surprising is knowing that, as some users on social networks have shown, Bing can also give “out of tune” answers, more specifically out of the tone that Microsoft has wanted to establish for its chatbot. We have seen cases in which he wandered, others in which he indicated to a person that he (the user) was a bad user while he (Bing) was a good chatbot… there are even those who affirm that the search engine has used them emotional blackmail.

Bing doesn't always respond well, and it's normal

Microsoft has published, in the search engine’s official blog, an entry in which acknowledges receipt of the failures, in addition to sharing some points of learning from this first week deployment of the new Bing. In it, they readily admit the chatbot’s errors, claim to be working to fix them (which is a good sign, since it indicates that they have already been identified and have been able to reproduce them) and, logically, they also celebrate the initial reception and collaboration of users have been very positive.

Regarding the flaws in the behavior of the chatbot, Microsoft claims to have discovered, to its surprise (at this point I think they have been somewhat naive) that did not imagine that Bing would be used for the «general discovery of the world» or for social entertainment, two types of use in which conversations can take quite a long time. And is this important? Well, the truth is that yes, since, according to what we can read in that publication, the company discovered that in extended sessions of 15 or more questions, Bing can become repetitive or be provoked to give answers that are not necessarily useful or be “in line with the designated tone«.

Thus, the technology company considers that long chat sessions can confuse the model about the questions it is answering and, consequently, it believes that it may need to add a tool so that users can more easily update the context or start from scratch. We can also read thatthe model sometimes attempts to respond or reflect in the tone in which they are asked to provide responses which may lead to a style we did not intend. This is a non-trivial scenario that requires a lot of prompting, so most users won’t run into it, but we’re looking at how to provide more precise control.”.

In addition, and I must remember that this has reminded me a bit of Interstellar, the Christopher Nolan movie, because another measure that they are contemplating is that of add a toggle that would give users more control over how creative they want Bing to be either by responding to your query. In theory, this control could prevent Bing from making weird comments, but in practice, I’m sure many will set it to maximum creativity if possible, as let’s not forget that this can be fun for many users.

Bing doesn't always respond well, and it's normal

For the rest, we can read that in general the reception has been very good, that 71% of responses given by the new Bing chatbot got a “thumbs up”, that is, a positive assessment, and that the participants of this first beta phase are providing a lot of important information with a view to polishing its operation. In addition, the company also indicates that the new Bing receives daily updates, so its evolutionary pace is quite fast.

And why did I say at the beginning that it is normal for it to fail? Because we are talking about a terribly complex technologysubject to the particular criteria of each of the many users who have already obtained access and who, Let’s not forget, it has been in beta for just a week. When we saw the failures of ChatGPT, a few months ago, I did not hesitate to affirm that despite this, this chatbot was great, and Bing takes the best of it and adds speed, Internet connection, transparency about the sources… in short, many improvements to something that was already good.

We will continue to see strange behavior in the new Bing, and Microsoft and OpenAI engineers will have to deal with some particularly complex problem-challenges, but it is becoming clearer and less open to discussion that we are at the dawn of a new paradigm in what refers to the interaction with the information search tools on the Internet and, in fact, with the access to the information itself. However, we must not forget the underlying complexity and understand that there is still a long way of evolution ahead.

That it does not fail is, at this early point, almost impossible, but the really important thing is the answer on the part of Microsoft and OpenAI to those failures and, at this point, it seems that both are doing the best they can.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *