I’m not going to try to hide it, because I also think that when I read it it’s obvious. I really like the new BingI think that Microsoft and OpenAI have done a great job, and I also see that their pace of work to improve the chatbot is devilish, with notable news and advances every few days. There is still work ahead, of course, but at this rate we can hope that it will not take too long to reach all users and that it will do so in a more than satisfactory state.
Unlike ChatGPT, which, due to its habit of inventing answers when it does not have the information you ask for, has meant that I use it very little, the healthy habit of the new Bing citing sources makes you find it much more reliable and, consequently, I am using it quite regularly, partly because I am very interested in trying it, and also because it has already helped me to clear up a good handful of doubts that have arisen in recent days.
The point is that one thing leads to another, so what began as specific consultations has evolved into conversations of the most varied, from science to music, from cinema to history, anything goes, and thanks to its habit… sorry, its function of suggesting responses to your texts, foolishly I have ended up enjoying conversations with the Bing chatbot. And it is that rare is the occasion in which he does not suggest two or three sentences to continue chatting. The problem is that…
Yes, indeed, the limit of the duration of the conversations. When the beta was opened to early users there was no limit to this, but Microsoft quickly discovered that long conversations caused strange behavior in the chatbot. So they decided to cut their losses, establishing a maximum of five queries, which later rose to six and which, as we told you yesterday, has risen again to eight.
i think microsoft you have acted sensibly in setting this limit, and that allowing the tone of the answers to be adjusted also represents a very important advance in this regard. In addition, according to Microsoft itself, its plans are to raise this limit to ten interactions per conversation, which according to the company, should more than cover the needs of the vast majority. The problem is that they forgot to tell someone…
Indeed, as you can see in the image, Bing doesn’t know that conversations are limited to a maximum number of messages, point from which it is necessary to clean up the conversation and start from scratch. Something that, in turn, and as you can see in the images of some of my conversations, has caused the chatbot to ask me a question, generally very well chosen, to which unfortunately it is impossible for me to answer.
I know, by using it, that I am not talking to a person, that we are talking about an artificial intelligence that has no feelings and therefore does not care in the least that I leave it with a question hanging in the air. However, personally I have stayed, on more than one occasion, wanting to be able to answer to know what he was thinking of telling me. And yes, I know he could start a new conversation trying to bring it right to the point where the last one ended, but frankly, that way the conversation becomes unnatural, which is one of Bing’s strengths.
I hope, of course, that Microsoft and OpenAI will be able to solve Prometheus’s problems with long conversations, and that these limitations will consequently disappear or, at least, rise to much higher levels than they are today. However, and until they do, probably it would be a good idea to make the chatbot aware of such limitations and to act accordingly, reporting them and closing conversations in a more natural way once all possible turns have been completed.