Tech

Google won’t offer its own version of ChatGPT, here’s why

The notoriety of ChatGPT, the chatbot doped with Artificial Intelligence, is growing every day. Some observers wonder if it could simply replace search engines. The undisputed giant in the field, Google, is also experimenting with machine learning and AI.

A robot chats with a woman / Credit: 123rf

During a meeting between pundits of Alphabet, the exchanges arrived at ChatGPT. According to Sundar Pichai, the technology behind ChatGPT is not mature. It not only has a problem of reliability, the conversational agent still frequently gives erroneous answers or even invents them. Worse, they may be biased, which is a real risk to the reputation of the company.

According to Pichai, Google’s language models are as good as OpenAI’s ChatGPT. The difference between the two technologies is mainly due to the size and objectives of the companies that exploit them. When OpenAI is in experimentation mode and gives itself the right to make mistakes, Google must be more careful: the company is no longer a startup and must take care of its brand image.

Google says AI is powerful but dangerous, so be careful

He adds: “Google has a lot of plans for AI in 2023, and it’s an area where we have to be bold while being responsible, we have to find a balance”. BERT, LaMDA or even MUM: these are the names of language models that Google is already using through various platforms. Whether their performances are not flashy, they can, for example, detect, just by their language, if an Internet user is in emotional distress. This makes it possible, if necessary, to direct them to the appropriate support service.

The multinational is therefore well aware of its social responsibility, it cannot afford to do just anything. Sam Altman, CEO of OpenAI, agrees: “ChatGPT is incredibly limited, but good enough in some areas to give the misleading impression of being great. It would be a mistake to rely on it for anything important right now. […] there is a lot of work to be done in terms of robustness and veracity”.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *