News

Experts express the need to stop AI experiments in favor of security

A group of entrepreneurs from the technology sector, among which stands out Elon Musk (Tesla CEO), has come together to publish an open letter publicly requesting a minimum stoppage of six months in giant artificial intelligence experiments. The reason they give is that these great advances could mean significant risks to both society and humanity.

The letter published on the Future of Life Institute website, has the support of the co-founder of Apple, Steve Wozniak and the American and Spanish researchers Yoshua Bengio, Stuart Russel, Carles Sierra and Ramón López, among others, up to more than a thousand people. The tone of the letter is practically apocalyptic and they request an urgent solution so that this long-distance race does not become an irremediable lack of control.

AI systems that work with human-competitive intelligence can pose major risks and transform life on Earthin such a way that it is necessary to create legislation that restricts the administration and planning of resources.

The uncontrolled advances of AI, as they point out in the letter, suppose risks in the reliable exchange of information, the labor market and civilization in general, for which they request slow down the release of GPT-4 and possible new updates.

It sounds like a script for a sci-fi movie, but the letter raises the issue of whether they are allowed to unfold new non-human minds could end up surpassing us in numbers and intelligence, coming to replace and stay with the absolute domain of civilization. A panic and apocalyptic message, although with a certain background of prudence and truth.

the next task

They talk about the need to take ‘a long AI summer break’ and avoid rushing and falling into irremediable errors. The objective is to take advantage of these more than six months to develop among all the experts a series of rigorous security protocols and audit processes that guarantee that future programs are transparent, robust, aligned, loyal, trustworthy, interpretable and transparent.

The pause that is being talked about, and that is what the statement emphasizes, is not a general one, but a step back from the dangerous race towards increasingly large and highly emerging capacity unpredictable black box models.

The responsible

All the researchers and laboratories that work in an incessant fight to create a more groundbreaking model of AI are responsible for the situation and for this race to the bottom, although perhaps Open AI has been the most self-critical. In fact, the company has advocated for an independent review process that would precede future training and releases of the system, although without putting a specific date on it.

The GPT-4 would already be proposing advanced reasoning capabilities, since it is an artificial learning model that uses a human-like language, based on receiving texts and images to generate text. In fact, this efficient tool would be being perfected in really important academic and professional aspects.

As indicated in the letter, those responsible would also be the government legislators, who should be involved in setting up a strong and restrictive legal system regarding AI. And it is that if the risks are not manageable and the effects are not positive, then it would be necessary for governments to impose this moratorium if the parties involved refuse to temporarily suspend the investigations.

The solution

Once the new regulatory authorities dedicated exclusively to the supervision and monitoring of high-capacity AI are constituted, it will be necessary to configure provenance systems and watermarks to help distinguish real from synthetic leaks and to track patterns.

there will also be create a robust audit and certification ecosystemliability for damage caused by AI, strong public funding for technical AI security research, and institutions equipped to curb economic and political disruptions stemming from AI. As has been done on previous occasions, with other technological models, now humanity needs to stop.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *