MOSCOW, June 3. The rapid creation of general artificial intelligence superior to humans will inevitably deprive humanity of the future, said University of Louisville professor Roman Yampolsky in an interview with journalist Lex Friedman.
According to him, general AI is usually understood as a system that is capable of operating in all areas accessible to humans. However, recently, experts are increasingly equating AGI with superintelligence, which surpasses human capabilities in all possible areas of activity.
«
"If we create general AI, then I don't see a positive outcome for humanity in the long term. The only chance to win this game is not to play it at all,” the expert expressed his opinion.
According to him, current models of artificial intelligence already meet the classic definition of general AI due to superior intelligence over the average person. However, technological progress will soon lead to the fact that even the best specialists in their fields will begin to give way to AI.
As Yampolsky noted, artificial intelligence can either simply destroy humanity or try to provide people with maximum suffering. No less dangerous is the disappearance of the meaning of life due to the complete replacement of people by AI in work, art and other areas of life. At the same time, creating reliable protection against a constantly evolving and unpredictable system seems impossible to the scientist, he added.
The expert suggests that humanity will develop general AI as early as 2026. The industry representatives with whom he spoke were of the same opinion.
“»Two years left. <…> This is very soon, considering that we don’t even have working defense mechanisms (against strong AI. — Approx. . ed.), even prototypes,” Yampolsky warned.
Earlier, the UN High Commissioner for Human Rights, Volker Türk, said that artificial intelligence capable of generating text or images poses a huge danger to humanity. The World Health Organization (WHO) has also urged humanity to exercise caution when using artificial intelligence.
Свежие комментарии