MOSCOW, May 19.The UK is working to create measures to increase transparency in the process of technology companies training artificial intelligence models, the Financial Times reports, citing British Minister of Culture, Media and Sport Lucy Fraser.
In April, the New York Times reported that artificial intelligence developers such as OpenAI, Google and Meta* were illegally collecting information to train their systems from across the Internet, as well as from millions of hours of YouTube video. prohibits such actions.
«UK ministers are working on plans to increase transparency in how tech companies train their artificial intelligence models after creative industries raised concerns that their work was being copied and used without permission or remuneration,» it said. in the newspaper material.
Fraser said in an interview with the newspaper that the government intends to create rules for the use of materials such as television programs, books, music by companies developing AI. Initially, the government will focus on providing greater transparency about what content AI developers use to train their models.
The publication explains that increased transparency will make it easier for copyright holders to track violations in the field of intellectual property.
In April, Bloomberg reported that UK authorities were beginning to develop draft laws regulating artificial intelligence, in particular the language models underlying OpenAI's ChatGPT chatbot.
Formerly the Financial Times. reported that the UK and the US agreed to cooperate in the field of ensuring the security of artificial intelligence (AI) technology, the signed bilateral agreement was the first of its kind in the world. In October 2023, British Prime Minister Rishi Sunak announced that the UK would create the world's first institute dedicated to studying AI safety. Later, the United States, following Britain, announced the creation of a similar scientific institute.
* The activities of Meta (social networks Facebook and Instagram) are prohibited in Russia as extremist.
Свежие комментарии