Voice clones of artists will soon no longer be welcome on YouTube | Tech

Voice clones of artists will soon no longer be welcome on YouTube | Tech
Voice clones of artists will soon no longer be welcome on YouTube | Tech

From 2024 onwards, YouTube will give record companies more tools to remove videos containing voice clones of artists. But not everything is simply removed.

Record companies will soon be able to contact YouTube if they believe that their artists and bands have been cloned without permission using artificial intelligence. The video service investigates the complaints and then makes a decision. “Not all content will be removed from YouTube,” the service warns in an announcement.

According to YouTube, it is specifically about “AI-generated music in which the singing or rapping voice of an artist is imitated”. The company is considering whether the video is part of news reporting, an analysis or criticism of the fake votes when potentially removing it. If AI songs are not part of this, then in practice a record label will probably be able to remove cloned voices quite quickly.

You can already find a lot of fake music on YouTube

There are many videos on YouTube that imitate the style of famous artists. There is a lot of discussion about that. The conversation about this started when an anonymous TikTok account called Ghostwriter posted a song online that was supposedly made by Drake and The Weeknd.

In this edition, Heart On My Sleeve, the artists’ voices are used. This makes it seem like they are rapping and singing new lyrics, while in reality they never did that.

This is all done with artificial intelligence. Software analyzes the artists’ voices and intonation and can use this to create new songs in the same style.

No clear rules on AI cloning in the law yet

With the new measures, YouTube is ahead of the regulations and the company is meeting the wishes of record companies. Because it has not yet been laid down in law where exactly the boundaries lie in terms of what is and is not allowed with voice clones of artists.

By the way, YouTube wants more transparency about AI in videos from next year. The company writes that from now on, creators must use labels if they have used realistic AI images.

It is not yet clear what exactly falls under the definition of a realistic AI image. The service does state that fake images can mislead viewers, especially if it is not stated that the videos do not contain real images.

Beeld: GettyImages

Lees meer over:

Kunstmatige intelligentie Tech

The article is in Dutch

Tags: Voice clones artists longer YouTube Tech


PREV Porsche’s Turbo models will soon leave all the bling behind
NEXT Netflix will make full use of the cloud for its games