On one hand, it’s Grimes who wants to be ‘less famous’ and has even created a platform, Elf.Tech, that lets her fans make songs using the AI-generated stems of her voice.
And on the other is Drake and The Weekend’s case with a song that was allegedly performed by the artists and uploaded to all streaming platforms but removed later. It became obvious that the song had been made by AI.
AI-generated music isn’t a phenomenon of this year. Though in 2023, it gained momentum. Back in 2020, for instance, a composition generated by AI was played in a concert of the London Symphony Orchestra, so the attitude towards AI-made music is very diverse.
From the example with Grimes’s Elf.Tech, we see that not all musicians are concerned with AI taking over music. Some embrace it, while others surely don’t. But to figure out if AI-generated music actually is a threat, let’s briefly define what music made by artificial intelligence means.
What is AI-generated music?
Music is AI-generated when it’s composed or produced by artificial intelligence algorithms. Essentially, it’s music that has been created by a machine rather than a human.
AI algorithms are typically trained on large datasets of music (often protected by copyright) that include a broad range of musical styles, genres, and compositions. Artificial intelligence is programmed to analyse these datasets to identify patterns, structures, rhythms, melodies, harmonics, and other musical elements. Some AI systems are even taught to understand the theory of music.
💡Interesting: Universal Music Group, a recording label that controls a third of the global music market and is home to Drake, The Weekend, Elton Jones, Taylor Swift, and other world-renowned artists, has told streaming services to block AI services from scraping tunes and lyrics from UMG copyrighted music. It’s only a matter of time when other labels will do the same to protect their rights and the ones of their artists.
Once the AI has been sufficiently trained, it can generate new music. AI creates or predicts a sequence of musical notes that follow the patterns and styles that it learned during training.
Does AI-generated music have copyright?
At the moment of this text being written, the legal side of AI-generated music is still obscure: If you create music with AI, you may not be the owner of the generated track. But who is? AI developers, artists whose songs were used to train AI, or those who type the prompt — that is so far unclear. And if AI-generated visual art is stated that it isn’t copyrightedunder U.S. law, if that applies to music is yet unknown. Another question is if AI-generated music even requires a licence to use.
Musicians whose art is used for AI training may not know that their music has been leveraged for this purpose and thus are unable to sue AI developers for copyright infringements. The artists whose music is used as input aren’t compensated for the usage of their copyrighted tracks and can’t sometimes even prove their music has actually been used to train AI or generate the output.
In response to these threats and controversies, Google and Universal Music are going to embrace AI and allow content creators to legally generate voices with AI using artists’ voices as long as they pay the rightful copyright owners by introducing YouTube AI Incubator.
The incubator will help inform YouTube’s approach as we work with some of music’s most innovative artists, songwriters, and producers across the industry, across a diverse range of culture, genres, and experience.