
The music sector is combating this issue on various platforms, in legal settings via the courts, and alongside lawmakers in an effort to stop the pilfering and misappropriation of artwork generated by artificial intelligence—but progress continues to be challenging.
Recently, Sony Music stated that they have already requested the removal of 75,000 deepfakes — which include fake images, sounds, or video content that can easily pass as authentic — highlighting the extent of the problem.
According to the cybersecurity firm Pindrop, AI-created music exhibits distinct characteristics and can be easily identified. Nevertheless, this type of music appears ubiquitous.
"Pinddrop, a company specializing in voice analysis, noted that even when AI-generated songs sound realistic, they frequently contain minor inconsistencies in pitch variations, rhythms, and digital patterns that you wouldn’t typically find in human performances," he stated.
However, it only takes a few minutes on YouTube or Spotify — two leading music streaming services — to come across a bogus 2Pac rap about pizza or an Ariana Grande cover of a K-pop song she didn’t actually perform.
Sam Duboff, who leads policy at Spotify, emphasized that they treat this matter with great seriousness and are developing new tools in this area to improve the situation further.
YouTube mentioned it is "improving" its capability to detect AI-generated content, and might share updates within the next few weeks.
"As the malicious players became somewhat more cognizant earlier on," musicians, record labels, and other entities within the music industry found themselves "responding reactively," according to Jeremy Goldman, an analyst at EMarketer.
"Given the annual revenue in the multibillion-dollar range, YouTube has a significant stake in resolving this issue," Goldman stated, expressing his belief that they are earnestly addressing the problem.
He stated, 'If you're on YouTube, you wouldn't want the platform to turn into an AI horror story.'
Litigation
However, apart from deepfakes, the music industry is especially worried about the improper utilization of its material to educate generative AI systems such as Suno, Udio, or Mubert.
Last year, several prominent record labels initiated legal action in a federal court located in New York against the parent firm of Udio. They accused this entity of crafting its technological infrastructure using "copyrighted audio recordings with the endgame of luring away the audience, admirers, and prospective licensing partners of the very recordings they duplicated."
Over nine months later, the proceedings have still not commenced properly. Similarly, a related case against Suno, which was brought up in Massachusetts, has also failed to start.
The heart of the lawsuit revolves around the concept of fair use, which permits the usage of certain copyrighted materials without prior consent for specific purposes. This could potentially restrict how intellectual property rights are enforced.
"Uncertainty truly prevails here," stated Joseph Fishman, who is a law professor at Vanderbilt University.
Initial decisions may not be conclusive since differing views from various courts might lead to the matter being referred to the Supreme Court for resolution.
Meanwhile, the key participants in AI-created music keep training their systems on material that is protected by copyright--raising the issue of whether the fight may be over before it truly begins.
Fishman indicated that it might still be premature to conclude this: despite numerous models currently being trained on copyrighted content, updated iterations of these models keep emerging regularly. It remains uncertain whether future judicial rulings could lead to potential licensing complications for such models moving ahead.
Deregulation
In the legislative sphere, labels, artists, and producers have achieved minimal success.
Various bills have been proposed in the US Congress, yet no substantial outcomes have emerged from them.
Several states — most notably Tennessee, which is a hub for the influential country music sector — have implemented protective laws, particularly concerning deepfakes.
Another possible hurdle comes from Donald Trump; as the Republican president, he has positioned himself as a strong advocate for deregulation, especially concerning AI.
Many prominent figures in artificial intelligence have entered the fray, with Meta being a key player. The company has called for the administration to specify that utilizing publicly accessible data for training models is clearly within the bounds of fair use.
Should Trump's administration follow this counsel, it might shift the advantage away from music industry experts, despite the judiciary technically having final say.
The situation in Britain is not much improved, as the Labour government is contemplating changes to legislation that would permit AI firms to utilize creators' online content for developing their models, with the caveat that rights holders must actively choose to opt out.
Over a thousand musicians, such as Kate Bush and Annie Lennox, came together to release an album in February titled "Is This What We Want?". The album features the sound of silence captured across multiple recording studios, serving as a musical statement against those efforts.
According to analyst Goldman, AI is expected to keep posing challenges for the music industry -- provided it stays disorganized.
The music business is highly splintered," he stated. "This fragmentation ends up being detrimental when trying to address this issue.