The battle between big media and AI is heating up, but the real story is far from a simple underdog tale.
The music industry's giants are now in the AI game. Universal Music Group (UMG), along with major labels, sued AI music startups for allegedly using their recordings to train AI models without permission. But in a surprising twist, UMG recently partnered with one of the accused, Udio, to develop an AI music platform. Despite assurances to protect artists, advocacy groups are skeptical, recalling past instances where artists were left with little control.
The legal landscape is complex. Numerous lawsuits across the US involve artists, publishers, and studios claiming copyright infringement by AI training. Judges are grappling with the challenge of applying copyright law to a technology that challenges the very idea of authorship. The case of Andersen v Stability AI, a class-action lawsuit over an AI image generator, highlights the artists' concerns about the unauthorized use of their work.
Creative workers are feeling the impact of AI. Generative AI is displacing human labor, with illustrators and audiovisual creators already experiencing income loss. This has sparked a wave of activism, with artists and executives uniting to confront the tech industry through social media campaigns, crowdfunding, and legal action. The Human Artistry Campaign advocates for legislation to protect artists from AI and big tech.
But is big media the solution? Some artists and civil liberties groups warn of the risks of aligning with large media conglomerates known for exploiting labor and aggressively expanding copyright. The 'enemy of my enemy' strategy may backfire as big media and big tech seem to be growing closer. Copyright lawyer Dave Hansen argues that copyright lawsuits won't protect artists; instead, they may lead to exclusive deals between media and tech giants, leaving independent artists out in the cold.
History repeats itself. Past licensing negotiations during the rise of streaming saw labels and studios profit while artists were left behind. AI licensing deals may follow a similar pattern. When AI companies partner with media giants, artists may not receive compensation or control over their work. Mandatory licensing may not curb big tech's power, as only large companies can afford the costs, further centralizing control.
Proposed solutions may cause more harm than good. The NO FAKES Act, aimed at regulating deepfakes, has been criticized for its vague language and potential for abuse. It could allow studios to pressure artists, including children, into giving up control of their digital identities. Many copyright lawsuits and licensing solutions are seen as Trojan horses, benefiting big media companies more than artists.
The power of organized labor. The real solution, some argue, lies in organized labor. Unionized creative workers have secured protections against AI through strikes and collective bargaining. Copyright law, with its limitations, cannot adequately address the future of an already vulnerable creative workforce. If big media genuinely wants to protect artists, they should engage in genuine dialogue and collaboration with artists, rather than treating their work as a commodity.