Landmark New Report Warns That A Flawed AI Content Market is Accelerating ‘Content Cannibalization’
AI systems are cannibalizing the content they depend on to function, and the fledgling AI content market designed to solve the problem is doomed to fail unless it fixes three major structural flaws. That’s the warning from a first-of-its-kind report from the Open Markets Institute's Center for Journalism and Liberty.
"Same Gatekeepers, New Tollbooths: Mapping the AI Content Licensing Market" by Dr. Courtney Radsch and Karina Montoya is the first comprehensive analysis of how AI companies source, value, and compensate the news and creative content their systems rely on. As use of AI is leading to a collapse in traffic and revenue to original sources, the report finds that the current system of copyright claims, ad hoc deals, and voluntary commitments to address this is repeating the same mistakes as the social media and search era. Unless the industry acts now, the result will be ever-worsening "sloppification" – a slow but accelerating degradation of AI's own outputs as the supply of quality, human-created content dries up.
Drawing on more than 35 interviews and consultations with publishers, content creators, and AI licensing startups, the report maps the emerging licensing market in granular detail – and delivers an innovative, actionable agenda for putting AI and content creators on a sustainable, transparent, symbiotic footing before it's too late.
"The quality of AI outputs depends on an ongoing supply of quality human content. Destroy the economic foundation of that content, and you degrade the intelligence of AI itself. It's a house of cards built upon quality content as the base,” said report author Dr. Courtney Radsch. “We found major problems with the current licensing market. None are self-correcting. Either we fix this, or we’ll see yet another crisis in journalism and media. We need to build a symbiotic input output relationship, or AI risks becoming a parasite which kills the content streams it relies on."
The Double Bind
News and content creators are being trapped in a double bind, the report finds. The same Big Tech firms whose AI products are eroding website traffic are now building and controlling the licensing infrastructure for alternative revenue – crowding out innovative startups and occupying both sides of the value chain simultaneously.
The collapse in human website traffic is costing publishers billions in lost revenue and leading to ongoing journalism layoffs. Rather than compensating for that loss, AI companies are accelerating it. The rate of AI bots bypassing voluntary access restrictions has quadrupled in the past six months, from 3.3% to 12.9%.
Mapping the Market
The report maps out a three-tiered market: a small number of large bilateral deals between major AI companies and major publishers; an emerging intermediary layer of licensing marketplaces that includes startups and Big Tech firms; and an uncompensated majority – the bulk of publishers and creators – that sits outside of any licensing or compensation framework.
"AI firms will lobby for voluntary fixes, which is how Big Tech has always promised to fix the problems it's created. It's never worked,” Dr. Radsch said. “The window for intervention is narrowing as market structures are locking in – if policymakers don't act soon, the terms will be set for them."
Innovative Solutions to Avoid Systemic Failure
The report pairs its diagnosis with a concrete, innovative policy agenda designed to equalize the power dynamic between dominant AI firms and content creators, enhance transparency, and restructure the digital incentives that make quality content economically viable:
Statutory Licensing Frameworks: Set enforceable, market-wide payment obligations, ensuring all publishers are compensated when AI systems use their content, replacing one-off deals with reliable standards.
Collective Licensing and Sectoral Bargaining: Explore extended collective licensing and sectoral bargaining frameworks and allow publishers to negotiate as a group with AI firms, enabling smaller and local outlets to secure fair terms alongside larger players. (Note: Collective bargaining has already proven effective in Canada and Australia).
Transparency Requirements: Mandate disclosure of deal terms, pricing models, and data usage to correct information asymmetries and support fair market valuations.
Attribution & Compensation at Model Level: Require technical systems to include bot identification and detect when AI outputs rely on specific sources and distribute compensation accordingly, akin to rights management systems in music.
Inclusion of Local & Independent Media: Require eligibility and distribution mechanisms to include smaller, regional, and non-English publishers to ensure diversity of data inputs, prevent further concentration of media power and limit systemic risks.
Also see:
“AI Needs Us More Than We Need It: Why journalists and other content creators have more leverage over the future than they might know.” (Washington Monthly, October 2024)
“The value of journalism must be established in the AI era” (Seattle Times, June 2024)
###