Hollywood Confronts AI Copyright Chaos in Washington, Courts -- WSJ

Dow Jones
07/01

By Amrith Ramkumar and Jessica Toonkel

Natasha Lyonne stayed up all night furiously texting and calling Cate Blanchett, Ron Howard and everyone else she knows in Hollywood, asking them to sign her letter to the Trump administration.

The White House is about to issue its artificial-intelligence action plan, a document that could influence how U.S. copyright rules are applied to training large language models. Tech companies say they need the material to train their models and keep up with China in an AI race with grave national-security implications.

Lyonne and the Hollywood set see it differently: Unfettered AI access to films, TV shows, and acting performances would destroy their value. "At one point my phone started smoking," the actress and director said in an interview about that March night. She gathered more than 400 signatures for her letter.

"My primary interest is that people get paid for their life's work," said Lyonne. She is a partner in a new film and TV studio called Asteria that uses generative AI that trains only on models underpinned by data and images used with permission, practices she wants to be the norm.

America's creators are mounting a campaign to push back on any use of their work without permission or compensation, seeking to head off potential abuses of their intellectual property.

Disney Chief Executive Bob Iger, along with legal chief Horacio Gutierrez met with White House officials recently to discuss worries about AI models infringing on the company's intellectual property and using the studio's characters in inappropriate ways, according to people familiar with the talks.

Disney and Comcast's Universal recently sued AI company Midjourney for allegedly stealing their copyrighted work to train its AI image generator. The company didn't respond to requests for comment.

The battle between technology heavyweights and many of the country's most famous creative companies and artists is playing out in Washington and in court. At stake are billions of dollars and precedents that could shape the future of AI and U.S. copyright law. While the fight is far from over, some in creative industries fear it might be too late to stop the advance of AI as it roils their professions.

"They're fighting over who's going to control and dictate the next generation of technological development," said Joshua Levine, a research fellow at the Foundation for American Innovation, a tech-focused think tank.

Administration officials aren't sure whether they will take any action because of the legal complexities and political downside of favoring one over the other, people familiar with the matter said. The action plan is expected to come out this month.

Judges sided with Meta Platforms and Anthropic in parts of two separate cases last week, finding that using copyright material to train AI models is fair use in some cases when the material is transformed into something dramatically different.

"It's very important that we end up with a sensible fair use definition like the one the judge has come up with in this Anthropic case because otherwise we will lose the AI race to China," White House AI Czar David Sacks said on the podcast he co-hosts.

Other aspects of the rulings favored the copyright holders. Anthropic must face a separate trial about whether training on pirated content is legal. The judge in the Meta case said similar lawsuits could get a different result, particularly if creators show their sector was harmed by the AI training.

There are dozens more cases, and appeals are expected. So far, there hasn't been a ruling on how artists and creators should be compensated for the outputs of AI systems.

Keith Kupferschmid, CEO of the Copyright Alliance, a nonprofit that advocates for creative industries and individuals, said the cases represented a mixed bag and "may create this copyright chaos for AI companies and copyright owners."

AI Progress, a group of tech companies including Meta, Google and Microsoft, is expected to meet with administration officials on the topic in the coming weeks, people familiar with the meeting said.

During a recent meeting with the Motion Picture Association, Sriram Krishnan, the senior White House policy adviser on AI, characterized a big part of his role as making sure the U.S. beats China in the AI race, according to people familiar with the meeting.

U.S. security officials met with tech companies including Meta this spring and convened discussions with the White House. The security officials see copyright uncertainty as an issue that could slow the advancement of models or the deployment of the technology in the intelligence community, some administration officials said.

In May, President Trump fired the librarian of Congress, who oversees the Copyright Office, in part because of worries that a report on AI and copyright set to be published soon would favor copyright holders, people familiar with the matter said. The following day, a draft of the report was published. Such reports aren't legally binding but are often cited by judges.

The report said the legality of AI training depends on whether the models generate outputs substantially similar to the copyright material and if companies use pirated material.

The office angered some tech executives by saying courts should also consider whether content produced by AI competes with the original training material and dilutes that industry's market. The judge cited that argument in the Meta decision, claiming that plaintiffs in similar cases could win by showing AI-generated products hurt their sector.

Write to Amrith Ramkumar at amrith.ramkumar@wsj.com and Jessica Toonkel at jessica.toonkel@wsj.com

 

(END) Dow Jones Newswires

July 01, 2025 05:30 ET (09:30 GMT)

Copyright (c) 2025 Dow Jones & Company, Inc.

應版權方要求,你需要登入查看該內容

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10