Home » Meta has revenue sharing agreements with Llama AI model hosts, filing reveals

Meta has revenue sharing agreements with Llama AI model hosts, filing reveals

by Jacob Langdon
0 comment


In a blog post last July, Meta CEO Mark Zuckerberg said that “selling access” to Meta’s openly available Llama AI models “isn’t [Meta’s] business model.” Yet Meta does make at least some money from Llama through revenue-sharing agreements, according to a newly unredacted court filing.

The filing, submitted by attorneys for the plaintiffs in the copyright lawsuit Kadrey v. Meta, in which Meta stands accused of training its Llama models on hundreds of terabytes of pirated e-books, reveals that Meta “shares a percentage of the revenue” that companies hosting its Llama models generate from users of those models.

The filing doesn’t indicate which specific hosts pay Meta. But Meta lists a number of Llama host partners in various blog posts, including AWS, Nvidia, Databricks, Groq, Dell, Azure, Google Cloud, and Snowflake.

Developers aren’t required to use a Llama model through a host partner. The models can be downloaded, fine-tuned, and run on a range of different hardware. But many hosts provide additional services and tooling that makes getting Llama models up and running simpler and easier.

Zuckerberg mentioned the possibility of licensing access to Llama models during an earnings call last April, when he also floated monetizing Llama in other ways, like through business messaging services and ads in “AI interactions.” But he didn’t outline specifics.

“[I]f you’re someone like Microsoft or Amazon or Google and you’re going to basically be reselling these services, that’s something that we think we should get some portion of the revenue for,” Zuckerberg said. “So those are the deals that we intend to be making, and we’ve started doing that a little bit.”

More recently, Zuckerberg asserted that most of the value Meta derives from Llama comes in the form of improvements to the models from the AI research community. Meta uses Llama models to power a number of products across its platforms and properties, including Meta’s AI assistant, Meta AI.

“I think it’s good business for us to do this in an open way,” Zuckerberg said during Meta’s Q3 2024 earnings call. “[I]t makes our products better rather than if we were just on an island building a model that no one was kind of standardizing around in the industry.”

The fact that Meta may generate revenue in a rather direct way from Llama is significant because plaintiffs in Kadrey v. Meta claim that Meta not only used pirated works to develop Llama, but facilitated infringement by “seeding,” or uploading, these works. Plaintiffs allege that Meta used surreptitious torrenting methods to obtain e-books for training, and in the process — due to the way torrenting works — shared the e-books with other torrenters.

Meta plans to significantly up its capital expenditures this year, largely thanks to its increasing investments in AI. In January, the company said it would spend $60 billion-$80 billion on CapEx in 2025 — roughly double Meta’s CapEx in 2024 — primarily on data centers and growing the company’s AI development teams.

Likely to offset a portion of the costs, Meta is reportedly considering launching a subscription service for Meta AI that’ll add unspecified capabilities to the assistant.

Updated 3/21 at 1:54 p.m.: A Meta spokesperson pointed TechCrunch to this earnings call transcript for additional context. We’ve added a Zuckerberg quote from it — specifically a quote about Meta’s intent to revenue share with large hosts of Llama models.



Source link

You may also like

Advertisement

Recent Posts

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

© 2024 Ai Crypto Watch. All rights reserved.