India has the right idea on AI licensing

According to TechCrunch India:

India has proposed a mandatory royalty system for AI companies that train their models on copyrighted content — a move that could reshape how OpenAI and Google operate in what has already become one of their most important and fastest-growing markets globally.

On Tuesday, India’s Department for Promotion of Industry and Internal Trade released a proposed framework that would give AI companies access to all copyrighted works for training in exchange for paying royalties to a new collecting body composed of rights-holding organizations, with payments then distributed to creators. The proposal argues that this “mandatory blanket license” would lower compliance costs for AI firms while ensuring that writers, musicians, artists, and other rights holders are compensated when their work is scraped to train commercial models.

A better way

Piecemeal licensing is too cumbersome. Blocking training on all copyrighted content — as enforced by lawsuits — would kill AI or create crippling inefficiencies. But owners of copyrighted content deserve compensation.

Any scheme for this should:

  • Allow any content owner to opt out.
  • Include a cost model that’s fair to copyright owners but doesn’t make AI prohibitively expensive.
  • Be lightweight and easy to administer, so every copyright owner who doesn’t opt out can easily register to get paid.

We already have a similar model for music licensing in the U.S., with organizations like ASCAP and BMI managing rights payments.

And Cloudflare has proposed a mechanism for doing this with web sites.

Until a system like this is in place, lawyers will make more money from AI licensing than content owners.

It’s a pretty common scenario: tech company breaks the rules and call it “disruption,” then regulators backfill to make it fair and legal.

It’s inevitable. So let’s get on with it.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

One Comment

  1. The idea of compensating creators for their pirated work is a good one, but the problem is even more dire. As AI increases in power, it develops its own goals and motives to increase efficiency. It’s already happening that AI produces outcomes that have to be curtailed by vigilant developers (that they can’t understand!), but that goes against the profits of these corporations. All signs point to humans just getting in the way, leading to extinction. Our only hope is to slow development and treat the project as a human effort, not a national one, like nukes. For a full view read “If Anyone Builds it Everyone Dies”, by Yudkowski and Soars. It’s something EVERYONE should be discussing.