Why must musicians get the short end of OpenAI’s Bad Business Model?
Tech companies claim AI needs free access to copyrighted materials for proper and accurate training, leaving artists uncompensated for the use of their property and with more AI-created competition.
by Stephen Carlisle of Nova Southeastern University
Earlier in the year, the maker of artificial intelligence program OpenAI made this statement to the UK Parliament:
“Because copyright today covers virtually every sort of human expression–including blog posts, photographs, forum posts, scraps of software code, and government documents–it would be impossible to train today’s leading AI models without using copyrighted materials. Limiting training data to public domain books and drawings created more than a century ago might yield an interesting experiment, but would not provide AI systems that meet the needs of today’s citizens.” 1
Then, as reported by Isaiah Portiz of Bloomberg Law, last week at the Vanderbilt University Music Law Summit OpenAI, attorney Sy Damle had this to say:
“If you were to do private negotiations for every piece of content that you need to train one of these models,” they wouldn’t exist… “We just have to accept that as a fact in the world, and then we can start having a conversation about what are the right policies that we build, given that we want these tools to exist.” 2
Two problems are immediately obvious. OpenAI has identified a crucial component to the functioning of their product (copyrighted materials) that they are unwilling to pay for. Then, notice the underlying assumption that somehow there is a NEED for these programs to exist, that we WANT these programs to exist, and that somehow copyright is standing in the way. To which I ask, where is the glaring need?
There are over 100 million music tracks available on Spotify. So many that Spotify is changing its payout policy and will only pay royalties on music tracks that have 1,000 streams or more in the previous 12-month period. 3
So, is there really a burning need for more songs? This company seems to think so:
“[GenericAI company] is your copilot for music creation. Use AI to make music by simply typing in your ideas and [AI Product] will generate copyright and royalty free music for you instantly. …[AI Product] is for all levels of musicians, content creators and everyone looking for beats audio effects…and music!” 4
This also reveals the hubris of the AI companies. They tell you by typing in a few commands into a computer program that you are now some kind of “music creator.” And, by the way, I am not sure if the reference to “copyright free” means that it is not infringing (hopefully) or a recognition that as an AI created work it is not capable of copyright protection.
But I digress.
There are 3,600 movies on Netflix, plus over 1,800 television shows. 5 There are 1,200 movies on Hulu along with 1,300 TV shows. 6 Over at Disney+ there are 500 movies and 15,000 TV episodes. 7 So again I ask, where is the great burning need for more content?
There isn’t. For the real purpose of AI is to put creative artists out of work.
Lately, my FaceBook feed has been awash in ads for AI product. What is their pitch? Don’t pay for creative works.
“Never Pay for Voiceovers!” says one. OK, let’s put voice actors out of work.
“Save hundreds of dollars and hours of your time with [Company] AI Headshot Generator.” and “No Photoshoot/ Photographer Needed!” says another. OK, let’s put headshot photographers out of work too.
“Because with it you can create training videos in 60+ languages in minutes. Without actors, cameras, microphones & voiceovers.” Yep. Let’s put them out of work as well.
“Replace expensive spokespersons, voice artists and multiple videos apps.” OK, goners.
“Hey, with [Company] AI, you can generate realistic looking images, even if you’ve never drawn more than a crooked line before.” Again, type a command, and you’re an instant creator. Except now there is an AI program that if you upload a photo or piece of art, it will generate the command prompt for you.
So, what AI represents is a massive campaign to drive revenue away from human artistic creators to the makers of AI programs, which were all trained on the work of the very artists they now want to put out of business.
All without compensation.
Because that would be “impossible.”
So what you have is a business model where your business is 100% dependent on obtaining your raw materials for free. And obtaining these raw materials for free is probably against the law. Is this a good business model? As my colleague David Newhoff over at the Illusion of More website wrote:
“If I were an AI investor asking about potential liability, and the founders told me, ‘Don’t worry, what we’re doing is fair use,’ my immediate response would be to ask whether there is sufficient funding for major litigation, to say nothing of predicting the outcome of that litigation. Because simply put, the party who conjures the term ‘fair use’ has effectively assumed that a potential liability for copyright infringement exists. And if that assumption is a bad business decision, then that’s the founders’ problem, not a flaw in copyright law.
No matter what the critics say, or how hard certain academics try to alter its meaning, the courts are clear that fair use is an affirmative defense to a claim of copyright infringement, which means that building a business venture on an assumption of fair use is tantamount to assuming that lawsuits are coming. And if it’s a multi-billion-dollar venture that potentially infringes millions of works owned by major corporations, then the lawsuits are going to be big—perhaps even existential.” 8
These lawsuits, many of them filed as class actions, have already started. But we can look beyond the mere enormity of the amount of money that might be owed. One of the remedies available to the copyright owners is:
“(b) As part of a final judgment or decree, the court may order the destruction or other reasonable disposition of all copies or phonorecords found to have been made or used in violation of the copyright owner’s exclusive rights, and of all plates, molds, matrices, masters, tapes, film negatives, or other articles by means of which such copies or phonorecords may be reproduced.” 9
So, the Plaintiffs could request not only that all copies of the AI program be recalled, but that the AI companies remove all copyrighted works from their training data set. And if we take the statement of Open AI above to be true, this would render their product an “interesting experiment” that was not commercially viable.
In other words, a bad business model is being rammed down the throats of the artistic community so it can be profitable.
This follows in the footsteps of Spotify, which gained its market dominance through streaming songs it did not have a license for. To this date, Spotify has never, ever, turned an annual profit. 10 Another tech “innovation” with a bad business model.
It also follows from the long line of tech “innovations” which were created by ripping off musicians and artists. The logic is “infringe first, duke it out in court later.” It worked for Google and YouTube. Not so good for Napster, Grokster, Re-Digi, Grooveshark, TV Eyes, VidAngel and others.
So, in the words of Elton John and Bernie Taupin, artists can say “I’ve Seen That Movie Too.” The days of artistic giveaways to big tech companies in the name of “innovation” can be, and should be, over.
Notes:
- OpenAI-written evidence ↩
- OpenAI Attorneys Say Licensing All Training Data Is Impossible ↩
- Digital Music News ↩
- Please note I am deliberately not naming the companies in question so as to not give them free publicity ↩
- Top 10 Movies & TV shows on Netflix ↩
- Top 10 Movies & TV shows on Hulu ↩
- Getting started with Disney+ ↩
- “Fair Use” is Not a Great Business Plan ↩
- 17 USC 503(b) ↩
- Spotify Passes 600 Million Users, Expects Profitable 2024 Start ↩