That’s a good litmus test. If asking/paying artists to train your AI destroys your business model, maybe you’re the arsehole. ;)
Not only that, but their business model doesn’t hold up if they were required to provide their model weights for free because the material that went into it was “free”.
There’s also an argument that if the business was that reliant on free things to start with, then it shouldn’t be a business.
No-one would bat their eyes if the CEO of a real estate company was sobbing that it’s the end of the rental market, because the company is no longer allowed to get houses for free.
Businesses relying on free things. Logging, mining, ranching, and oil come to mind. Extracting free resources of the land belonging to the public, destroying those public lands and selling those resources back to the public at an exorbitant markup.
Extracting free resources of the land
Not to be contrarian, but there is a cost to extract those “free” resources; like labor, equipment, transportation, lobbying (AKA: bribes for the non-Americans), processing raw material into something useful, research and development, et cetera.
I’m fine with this. “We can’t succeed without breaking the law” isn’t much of an argument.
Do I think the current copyright laws around the world are fine? No, far from it.
But why do they merit an exception to the rules that will make them billions, but the rest of us can be prosecuted in severe and dramatic fashion for much less. Try letting the RIAA know you have a song you’ve downloaded on your PC that you didn’t pay for - tell them it’s for “research and training purposes”, just like AI uses stuff it didn’t pay for - and see what I mean by severe and dramatic.
It should not be one rule for the rich guys to get even richer and the rest of us can eat dirt.
Figure out how to fix the laws in a way that they’re fair for everyone, including figuring out a way to compensate the people whose IP you’ve been stealing.
Until then, deal with the same legal landscape as everyone else. Boo hoo
I also think it’s really rich that at the same time they’re whining about copyright they’re trying to go private. I feel like the ‘Open’ part of OpenAI is the only thing that could possibly begin to offset their rampant theft and even then they’re not nearly open enough.
They are not releasing anything of value in open source recently.
Sam altman said they were on the wrong side of history about this when deepseek released.
They are not open anymore I want that to be clear. They decided to stop releasing open source because 💵💵💵💵💵💵💵💵.
So yeah I can have huge fines for downloading copyrighted material where I live, and they get to make money out of that same material without even releasing anything open source? Fuck no.
Good. I hope this is what happens.
- LLM algorithms can be maintained and sold to corpos to scrape their own data so they can use them for in house tools, or re-sell them to their own clients.
- Open Source LLMs can be made available for end users to do the same with their own data, or scrape whats available in the public domain for whatever they want so long as they don’t re-sell
- Altman can go fuck himself
Training that AI is absolutely fair use.
Selling that AI service that was trained on copyrighted material is absolutely not fair use.
This is basically a veiled admission that OpenAI are falling behind in the very arms race they started. Good, fuck Altman. We need less ultra-corpo tech bro bullshit in prevailing technology.
But I can’t pirate copyrighted materials to “train” my own real intelligence.
Now you get why we were all told to hate AI. It’s a patriot act for copywrite and IP laws. We should be able too. But that isn’t where our discussions were steered was it
Man, what if we abolished copyright, but also banned gen AI completely. I think that would be the funniest answer.
Only answer that would make me happy
you can, however, go to your local library and read any book ever written for free
Unless it’s deemed a “bad” one by your local klanned karenhood and removed from the library for being tOo WoKe
i almost wrote that caveat, but decided to leave it low hanging….
as far as i know, though, that only applies to children’s books at this point…
So can the AI
any book ever written
Damn! Which library are you going to?!
if the library doesn’t have a book, they will order it from another library….
every american library…Interlibrary Loan isn’t available everywhere (at least back when I used to work at a library ~10 years ago it wasn’t). If it is, it often has an associated fee (usually at least shipping fees, sometimes an additional service fee). I think the common exception to that is public university libraries.
What if it’s out of print?
i am guilty of hyperbole… i should’ve qualified my infinitives with “just about” and such….
i am more sorry about my inaccuracy than anyone has ever felt sorry about anything
Mine doesn’t…
are you sure? have you actually tried? or maybe ask a librarian?
most public libraries are part of a network of libraries… and a lot of their services aren’t immediately obvious….
also, all libraries have computers and free internet access…
i’d like to ask what library in particular, but you probably don’t want to dox yourself like that….
Then let it be over then.
I have conflicting feelings about this whole thing. If you are selling the result of training like OpenAI does (and every other company), then I feel like it’s absolutely and clearly not fair use. It’s just theft with extra steps.
On the other hand, what about open source projects and individuals who aren’t selling or competing with the owners of the training material? I feel like that would be fair use.
What keeps me up at night is if training is never fair use, then the natural result is that AI becomes monopolized by big companies with deep pockets who can pay for an infinite amount of random content licensing, and then we are all forever at their mercy for this entire branch of technology.
The practical, socioeconomic, and ethical considerations are really complex, but all I ever see discussed are these hard-line binary stances that would only have awful corporate-empowering consequences, either because they can steal content freely or because they are the only ones that will have the resources to control the technology.
At the end of the day the fact that openai lost their collective shit when a Chinese company used their data and model to make their own more efficient model is all the proof I need they don’t care about being fair or equitable when they get mad at people doing the exact thing they did and would aggressively oppose others using their own work to advance their own.
If everyone can ‘train’ themselves on copyrighted works, then I say "fair game.‘’
Otherwise, get fucked.
For Sam:
I mean, if they are allowed to go forward then we should be allowed to freely pirate as well.
In the end, we’re just training some non-artifical intelligence.
Yeah, you can train your own neural network on pirated content, all right, but you better not enjoy that content at the same time or have any feelings while watching it, because that’s not covered by “training”.
If giant megacorporations can benefit by ignoring copyright, us mortals should be able to as well.
Until then, you have the public domain to train on. If you don’t want AI to talk like the 1920s, you shouldn’t have extended copyright and robbed society of a robust public domain.
Either we can now have full authority to do anything we want with copyright, or the companies have to have to abide the same rules the plebs and serfs have to and only take from media a century ago, or stuff that fell through the cracks like Night of the Living Dead.
Copyright has always been a farce and a lie for the corporations, so it’s nothing new that its “Do as I say, not as I do.”
I’m somewhat ok with AI talking like the 1920s.
“Babe, I’m on the nut. I’m behind the eight ball. I’m one of the hatchetmen on this box job, and it’s giving me the heebie-jeebies. These mugs are saying my cut is twenty large. But if we end up squirting metal, this ain’t gonna be no three-spot. The tin men are gonna throw me in the big house until the big sleep.”
Good, end this AI bullshit, it has little upsides and a metric fuckton of downsides for the common man