Changer said:
LordVrane said:
Ironic, considering we, as a community, decided to listen to your requests and gave you a way not to allow us to use your images for training purposes for the new models (and let me remark this: StabilityAI is STILL under no legal obligations to do such a thing)
Did you actually read the discussion? Its not about the web scraper. Its about people *manually* taking and using art from the hub. Edge said you should ask permission from the artist before doing that, and Jigiyak threw a temper tantrum about it because they, by their own words, believe artists shouldn't have rights.
It seems to me that your argument there is incompatible with Jigiyak's stance that it's unreasonable to ask people not to steal your art and use it for AI training without permission. Unless your stance is that because the web scraper added an opt out, letting individuals steal from us is a "reasonable compromise"?
If that is your position there's still nothing reasonable about it. artists get nothing out of this compromise. AI art proponents have offered NOTHING to artists in compromise. The only compromise is "Let us take what we want, or at least let us take a bit of what we want" with nothing given in return. That's not how compromise works.
Scraping images from the web and taking them manually from a website is the same thing, just the first being done more efficiently. What do you think a program does when I ask specifically to get images that got the tag "Mind control" in it? It goes on Google Images, open every websites that fits the criteria and gets the images and the tags for the dataset. Anyone doing a "scraping" without a target these days is... well, I would dare to say nobody, because it's simply non a viable way to do things.
Also, once again, I find all of this process 100% reasonable because it's not stealing (not by the law), and anyone who uploads an image on the Internet signs off a contract that allows anyone to use their work for specific purposes without crediting: AI training, like scientific research or style copying\inspiration, is included. Also, you actually got things out of it, just nonmonetary:
- Google Images is a tool used by billions each day, and artists get quite a nice deal with a free tool that can help them get inspiration, tutorials and such.
- Stable Diffusion, by being free, can be used by anyone, and it has lot of tools that can help speed up lots of processes that were, usually, relegated by manual work (like putting a generic background, basic colors, or even the fine-tuning of details after putting down an elaborate sketch).
To be quite honest, I find even discussing this utterly hypocritical. Plenty of artists in digital media today use programs like Photoshop and the like. Adobe has made NO secret they used their users' work to improve their systems, thus allowing the same artists (the uncredited ones) to do work more efficiently while also improving their product (and raising their own profit by it, Adobe I mean). The artists community NEVER sparked outrage for that (despite Adobe making dimes of their uncredited work, SD doesn't make one on comparison), and I have a suspect why: because it didn't democratize art, it actually made it more difficult for newcomers to reach the new, higher level of mastery compared to the "veterans". SD destroys this barrier, allowing me, a mere UX Researcher, to do the lewds of my dreams in my free time (while also having the AI easily remove part of my job in the quantitative field, but I don't feel at risk of losing my job, just need to adapt). It suddenly becomes "theft" when you are risking losing money, should you refuse to learn to adapt to this progress in tech? We literally did nothing different from Google, Adobe, and oh so many others for a decade already, using the same laws you decided to abide to, but NOW you decided to wake up?
Also, where was the "justice police" when AI took out plenty of jobs at many warehouses, where humans were kicked out because a tiny piece of plastic on wheels could do their job... I wouldn't say better, but I would say with fewer costs involved. Where was the outrage for AIs?
However, as a final note: since many asked NOT to do that, I would say it would be quite the dick move not to respect their wishes; exactly as SD allows to opt out from their next model, this request should not be ignored. Perhaps a statement from the site to make this much more clear to anyone thinking about training a new hypnohub model: it's not illegal (yet), but it would be common courtesy.
I believe I said all I had to.