My favorite part of this is the complete (and predictable) inability of Ai bros to take basic critique. Every professional artist does this on a DAILY BASIS. Accepting valid criticism is the art school litmus test to see who can hack it in the industry. If you can't take it, you won't make it.
Once upon a time I worked with artists setting up web pages for A Major Studio - I did coding, interactivity, cross-platform support; they made it look good.
I remember them discussing items like mentioned here, "minor" adjustments that I couldn't comprehend much less fix. A whole other skill set.
I think it's because what they think the life of an artist is like and the actual life of an artist are so dissimilar they don't even know how to handle it
"I would rather shoot myself than see my art on a commercially available product!" was the last take from a "professional artist" that really pissed me off
My personal favorite part is that the tools to do what was being asked are already there in any actual local instance of Stable Diffusion with an ethically trained inpainting model- these people are just too lazy to even begin to understand the tools they're working with beyond typing crap into Bing
"Removing objects or figures seamlessly from an already finished picture" Is probably the most legitimate use case where AI can make an artist's work easier, even! But actually learning anything about anything is categorically beyond this kind of lazy tech bro type of person
Stable diffusion is effectively just the core program that does a certain kind of complicated matrix math. The program by itself can't actually generate anything. It needs a trained model to do that, which most (in)famously are created by mass downloading art without consent from Artstation
But the same program can generate new models, or compact fine-tuning. People can use it to train their own models and give training on their own art; the Spiderverse movies AI elements were created ethically with their own work, for example, as a tool for interpolating. Idk what software they used.
So there's a whole gamut of ethics here, ranging from really blatant intentional theft like a soul stealer beam aimed at a specific artist, to ethically self trained models, to more grey area stuff like the furry models that use E621's Do Not Post list to let artists opt out and keep paid work out.
Why do I suspect that these people were suckered into getting some kind of bogus training/credential in "prompting" by people who don't know any more about creating art than they do, and are tetchy because at some level they know they don't know what they're doing and are anxious about it?
This could be purely my imagination, it just reminds me so much of people who've applied for office jobs I was hiring for in the past and were lying about their qualifications.
I hire and work with artists, and there are always revisions, color changes, small tweaks. These people are in way over their heads, they don't even understand the subject.
Adding to this: I’m a digital copywriter, and was just hired by a fairly well-known beverage brand to rewrite their websites specifically because they were unhappy with the job ChatGPT was doing.
The hype will fade, talent won’t.
This was one of the things game devs in my circles kept repeating. “These plagiarism boxes and their operators cannot refine or iterate.” Iteration is such a key part of production that those boxes shouldn’t even be considered.
Fighting Ai bullshit was what led me to be banished from Twitter and this was one of my favorite things to talk about. Like boarding with Ai would be a whole circle in Dante's art production hell.
I have to admit that I tried using AI to get backgrounds for my comic, but the inability to refine it was what made it to frustrating to use. I know what it did wrong. I know how to fix it. I can even fix it, but if I pass the fix through AI to unify the style, I will break something else