So, look.
When I was back at Microsoft, an issue that came up was the device owner's need for privacy. There might be things on your device you don't want people to see! The semi-official code for these was "family photos".
Generative AI isn't just for predatory billionaires, it's also for hack directors who want to be able to exploit women from beyond the grave
variety.com/2023/film/ne...
As in, we'd be making a feature like "the device automatically makes a slideshow out of images you put on it and uses it as a screensaver." And someone would say, "well, but you might not want your family photos to come up in a work context."
Pretty sure everyone here is engaged in the same exercise, because I guarantee nobody gives a shit about custom rom-coms. The use case is "Marilyn Monroe" (read: "any female celebrity, probably not a dead one") "makes a rom-com" (read: "engages in your extremely specific fetish".)
And if they claim not to be doing that, they're "not doing it" in the sense that OBVIOUSLY it's against the terms of service, and of course they're happy to respond to takedowns, but the volume of user content is so high, you see...
One thing AI has in common with crypto is that boosters are well aware the primary use cases are both illegal and immoral; they don't care but can't say so in public.
Since this has gone around a bit let me clarify some things. I use the terms as they're commonly used: crypto to mean cryptocurrency (cryptography is fine) and AI to mean the latest generation of generative models, as Russo meant in OP.
To be fair the main VC firm (Andreesen Horowitz) funding this shit did, in fact, say something to the effect of "we know the only way for this to work is stealing everything we possibly can and making us do it legally or morally will bankrupt us" so at least there's that.
The skeleton key to understanding today’s tech elite is still Zuck scraping women’s photos off the Harvard student website and inviting people to judge who was hotter
Everything you’re saying is true, Django, but I also want to note that I would curl up in a ball and die if
I was forced to watch even 30 seconds of an artificially generated romcom in which I was starring
There are absolutely some legal uses where AI can be helpful, even a boon.
But most internet tech gained traction because of porn and too often the safety team is overruled on potential future scope. Even if faster release can do harm, they don’t spend enough time & cash on the safeguards.
Please be more specific than “AI”
It feels like saying metal like bronze or steel.
“Metal is going to cause humans a lot of trouble”.
“We need to be more careful with metal”
The logical endpoint of this is that once they've perfected "[insert my] face into porn starring [insert my 3rd grade teacher] at [insert my office]" the technology's public-facing uses will flounder.
(because the vast majority of people consuming porn do not care much beyond the insertion aspects)
One thing AI has in common with crypto is that boosters are well aware the primary use cases are both illegal and immoral; they don't care but can't say so in public.
Sure, but part of the grift is to wink it to the public. Crypto marks fell for it because they thought they were gonna get away with something sketchy. Most any con is pretty clear in saying this is a con but YOU special smart person that you are can get in on the ground floor and fleece the rubes.
As dirty as it feels to say a single solitary thing in defense of cryptocurrency, not all illicit online transactions were immoral, nor even necessarily illegal. There's a pretty decent chunk of dark web transactions where the immorality is that the extreme privacy is necessary in the first place.
I feel the reality of whatever future we have is that the end-user - the person at home - wants what they want when they want it. Nothing else matters to them. If a tool gives them that then they're going to use it whether morality or laws or ethics say different.
Stop it.
Some set of VC-funded AI companies, yes, but not AI in general.
Self driving, cancer diagnosis, aircraft routing, autonomous agents, etc., etc, all have valid use cases.
These morons are just going after “low-hanging fruit” enabled by their lack of ethics.
Not AI generally, but the big, splashy, VC ones, yeah.
The real work of AI is incremental improvement. Moral actors are maintaining transparency and human agency.
For the record, I do care about someone's likeness being used without their consent - especially after death. I'm *more* concerned about it being used for sexual fetishes
a work doesn't have to be pornographic to be leering, creepy, and exploitative, though
reactionary guys in particular have this insatiable urge to approach everything with the mindset "this is wank material for me!!" even when they're not asking for explicit sex and nudity
Overwhelmingly the primary use for generative AI is to make porn, but every generative AI company desperately wants seed money and so tries to make sexually explicit creation impossible. This results in a hilarious escalation where they are fighting their user base always poking to find the hole.
Also, if you play around with them enough, you can tell exactly what people are training them on because it has some genuinely fascinating problems with depicting men.
Considering Marilyn Monroe's tragic history of being sexualized against her wishes and exploited, they're saying a lot more than they mean to when they use her name
Similarly while I was at MSFT we used “buying a gift for your spouse” as a reason for not wanting to automatically share info about your purchases when it was clearly “people buy some sketchy stuff online”