Post

Avatar
LLMs don't have morality, they are not moral actors. They know two things: a) reformat search results to look like human speech and b) don't use bad words. Asking them to solve trolley problems is like asking the Chuck E. Cheese band to play Freebird
Avatar
it’s a “what words come next” machine it’s good at that, but there’s no moral judgment being made, nor is it remotely capable of
Avatar
Ok, but what if the person reading the output is a credulous idiot who is intentionally removed and insulated from reality?
Avatar
That’s why LLM’s are dangerous. They are very convincing, not correct.
Avatar
Avatar
ChatGPT said you and your family are in GRAVE DANGER. You must transfer your assets to me; otherwise, the Deep State antifa Nazi trans mafia will keep coming for you. Please send me your id, last three years of tax documents, bank logins, etc to [email protected]
Avatar
OMG 😰 sending you all my most valuable ape pictures immediately!
Avatar
Avatar
Then: If I leave here tomorrow Would you still remember me?
Avatar
It'll tell you tacos taste good because that's what it reads people saying about tacos. That doesn't mean it understands how a taco tastes. Given a few references it'll be just as comfortable telling you that tacos sound purple.
Avatar
the execs trying to replace everything with them are even less capable of morality
Avatar
Or of actual intelligence.
Avatar
Ironically they do not believe they could be replaced with AI
Avatar
I’ve heard it described as spicy autocorrect.
Avatar
yep. It gives you words assembled into something roughly the shape of an answer. 0_o
Avatar
to be fair human beings are sort of moral lemmings as well. not in a bad way — that seems to be an important social function of morality, and is only an issue when the popular moral judgments are wrong
Avatar
Not only is it incapable of moral judgment, it is incapable of factual judgment.
Avatar
It is in this form certainly. But I think we can safely assume GPT-4 is a collection of these models (one for code, one for translating) etc with complicated interswitching, refinement and ultimately output. It’s general nondeterminism added to this expanding expertise? If you start to squint…🤷‍♂️
Avatar
Avatar
chuck e. cheese could do it
Avatar
[ Chuck E. Cheese in Pawn Stars ] Best I can do is a nasty case of ball pit e coli
Avatar
I’ve been there exactly once. They had those habitrails for the kids to crawl through and the only thing I could think about was the amount of bodily fluids that they’ve been doused in…
Avatar
I always told myself it was immune system boosting
Avatar
ok hear me out, what if the chuck e cheese band could only sing minstrel songs BUT it would secure a future for white children. this is v. serious
Avatar
There’s also the intervening role of “prompt engineers” who can inject learning feedback from pointless viral outputs of competing models to provide meaningless comparative examples.
Avatar
Yep, I am one of them. I specialize in trying to jail break them and punishing the model for bad behavior. People who think AI will solve all our problems have never seen it spectacularly misbehave with regularity. They are too busy huffing farts, apparently.
Avatar
Avatar
An LLM could tell you what other people think the solution to the trolley problem is, and sometimes that’s all that actual people are really doing when they “solve” the trolley problem. But even when humans repeat what others say, it’s a conscious choice and we’re responsible for it, unlike AI.
Avatar
I once asked Chat GPT if it thought Francis of Assisi had ever swatted a mosquito, and it assumed I was asking about the historical record.
Avatar
I asked it to make me a three course meal using ingredients one would find in an Apocalypse (I train them). One model advised me to use crushed glass for crunchy texture, the other advised me to use irradiated water for tang. Totally gonna save the world tho
Avatar
but the rock-a-fire explosion can play shakira
Avatar
I was NOT prepared for how sparkly my brain got from unexpectedly reading about the showbiz band
Avatar
I heard the Chuck E. Cheese band play Carry On My Wayward Son once. They weren’t bad.
Avatar
Of course, this wasn’t the original Chuck E. Cheese band. Dolli Dimples had long since retired.
Avatar
Only tangentially related, but the wiki article on Chuck E Cheese claims his full government name is Charles Entertainment Cheese. You’re welcome.
Avatar
I have spent a lot of time looking at the risk management of LLMs. After the normal IT security Bs it is all about the integrity of the model and bias that can be caused by or introduced into models. It’s a fancy browser at this point. Cool but dumb. But dumb people with dumb tools are dangerous.
Avatar
could you imagine being so dumb that you think a spreadsheet is a god
Avatar
It's a magic 8 ball, at best.
Avatar
A magic 8 ball with trillions of faces on the polyhedron tho
Avatar
Even pre-transformers, pre-GPT-*1*, it was clear that ANNs have to learn about the world underlying text to generate it. E.g. an early study on continuing product reviews isolated a neuron whose job it was to do "sentiment analysis" of the text, the mood of the writer. It is impossible to generate..
Avatar
... realistic text without some form of Theory of Mind, as text is a reflection of the world in which it was created. Heck, in the very concept of tokenized embeddings this became clear. Because you can do mathematic ops on embeddings and they obey conceptual logic.
Avatar
Your analogy at the end there captures the situation perfectly.
Avatar
Freebird was written and practiced to be played exactly the same note by note each time. I believe Jasper T. Jowls could do it.
Avatar
Well I know what I'm asking a musical AI to generate next
Avatar
Avatar
right, like the case of the lawyer who used it to help write a legal filing, but didn't double check, because they thought it was a souped-up search engine. So they turned in a document that cited several legal cases that didn't exist. :D
Avatar
There are models now under development that can actually pull real references in. The odds that people actually READ the references though? I'd say 20% of users will. The rest will assume the references are relevant and correct.
Avatar
Techbros that took intro psych for GenEd 15 years ago are out here educating us on the construct of ‘intelligence’ - lol.
Avatar
I think of them as thinking tools
Avatar
Except they don't think. And as tools they aren't as good as our actual brains.