Post

Avatar
When I say that the proliferation of "AI" bullshit being sold as fact-finding or truth-certifying machines is going to get people killed, "Google's GPT System Tells Someone To Mix Vinegar And Bleach (I.e. Make Chlorine Gas) To Clean Their Washing Machine" is exactly the kind of thing I mean
Avatar
To people noting that this doesn't say "mix": You're right, my bad. It says "*use* bleach and vinegar." Totally different. My point is a lot of people aren't likely to read past that which will harm them, & the way this system presents "Summaries" increases not decreases the likelihood of that harm
When I say that the proliferation of "AI" bullshit being sold as fact-finding or truth-certifying machines is going to get people killed, "Google's GPT System Tells Someone To Mix Vinegar And Bleach (I.e. Make Chlorine Gas) To Clean Their Washing Machine" is exactly the kind of thing I mean
Avatar
So: Let's Clarify: When I say that … "AI" bullshit being sold as fact-finding or truth-certifying machines [will] get people killed, "Google's ['AI' Says 'Use Bleach & Vinegar' (I.e. 2 Things Which In Mixture Make Chlorine Gas)] To Clean Their Washing Machine" is exactly the kind of thing I mean.
Avatar
Avatar
Someone (can't find it now) mentioned that this was scraped from the Gain website & repackaged as part of Google's "AI" "summary," & you know what? It was! But you know what's RIGHT NEXT TO IT on the same webpage & NOWHERE in the "summary"? This. (And even THAT'S not an ideal design. But it's There)
Avatar
As I've said before, "just don't talk about it" is not a viable strategy for the kinds of fucked up answers LLM/GPT-type "AI" is likely to give, because that strategy doesn't account for the structural reasons it gives those answers. Meaning it can't account for the question/topic you can't think of
Google Caught Manually Taking Down Bizarre AI Answersfuturism.com Google manually removed AI-powered search results from pages where it was caught giving bizarre and incorrect information to search users.
Avatar
Given all the evidence of LLMs - better known as plagiaristic drivel-creators - creating life-threatening results, what will be the legal liability for *not* recalling these products (i.e. removing them from updated versions of browsers)? Should we worry about dining outside (or even in) our homes?
Avatar
These sorts of things are far more worrisome than the "add glue to your pizza" type issues. Obviously bad/wrong is one thing. But the non-obvious misinformation is what's going to do real harm
Avatar
Avatar
So if AI posts something that breaks terms of agreement, is vile, violent, etc., it is banned from social networking platforms? 🤗
Avatar
The full instructions are also exactly as long as they need to be, to be clear about what goes in which cycle when
Avatar
chemistry labs assume less perfection from their usere
Avatar
I was going to say "I wonder what malicious inputs caused it to produce such dangerous output." I suppose I shouldn't be surprised that it could synthesize such output from benign inputs. There's perhaps a morbid analogy to naively mixing common household chemicals to produce something deadly.
Avatar
I see you, hear you and agree with you !!
Avatar
Avatar
You had me at "

Use chlorine bleach and vinegar

" lol
Avatar
This statement of clarification is an example of greater exercise of clarity in communication and responsibility of instruction than anything being applied to AI-generated information searches. That disparity itself seems to be an example of the larger problem.
Avatar
Avatar
Yeah I wouldn't roll the dice that all the bleach or vinegar gets rinsed out after one cycle. Accidentally gassed myself a few times in the lab back in the day trying to clean things.
Avatar
I have a clear memory of a student at my highschool doing this during some cleanup; we had to evacuate a whole building
Avatar
I can believe it, I got lucky and the fume hoods near me sucked it up. Only a bit of residue created a little cloud, I'll never forget the POOF sound.
Avatar
we had an explosion at the community college i went to. it was one of the better ones, particularly at science ed. chems happen!
Avatar
Sincere question: I don’t use them together but I have a spray bottle of diluted vinegar and one with diluted bleach stored next to each other under the sink. Problem?
Avatar
As long as they're tightly sealed and don't drip near each other they should be fine
Avatar
thank you! they are in spray bottles and I live in an earthquake zone so I think from now on I will store them in separate locations. #themoreyouknow
Avatar
I feel like this is going to be a go to example now for what social sciences are and do and are useful for: being able to identify that the instructions aren't unsafe AND a process giving these instructions is unsafe
Avatar
Tell every tech company and legislator that, because they for sure still seem not to get it
Avatar
They don't care. SV is a branch of Wall Street now, they don't actually care what they are making, nor who it may harm.
Avatar
Avatar
I did my doctoral thesis here...on here...
Avatar
I wouldn't assume the chlorine bleach would be entirely gone after the cycle. I mean it would be diluted enough that it wouldn't hurt you, but small amounts of chlorine gas are still bad. I think that's super bad advice.
Avatar
It doesn’t matter at all that it doesn’t instruct ppl to mix tbh you were still correct. It’s encouraging people to open these containers at around the same time in probably an underventilated space that already had 1000+ ppm CO2 and god knows what fungal life
Avatar
I was in a mood to show people that even while nitpicking, they were still wrong 🤷🏾‍♂️
Avatar
I don’t mean to question that, just nitpick For Good tbc 🫡
Avatar
Yeah, you should always tell people explicitly that they must never mix chlorine bleach with acids, when you recommend using it.
Avatar
The entire idea of summaries I worry drives in opposition to people reading and understanding by promoting viewing and following without ever thinking.
Avatar
Exactly correct. It's an offloading of responsibility for knowledge and expertise onto supposedly "objective" computer systems, without any understanding of why that objectivity is an illusion and most trust in these systems is Woefully misplaced
Avatar
I honestly think many people in many situations will offload onto *anything*. Quora threads. Stackoverflow without checking that the problem matches theirs. Marginalia from the previous student who owned the book, without checking if they passed the course. This just makes it easier and worse.
Avatar
Vicious cycle even more than most are imagining bc the society of the synopsis will encourage, somehow, even more epistemologies embracing illiteracy to invade the K12 curriculum (and yet another reason to teach Ling 1 in K12)
Avatar
The people arguing with you that you didn’t use the word “mix” sound like the exact kind of people who would read an AI answer and end up with a lung full of chlorine gas (or eat rocks, glue, etc)
Avatar
Seems like nitpicking on many people's behalfs here tbh. A whole lot of "Well actually"s.
Avatar
Avatar
Avatar
I get caught out at least a few times a year by not reading an unfamiliar recipe all the way through before starting to cook so I can definitely see how this would lead to some folks mixing those chemicals together if they just did a quick skim.
Avatar
I think this is the point - we're racing past our typical "lowest common demonimator" safety nets. Like labeling rat poison "do not eat" and such. AI is assuming the masses are reasonably thoughtful.
Avatar
At a time in history when people have been and are being explicity trained to be less critically and carefully thoughtful.
Avatar
A fucking quart of bleach too?? That is literally 8x as much as Good Housekeeping recommends.
Avatar
“Whole house is now cleansed.”
Avatar