Post

Blocked.
Avatar
This is also a very qucik hypthetical that I wrote up just to show a point not to argue a fucking legal case.
Avatar
You in fact demonstrated the point, just not the one you meant to, because of the nature of legal cases. There is an opposing counsel that is extremely motivated to find any possible flaw with anything that you say, do, or most importantly file.
Avatar
Because you used Chat-GPT as a "first step" and it drafted a brief-shaped document, you've now introduced an unbounded set of legal landmines for the both the litigant and their attorneys.
Avatar
Wait let me get this straight. Because my HYPOTHETICAL BRIEF involving A PERSON WHO IS NOT REAL is not factually accurate, I have put my fake person at risk?
Avatar
Buddy, I didn't say that it should be used to write a fucking brief, and certainly not that a fucking brief should be written in a second as a throwaway hypothetical. Jeesus fucking Christ.
Avatar
Do you put the same level of validation work into constructing a hypothetical for a discussion on social media that you do for your fucking cases? I don't think so. Why do you expect me to? Since the facts of a FAKE CASE are not relevant to the matter at hand, get over it.
Avatar
This is from your medium post.
Avatar
bsky.app/profile/dgol...
You're not getting it. The process that I used to create a hypothetical case is not the same process that I outlined in the hypothetical case.
Avatar
Okay, fine, so you play law professor and make up a hypo, it contains problems, but we'll pretend it doesn't and now you want chat GPT to help. Let's start here. Would chatGPT produce documents if asked to research this wholly fake legal matter?
Avatar
When you use a hallucination machine to produce a hallucination in a system that punishes hallucinations you get punished, yeah.
Avatar
No one is saying that this holds the same real world ramifications as if you did it in actual litigation But exactly these problems are introduced when you use these tools, and your example shows them being introduced!
Avatar
Daniel, your hypothetical case *isn't*. The 'linchpin' here is that the brief fails to state a claim. Even if *your* 'linchpin' was legally relevant, and it isn't, your hypothetical fails to even reach it.
Avatar
Right but (1) this is not a legal case and (2) even if it was it is actually not relevant to the case. That it can be shown that using a GPT alone, and without expertise, there can be errors in GPT output does not in any show that they cannot be used to search in foreign languages.