Post

Blocked.
Avatar
This is also a very qucik hypthetical that I wrote up just to show a point not to argue a fucking legal case.
Avatar
You in fact demonstrated the point, just not the one you meant to, because of the nature of legal cases. There is an opposing counsel that is extremely motivated to find any possible flaw with anything that you say, do, or most importantly file.
Avatar
Because you used Chat-GPT as a "first step" and it drafted a brief-shaped document, you've now introduced an unbounded set of legal landmines for the both the litigant and their attorneys.
Avatar
Wait let me get this straight. Because my HYPOTHETICAL BRIEF involving A PERSON WHO IS NOT REAL is not factually accurate, I have put my fake person at risk?
Avatar
Buddy, I didn't say that it should be used to write a fucking brief, and certainly not that a fucking brief should be written in a second as a throwaway hypothetical. Jeesus fucking Christ.
Avatar
Do you put the same level of validation work into constructing a hypothetical for a discussion on social media that you do for your fucking cases? I don't think so. Why do you expect me to? Since the facts of a FAKE CASE are not relevant to the matter at hand, get over it.
Avatar
This is from your medium post.
Avatar
bsky.app/profile/dgol...
You're not getting it. The process that I used to create a hypothetical case is not the same process that I outlined in the hypothetical case.
Avatar
Okay, fine, so you play law professor and make up a hypo, it contains problems, but we'll pretend it doesn't and now you want chat GPT to help. Let's start here. Would chatGPT produce documents if asked to research this wholly fake legal matter?
Avatar
Follow up, if I asked it very specifically for a document demonstrating Elias Martinez holds title to the property described, do you think it might come up with something?
Avatar
So your position is that Chat GPT, the LLM, will scrape the web and "additional resources the GPT is given access to" and accurately find which sources are relevant summarize those sources?
No they will return summaries of websites and other sources that they find, depending on what additional resources the GPT is given access to.
Avatar
That's a huge ask. Probably not. That's not a use case I suggested though so it's irrelevant.
Avatar
I'm not playing law professor. I didn't say anything about the law. That's the point. I am not arguing about law. I am arguing about what these tools can and cannot do.
Avatar
And I am asking you a question about what the tool will do. Please answer it at your earliest.
Avatar
No they will return summaries of websites and other sources that they find, depending on what additional resources the GPT is given access to.
Avatar
When you use a hallucination machine to produce a hallucination in a system that punishes hallucinations you get punished, yeah.
Avatar
No one is saying that this holds the same real world ramifications as if you did it in actual litigation But exactly these problems are introduced when you use these tools, and your example shows them being introduced!
Avatar
Daniel, your hypothetical case *isn't*. The 'linchpin' here is that the brief fails to state a claim. Even if *your* 'linchpin' was legally relevant, and it isn't, your hypothetical fails to even reach it.