Post

Avatar
Avatar
Your hypothetical case conflated Pine Creek Montana with the Pine Creek First Nation of Manitoba, and your lack of knowledge on the subject you were basing your hypothetical on means this wholly hallucinated association flew under your radar. Do you see where this would be a problem in litigation?
Avatar
To underline this a bit, the LLM you used is not in the business of giving accurate results, it's in the business of giving results that pass casual inspection. That is not the standard which is needed in legal cases.
Avatar
This is also a very qucik hypthetical that I wrote up just to show a point not to argue a fucking legal case.
Avatar
You in fact demonstrated the point, just not the one you meant to, because of the nature of legal cases. There is an opposing counsel that is extremely motivated to find any possible flaw with anything that you say, do, or most importantly file.
Avatar
Because you used Chat-GPT as a "first step" and it drafted a brief-shaped document, you've now introduced an unbounded set of legal landmines for the both the litigant and their attorneys.
Avatar
Wait let me get this straight. Because my HYPOTHETICAL BRIEF involving A PERSON WHO IS NOT REAL is not factually accurate, I have put my fake person at risk?
Avatar
Buddy, I didn't say that it should be used to write a fucking brief, and certainly not that a fucking brief should be written in a second as a throwaway hypothetical. Jeesus fucking Christ.
Avatar
This is from your medium post.
Avatar
bsky.app/profile/dgol...
You're not getting it. The process that I used to create a hypothetical case is not the same process that I outlined in the hypothetical case.
Avatar
Okay, fine, so you play law professor and make up a hypo, it contains problems, but we'll pretend it doesn't and now you want chat GPT to help. Let's start here. Would chatGPT produce documents if asked to research this wholly fake legal matter?
Avatar
Follow up, if I asked it very specifically for a document demonstrating Elias Martinez holds title to the property described, do you think it might come up with something?
Avatar
So your position is that Chat GPT, the LLM, will scrape the web and "additional resources the GPT is given access to" and accurately find which sources are relevant summarize those sources?
No they will return summaries of websites and other sources that they find, depending on what additional resources the GPT is given access to.
Avatar
I promise you proof of title is in fact relevant, and is an example of the kind of document search that might be necessary. bsky.app/profile/dgol...
That's a huge ask. Probably not. That's not a use case I suggested though so it's irrelevant.
Avatar
Also, depending on how you ask it, the chat GPT will spit out that title for you. Do you understand why that is a problem?
Avatar
Positing a validation stage does not in fact cure all of the problems with hallucinations, nor does it instantiate an actual advantage in identifying relevant documents. The AI tool actually has to be better in finding the documents.