Hacker News new | past | comments | ask | show | jobs | submit login

multiple lawyer friends I know are using chatgpt (and custom gptees) for contract reviews. They upload some guidelines as knowledge, then upload any new contract for validation. Allegedly replaces hours of reading. This is a large portion of the work, in some cases. Some of them also use it to debate a contract, to see if there's anything they overlooked or to find loopholes. LLMs are extremely good at that kind of constrained creativity mode where they _have_ to produce something (they suck at saying "I dont know" or "no"), so I guess it works as sort of a "second brain" of sorts, for those too.

There's even reported cases of entire legislations being written with LLMs already [1]. I'm sure there's thousands more we haven't heard about - the same way researchers are writing papers w/ LLMs w/o disclosing it

[1] https://olhardigital.com.br/2023/12/05/pro/lei-escrita-pelo-...




Five years later, when the contract turns out to be defective, I doubt the clients are going to be _thrilled_ with “well, no, I didn’t read it, but I did feed it to a magic robot”.

Like, this is malpractice, surely?


It only has to be less likely to cause that issue than a paralegal to be a net positive.

Some people expect AI to never make mistakes when doing jobs where people routinely make all kinds of mistakes of varying severity.

It’s the same as how people expect self-driving cars to be flawless when they think nothing of a pileup caused by a human watching a reel while behind the wheel.


In the pileup example, the human driver is legally at fault. If a self driving car causes the pileup, who is at fault?


My understanding is the firm operating the car is liable, in the full self driving case of commercial vehicles (waymo). The driver is liable in supervised self driving cases (privately owned Tesla)


Well, maybe its wheel fell off.

So, the mechanic who maintenanced it last?

...

We don't fault our tools, legally. We usually also don't fault the manufacturer, or the maintenance guy. We fault the people using them.


Any evidence it's actually better than a paralegal? I doubt it is.


This is malpractice the same way that a coder using Copilot is malpractice




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: