If lawyers can’t be responsible enough to check their citations, imagine how easy it is for the general population to fall for misinformation. Just seeing a post on facebook is enough for some people to consider it a fact.
Before FB, just getting an email from someone that they forwarded from someone else was enough for some people to accept whatever the email said as absolute fact and forward it again to everyone they knew.
Honestly, bold move on his part thinking this would work for him.
It’s amazing how many people think chatgpt can’t be wrong.
Don’t get me wrong, I use ChatGPT more days than not for my job; but I use it to help me find things I might not have known about… and then I go and actually look up those things to get correct information. It’s really good at helping you find words to google to get at what you want. It’s really BAD at always giving you good information.
It really goes to show how good ChatGPT is at creating realistic sounding responses. If you don’t know how it works, it can be easy to wrongly trust it.
I’m not a lawyer or anything, but I have to assume there’s some kind of process for checking citations you’re taught along the way that go beyond trusting what a computer writes.
I know he said his assumed that ChatGPT had some kind of access to data other sites didn’t but not finding those cases anywhere else should’ve raised red flags. People need some refreshers on computer literacy.
His mistake was not plugging the blockchain into the mainframe using AI