I attended the hearing for the hapless lawyer who used ChatGPT in his filings and here's my report. A cautionary tale not just for lawyers but for journalists: not about the machine but about our responsibility.

https://medium.com/whither-news/chatgpt-goes-to-court-7e4a0261114f

@jeffjarvis i've been enjoying the ongoing saga of LLMs revealing that nobody understands anything about anything. great write up
@jeffjarvis
I need to hear what author and former appellate defense attorney @Teri_Kanefield has to say about this situation.
@jeffjarvis The part that bothers me the most is... what exactly did these lawyers think their clients were paying them for? If they really thought that just letting ChatGPT do their job for them was legitimate, why wouldn't their clients just do that themselves instead of paying a lawyer an absurd amount of money to do it for them? That alone seems like it should be enough to prove dereliction of duty.
@jeffjarvis
"Judge Castel’s point stands: It was the lawyer’s responsibility — to themselves, their client, the court, and truth itself — to check the machine’s work. This is not a tale of technology’s failures but of humans’, as most are."
@jeffjarvis THIS - good journalism is not the ability to write a story though writing it well will help with getting the story consumed in the end - good journalism is the ability to find and research and explain what is going on, in the world, in a situation, with people.
@jeffjarvis Well written! I'm glad to see some emphasis being put on the lawyers' responsibility to fact-check their research and their negligence in not doing anything other than going through ChatGPT to verify it. A lot of the coverage I've seen (not that it's very much) seems to want to blame the technology and ignore the blatant misuse of it by humans who should know better, or at least should have learned better at some point in the process.
@jeffjarvis ´ChatGPT, as it is a wronged party in this case: wronged by the lawyers and their blame, wronged by the media and their misrepresentations, wronged by the companies —Microsoft especially — that are trying to tell users just what Schwartz wrongly assumed: that ChatGPT is a search engine that can supply facts. It can’t. It supplies credible-sounding — but not credible — language. That is what it is designed to do. That is what it does, quite amazingly. ´
@clarinette @jeffjarvis
Someone put "intactivism" into a chatbot and it spewed out several statements attributed to me, that are adjacent to what I would say, but really just generic statements about #intactivism with my name tagged on. None quotes anything I have actually said. One is so far from what I actually say, I'd call it a lie.
@jeffjarvis it’s nothing more than a super Grammarly but does it worth the damage caused to the environment ?

@jeffjarvis

Thanks Jeff for the report. Indeed the ignorance of tech in the professions is mind-boggling.

🤓

@jeffjarvis you're right that much more ridicule will be deserved by the next hapless lawyer falling for the same trap than the first one. That is not a sufficient excuse though - a difference between a professional and someone who is merely paid to do some semblance of a job is that a professional takes responsibility for their work. This Schwartz character obviously did not.

https://medium.com/whither-news/chatgpt-goes-to-court-7e4a0261114f

@osma @jeffjarvis It’s all bs. I cannot imagine citing a case I did not first read and “Shepardize” to make sure it was good law. And, especially, if I were submitting it under a colleague’s name in a federal jurisdiction to which I was not admitted to practice.

@jeffjarvis

Ultimately it is very simple, man „the user“ is responsible for validating any information they use. Ask yourself every time: Is it true?

Tools are just what they are. Tools.

And for now AI is the Wild West, a technological gold rush. Rules and regulations needed. The EU DSA act a start.

@xs4me2 @jeffjarvis
Of course anyone using chatGPT for information should validate everything.

But more generally: tool makers can be held liable if they were negligent. I’m sure there are all sort of ways AI models could be harmful or dangerous.

@IanStuart @xs4me2
A typesetter and printing press can produce truth or lies. In the early days of print, they were the ones held liable--often beheaded, behanded, or burned. Later, the author became the responsible party; indeed, Foucault argues, that is the birth of the concept of the author. So who is responsible today for a machine that will do what it is told?

@jeffjarvis @IanStuart

An intricate problem indeed, and basically of all times. It is about the essence of truth and truth finding and validation of it.

Designers (in their flawed and human nature) are responsible and should be held accountable for the validation of their tools and algorithms.

Society needs to provide a framework (rules and regulations).

Current AI developments provide a huge challenge for the concept of truth, perception and manipulation of it... dangerous times...

@jeffjarvis @xs4me2 No doubt this is a question we, as a society, have to answer.

Who enabled it? Who wrote the code and (if applicable) how was it trained? Who financed it? What safe guards were put into place? Was the AI model sold (if so what warnings and expectations were communicated)? On who’s hardware is it running? Who enabled it? Is there a physical robot? What was asked of it?#AIethics #moralsAndArtificialIntelligence #morals #ArtificialIntelligenceEthics #ArtificialIntelligenceMorals

@IanStuart @jeffjarvis

Validation of truth is independent of tools even. It is a basic principle of science, law and journalism.

And yes, uncontrolled growth in the hands of commercial toolmakers can be harmful or dangerous.

They should be held accountable, and society should provide the rules and guidance where it allows the toolmakers to go. The EU DSA is a start of that as were the hearings in US senate...

@jeffjarvis

As ever, the bad workman blames their tools!

In this case, the tools were ChatGPT and MS Word.

It sounds very much like this lawyer would happily have blamed anything and anybody other than themselves for the poor outcome.

@jeffjarvis We need more hapful lawyers and journalists.