Why AI shouldn’t be used even to decide ‘simple’ court cases

Senior judicial leaders have suggested AI might be used to decide ‘low-stakes’ cases.

The Conversation
@brainsmatter As a former German lawyer I can tell you that AI is very often wrong. I read in 70% of the cases. ChatGPT at least apologizes, when its answers are pure fantasy.
@ahako I heard the hallucination rate is still above 60%. There have been instances in Australia where ChatGPT has made up fictitious precedents when people have used it to help with their cases !
@brainsmatter Yes that is right .
70 % said ChatGPT itself when it apologized last time. The problem is that its answers sound so good. People believe it. You need to be an expert to understand that its answers are completely wrong or slightly wrong or perhaps slightly ok.