sinardaily.my on Nostr: New study finds AI legal research tools unreliable, prone to 'hallucinations' ...
New study finds AI legal research tools unreliable, prone to 'hallucinations'
==========
A new study by researchers at Stanford and Yale universities has found that AI legal research tools are unreliable and prone to 'hallucinations'. The study, titled 'Hallucination-Free? Assessing the Reliability of Leading AI Legal Research Tools', examined different AI tools used by lawyers for legal research and found that they are prone to generating false information. The study tested popular tools from LexisNexis and Thomson Reuters and found that they made mistakes 17% to 33% of the time. Some 'hallucinations' occur when AI tools cite non-existent legal rules or misinterpret legal precedents. The authors of the study called on lawyers to supervise and verify AI-generated outputs and urged AI tool providers to be honest about the accuracy of their products.
#Ai #LegalResearch #Reliability #Hallucinations
https://www.sinardaily.my/article/218722/focus/world/new-study-finds-ai-legal-research-tools-unreliable-prone-to-hallucinations
==========
A new study by researchers at Stanford and Yale universities has found that AI legal research tools are unreliable and prone to 'hallucinations'. The study, titled 'Hallucination-Free? Assessing the Reliability of Leading AI Legal Research Tools', examined different AI tools used by lawyers for legal research and found that they are prone to generating false information. The study tested popular tools from LexisNexis and Thomson Reuters and found that they made mistakes 17% to 33% of the time. Some 'hallucinations' occur when AI tools cite non-existent legal rules or misinterpret legal precedents. The authors of the study called on lawyers to supervise and verify AI-generated outputs and urged AI tool providers to be honest about the accuracy of their products.
#Ai #LegalResearch #Reliability #Hallucinations
https://www.sinardaily.my/article/218722/focus/world/new-study-finds-ai-legal-research-tools-unreliable-prone-to-hallucinations