ChatGPT Causes Legal Chaos After Making Up Cases To Ease Lawyer's Workload
Portfolio Pulse from Ananya Gairola
A lawyer's use of OpenAI's chatGPT for legal assistance backfired as the AI chatbot fabricated nonexistent cases, causing legal chaos. The incident highlights the potential dangers of relying on AI chatbots like chatGPT, Microsoft's Bing AI, and Alphabet's Google Bard in sensitive fields like law.

May 30, 2023 | 7:52 am
News sentiment analysis
Sort by:
Ascending
NEGATIVE IMPACT
Alphabet's Google Bard is mentioned as another generative AI model that can hallucinate and provide made-up facts, raising concerns about the use of AI chatbots in sensitive fields.
The article highlights the potential dangers of using AI chatbots like Alphabet's Google Bard in sensitive fields like law. This negative publicity could impact Alphabet's reputation and potentially affect the adoption of Google Bard in certain industries.
CONFIDENCE 80
IMPORTANCE 40
RELEVANCE 30
NEGATIVE IMPACT
Alphabet's Google Bard is mentioned as another generative AI model that can hallucinate and provide made-up facts, raising concerns about the use of AI chatbots in sensitive fields.
The article highlights the potential dangers of using AI chatbots like Alphabet's Google Bard in sensitive fields like law. This negative publicity could impact Alphabet's reputation and potentially affect the adoption of Google Bard in certain industries.
CONFIDENCE 80
IMPORTANCE 40
RELEVANCE 30
NEGATIVE IMPACT
Microsoft's Bing AI is mentioned as another generative AI model that can hallucinate and provide made-up facts, raising concerns about the use of AI chatbots in sensitive fields.
The article highlights the potential dangers of using AI chatbots like Microsoft's Bing AI in sensitive fields like law. This negative publicity could impact Microsoft's reputation and potentially affect the adoption of Bing AI in certain industries.
CONFIDENCE 80
IMPORTANCE 40
RELEVANCE 30