From the early days of ChatGPT, I told you that generative AI products like OpenAI’s viral chatbot can hallucinate information. They make up things that aren’t true, so you should always verify their claim when looking for information, especially now that ChatGPT is also an online search engine.
What’s even worse is that ChatGPT can hallucinate information about people and hurt their reputations. We already saw a few complaints from affected individuals who found the AI spitting out false information about them that could damage their reputation. But the latest such case is even worse and definitely deserves action from both regulators and OpenAI itself.
The AI said a Norwegian man murdered two of his children and spent two decades in prison when the user asked ChatGPT what information it had on him. None of that was true. Well, some of the information the AI presented about the man was accurate, but not the gruesome parts. Whatever the case, a privacy rights advocacy group has filed a complaint against OpenAI in Norway that shouldn’t be ignored.
The post ChatGPT hallucinates a man murdered his kids: OpenAI faces a complaint in Europe it shouldn’t ignore appeared first on BGR.
Today’s Top Deals
Today’s deals: $2,300 Amazon gift card, $53 Beats Solo Buds, $20 Blink Mini 2, $70 Keurig coffee maker, more
Today’s deals: Bose soundbars, $299 Apple Watch S10, $160 TP-Link WiFi 6 mesh system, $500 off Freo X Ultra, more
Today’s deals: $269 iPad 10, $15 Amazon credit, 20% off Shark NeverChange Air Purifier, $375 HP laptop, more
Today’s deals: $169 Apple AirPods Pro 2, $100 Philips Sonicare, $113 Bose speaker, $36 space heater, more
ChatGPT hallucinates a man murdered his kids: OpenAI faces a complaint in Europe it shouldn’t ignore originally appeared on BGR.com on Thu, 20 Mar 2025 at 08:52:00 EDT. Please see our terms for use of feeds.