AI model collapse might make current hallucinations seem like a walk in the park

We’ve been worried about ChatGPT and other AI models hallucinating information since the day the former went viral. The infamous glue-on-pizza AI Overviews are the best example, but it’s hardly the only one.

While all AI firms working on frontier models have tried to improve the accuracy of chatbots, they still hallucinate information. A new study looking at ChatGPT o3 and o4-mini, OpenAI’s newest reasoning models, showed they tend to hallucinate even more than their predecessors.

That’s why I always advise people to ask for sources if the chatbot they use doesn’t provide them by default. You can verify the information the AI gives you on the spot. It’s also why I find myself fighting with ChatGPT more frequently lately, as the AI sometimes fails to provide links or sources for its claims.

Now, if the sources the AI uses contain hallucinations themselves, that’s a problem.

It turns out hallucinations might get worse rather than disappear. This is called AI model collapse, and it’s a development risk we need to be aware of. Some AI models may get worse rather than better in the near future, and the consequences could be disastrous.

Continue reading…

The post AI model collapse might make current hallucinations seem like a walk in the park appeared first on BGR.

Today’s Top Deals

Today’s deals: $149 AirPods 4 with ANC, $199 Bose TV soundbar, $41 mattress topper, $399 Weber grill, more
Today’s deals: $189 Apple Watch SE, 15% off Energizer batteries, $144 queen memory foam mattress, more
Today’s deals: $150 AirPods 4 with ANC, $30 JBL speaker, $55 Ring Battery Doorbell, $279 Miele C1 vacuum, more
Today’s deals: $299 Apple Watch Series 10, $38 Sony portable speaker, $249 DJI Mini 4K drone, more

AI model collapse might make current hallucinations seem like a walk in the park originally appeared on BGR.com on Wed, 28 May 2025 at 11:03:00 EDT. Please see our terms for use of feeds.

Leave a Reply

Your email address will not be published. Required fields are marked *