top of page

AI Hallucinations Still Pose Problems for Academics

  • Writer: News
    News
  • Sep 18
  • 1 min read

Hallucinations aren't going anywhere — for now.


ree

Jonathan Jarry at McGill University tested Consensus, an AI tool marketed as “hallucination-free” for academic searches. While it avoided inventing fake papers, it often produced misleading graphs, struggled to weigh study quality, and was overly generous toward pseudoscience.


The findings highlight how AI’s smaller mistakes, like skewed summaries, could mislead students, researchers, and clinicians who rely too heavily on automated outputs.


“The bottom line is that what we are calling artificial intelligence has gotten really good… but smaller, significant problems emerged during my test drive which the average user might miss if they are enthralled by the output.”

— Jonathan Jarry



Comments


bottom of page