Skip to Main Content

Artificial Intelligence

Artificial Intelligence (AI) impacts all fields of study and is not subject specific. This guide is here to support research and learning involving Artificial Intelligence.

What are AI Hallucinations?

AI hallucinations occur when Generative AI tools produce incorrect, misleading, or nonexistent content. Content may include facts, citations to sources, code, historical events, and other real-world information. Remember that large language models, or LLMs, are trained on massive amounts of data to find patterns; they, in turn, use these patterns to predict words and then generate new content. The fabricated content is presented as though it is factual, which can make AI hallucinations difficult to identify. A common AI hallucination in higher education happens when users prompt text tools like ChatGPT or Gemini to cite references or peer-reviewed sources. These tools scrape data that exists on this topic and create new titles, authors, and content that do not actually exist.

AI Hallucinations