Skip to Main Content

Generative AI for NorQuest learners

Support for NorQuest learners to start using generative AI tools.

Why do you need to evaluate content from generative AI?

The content created by generative AI is often entirely incorrect. Incorrect outputs are called hallucinations, which means the generative AI tool has fabricated incorrect facts and details.

TechRound. (2025, June 6). What are AI hallucinations and how do they work? [Video]. YouTube. https://youtu.be/WowYXubbMtA?si=5gN-EFbrPX26t23b

IMPORTANT! Because generative AI often hallucinates, it is important to evaluate the information created by generative AI to ensure it is accurate.

Strategies to evaluate content from generative AI