Skip to Main Content

AI (Artificial Intelligence): Evaluating AI Outputs

Responsible and Effective Use of Artificial Intelligence in Academic Research and Writing

Why should you evaluate AI outputs?

AI tools, such as ChatGPT or CoPilot, AI assistants in databases or AI writing assistants, are useful for brainstorming and drafting ideas, but lack human-level understanding. They can make errors, overlook context, and make up convincing yet false information.

That’s why critical evaluation is essential, especially in academic work.

Core Critical Evaluation Skills

Double-check facts, dates, names, and statistics.

Ask yourself: Does this match what I already know?

Example: AI might give you a statistic with no source — look it up in a trusted database or website.

AI systems learn from human data — so they can repeat cultural or social biases.

A general AI tool might overrepresent certain perspectives or overlook important voices.

Tip: Compare AI content with peer-reviewed sources.

Verification Techniques: 

Cross-check key facts in trusted sources: Library academic databases, Google Scholar, reputable websites.

Lateral reading: Open multiple tabs to see how other credible sources cover the same topic.

Triangulate: Find at least two other trusted sources that confirm the information.

 

Handy fact-checking tools: 

  • The Library’s subject-specific databases (e.g., JSTOR, Scopus, ProQuest)
  • Google Scholar
  •  Fact-checking websites (Snopes, Africa Check, Full Fact).
  • Wikipedia for quick context — but always confirm with scholarly sources.

Use this quick checklist to assess the quality and reliability of AI-generated content before using it in your research:

 

 When reading AI-generated content, ask:

  1. Is it factually correct? Look up key claims.
  2. Is there bias? Does it present more than one side?
  3. Is it relevant? Does it actually answer your question?
  4. Is it complete? What’s missing?
  5. Is it logical? Are there contradictions?
  6. Is the source reliable? Remember, a general AI is not a subject expert, check the Library’s specialist resources for depth.

When in doubt, consult your academic sources!

Extra reading: Thinking Critically about AI | Academic Skills Kit | Newcastle University

Common issues and how to spot them:

Example

AI generates fake studies, citations, or facts that sound true, but do not exist.

Real Life Example

Another South African lawyer caught using AI has landed in big trouble – MyBroadband

Impact on your Research

Using fake sources undermines your academic integrity.

What to Do Search for the reference in an academic database or Google Scholar. If it does not exist, do not use it!

 

Example

AI models are trained on older data and may miss recent developments

Real Life Examples

AI Agents Are Here, But They're Only as Smart as the Data They're Built On

Impact on your Research

You risk citing out of date sources or missing current debates.

What to Do

Check publication dates in recent journal articles or the Library databases.

 

Example

AI tools may oversimplify complex issues, or omit important context or background information.

Real Life Examples

New AI models are more likely to give a wrong answer than admit they don't know

Impact on your Research

Leads to shallow analysis and weak arguments.

What to Do

Look for detailed explanations in academic sources. Use subject-specific databases for depth.

 

Example

AI tools may oversimplify complex issues, or omit important context or background information.

Real Life Examples

Exploring the Disadvantages of AI in Education: A Critical Look | TechAnnouncer

Impact on your Research

Can skew your interpretation of the topic and exclude diverse perspectives.

What to Do

Compare multiple sources for balance. Use peer-reviewed literature to balance viewpoints.

 

Example

AI may present speculative or uncertain claims as factual.

Real Life Examples

I chatbots remain overconfident—even when they're wrong, study finds

Impact on your Research

Misrepresents the strength of evidence and may mislead readers.

What to Do

Ask: “Where’s the source?” If none is given, verify the claim independently.

 

Example

AI may contradict itself or provide conflicting information across responses.

Real Life Examples

Inconsistency of AI in intracranial aneurysm detection with varying dose and image reconstruction | Scientific Reports

Impact on your Research

Confuses your analysis and weakens coherence.

What to Do

Cross-check claims and ensure consistency with scholarly sources.

 

Key Takeaways

  • AI tools are helpful starting points, not final authorities.
  • Always verify, cross-check, and fill in gaps with real sources.
  • Practice lateral reading: open trusted sites to confirm facts.
  • Use the Library’s databases and ask subject experts for deeper research.
  • For responsible use, see our AI & Ethics section of this guide.
  • Cite responsibly: If you use AI tools, acknowledge them appropriately and verify all sources before including them in your work.