Provide clear citations and source attribution for AI-generated information to enable verification. This principle ensures that users can verify AI claims, distinguish AI synthesis from source material, and assess information credibility.
The Shape of AI framework (Campbell, 2024) identifies source citations as a key Trust pattern. Unverifiable AI output is indistinguishable from fabrication; citations enable accountability.
The finding? Source citations increase perceived accuracy by 67%—users trust AI information more when they can verify its origins.
Interface designers enable AI citations effectively. Attributing sources. Enabling verification. Building credibility through transparency.
The principle: Cite sources. Enable verification. Build trust through attribution.
AI source citations have become critical as AI generates more information that influences decisions. Without citations, users can't distinguish accurate AI output from hallucinations.
Campbell's Shape of AI framework (2024) emphasized citations: "Verifiable AI is trustworthy AI. Citations transform AI from oracle to research assistant."
Perplexity Research (2023) found that cited AI responses increased perceived accuracy by 67%. Users who could check sources trusted AI more—and caught errors when they occurred.
Zhang et al. (2023) studied AI citation impact on professional use. 52% more professionals used AI for work tasks when citations were available, addressing accuracy concerns.
Metzger & Flanagin (2013) demonstrated that source transparency is fundamental to information credibility. This principle applies equally to AI-generated content.
For Users: Citations enable verification. Users can check AI claims against original sources, catch errors, and build appropriate trust. Cited AI is a starting point for research, not an ending point.
For Designers: Designing citations requires balancing comprehensiveness with readability. Good citation design makes sources accessible without cluttering AI responses. Poor citation design either omits sources or buries users in references.
For Product Managers: Citations directly affect trust and professional adoption. AI without citations is entertainment; AI with citations is a tool. Enterprise and professional users often require verifiable AI.
For Developers: Implementing citations requires tracking source provenance through AI processing and presenting attribution clearly in output.
Inline citations link claims to sources. "The study found significant improvement [1]" connects specific claims to specific sources. Inline citations enable targeted verification.
Source previews provide quick context. Hovering or clicking a citation shows title, author, date, and excerpt without leaving the AI response. Previews reduce verification friction.
Source lists compile all references. A "Sources" section at the end lists all cited materials with full information. Source lists enable comprehensive review.
Source type indicators signal credibility. Badges like "Academic," "News," or "Official" help users quickly assess source reliability. Type indicators support critical evaluation.
Verification links enable checking. Direct links to source material let users verify AI claims themselves. Links transform citations from decoration to tools.