Authenticity as Currency

My core thesis was that AI-generated content would create a saturation problem — when everything looks polished, nothing feels meaningful. In healthcare, where connection drives belief and belief drives behaviour, I argued this was a strategic opportunity: shift from brand voice to human voice, from polish to truth, and the agency that elevates real patient, caregiver and clinician voices wins.


What actually happened

The instinct was right. But the mechanism changed in ways I didn’t fully anticipate. In 2025, “authenticity” was still largely a creative positioning — a tone, a production style, an editorial philosophy. By 2026, it’s become a structural requirement. Three converging forces made this happen:

  • Regulation got specific. The EU AI Act’s transparency rules (Article 50) apply from 2 August 2026. This is a timeline with enforcement. If AI is involved in creating or personalising content that could materially affect how people perceive authenticity, identity or representation, disclosure may be required. For healthcare brands operating across European markets, this is not optional.

  • Platforms hardened their policies. YouTube now requires creators to disclose altered or synthetic content. TikTok requires labelling of realistic AI-generated media and has introduced user controls to adjust AI content exposure. Meta has expanded “Made with AI” labels using detection and disclosure mechanisms. Platform-enforced operating conditions now exist.

  • Industry bodies filled the operational gap. The IAB published its AI Transparency and Disclosure Framework in January 2026, pushing a risk-based approach: disclose when AI materially affects authenticity in a way that could mislead, rather than labelling everything. In the UK, the Advertising Association published a Best Practice Guide for responsible GenAI use, covering transparency, human oversight, harm prevention and continuous monitoring.

The net effect is that “be authentic” has evolved into “prove your authenticity.” You can still lead with human stories, real voices and emotionally honest creative — in fact, you should. But the surrounding infrastructure now demands that you must verify how content was made, who reviewed it, and whether AI was involved in material ways.


What this means now

The creative argument I made in 2025 still holds: authentic, human storytelling outperforms synthetic polish, especially in healthcare where trust is the precondition for behaviour change.

But the 2026 addition is that authenticity needs a supply chain. Provenance standards (C2PA/Content Credentials), disclosure design, and governance processes are now part of what makes authentic work credible — not just how it looks and sounds.

For healthcare and pharma teams specifically: this intersects with existing MLR (medical-legal-regulatory) review processes. If AI is generating or iterating materials that go through review, the review checklist needs to evolve to catch AI-specific risks: hallucinated claims, unlicensed references, off-label implication through generated language.


Verdict: Evolved

The prediction was directionally right but framed too narrowly. Authenticity as a creative currency is still valid. But authenticity as a structural requirement — verified, disclosed, governed — is the 2026 reality. The creative opportunity hasn’t diminished; it’s simply acquired an operational layer that makes it harder to fake and more rewarding to get right.

Previous
Previous

Reading the Terrain

Next
Next

The Gen Z Reset