The Convergence

The most useful thing about revisiting the 2025 predictions is seeing how four separate forces have converged into a single, coherent challenge.


Where the Threads Meet

In 2025, I was looking at these as distinct trends, each with its own implications and opportunity spaces. Twelve months of evidence shows they’re not distinct, but facets of the same structural shift:

  • Authenticity as currency + platform/regulatory enforcement = trust must become structural, not narrative.

  • Atomisation + active self-protection = emotional bandwidth is the new constraint, not attention.

  • Agency-to-engine + AI commoditisation + homogenisation risk = editorial judgment is the differentiator, not production capacity.

The Gen Z thesis dissolves into all three: the generational signal was an early indicator of market-wide behaviour change that now manifests as bandwidth collapse, structural mistrust, and demand for substance over volume.

I’m calling this convergence the Age of Consequence. Not as a warning — though it is one — but as a reframe. The market has moved from experimentation to accountability. The question is no longer “what can we do with AI?” but “does any of it hold up?”

Hold up under regulatory scrutiny, now that the EU AI Act moves from principles to enforcement. Hold up under audience scrutiny, as people actively throttle intensity and walk away from brands that waste their time. Hold up under commercial scrutiny, as organisations discover that cheap production without editorial judgment creates expensive problems.


What this means if you commission healthcare creative

If you’re a CMO, digital director, or head of marketing — particularly in healthcare or pharma — the Age of Consequence changes what “good” looks like when you brief, evaluate and buy creative and digital work.

  • Ask about bandwidth, not just engagement. When you review creative, ask your team or agency to show you the cognitive load of the experience. How many decisions does it force? How many steps? Can someone understand the core message in five seconds or less? If the answer is “it’s a rich, layered experience,” that might be a problem, not a feature.

  • Ask about disclosure, not just compliance. If AI is involved in creation, personalisation or delivery, do you know when disclosure is required? Do you have a standard for how it appears? Is it in the brief, or is it something you discover at the review stage? The second option is a 2026 risk.

  • Ask about judgment, not just output. When your agency presents work, ask what they chose not to ship. Ask to see the editorial gate. Ask how they define “distinctive” when AI is generating the first drafts. A team that can answer these questions has editorial authority. A team that can’t is relying on luck.

  • Ask about governance, not just governance decks. Who reviews AI-generated content? What’s the escalation path if something goes wrong? Is there an audit trail? For pharma: has the MLR checklist been updated for AI-specific risks? These questions belong in scoping conversations, not post-mortems.


What comes next

Over the coming weeks I'll publish the full Age of Consequence analysis — detailed, evidence-based examinations of each 2026 force, with practical actions, case studies, and dedicated healthcare and pharma implications. The series will cover:

  • Emotional bandwidth is collapsing — and why your creative needs to design for relief, not reach

  • Trust moves from story to structure — and what disclosure, provenance and governance look like in practice

  • Production is cheap; judgment is rare — and how to install editorial authority as your competitive advantage

  • The healthcare and pharma lens — mapping all three forces to the specific realities of regulated creative

If any of this resonates with your experience — or contradicts it — I’d welcome the conversation. The best thinking happens in dialogue, not in reports.

Previous
Previous

From Agency to Creative Engine