← Back to articles
Marketing Strategystrategic-architectureb2b-marketing-strategymarketing-diagnosismarketing-metricsMarketing Architecturemarketing-leadership

Marketing Strategy Problems Design: Polishing Broken Systems

When marketing strategy problems design fails, it's rarely the process. The system was built around proxy metrics from the start. That's the real problem.

Scott RoyScott Roy
Marketing strategy problems design — precision drafting tools arranged over a blueprint with visible structural fault lines, representing the futility of design refinement on a broken strategic foundation

Marketing Strategy Problems Design: Polishing Broken Systems

Your content is performing. Your team is executing on schedule. Every channel has owners, every campaign has metrics, every quarter has a plan. CAC is rising. Sales cycles aren't shortening. The CEO wants proof of commercial value that your current reporting can't produce.

That gap isn't a competence problem. It isn't a team problem.

The problem most marketing leaders searching for answers around marketing strategy problems design never diagnose is that they designed their system correctly — to produce the wrong outputs. Proxy metrics — MQL volume, click-through rates, impression counts — weren't what you defaulted to when the system started failing. They were the design requirements. The brief said: build a system that generates measurable lead flow with channel attribution. You built exactly that. The system works as designed. That's what makes it hard to fix.

The Specific Trap of Having Built It Yourself

Inheriting a broken marketing architecture is disorienting. But it's diagnosable. You can map the gaps with reasonable detachment — identify what was missing from the original design, propose structural changes, build a case for change.

Building the broken system yourself is something different. You approved the metrics. You selected the channels. You made the hiring decisions that created the current team structure. When the system underperforms, the professional reflex is to treat it as a design failure: run a better workshop this time, engage stakeholders more thoroughly, bring in design thinking methodology.

Jeanne Liedtka's research in Harvard Business Review frames design thinking explicitly as a social technology for innovation — analogous to how total quality management improved manufacturing processes. Real benefits. Real research base. But design thinking improves how you execute against a brief. It can't evaluate whether the brief itself was structurally sound. A more rigorous design process applied to the wrong specifications produces a better-executed version of the wrong system.

This is architectural blindness in its most personal form. You're not operating within a framework someone else built and can be blamed for. You authored it.

McKinsey research found that only 21% of executives report their strategies pass four or more of their Ten Tests of Strategy. That means 79% of organizations are running with strategies that fail basic quality thresholds — before any design methodology is applied. Better design process doesn't close that gap. The gap exists in the strategic foundation the design process is executing against.

When Marketing Strategy Problems Are Encoded at Design Time

Here's where most marketing diagnostics go wrong: they treat proxy metrics as what you ended up measuring when the system started failing. That's the wrong chronology.

At the moment you designed the system — chose your channels, defined what a "qualified lead" meant, built your reporting cadence, hired for channel-specific expertise — you encoded proxy metrics as the design requirements. The implicit brief specified:

  • Optimize for MQL volume with measurable channel attribution
  • Hire channel specialists, not integration architects
  • Define success as top-of-funnel throughput
  • Report on engagement metrics: pipeline influence, content performance, email open rates

The system you built is correct. It generates what it was designed to generate.

Forrester's 2025 marketing predictions research found only 12% of marketing leaders believe their team's current organizational design will help them meet revenue targets over the next year. That 88% gap isn't a performance problem. It's structural — the organizational design chosen at inception doesn't connect to revenue outcomes. That was an architectural decision, made early, when the brief was written.

The 73% of MQLs that are never engaged by sales aren't evidence of a broken system. They're evidence of a system performing exactly as designed — producing outputs that were never connected to actual sales motion in the original brief. Tactical excellence within a broken framework looks like this from the inside: your channels hit their numbers, your dashboards show green, and nothing converts at the rate the revenue model requires. The system is doing its job. Its job was wrong.

The Brief Needs a Different Diagnosis

Redesigning your design process won't fix this. The problem isn't design execution — it's what the design was specified to produce.

The right diagnostic question isn't "how do we run a better discovery process?" It's "what was this system actually built to do, and does that connect to how B2B buying decisions actually happen?" A buying process involving over a dozen internal stakeholders across multiple stages doesn't resolve against your MQL definition. It doesn't follow your funnel stages. The system you built for channel-attributed lead generation is operating against a purchasing reality it was never designed to address.

That's not a design thinking problem. It's a strategic architecture problem — one encoded at the brief stage, before the first campaign launched.

The mechanism by which proxy metrics specified at design time produce campaigns that appear to work until they demonstrably don't is the focus of The Illusion of Proxy Command. It maps what true command of a marketing system requires at the architectural level — not just the execution level. Start there before you redesign anything.