· 5 min read

The Green-Light Problem

A strategy document lands on a client’s desk. It recommends a major platform migration. The tone is confident. The structure is logical. The recommendation is clear: move forward.

Buried on page three, in qualified language, are a handful of caveats. The primary integration hasn’t been validated with the client’s actual data. The connector vendor hasn’t been vetted beyond a cursory website scan, and a core data system that powers the existing workflow isn’t mentioned at all. The timeline assumes everything works on the first try.

The client’s leadership reads the document and sees a green light. The technical team reads the same document and sees open questions.

The gap between those two readings is where six-figure mistakes live.

The anatomy of a premature recommendation

It’s not malicious. It’s structural.

Someone does the real technical analysis. They flag risks, identify dependencies, note unresolved questions, and recommend a staged approach: validate the critical assumptions before committing to a full build. The analysis is honest about what’s known and what isn’t.

Then the analysis gets polished for client consumption, and the risks are softened. “This is an unvalidated dependency that could change the entire architecture” becomes “this will require thoughtful implementation.” The staged approach gets flattened into a single “we recommend moving forward.” The hard questions get cut because they might make the recommendation look uncertain.

The polished version isn’t wrong, exactly. Everything in it is technically true. But it’s selectively true in a way that systematically favors proceeding. The caveats might be present but soft; the confidence is high but unearned. And the client, who is paying for expert guidance on a decision they can’t evaluate themselves, reads the document at face value.

That’s The Green-Light Problem. Not a bad recommendation, but premature; a conclusion delivered before the evidence supports it.

The “not a blocker” trap

There’s a specific phrase that shows up in these documents. It sounds reasonable and is often catastrophic:

“Not a blocker.”

An integration with an unvetted vendor, connecting a legacy ERP system to a modern platform, handling thousands of customer-specific pricing matrices and complex approval workflows. The vendor’s website has a case study. A sales engineer said it works. Nobody has tested it against the client’s actual data, actual edge cases, or actual transaction volume.

“Not a blocker.”

If that integration fails, the entire architecture changes. The timeline doubles, the budget triples, and the client is three months into a build when they discover that the foundation, around which the whole project was designed, doesn’t hold weight.

Calling an unvalidated dependency “not a blocker” before vetting it is optimism bias dressed up as a technical assessment. It’s the kind of language that makes strategy documents read well, and post-mortems read badly.

What “validated” actually means

There’s a meaningful difference between “we believe this will work” and “we’ve proven this works.” Strategy documents routinely conflate the two.

Validation is not:

  • The vendor says it works
  • We found a case study on their website
  • It works in a demo environment with sample data

Validation is:

  • We tested the specific integration with the client’s actual data and edge cases in conditions that resemble the production environment
  • We documented what happened.

Validation costs money and takes time. It delays the exciting part of the project (the build) in favor of the boring part (the proof). And it is the single most valuable thing a technical advisor can recommend before a six-figure commitment.

The timeline cascade

A premature green light doesn’t just risk a bad outcome. It creates a compounding timeline problem.

When the project starts with unvalidated assumptions, the team builds on those assumptions for weeks or months. When one of them turns out to be wrong (and they do, regularly, because that’s what “unvalidated” means), the timeline doesn’t shift by the time it takes to fix the problem. It shifts by the time it takes to fix the problem plus the time spent building on the assumption that turned out to be wrong plus the time spent unwinding the work that depended on it.

A two-week validation phase at the beginning can prevent a three-month correction in the middle. The math is simple: the two-week validation phase feels like a delay, and the three-month correction feels like bad luck.

It’s not bad luck, but the entirely predictable consequence of skipping validation.

The document problem

A well-crafted strategy document can make an unvalidated recommendation look validated. The formatting is professional. The sections follow a logical structure. The language is measured and confident. If you didn’t have the technical context to evaluate the claims, you’d read it and feel reassured.

The people making the platform decision often don’t have the technical context. That’s why they hired advisors. And when the advisory document systematically smooths over the rough edges, the client loses access to the information they need to make an informed decision.

This isn’t about incompetence. It’s about incentives. PMI research on optimism bias in project delivery shows that the dilution of risk reporting is one of the most common failure modes in status communication. The path of least resistance is always “we recommend proceeding.” Clients want to hear yes. Teams want to move forward. Leadership wants progress. The person who adds a gate and says, “Wait, we haven’t validated this yet,” is often treated as an obstacle to progress rather than as someone protecting the investment.

The fix is boring, and it works

Stage-gated recommendations.

That’s it.

“We believe this platform is a viable path for your requirements. Before committing to a full build, we recommend a validation phase. Here’s what we’ll test, here’s what it costs, and here’s the criteria we’ll use to decide whether to proceed.”

That’s not hedging. That’s risk management. And the client who hears “we want to prove this works before you spend six figures on it” will trust you more, not less, because you’re clearly prioritizing their outcome over your timeline.


The most expensive sentence in any strategy document is “we recommend moving forward” — when the analysis it’s based on isn’t finished yet.

Sources: Optimism Bias — The Decision Lab · Optimism Bias and Failure to Terminate Failing Projects — PMI · Phase-Gate Process — Smartsheet

Get new posts in your inbox

Share