Blind Spots & Buried Truths: The Ethics of ‘Undiscovered’ Risks

Blind Spots & Buried Truths: The Ethics of ‘Undiscovered’ Risks

“Just check the accessible sections for this year’s report,” Mark had said, the words a low hum in the humid air of the operations control room. The young engineer, fresh out of her second-year rotation, nodded, her hand already reaching for the schematic that outlined the pipe runs. Both knew, with a certainty that felt like a physical weight, that the real, terrifying truth wasn’t in those readily available segments. It was deep below, in the submerged, inaccessible intake tunnel – a forgotten artery, silently decaying, waiting for its moment to fail. A moment that, if discovered, would unleash a torrent of unfunded liabilities, a crisis nobody wanted to own.

$676 Million

Potential Remediation Cost

This is the quiet corruption of willful blindness. It’s not about malevolent intent or grand conspiracies, but rather a subtle, systemic incentive against genuine discovery. Organizations, particularly those managing sprawling public or critical assets, often create powerful, unspoken directives: _don’t look too closely_. Because if you find something, then they – the management, the board, the public – have to fix it. And fixing things costs money, disrupts budgets, and forces uncomfortable conversations about past oversights. It’s far easier to manage what isn’t yet ‘known’ than to confront a problem that has suddenly become a glaring, unfunded liability.

I remember trying, one particularly frustrating evening, to open a pickle jar. It wasn’t just stuck; it felt cemented shut. Every twist, every grunt, only highlighted the futility. It felt similar to how many professionals feel when they encounter these ‘undiscovered’ problems. The solution is there, the truth is evident, but the system itself acts as a stubborn, unyielding lid, preventing access to what needs to be revealed. The frustration isn’t with the problem itself, but with the institutional resistance to even acknowledging its existence.

Known vs. Undiscovered Risks

This isn’t to say that all risks are ignored. The known unknowns, the things we budget for and plan around, get their due. But what about the ‘undiscovered’ risks? These aren’t genuinely unknown; they’re often the ones we’ve chosen, through a collective, unspoken agreement, not to investigate thoroughly. The kind that manifest as a faint tremor in the data stream, a recurrent anomaly in a sensor reading that gets rationalized away, or an inspection scope that conveniently skirts around the most problematic areas. They linger, growing like a slow-burning ember, until they ignite into a full-blown inferno.

🔥

Ignited

Full-blown inferno

〰️

Tremor

Faint signal in data

Undiscovered

Chosen not to investigate

The Artist and the Engineer

Consider Isla P.-A., a sand sculptor I met years ago on a blustery beach in northern California. Her creations were breathtaking, intricate castles with delicate spires and swirling moats. She knew, perhaps better than anyone, the transient nature of her art. “I build beautiful lies,” she’d once told me, her fingers still shaping a miniature turret. “They’ll be gone with the next high tide. But at least I’m honest about their impermanence.” There was a profound wisdom in her work: an acceptance of decay, a direct confrontation with the forces of nature.

Our engineers, conversely, are often tasked with building and maintaining structures that aspire to permanence, yet are simultaneously asked to adopt a willful blindness towards the very forces that undermine that permanence. They are asked to pretend that the high tide isn’t coming.

The Cost of “Being Right”

My own career has seen its share of these paradoxes. Early on, driven by a rigid sense of professional duty, I once pushed hard to expose a critical flaw in a legacy wastewater treatment system – a 6-inch crack in a major effluent pipe that had been ignored for 6 months. I was right. The flaw existed. It cost the city millions to fix, causing significant budget overruns and delaying several other planned infrastructure projects. While the system was ultimately saved from catastrophic failure, my relentless pursuit of the truth, unbuffered by political realities, earned me a reputation that was, shall we say, _complicated_.

I learned that sometimes, simply being right isn’t enough; you also have to be smart about _when_ and _how_ you’re right, a difficult truth that subtly clashes with the idealism that drew me to engineering in the first place. This doesn’t excuse willful blindness, but it does illuminate the powerful, human-shaped incentives behind it. The system rewards quiet compliance, not inconvenient revelations.

Initial Estimate

$236K

Fixing the Crack

VS

Actual Cost

$676M

Remediation & PR

The Essential Eye in the Murk

When reality hits – when the water main bursts, the bridge cracks, or the intake tunnel fails – that’s when the true cost is tallied. And that’s when organizations, often in crisis, finally seek out the experts who can actually see what’s there, who aren’t afraid to look into the dark, murky depths.

This is precisely where the value of partners like Ven-Tech Subsea becomes not just evident, but essential. They represent the specialized eye, the unblinking gaze that can peer through the murk and provide an honest assessment, without the internal pressure to keep inconvenient truths buried.

The Courage to Discover

Navigating this landscape requires more than just technical skill; it demands a particular kind of courage. The courage to document the corrosion, to map the unseen structural fatigue, to dive into the truly inaccessible. Because these ‘undiscovered’ risks don’t just disappear. They merely wait. They accumulate. Every deferred inspection, every overlooked tremor, every structural report conveniently filed away, adds another layer to the eventual reckoning.

Deferred Inspections

+1 Layer

90% Cumulative Risk

And when that reckoning arrives, it almost always demands a far higher price than preventative action ever would have. The irony, of course, is that the very act of not looking, intended to save immediate costs, often ensures a catastrophic future expense.

Beyond Information: Ethical Imperatives

We often assume that organizational failures stem from a lack of information. More often, it’s a surplus of the _wrong_ kind of information, or a deliberate _avoidance_ of the right kind. The incentive structures are perverse: individual careers flourish when problems are managed quietly or, better yet, not found at all, even if the organization as a whole sails towards an iceberg.

This isn’t just about financial prudence; it’s a profound ethical dilemma. We are entrusted with public safety, with critical infrastructure that underpins our daily lives. To willfully ignore its decay is not just negligent; it’s a betrayal of that trust. It’s a collective agreement to value blissful ignorance over painful, necessary truth. The responsibility, therefore, extends beyond the balance sheet to the very fabric of societal well-being.

Embrace Discovery

Finding a problem isn’t a failure; it’s an opportunity for foresight and proactive action.

Moving Forward: A Shift in Perspective

So, what does it truly take to move beyond this? It begins with a fundamental shift in perspective – acknowledging that finding a problem isn’t a failure; it’s an opportunity. It’s a chance to exercise foresight, to be proactive rather than perpetually reactive. It requires creating cultures where reporting inconvenient truths is rewarded, not penalized.

It demands a leadership that understands that the biggest risks aren’t those we don’t know exist, but those we actively choose not to discover. Because only when we decide to look, truly look, can we begin to build something that might actually stand the test of time.