0
(Image screenshot Channel 4 News video)

In the aftermath of the disastrous Grenfell Tower fire, people are asking how: how can this happen? The answer, writes Dr Samuel Douglas is that engineering disasters and moral failures often go hand in hand.

IT'S A BIT SOON to speculate too much about the causes of the Grenfell Tower fire — and it would be doubly premature to talk about who knew what and when they might have known it. But in light of issues with the flammability of external cladding on apartments elsewhere, I think everyone has some idea of where this story is going.

Even before we have definitive answers, there’s a question already on everyone’s lips: How can something like this happen?

The world we live is in complex. The technological artefacts that are the defining icons of our age are no exception. Our phones, cars, apps, appliances, and dwellings are the culmination of engineering, design and manufacturing processes that involve thousands of decisions made by thousands of people. And usually, these wonders of the modern era work reasonably well for the bulk of people using them. Technological know-how is only part of success here — the values that these important decision-makers hold play an important role. Our safety and security rests, in no small part, on the skill, diligence and integrity of engineering and design professionals.

Failure, sadly, is always an option.

Modern engineering doesn’t usually go catastrophically wrong, but it does happen often enough that patterns emerge. These failures do not usually sit with one person and they often have a moral component. People don’t just fail to do the calculations correctly, they fail to do the right thing.

Before the Challenger shuttle explosion, concerns over potential component failures were raised by engineers, but political and financial pressure led to a management decision that overrode these worries, even though it was the astronauts, not the managers, risking their lives.

Sometimes, as with the Deepwater Horizon oil spill, specific safety recommendations and instrument readings were ignored — again because of cost pressures. This was compounded by a corporate culture that failed to value the lives of workers, the environment or anyone other than their shareholders. The organisational complexity of BP and its partners on the rig project – Haliburton and Transocean – meant that no single individual felt overall responsibility for these neglected considerations.

In the case of the Sampoong department store in South Korea, engineers informed management that the building was profoundly unsafe and were dismissed for their troubles. Sadly, they didn’t take their concerns to external authorities, in time to prevent the 502 deaths that occurred when the building collapsed. The construction of this building was later found to be profoundly inadequate, yet it had passed annual safety inspections for five years due to the bribery of local officials.

There are plenty more examples, such as the Eschede derailment, Fukushima Daiichi nuclear disaster, West Gate Bridge construction accident, salinity of the Salton Sea, disappearance of the Aral Sea, Therac-25 irradiation incidents, the exploding Ford Pinto and the Williamtown RAAF base contamination. The list goes on ... and on. Often, there are similar recurring factors: lack of individual responsibility (the "many hands" problem), lack of communication between stakeholders, unfair distributions of risk and (not least) valuing money more highly than human life.

But a serious engineering mistake need not end in disaster if the problem is detected early and properly acknowledged.

Citicorp Center Manhattan (or as some of my students call it, "Man-flatten") is an example of how things can be turned around. This highrise building had a novel design to accommodate a planning requirement involving a nearby church. This design passed all relevant regulations and required tests. The problem, as engineering student Diane Hartley realised, was that it wouldn’t resist wind hitting it on the diagonal. By the time she contacted the structural engineer in charge of the project, William LeMessurier, the 279-metre-tall tower was built. LeMessurier thought it shouldn’t be a problem, as the design was strong enough due to a specific series of welded joints in the superstructure. But, to reduce costs, the contractor was approved to bolt the joints instead. The result? The bolts might shear, a one in 16-year storm could topple the building, leading to a domino effect that could result in tens of thousands of deaths and billions of dollars’ worth of damage.

Battling anxiety, depression and thoughts of suicide, LeMessurier worked with insurers and city authorities to have repairs undertaken that would render the building safe. He wasn’t a complete saint, though. These repairs, which took eight weeks, were conducted after hours and the whole crisis was kept secret from people who worked in the building and nearby residents. Even the student who first saw the problem didn’t find out what had happened until years later. The moral gamble of hiding the risk from people, denying them an informed choice on whether to go to work in an unsafe building, paid off. But it was just dumb luck. Had the weather been different, this could have been the worst building collapse of the 20th century.

Sometimes disasters come out of nowhere, but when it comes to human creations, most of the time, someone knows that there might be a problem. And sometimes, this someone is an engineer. When this is the case, we rely on them having the ethical knowledge, moral courage and professional autonomy to make difficult decisions — to give a client bad, but necessary news. To resist a move to treat human life (and death) as just another dollar-value in a cost-benefit analysis. They may even need to consider the career sacrifice of being a whistleblower.

And a considerable sacrifice it is. Roger Boisjoly testified at the presidential commission into the Challenger disaster about his concerns leading up to the launch. He tried to prevent the disaster beforehand and told the truth about it afterwards. His reward? To be cut off from the space work that he loved and bullied into quitting. His experience was not atypical. Whistleblowing, even when it does not break the law, usually involves loss of employment and broken friendships.

In Australia, the weight of the responsibility carried by engineers is reflected in their code of ethics – and rightly so.

This is why, with my engineering students, we’ve spent the past 13 weeks looking at these events and asking: Who knew what, and when did they know it? Were their actions (or lack thereof) ethically justified? Was there anything else they could have done to prevent the problems?

Over the coming weeks, these are the sorts of questions that will be directed at those responsible for management and renovation of Grenfell Tower. Right now, it’s too soon to know the full extent of technical and moral failures involved in this latest tragedy. But if other engineering disasters are anything to go by, they’ll be there.

Hopefully the authorities in London, as well as architects and engineers around the world can learn something from this horrifying and tragic disaster.

You can follow Dr Samuel Douglas on Twitter @BeachPhilsophy.

Creative Commons Licence
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Australia License

Monthly Donation

$

Single Donation

$

Subscribe to IA. It's well engineered.

 

Share this article:   

0

Join the conversation Comments Policy

comments powered by Disqus