The Citigroup Center and our Ethical Obligation to Good Work

This week my eye was caught by an interesting case study of ethics in the design and building of something other than software. In 1977, the Citigroup Center was completed – a towerblock in Midtown Manhattan. It used an interesting design, resting on 4 columns which were positioned at the centre of each side, rather than the corners. In its construction, structural engineer William LeMessurier was tasked with calculating the effect of winds on the building to ensure it would be structurally sound, and wouldn’t topple over in a strong wind.

Unfortunately LeMessurier used the NYC building code as his benchmark – a set of “minimum requirements” which did not and could not take into account the very specific risks associated with the Citigroup Center’s unique design…

Paulkhor.jpg

“Due to a design oversight and changes during construction, the building as initially completed was structurally unsound. For his original design, LeMessurier calculated wind load on the building when wind blew perpendicularly against the side of the building—wind from due north, east, south, or west—all that was required by New York building code. Such winds are normally the worst case, and a structural system capable of handling them can easily cope with wind from any other angle. Thus, the engineer did not specifically calculate the effects of diagonally-oriented “quartering winds” (northeast, northwest, southeast, or southwest). In June 1978, prompted by discussion between a civil engineering student at Princeton University, Diane Hartley, and design engineer Joel Weinstein, LeMessurier recalculated the wind loads on the building, this time including quartering winds. This recalculation revealed that with a quartering wind, there was a 40% increase in wind loads, resulting in a 160% increase in the load at the chevron brace connection joints.

LeMessurier’s original design for the chevron load braces used welded joints. However, during construction, builder Bethlehem Steel was approved to use bolted joints to save labor and material costs. LeMessurier’s firm approved the change, although this was not known to LeMessurier himself. The original welded-joint design had ample strength to withstand the load from straight-on wind, with enough safety margin to withstand the higher loads from quartering wind; however, the load from a 70 miles per hour (110 km/h) hurricane force quartering wind would exceed the strength of the bolted-joint chevrons. The bolts could shear and the building could collapse. Wind tunnel tests with models of Citigroup Center revealed that the wind speed required to bring down the building would occur every 55 years on average.

[…]

LeMessurier reportedly agonized over how to deal with the problem. If the issues were made known to the public, he risked ruining his professional reputation. He approached the architect (Hugh Stubbins) first, and then Citicorp. He advised them to take swift remedial action. Ultimately, he persuaded Citicorp to repair the building without informing the public, a task made easier by a then-ongoing press strike.

For the next three months, construction crews working at night welded 2″ steel plates over each of the skyscraper’s 200 bolted joints. They worked during the night, after each work day, almost unknown to the general public.”

From https://en.wikipedia.org/wiki/Citigroup_Center#Engineering_crisis_of_1978

To cut a long story short, because of the unique design, LeMessurier checking the building’s wind-bearing ability was “up to code” was not sufficient, and caused significant risk and remedial work to the building’s owners. There was a very real risk the building could collapse in strong winds, and only limited thought given to the safety of the people working there (and in the surrounding buildings. Those responsible may have taken personal lessons from this situation, but they kept it quiet – in fact, full details only emerged almost 20 years later.

Cool Story Bro. What’s That Got To Do With Me?

This case study demonstrates the significant impact of pre-release testing and analysis, and highlights a risk of testing (actually checking) just that the acceptance criteria pass, then asserting the code is ready to ship. This is almost never the case, and it is the job of a tester and indeed any person involved in making changes to software to consider more than just the basics in their approach to assessing the quality of any solution.

Acceptance criteria are by nature incomplete. We do not usually assert a specific acceptance criteria that “Other system actions are not negatively impacted”, but we certainly expect it. We do not necessarily cover the performance of the solution with AC (although sometimes we should). The number of AC every story would need to carry in order to cover every implication of every change would be antiethical to the agile maxim of “Working software over comprehensive documentation” – instead, we hire competent, trustworthy and ethically sound developers and testers to ensure we are not simply “ticking boxes” and are interested in more than just hitting the AC as written.

Circling back to the Citigroup Center, Architect Eugene Kremer analysed the case and came away with six key points, which I think are of value to us as software testers:

1. Analysis of wind loads. Check all calculations and not rely just on building codes; these set minimum requirements and not the state of the art.
2. Design changes. In this case change from welded to bolted connections. Changes are considered in the overall design context and by everyone involved and not a spur of the moment decision.
3. Professional responsibility. To follow the codes of conduct for every chartered institution. LeMessurier did not consider the public safety first.
4. Public statements. In this case the public statements issued by LeMessurier and Citigroup set out to mislead the public deliberately.
5. Public safety. The public statement denied the public the right to ensure their own safety and to make their own critical decisions.
6. Advancement of professional knowledge. Concealing this problem for almost 20 years prevented ethical and engineering learning that could have taken place.

From Beyond Failure: Forensic Case Studies for Civil Engineers by Norbert J. Delatte

Citigroup_center.jpg

 

Each one of these has a bearing on how we develop and test software. Let’s break them down:

1. Analysis
“Check all calculations and not rely just on building codes; these set minimum requirements and not the state of the art.”

We can, should, must consider more than AC from the outset of planning and sizing (if we do this). We need to understand that the minimum requirements are never the “only” requirements. We can help this along by testing our designs are fit for purpose before committing to writing code. Don’t rely on “just” AC; these set minimum requirements and cannot take into account the value we add as experienced professionals, familiar with our own software, code and best practice.

2. Design changes
“In this case change from welded to bolted connections. Changes are considered in the overall design context and by everyone involved and not a spur of the moment decision.”

Requirements can and do change, and it’s frequently the case that designs also shift as the realities of a solution come to light during the development process. It is essential that changes are a group decision that all working on the software are informed of. In the Citigroup case, had LeMessurier been informed of the change to bolts he would have had an opportunity to stop it, and save considerable risk and embarrassment. If designs change, so should the approach to testing they’re fit for requirements (and for actual use).

3. Professional responsibility
“To follow the codes of conduct for every chartered institution. LeMessurier did not consider the public safety first.”

As software testers our responsibilities are manifold. We have a responsibility to our users, but also to our employers. We protect both from the impacts of poorly written, badly-working code and the psychological and reputational damage poorly-performing software can create. Whilst testers do not necessarily have an ethical charter, we can each agree to do only good work, never to sign off on untested code or to “half-arse” the work we do. Understanding the importance of our work, and the detrimental impact of doing it badly, we can strive to be better testers at the heart of informed, collaborative teams.

4. Public statements
“In this case the public statements issued by LeMessurier and Citigroup set out to mislead the public deliberately.”

When we do find bugs in our solution, it’s embarrassing. It’s also tempting to sweep them under the carpet. This is also unethical. If we wait until one of our customers’ days is ruined before we acknowledge a bug, we’re not doing our job properly. If we believe something we’ve missed or done wrong is likely to have this detrimental impact on a user, we have an ethical obligation to speak up and get something in place to handle this. That could be an email to affected users, revocation of a feature until such time as it can be firmed up, or prioritising a simple quick fix to deal with the problem immediately. Whatever we do, once we know there’s a problem, we cannot do nothing.

5. Public safety
“The public statement denied the public the right to ensure their own safety and to make their own critical decisions.”

In our industry this is frequently an extension of point 4 – if we know something is unsafe, or likely to be very problematic for our users, other systems or services, or our employers, we need to speak up and stress the urgency of the problem. Few of us work on features which pose genuine safety risks to human life, but all software risks a detrimental impact on human life! We have an obligation to stand in the way of this risk and ensure our software is as solid and fit-for-purpose as it can be.

6. Advancement of professional knowledge
“Concealing this problem for almost 20 years prevented ethical and engineering learning that could have taken place.”

When we learn something new, it can be tempting to keep that information for just ourselves. We become “special” by virtue of knowing how to do things others don’t. Again, this is unethical: it is selfish. We are not using our strength for the benefit of the maximum number of people if we keep it under our own hats. By sharing best practice, sharing our learning and giving others the opportunity to benefit from our mistakes (of near-mistakes) we maximise the growth possible. In the Citigroup case, other people’s lives may have been placed at risk because of the hush-hush approach to a very dangerous design fault. If other skyscrapers built similarly went through the same process and collapsed, many people may have died – and (somewhat less importantly, but more pressingly for them) Citigroup’s reputation would have been left in tatters.

1280px-CitigroupCenter2.jpg

In Conclusion

As per point 6 above, it’s only responsible to share relevant learning and studies with our industry, and this cautionary tale with its handy checkpoints for responsible development was certainly new to me. We are ethically obliged to do good work. We are ethically called upon to ensure we take what we know and do the best we can with it. We are ethically required to make choices (choices which may involve significant hard work or personal effort) in order to do things as well as we know how to. Not to turn a blind eye, not to hope someone else picks something up. It’s on us to do the best we can.

Ethics is an infrequently discussed topic in software development, but perhaps it shouldn’t be: What we do is a thinking craft; how we think shapes how we work. Let’s make sure that whether or not we are working responsibly is at least a consideration.

Do you have an ethical code which drives how you work? What are some other examples from parallel fields of work which impact how you test? Let me know your thoughts in the comments!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s