Consider these cases:
Behind these public, large scale social impact examples of the unforeseen negative consequences of well-meaning actions, similar stories routinely occur privately within organizations. For example:
These are recent examples of a persistent, common flaw in the design of corporate initiatives.
In the early 1990s, Harvard students studying under Prof. Chris Argyris, interviewed Fortune 500 leaders about the performance of recent strategic initiatives. The students observed that few initiatives were claimed to be fully successful. The leader’s rationale followed a pattern of attributing any positive results to themselves and their teams and assigning the blame for delays, cost overruns, missed benefits and outright failure, on a distant third-party.
The explanations took the form of “They did x, which hurt us.” and “They did y, which caused bad thing z for our strategy.” followed by the certain claim that “No one could have predicted that it would happen.” The message was “It’s not my fault we didn’t consider the potential negative consequences our actions would trigger and plan to prevent them.”
This is a common management bias, certainly not restricted to Fortune 500 managers. For all, the excuse has been valid; few were taught how to prevent such issues and used methodologies that didn’t account for them.
(There are multiple roles where the prediction and prevention of unintended negative consequences are an explicit part of the work, from military mission planning and engineered systems design to pharmaceuticals and medical device research. However, in our experience, it is by exception that people take this thinking with them as they move up from their tactical, functional roles into strategic, managerial roles.)
Prof. Thomas Schelling, a 2005 recipient of the Nobel Prize for Economics, is cited for the first known use of the phrase collateral damage in his May 1961 article Dispersal, Deterrence, and Damage. The CIA describes “blowback” as the unintended, undesirable consequences of covert operations, such as the funding of the Afghan Mujahideen and the destabilization of Afghanistan contributing to the rise of the Taliban and Al-Qaeda. Each is a negative unintended consequence. This is a term popularized by American sociologist Robert K. Merton, whose focus was on the societal types exemplified at the start of this article.
To improve something, we cause change within the greater system of which it is a part. Systems comprise interconnected, interdependent elements, with positive and negative causal feedback loops, (known as balancing and reinforcing loops.) Most managers, though, are trained in process thinking, not systems thinking. They see a linear series of steps; A, B, C, and D, producing result F. They see Cause and Effect.
Many managers and consultants still use the 1960’s era SWOT Analysis in goal setting and problem-solving. However, unintended negative consequences, a form of ‘T’hreat, are not explicitly asked in SWOT methodology. Nor, in our experience, in conventional Risk Assessments, which focus on barriers to success rather than negative consequences on others.
Over the last decade, we have trained business personnel and management consultants to ask themselves and stakeholders how the goals for which they advocate, and the actions required to attain them, could trigger negative consequences.
The Question: How could the actions you propose, or attainment of the goals they will enable, trigger negative side-effects on yourself or others inside and outside this initiative?
Note its difference to the question used to surface the issues and problems a group will face: “What barriers do you experience, or foresee arising, that will prevent attainment of the goals?”
Project by project, unintended negative consequences are shifting from being referenced in the past tense to the future tense. From “Can you believe that happened?” to “If we do x, bad thing y is likely to occur.”
We will look at who to ask in a future post. I can say that, initially, until they are explained, fewer than 10% of those responsible for an initiative can express a single negative consequence. Most will say “There aren’t any.” Those who do will cite one, on average. The most common example is “We will lose some good people who can’t make the change.”
Almost all say they understand the concept of an unintended negative consequence, but few do, usually describing barriers to success. “We can’t do x.” “They don’t have enough y.”
The syntax of an unintended negative consequence is: IF (action is taken/goal is attained) THEN (bad thing will occur).
On average, across several hundred corporate initiatives, when asked the above question, groups have foreseen 13 potential negative consequences of their proposed actions, including these examples expressed by board members to operating personnel:
After such concerns are surfaced, five factors determine which warrant validation and mitigation:
Before the First Fleet set sail for Australia, how likely is it that, if asked, any of the planning committee in London would have predicted that taking European rabbits as a food source would eventually lead to the decimation of many local species by eating so much, and eating so deep as to cause devastating land erosion problems? Unlikely.
Could the British have thought through the impact of fewer road accidents and the consequences beyond road-fatalities reduced, if asked to do so? I would like to think so.
A consequence which compromises one of the topic’s primary goals, or makes a critical path action unfeasible, is considered high impact. I am confident Leonid Brezhnev would have felt that way if he could have envisioned how Article 7 would lead to free expression and enable groups such as Lech Walesa and Solidarity to emerge.
Something which slightly reduces a goal’s value or delays a non-critical path action would be considered low impact on the topic. However, note that what might be a low impact on the topic might be a high impact on a participant. “After we do this, they won’t need me anymore!”
If it could have been predicted, would the members of the First Fleet care that the food source they chose to support their mission to settle in Australia might lead to soil erosion problems? Probably not.
Should Brezhnev have paid more attention to Article 7 and what removing the Communist blanket over public expression might do to his control of the occupied countries such as Poland and Hungary? Yes. (I am glad he did not, a perspective I expect we might have been misaligned on.)
Several of the examples illustrate a common flaw in organizational initiatives – confining the boundary of the project’s thinking to the area defined by its goals and plans.
Even if some of the earliest British to settle in Australia had been able to foresee the problems their dinner entrees were going to give their descendants decades later, should they have acted differently at a time when they needed a sustainable food source transportable in small wooden ships over several weeks and vital to getting established when they arrived on the other side of the world? Perhaps not. Today, we would put the concern on the ‘Parking Lot’ flipchart for after the settlement is built.
But if the US government in 1920 had predicted that prohibition would trigger large organized crime, (great clarity) with its direct impact on deaths, tax losses, policing costs (short distance) and that it would start as soon as 1921 (short time), they would probably have taken a different approach.
And, finally, the greatest ‘Get Out of Jail Free’ card for those who don’t want to spend time thinking about such things and prefer to get on taking action; How likely is it to happen anyway?
There is a spectrum, a range of likelihoods from 1% ”It could happen if ten other things also occur.” to 99% “We are aware that when others have done x, as we propose to do, bad thing y was triggered.”
There is great value in the dialogue among the stakeholders regarding the likelihood of the negative consequence occurring.
Surfacing, clarifying, and mitigating the negative unintended consequences your goals and plans could trigger, benefits them in two ways:
By surfacing, validating, and mitigating your stakeholder’s concerns, you kill two birds with one stone.
Spend a few minutes and ask yourself the question of your projects and initiatives, goals and plans. If you cannot come up with any negative unintended consequences yourself, or only one or two, bring in colleagues involved in the initiatives and use this approach to improve their strategic thinking and the initiative’s feasibility.
If you want to learn more about how to do this and similar how-to for improving organizational performance; learn the Advanced Management Consulting Core Skills.
(PS A large and sincere Thank You to the executives and internal and external consultants who directly contributed to the material in this paper during one of our SchellingPoint Wednesday Webinars. Brilliant collaboration and thinking.)