Sten Vesterli's Blog

Are You Too Cautious?

Do you always play it safe? We all have our personal risk profiles. Some people climb mountains without safety ropes, while others won’t climb more than two steps up a ladder. Being very careful to follow all the recommendations might be a good strategy in a pandemic, but being over-cautious also means you miss out on opportunities.

Researchers in the UK have found that teaching children chess made them more willing to take prudent risks. In chess, you need to be able to take prudent risks and sacrifice a piece to gain a decisive advantage. Chess was a safe environment for the children to experiment with risk – the worst thing that could happen was that they lost the game.

If you are being over-cautious in your life, find some place where you can practice taking small risks. You might even take up chess.

Risk Aversion

The U.S. has stopped distributing the Johnson & Johnson vaccine. It has been given to more than 7 million people, and there have been six reported cases of blood clotting. Here in Denmark, we have stopped giving the Astra Zeneca vaccine because of one similar case. That is not risk management, that is risk aversion.

Risk management is one of the basic leadership tasks. The leader has to decide if the benefit of a certain decision is worth the risk of something bad happening. If we could calculate the exact probability and the exact impact, risk management would be a purely mathematical exercise. But since both probability and impact are only vaguely known, the leader has to use his or her experience, evaluate contrasting opinions, and make the call.

There is a classic short story by Stephen Leacock called “The Man in Asbestos.” It is from the time where fire-resistant asbestos was considered one of the miracle materials of the future. The narrator travels to the future to find a drab and risk-averse society where aging has been eliminated together with all disease. Machines produce everything anybody needs. Since everybody will live forever, barring accidents, railroads and cars are outlawed as too dangerous. Nobody needs to go anywhere, and nobody does. In this future, everybody has everything they need and lives forever, but the narrator is appalled at consequent stagnation.

That story was written in 1911 but was very prescient. We have since eliminated many risks and have increased our standard of living immeasurably. And we are less and less willing to accept any risk.

A leader accepts the risk and reaps the benefit. But our decisions are increasingly influenced by experts who point out the dangers. If you have dedicated your life to immunology, you know what the risks are. From the viewpoint of the immunologist, it is safest to lock everybody down until everyone is vaccinated. A political leader takes that input together with input from economists and other experts about the costs of lockdown and makes a leadership decision.

In organizations, the equivalent to the immunologist resides in legal, compliance, QA, risk management, or validation departments. They point out all the risks – children might swallow our product, we might get sued, we might have our operating license revoked. The larger the organization, the more of these departments of innovation prevention you will have. It takes courageous leadership to overrule the objects of the naysayers. The reason smaller organizations are able to out-innovate larger ones is that they can spend their leadership time on innovation and growth and instead of on fighting organizational units dedicated to preserving the status quo.

As an IT leader, it is your job to make sure your organization doesn’t get paralyzed by risk aversion.

Risk Aversion

In this episode of Beneficial Intelligence, I discuss risk aversion. The U.S. has stopped distributing the Johnson & Johnson vaccine. It has been given to more than 7 million people, and there have been six reported cases of blood clotting. That is not risk management, that is risk aversion.

There is a classic short story from 1911 by Stephen Leacock called “The Man in Asbestos.” In it, the narrator travels to the future to find a drab and risk-averse society where aging has been eliminated together with all disease. People can only die from accidents, which is why everybody wears fire-resistant asbestos clothes, railroads and cars are outlawed, and society becomes completely stagnant.

We are moving in that direction. Large organizations have departments of innovation prevention, often called compliance, risk management, or QA. It takes leadership to look at the larger benefit and overrule their objections Smaller organizations can instead spend their leadership time on innovation and growth.

As an IT leader, it is your job to make sure your organization doesn’t get paralyzed by risk aversion.

Contingency Plans

Last week’s episode of my podcast Beneficial Intelligence was about contingency plans. Texas was not prepared for the cold, and millions lost power. The disaster could have been avoided, had the suggestions from previous outages been implemented. But because rarely gets very cold in Texas, everybody decided to save money by not preparing their gear for winter. At the same time, Texans have decided to go it alone and not connect their grid to any neighbors.

In all systems, including your IT systems, you can handle risks in two ways: You can reduce the probability of the event occurring, or you can reduce the impact when it occurs. For IT systems, we reduce the probability with redundancy, but we run into Texas-style problems when we believe the claims of vendors and fail to prepare for the scenario when our redundant systems do fail. 

Texas did not reduce the probability, and was not prepared for the impact. Don’t be like Texas.

Contingency Plans

This week’s episode of my podcast Beneficial Intelligence is about contingency plans. Texas was not prepared for the cold, and millions lost power. Amid furious finger-pointing, it turns out that none of the recommendations from the report after the last power outage have been implemented, and suggestions from the report after the outage in 1989 were not implemented either.

As millions of Texas turned up the heat in their uninsulated homes, demand surged. At the same time, wind turbines froze. Then the natural gas wells and pipelines froze. Then the rivers where the nuclear power plants take cooling water from froze. And finally the generators on the coal-powered plants froze. They could burn coal, but not generate electricity. You can built wind turbines that will run in the cold, and you can winterize other equipment with insulation and special winter-capable lubricants. But that is more expensive, and Texas decided to save that money.

The problem could have been solved if Texas could get energy from its neighbors, but it can’t. The US power grid is divided into three parts: Eastern, Western, and Texas. They decided to go it alone but apparently decided to ignore the risk.

In all systems, including your IT systems, you can handle risks in two ways: You can reduce the probability of the event occurring, or you can reduce the impact when it occurs. For IT systems, we reduce the probability with redundancy. We have multiple power supplies, multiple internet connections, multiple servers, replicated databases, and mirrored disk drives. But we run into Texas-style problems when we believe the claims of vendors that their ingenious solutions have completely eliminated the risk. That leads to complacency where we do not create contingency plans for what to do if the event does happen.

Texas did not reduce the probability, and was not prepared for the impact. Don’t be like Texas.

Listen here or find “Beneficial Intelligence” wherever you get your podcasts.

Risk and Reward

Last week’s episode of my podcast Beneficial Intelligence was about risk and reward. Humans are very good at calculating risk and reward. That means we will do what is best for us, even if it is not the best for the company.

It is easy to create incentives for being fast and cheap, but hard to create good incentives for quality. That’s why we try to use incentives for speed and cost, but try to use QA procedures to ensure quality.

Incentives almost always win over procedures. As CIO, you need to make sure there are also incentives for quality. If not, you can be sure that your procedures will be circumvented, and corners will be cut.

Risk and Reward

This week’s episode of my podcast Beneficial Intelligence is about risks and rewards. Humans are a successful species because we are good at calculating risks and rewards. Similarly, organizations are successful if they are good at calculating the risks they face and the rewards they can gain.

Different people have different risk profiles, and companies also have different appetite for risk. Industries like aerospace and pharmaceuticals face large consequences if something goes wrong and have a low risk tolerance. Hedge funds, on the other hand, takes big risks to reap large rewards.

It is easy to create incentives for building things fast and cheap, but it is harder to create incentives that reward quality. Most organizations don’t bother with quality incentives and try to ensure quality through QA processes instead. As Boeing found out, even a strong safety culture does not protect against misaligned incentives.

As an IT leader at any level, it is your job to consider the impact of your incentive structure. If you can figure out a way to incentivize user friendliness, robustness and other quality metrics, you can create a successful IT organization. If you depend on QA processes to counterbalance powerful incentives to ship software, corners will be cut.

Listen here or find “Beneficial Intelligence” wherever you get your podcasts.