Cybersecurity must be risk-based

Good cybersecurity is based on risk analysis. It is not based on locking down everything as tightly as you can.

I’ve been discussing the consequences of the war in Ukraine with several cybersecurity experts. Some argue that if you have to strengthen your defense now, it means it was too weak before. That is a fundamental misunderstanding of security. Security, like availability, reliability, and many other aspects of your technology is a trade-off. Higher security costs more money and slows your organization down. You don’t need maximum security always. You need a security level that is appropriate to your risk.

Right now, cyber-warriors and vigilantes are firing indiscriminately in all directions. You might get caught in the crossfire even if you have nothing to do with either side in the war. That’s why your risk has increased and you need to strengthen your cyber security posture. When the war is over, you can reassess your risk again.

People and Material

“In war, three-quarters turns on personal character and relations; the balance of manpower and materials counts only for the remaining quarter.” Napoleon said that in 1808, and it applies equally in Ukraine today.

It also applies in other human endeavors. You can see organizations performing well with antiquated IT systems, and organizations making a mess of their customer service even though they have the latest and greatest cloud services. Simply rolling out new technology without considering people, organization, and processes will not improve your organization.

Do People Believe You?

How is your credibility balance? Will your employees, partners, and customers believe you in a crisis?

The information war accompanying the kinetic war has been resoundingly won by Ukraine. Many of the stories coming out of the conflict zone are false, but Ukrainian stories are given the benefit of the doubt while Russian stories are immediately disbelieved.

Honest communication adds to your credibility balance. Trying to sweep your failures under the carpet and hitting your critics with spurious DMCA takedowns and questionable lawsuits detracts from it. If you are in a credibility deficit when the next crisis hits, it will become orders of magnitude worse.

Don’t Ask Half Questions

Asking half questions leads to dangerous outcomes. We just saw an example when irresponsible Reuters pollsters looking for a scoop simply asked Americans “should NATO establish a no-fly zone over Ukraine.” They got a resounding 74% approval.

Another pollster asked the question with the qualifier “knowing that this will lead to direct war with Russia” and support dropped to 34%.

A complete question asks “are you willing to accept this downside to gain this upside?” Organizations get an idea, focus on the upside, take a cursory glance at the downside, and then take erroneous or even disastrous decisions. Who has the job of ensuring the downside is examined as well as the upside? You might need someone external to provide this.

There is Always an Alternative

There is always an alternative. Not looking for it is either intellectual laziness or willful manipulation. Margeret Thatcher, Prime Minister of the UK for a decade, was known among friends and enemies alike as “TINA” due to her usual insistence that “There Is No Alternative.”

As an IT leader, you are bombarded with requests to make specific technical decisions. Many of these are attempts to railroad you into choosing a technology that the team would like to play with and put on their CVs. When presented with a single option, ask for more. When one of the options is the obvious slam dunk, examine what has been left out of the presentation of the others. Binary selections are common in computer programming. In the real world, there are always many choices.

Are you Monitoring Important Systems?

New York is replacing their payphones with LinkNYC access points providing free calls, 911 calls, free WiFi, charging, and more. You would think such a system would warrant professional monitoring. Nevertheless, some of these devices just show a blue screen of error messages followed by a Linux login prompt.

  • Monitoring of crucial systems must include an automated mitigation action and reporting to a 24/7 operations center.
  • Monitoring of important systems needs immediate alerting to staff on call.
  • Monitoring of normal systems only needs to log a trouble ticket to be addressed by regular staff during working hours.
  • Low-priority systems do not need active monitoring.

It seems these kiosks are not as important to the company running the system as they were to the Mayor promising them.

Does every system on your central system list have a monitoring priority? When was the last time you checked with the person with the technical responsibility what monitoring is in place?

Fancy or Usable?

Do you want something that works or something that looks fancy? Sometimes, these two objectives come into conflict. Too often, the IT professionals can’t imagine a solution that does not involve touchscreens and mobile apps.

I’m staying in an upscale hotel in New York this week, and the control panel for heating and lighting is definitely old-school. But it works. And it can be understood and operated by every age group likely to frequent the hotel.

Meanwhile, back in Denmark, we are currently rolling out a new central authentication system. You will have to figure it out in order to do online banking or access public services. It was designed by tech-savvy young people and is very fancy. Too bad it has left hundreds of thousands of non-computer-literate citizens desperately calling the understaffed phone helpline.

Are you sure the solutions you roll out have been tested by the entire target audience?

Check your defenses

Your risk profile just changed dramatically. You might think the war in Ukraine will not affect you, but your risk is higher than you think.

Do you know who ultimately writes the code your vendor delivers? Your contract is with a large system integrator in your own country. They outsource actual coding to several subcontractors, who sub-subcontract until the actual code is written by a team of three people in a basement in Kyiv. And right now, an adversary with nation-state resources is out to destroy the Ukrainian software industry along with the rest of the country.

Remember the attack that hit Maersk Lines a few years ago? They are the world’s largest container shipping company and have strong cyber defenses. Nevertheless, they suffered a two-week outage and lost $300 million because an attack on their Ukrainian subsidiary got through their defenses.

Revisit your risk management plan. You need stronger network security towards your all your suppliers.

Shooting the messenger

Even though the clueless Governor of Missouri tried to shoot the messenger, he missed. Last year, a reporter published his findings that private data on more than 100,000 teachers was available to anyone who knew how to click “View Source” on a web page. The Governor held a widely-ridiculed press conference where he vowed to prosecute the “hackers” who had told the world about the incompetence of the state IT department.

A thorough report by law enforcement now roundly exonerates the journalist. It also exposes that personal information on more than half a million people had been available for a decade to anyone who care to look.

Even professional IT organizations occasionally fail like the state of Missouri did here. You have a little simple system, you are under schedule pressure, and you forgot to book time with the security team. So you roll it out without a security review. The antidote to this is to maintain a complete systems inventory with a field for the name and email of the person who did the security review. That will show you if this step got skipped, and allow you to quickly ask questions about any alleged security issues before you start shooting at the messenger.

User Blaming

The IT industry has its own version of victim blaming. I call it user blaming. That is what happens when you build an IT system without proper regard for the users’ reality. When the purported benefits do not materialize, the vendor points to the convoluted and impractical instructions given and claims that if only the users would follow the instructions, the system would work as advertised.

I was reminded of user blaming this weekend. I had worn out the burrs on my coffee grinder, and as is sadly often the case, a replacement part was more expensive than a new machine. Being a professional, I always read the instructions. They told me to clean the machine after each use. Since I only grind what I need, that would mean several cleanings a day. And the cleaning involved six steps, washing everything in lukewarm water, emptying out the beans, disassembling the grinder, cleaning the burrs with the supplied cleaning brush, and much more.

That is an abdication of responsibility. Just like when an IT vendor provides unrealistic and impossible-to-follow CYA instructions. Take responsibility. Build a quality product that works in real life.