Somebody Else’s Problem

Things that are Somebody Else’s Problem (SEP) are invisible. Douglas Adams famously joked about this in “The Hitchhiker’s Guide to the Galaxy,” but the effect is serious and real.

For example, local British politicians were falling over each other trying to attract data centers. They were focusing on the cachet of having Google or Facebook in their town, and the half-dozen jobs for the electricians and plumbers maintaining them. Supplying these energy-hungry behemoths with power was Somebody Else’s Problem.

Now they have so many data centers in West London that their electrical grid is overloaded, and they won’t be able to build more housing until they have upgraded their main cables. That’ll be sometime in the 2030s.

As an IT leader, it is your job to ensure that each team knows the problems they might cause for other parts of the organization.

Cloud Services Leak Your Data

Big Brother is watching what you write. Chinese users working on the local equivalent of Google Docs discovered that there are some things you can’t write. An author was locked out of the novel she was writing, with the system telling her that she was trying to access “sensitive content.” It didn’t matter that she wrote herself.

Of course, Google would never lock you out of your Docs or Sheets. And they claim they don’t look at your documents to sell you ads, though plenty of users report spooky coincidences. The default setting in Microsoft producs is to enable “Connected Experiences.” That means your content is being sent to Microsoft servers for analysis. Microsoft claims no human looks at it.

Do you have guidelines and technical measures in place to prevent sensitive data leaking out of your organization through cloud services?

The Tolstoy Principle in Action

This is what failure looks like: 50% one-star reviews. The other half is five-star reviews. Assuming these are not all from the app developers themselves, the app apparently can work. It just didn’t work for me, nor for many others.

I call this the Tolstoy principle: All successful apps are alike; each unsuccessful app is unsuccessful in its own way. The end-user does not care that 98% of your back-end infrastructure is running. They care that they can complete their task. And if one critical component fails, your app is a failure. Like this one from my local supermarket chain.

When you build systems, is all the attention lavished on a cool front-end app? Unsexy back-end services are equally important.

Are You Making a Fool of Yourself?

You’d think that an official digital ID project would be subject to a careful security review. Not in Australia. The government of New South Wales in Australia has rolled out a digital driver’s license that contains no less than five different security issues. Together, these make it trivially easy to alter any data on your ID, effectively creating a fake ID. That is good news to Australian identity thieves and underage would-be drinkers. The official response is “it’s illegal to make changes to your ID.”

Are there any embarrassing security oversights in the products you roll out? How would you know?

Don’t Use Illegal Defaults

You would never implement a system programmed to break the law, would you? The municipalities in Denmark did. If you get social security in Denmark, you are supposed to work at least 225 hours per year if you can. Those who can, and don’t, get less money. Those who cannot work are exempt from this deduction rule. The IT system has been programmed to automatically start reducing benefits unless a caseworker remembers to manually keep pushing the deduction date into the future. This means the municipalities save money by illegally reducing benefits for those citizens who do not have the energy to complain.

When you automate a process, your users will quickly come to accept the decision of the system. Make sure you have good defaults. At the very least, make sure they are in accordance with the law.

Control Your Tools

Do you know which tools your developers are using? Many of them are using low-code/no-code (LCNC) tools, whether officially sanctioned or not. The latest State of the Developer Nation report from SlashData delves into LCNC tool usage and finds that 46% of developers are using them. 12% of professional developers use them for more than half of their work, but developers with 10+ years of experience shun them.

Developers can pick up cloud-based low-code/no-code tools without anybody noticing and deploy production applications using free-tier functionality. By the time IT management figures out what is happening, you might have dozens of small and medium-sized applications running.

You cannot prevent these tools from being used. You can get your developers to decide on one tool and make that the officially sanctioned low-code/no-code platform. That means you can manage all the applications on one platform, and developers can help each other use the tool. Trying to ignore these tools does not make them go away.

(image source: SlashData State of the Developer Nation, 22nd edition)

Are You Still Building Things That Don’t Scale Automatically?

There is no excuse for a modern system to be slow. I’m at a 5,000-people conference this week, and their official networking app is totally overloaded and almost unresponsive.

You might still have legacy systems with scalability issues, but everything you build today should be cloud-native. As a first-class citizen of the cloud, a modern app has access to automatic scaling, monitoring, robustness, and many other features.

Ask the architects building new systems in your organization about how the application will scale. If the answer is that it will scale automatically, good. If the answer is that somebody has to notice response time increasing and manually do anything, you are still building to the old paradigm.

Do You Understand What You are Running?

Don’t run systems you don’t understand. Some people had placed billions of dollars into a cryptocurrency called TerraUSD. They were told this was a “stablecoin” that would keep a value of $1. Underlying this claim was a clever algorithm that interacted with investors and another cryptocurrency in complex ways. Until its magic no longer worked and the supposedly stable TerraUSD dropped 80%. Trading in it is now halted.

In the global financial crisis of 2008, people had invested in complex financial instruments that they didn’t understand. Many billions were lost and large institutions went bankrupt. The banks who came out of the crisis unscathed were those who had stuck to simple banking products that everyone could understand.

Take a look at your IT landscape. Can you find somebody who understands your operating infrastructure? Or have generations of DevOps engineers just googled problems and tweaked your Kafka and Kubernetes configuration until it somehow seemed to work?

Why Employee Surveillance Doesn’t Work

Do you know what a “mouse jiggler” is? Your most innovative employees do. It is not a device to shake a rodent in a cage. It is a small USB device that sends random mouse movements to a computer.

Who would want such a thing? Employees subjected to tracking software, that’s who. With the mouse moving, the software will record “productivity.” The pandemic led to a boom in surveillance tech, euphemistically called “employee productivity software.” As workers return to the office, that tech is not removed from corporate laptops. But workers are pushing back, in accordance with Newton’s Third Law of IT systems: Whenever the organization implements a policy, the employees will implement an equal and opposite workaround.

Techno-optimists keep trying to replace humans with technology. There are some places where that works. Replacing human leadership with surveillance technology is one of the places where this strategy doesn’t work.

Do You Trust Amazon?

The default is no trust. You shouldn’t trust a random USB stick you pick up in the parking lot, and your customers and users don’t trust you. If you want trust, you have to be transparent in a way your users understand and appreciate.

Somewhere in the Amazon terms & conditions it probably says in illegible legalese that everything you say to your Alexa smart speaker can and will be used against you. Researchers have shown that your interactions with Alexa are reported to dozens of advertisers, and Amazon says the research is flawed. Who do you believe?

Amazon have hundreds of lawyers and are probably within the law. The problem is that they are not complying with users’ expectations. If you want any kind of goodwill from your users and customers, you have to meet their actual expectations. Hiding behind reams of legalese doesn’t cut it.