The Relative Sophistication Of A Tool Scales With The Skills Required To Wield It

I’m thinking out loud here as a means to explore some stuff that isn’t yet entirely coherent in my head.. so I reserve the right to meander and/or be entirely wrong.

Since Enterprise IT became a thing, organisations have been purchasing software to improve collective productivity. As those tools got more sophisticated, so to the skills required to use them effectively.

I’ve been on training recently to use a business intelligence tool. I’ve got some previous experience of working with data and writing SQL queries. In the main, I found this tool to be largely unintuitive.

Is it down to poor design? Possibly. Enterprise IT does not have a great track record on this front. In some circles, it’s almost a badge of honour or a rite of passage if you can wrestle with the interface and win.

I do wonder whether it’s also down to the scale that the tool is designed to work at. There are layers of things between me grabbing the data and me seeing a report. Those layers are there to help impose order on chaos. Except, the chaos only really appears if you’re working with this thing at a large scale (100’s or 1000’s of data sources, queries, reports, users). So at a smaller scale it feels like these additional layers are confusing or arbitrary in relation to my actual needs.


  • Do we match the complexity of the problem to the sophistication of the solution? (*1)
  • Do we match the investment in the software with the investment in skills required to use it effectively?

Example.. Microsoft Dynamics. How many organisations invest in Dynamics and then find that adoption falters because they’ve been unable to effectively use it to solve meaningful problems and/or because the skills aren’t readily available to do so.

I’m sure this example is interchangeable with many other things (Sharepoint, SalesForce, SAP etc. etc.)


  1. The sophistication of the tool should match the complexity of the problem(*2) being addressed.
  2. The sophistication of the tool should align with the investment in the skills required to use it.


The core problem with some (a lot?) Enterprise IT is that it’s often sold or purchased as a magic wand. What’s your problem? It can do whatever you need! It’s agile! A sure fire route to productivity gains and cost savings.

Financial incentives result in tools that are designed to accommodate ever larger customers with demanding needs. For smaller customers this results in ‘bloat’. A steady march of features and tweaks that are disconnected from the context that they themselves use the tool in.

Here’s a good real world example.. an anguished wail of despair from an Adobe Creative Cloud user.

I also feel that requirements are sometimes deliberately over specified as a means to future-proofing the investment. I can see a certain amount of logic in this, but at some point rampant optimism about breadth and speed of progress could actually work against you in doing the basics well. Perhaps everything should have a very fixed shelf life and realistic expectations in relation to where you’re starting from?

Is there something to be said for choosing a tool that is just sophisticated enough for your current needs.. and no more! For me, this makes sense in the context of rapid technological change where plans should be flexible and horizons should be short.


What are your capabilities as an organisation? If you’ve just procured a whizzy all-singing, all-dancing system, do you have the skills to make it succeed past the initial implementation period?

That means either having enough cash to buy the required skills in from outside for as often as you need them or hiring/upskilling your own employees.

Ongoing consultancy is often viewed as expensive because it’s OPEX not CAPEX. Typically the responsibility for making things work is given over to existing employees. The more sophisticated the tool, the steeper the learning curve for those employees.

  • Is this someone’s whole job, or are they expected to learn it on the side of something else?
  • Is knowledge about this tool widely available or is it niche/proprietary?

Adjust the learning curve accordingly based on your answers.

What happens when someone leaves? How easy is it to replace those skills?Depending upon how niche/proprietary the tool is, finding a person to drop into the role and pick up where the previous incumbent left off is orders of magnitude more difficult.

Perhaps it’s fair to say that any tool(*3) has a hidden associated cost in terms of the skills required to operate and make the best use of it. Failure to meet that cost hastens the conversion of an asset into technical debt.

So, perhaps there’s something here around..

  • Choosing tools with open, widely available knowledge.
  • Having the sustained financial backing to fund the required skills through consultancy or training.
  • Giving people the time and space to learn and succeed.

*1 Always worth caveating that there’s a long storied history of employing technical solutions in search of an actual problem.

*2 Is the problem really complicated or complex? Does it just need untangling first for a much simpler solution?

*3 I’m not entirely happy with the word ‘tool’ — is ‘software’ better? Is it just software we’re talking about?

Leave a Reply

Your email address will not be published. Required fields are marked *