Vitaly Sharovatov

This is a response to Bas Dijkstra post on shortcuts in achieving quality.

I wholeheartedly agree with Bas Dijkstra perspective and I’d argue that low quality software is fundamentally a social issue rather than a purely technical one.

Here’s what I mean by that:

1️⃣ We need continuous work on improving quality. This involves dedicating time and effort to ensure that the cost of changing and maintaining our systems grows linearly, rather than exponentially. If we don’t invest continuously, the cost of change and maintenance grows exponentially, and we end up in tech debt hell.

There are studies (#1, #2) showing that companies which failed to work on improving quality when it was needed, now pay from 40% to 90% of the whole IT budget for simple maintenance.

I believe that one of the main factors which led to this state of affairs is what Deming deemed “one of the deadly diseases of western management” — focus on short term profits.

I see three reasons why managers have this focus:

I consider all of these reasons to be social, meaning that these reasons influence how people behave, and the result of these behaviors is that there’s a significant lack of continuous work on quality in many companies.

2️⃣ We need continuous learning. This is a practice of getting better at continuous work, so that our continuous work on improving quality becomes cheaper.

This is grounded in simple economics: as we enhance our proficiency in continuous quality improvement, the associated costs decrease.

As the world changes at an ever growing speed, learning can’t be a project, it must be continuous — or even lifelong. (#3)

However, there are still plenty of companies which inhibit constant learning in many ways. (#4) All the factors negatively influencing continuous learning are also social, for example, constant stress from deadlines and sprints, blame culture, fear of making mistakes, or managerial focus on constant 100% “utilization”.

3️⃣ We must not rely on tools promising to do the work for us.

Social problems are deeply embedded in cultural, economic, and political contexts that tools alone simply cannot reshape or address.

Most tools are only designed for the human to operate on tasks: a hammer can’t drive a nail into a surface on its own, an autotesting framework can’t write tests to cover your business logic for you.

Tools are evolving, sure, like in the famous COMPAS scandal where an “AI” was racially biased in giving sentences (#6). Anyway, the current scientific consensus is that even the most advanced AIs can’t solve social problems, but need to be USED by a professional to solve social problems. (#7)

Addressing the social aspects is pivotal for enabling technological advancements to follow suit.

References:

  1. https://www.sciencedirect.com/science/article/pii/S2210832716301260
  2. https://www.altexsoft.com/whitepapers/legacy-system-modernization-how-to-transform-the-enterprise-for-digital-future/
  3. https://www.sciencedirect.com/science/article/abs/pii/B9780080448947000166
  4. https://qase.io/blog/quality-knowledge-overlap/
  5. https://www.sciencedirect.com/topics/social-sciences/agency-theory
  6. https://www.luc.edu/digitalethics/researchinitiatives/essays/archive/2018/sentencebynumbersthescarytruthbehindriskassessmentalgorithms/