ÁD Studio

Blurring the lines between business strategy
and software engineering.

Against Agile

The widespread methodology for software development has backfired, swallowed by the bad management it tried to eradicate.
(1879 words)


One of the most difficult things in software is naming things. It's a process that relies fundamentally on extending our language in creative ways. Astronaut, for example, comes from the Greek "astron" (ἄστρον), "star", and "nautes" (ναύτης), "sailor". Before humans could travel beyond the Earth, there was no compulsion to invent a word for it, and the one we came up with to talk about the Gagarins and the Armstrongs is a pretentious way to say that more humans have been on a ship than on the Moon.

Our most pressing problems belong to this category, what sociologists call cultural lag: being forced to slog through obsolete institutional habits to achieve our goals. They arise from trying to organise a supersonic, computer-driven, and nuclear world with horse-and-buggy ways of life. The result of the tremendous acceleration of technological progress.

These strains are usually met in the only sensible way possible: people are constantly striving to bring institutions into closer relationship with reality.

Let me therefore warn you that it is not my intention to inform, or to establish some truth. What I want to do is change your attitude. I want you to sense chaos where at first you noticed an orderly arrangement of well behaved things and processes.

Paul Feyerabend, "Against Method"

Reacting to these forceful changes, most people furiously demand cures that, sometimes, are worse than the illness. But some of us are making efforts to achieve a relation of mutual acceptability between our institutions and anything that puts requirements on it.

Unfortunately, the task isn't simple. In practice, we are groping for some sort of harmony between an unfinished creation and a nebulous context: understanding the world we live in and designing appropriate institutions for it are two faces of the same coin. And weirdly enough, and this is something that everyone who designs for a living has experienced, criticising is easy; what's difficult is saying it's done.


Like many professions, software engineering was started by a bunch of trailblazers who, independently of each other, started leveraging the newly invented computer to do their work faster and better. But with the professionalisation of writing code, our industrial society concentrated intensely on finding ways to develop software more efficiently. With the first software engineers came the first software development methodologies.

In 1970, Winston Royce published his influential article Managing the development of large software systems: concepts and techniques, in which he presented several project management models, including what we know now as waterfall, iterative, and agile.

Yes, Agile was invented in 1970. We simply didn't have a word for it.

In this article, Royce defines two essential steps in every software development project: analysis, and coding. The kicker is that you can only get away with doing just that for for brick-and-mortar projects. An implementation plan to manufacture larger software systems, and keyed only to these steps, however, is doomed to failure, he said. Immediately after, he presented a more grandiose approach, read "Waterfall", where analysis and coding are preceded by two levels of requirement analysis, separated by a program design step, and followed by a testing step.

Royce, nevertheless, noticed something crucial: that over time, the design is more detailed:

There is an iteration with the preceding and succeeding steps [...]. The virtue of all of this is that as the design proceeds the change process is scoped down to manageable limits.

Winston Royce

The riskiest thing for Royce was the lag between implementation and testing, and if the testing phase was unsuccessful, a major redesign is required, effectively bringing the project back to square one.

To summarise: any Agile adherent will tell you that Waterfall is bad, and that if you aren't doing Agile, it follows that you are doing Waterfall. But from the very beginning of software development history, projects that break down activities into linear sequential phases à la Waterfall, putting off testing to the later stages of the process, are risky. That's because it's in the testing phase where, for the first time, software is ran rather than analysed, and many things in the project cannot be analysed. In other words, the paper that introduced the Waterfall model did so as a straw man, and not as a feasible way to develop software.


How do we come to terms with the fact that the story that things were slow and now they're fast is fictional? That preeminent figures in software development had been advocating incremental and iterative development, even in the very 1970 Waterfall paper?

I want you to consider this: that Waterfall was a convenient misrepresentation of the ideas laid down by expert software engineers for the convenience of managers. People who think of software development as an assembly line, and expect something specific, in an specific timeframe. If that isn't the case, then why have we all been asked, in an Agile project, to produce or follow a detailed roadmap with milestones?

This alternative theory may explain why Agile projects are the way they are, as opposed to what a Professional Scrum MasterTM will tell you.

A couple of years ago, I was interviewed for a job in which Agile wasn't just a way of doing things: it was an obsession. Tasks were moved deliberately on a huge whiteboard on the CEO's office, and people's workday was structured with clockwork precision. They had to work in pairs in 20 minute stints, forcing them to short-circuit their instincts to do anything but work. The office was designed in a way that would have made Jeremy Bentham proud, a sort of Panopticon where all the inmates software engineers were under careful vigilance by the CEO. It had all the Orwellian attributes that people despise about school, and no one said anything about "individual versus processes". That's when I came to terms with the idea that the reason why Agile has spread so wide and so fast is, too, for the convenience of managers, adapted to a software industry where people are younger and less disciplined than they were in the 70s, and crave for direction and pats on the shoulder. Agile is high-school.


I concur with Whorf's view that language is not merely an instrument for describing events, but also a shaper. That grammar contains an ontology, a view of the world and the speaker's society and their role in it, which influences their perception, their reason, their memory of events and their testimony of them. I came to understand that Agile methodologies, such as Scrum or Kanban, are sufficiently accepted and have grown into sufficiently complex entities to be considered along the same lines as languages.

In the influential The Structure of Scientific Revolutions, Thomas Kuhn claimed that history of science reveals proponents of competing paradigms failing to make complete contact with each other's views. Think, for example, of Michelson and Morley's experiment revealing cracks in the theory behind the existence of ether, and being swept under the rug as a result. This competing paradigms use different languages to address different problems, and communication across the divide is limited, if not impossible. Scientific progress, then, doesn't happen because of a healthy debate where the best ideas win, but rather because of evolutionary forces: the people who have 'good' ideas and accept them beyond question survived, rewriting the history of how their ideas came to eventually become dominant1.

Indeed, even though "Waterfall" was simply a straw man methodology to be mocked and ridiculed, the development of Agile methodologies followed a similar pattern to that described by Thomas Kuhn: there is a foundational set of precepts (the proverbial "individuals and interactions over process and tools...", and so on), which turn into a firmly set tradition that strongly resists change. This set of ideas emerged under the intolerant umbrella of an us-against-them mentality that aids widespread adoption2. In the end, Agile was prone to dominate the software industry, regardless of its effectiveness.

Agile methodologies say more about their proponents and their worldview than it uncovers 'better ways of developing software'. When someone uses the word 'agile', they are not simply getting the technical definition, but also the whole infrastructure around it. An infrastructure that is only useful if it addresses the idiosyncrasies that are bound to be repeated in software projects even as technology changes. "Agile tools" is an oximoron, and the consequence of the widespread adoption of tools like JIRA is that software engineers work in a place that is built as the best worse-case divergence from expectations: the only option that no one reacts strongly against, even if no one has no reasons to like it either.

What I want is for you to realise that Agile has been preyed, repackaged and weaponised by the very ghosts that it was meant to encounter. It has evolved to fulfill the worldview of managers, who cherrypick the parts of Agile that align to their lowest managerial instincts, while leaving out the rest as TBD. That "working software over documentation" effectively means vague requirements, "customer collaboration" means shifting, unilaterally decided priorities, and "responding to change" means engineers have no say in the tasks that they work on. And, as a result, software development projects are prone to nonstop supervision, employee alienation, technical debt and scope creep.


The idea of a method that contains fixed and invariant principles for conducting the business of software development doesn't hold when confronted with the realities of human nature. There is not a single idea in the Agile manifesto, however vague, that is not violated at some point in an Agile project, and such violations are not accidental, the result of ignorance, or collateral effects of inattention. That is not necessarily bad news: this violations are sometimes necessary for the completion of the project3.

Implicit in Royce's argument about the differences between brick-and-mortar and enterprise projects is that scale matters when outlining how software development happens, and the obvious fact that scale changes with time implies that software development processes must change along with it.

It is clear then that the idea of a fixed methodology rests on naive views of software engineering and its social aspects. To those who look at the rich material provided by history, and who don't play games with it in order to please their lower instincts or their craving for intellectual security in the form of 'best practices', it will become clear that there is only one principle that can be defended under all circumstances and in all stages of software development.

It is the principle of anything goes.


  1. Once you twig this, it's hard not to see that political debates aren't about convincing anyone, but about prompting those who believe the same things to vote. It's all about survival.
  2. The original spread of Christianity in the Roman empire is likely to be the consequence of their intolerance. When you believe in several gods, one more isn't that big of a deal. When there is just one, believing in anything else is heresy. See Religious Toleration in Classical Antiquity, by Peter Garnsey.
  3. For example, it is both reasonable and absolutely necessary for a project to have the most critical parts of the organisation of the codebase being designed upfront, which contradict well-established Agile frameworks.