ÁD Studio

Blurring the lines between business strategy
and software engineering.

I, engineer

On what it means to engineer new things, and do science.
(1264 words)


I work in a nascent industry. My bachelor had nothing to do with it, and I never earned a certification that made me qualified for it. My only real obstacle between me and writing software that would be used by real world people was a whiteboard interview. I am though, in some sense of the word, and engineer.

Ironically, I recall glaring at engineers with contempt in university. We were physicists, and there was this notion that physicists developed the equations that the engineers used to design the edifices. We considered ourselves the secret of humanity's success, the triumph of thought, and tomorrow's unsung heroes. And it has taken me several years to realise that this notion is simply false. It's just that engineers get sandbagged by historians.

In a research paper called Urgency, uncertainty, and innovation: Building jet engines in postwar America, Phil Scranton showed that jet engines have been designed in a completely trial-and-error manner, without anyone truly understanding the theory underneath. Builders, in fact, needed the first engineers to know how to twist things and make the engine work. Theory is a byproduct1. Scranton was polite enough to point out that this arises when innovation is messy, and made a distinction between that and "more familiar analytic and synthetic innovation approaches", almost insinuating that the latter are the norm. They're not.

In The Knowledge Machine, author Michael Stevens argues that there's something that didn't exist in Ancient Greece and now does, and that is "the knowledge machine that we call modern science". It's hard to object to the fact that Socrates did not have MRIs or means of transport faster than horses, but the idea that we have modern science and they didn't seems like a false dichotomy to me. Romans were able to make the Lycurgus cup, and anybody who builds things for a living as oppose to just writing about them knows that you do not get a cup that shows different colours depending on whether or not light is passing through it unless you go through an intentional process of iteration and refinement. There may have been some inicial accident that ignite the whole process, but engineering a result this good takes a long time. That is something that the 400 AD Romans had; and it's not conceptually different from the science that brought you the jet engine.

I'm not denying the possibility that academic science isn't behind some practical technologies at all. But there is a body of know-how that is transmitted from master to apprentice, and only in such a manner, and the role of formal knowledge is overappreciated precisely because it is highly visible.


I have a long standing issue with the word "tech". It arised a couple of years ago, when the then-CEO of Goldman Sachs, Lloyd Blankfein, repeated emphatically that the investment bank he headed was a tech company. Now, I don't have an issue with a CEO trying to persuade his employees that they are already working at a FAANG-like company. But Goldman Sachs isn't a tech company, and neither is any of the FAANGs.

Facebook isn't a tech company; it's a media company like The New York Times. And so is Google. Apple is a manufacturer, and Netflix is no different from Disney or Warner Bros2. And Amazon is a big conglomerate like General Electric used to be. The relevance of technology on their core businesses has relatively no more influence than it has for literally any other company in their sector. "Tech" is not an industry, but rather an aspect of every industry.

How the computers have fit into our lives is peculiar. When compared to other technological advantages, it is time that it sank into society's substrate. We don't have office companies, or human companies. We just have companies. And yet, we still describe Facebook, Apple, Amazon, Netflix or Google as tech companies and this, I believe, is because of how language evolves. We are still grappling, confronting and contending with the purpose of computers in our lives. It's no wonder why words that describe behaviours on the Internet keep appearing as Word of the Year consistently3.

A computer is special in that it has a recursive purpose: it's a machine that builds machines. It takes descriptions, and transforms itself into the machine described to produce the same outputs. And on top of that, it has global reach and unlimited versatility. It's the lever long enough and the fulcrum on which to place the world that Archimedes asked for to move it.

Its chameleonic capacity is also what makes the cost of building new tools effectively zero. Marc Andreseen announced as early as 2011 that "we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy"4. He predicted that the "software companies" would replace the incumbent ones by means of leveraging code. But, in reality, software won in the same way that the steam engine won in the Industrial Revolution: by eventually replacing any other means of production that came before with its superior performance.

Companies that refused to adapt were displaced, such as Blockbuster. And those that adapted, survived. Such as Goldman Sachs.


The kind of work that a software engineer does at Facebook isn't the same as the one that a journalist does in The New York Times. But its ultimate purpose is the same: selling ads. The reason why the social network is worth more than the newspaper— and why the engineer earns more than the journalist— is because of the leveraging capacity of the former: Facebook attracts a bigger audience, and with higher quality for advertisers than The New York Times ever will. And the reason why they aren't in the same league is because computers are natural copy-pasters, tireless replicators. Facebook is able to regurgitate an endless feed of engaging content to its user. On the other hand, at some point, you get to The New York Times' footer, and that's it.

The moral, the philosophical, the ethical consequences of that aren't clear yet. But the mechanics are straightforward: an algorithm is more powerful than all the people in the world. And like the Industrial Revolution was a cultural shift in terms of what really is the fruits of someone's labor, that shift has happened again with computers. Then, people moved from effort to efficiency; now, people are moving from efficiency to ability. The focus isn't so much on how much can you do today (at sunrise, the computer is several hours ahead of you), but in your capacity to describe the job accurately to your laptop. The cloud has simply floored the pedal; you're not even required to have the machine anymore: a description of the computer that you need is enough.

The new job requirement is this: a willingness to recognize failure, to not paper over the cracks, and to change. To reflect deliberately, even obsessively, on failure, and being constantly on the lookout for new solutions5. A long, but beautiful, way to say ingenuity. The trait of engineers.

  1. I became aware of this story while reading Nassim Taleb's Antifragile.
  2. And that's pretty much the reason why it was so easy for Disney to catch up on Netflix's strategy.
  3. From the Oxford English Dictionary: unfriend (2009), selfie (2013), and my personal favourite: 😂 (2015).
  4. From Why Software is Eating the World. The whole essay deserves a careful read; it has aged really well.
  5. Paraphrased from Atul Gawande's Better: A Surgeon's Notes on Performance.