ÁD Studio

Blurring the lines between business strategy
and software engineering.

Archive

9076 words and counting

  • Humane Web


    In 1910, the Austrian architect Adolf Loos delivered a lecture titled "Ornament and Crime", in which he argued against any form of decoration. A truly civilised person, he said, would have "outgrown ornament". Loss advocated a spare and austere form of design, where function was predominant rather than concealed, and where monochromatism would highlight the natural properties of the materials.

    Adolf Loos, Villa Muller, 1928-1930, Prague. © Rachael McCall
    Adolf Loos, Villa Muller, 1928-1930, Prague. © Rachael McCall

    "Ornament and Crime" was a response to a time (early twentieth century) and a place (Vienna) in which Art Noveau was at its peak. But it's remarkable how much of it is present in Kim Kardashian's mansion. A white, diaphanous place, it's an ode to unused space. Like in the middle of the ocean, everything feels miles apart, and nothing steers you away from realising that it isn't your typical urban loft, which is precisely the point. "The proportions are the decoration", said Kayne West in an interview with Architectural Digest. The place is the architectural epitome of how much it costs to maintain an immaculate emptiness.

    Kim Kardashian's gargantuan sofa, with herself for scale. © Kim Kardashian
    Kim Kardashian's gargantuan sofa, with herself for scale. © Kim Kardashian

    Some people, especially today's wealthiest individuals, seem to be in the lookout for space to clear. Mark Zuckerberg was accused of "colonising" the Hawaiian island of Kauai because he allegedly forced his neighbours to sell the land near his property. Jeff Bezos recently bought the mansion across the street from the two mansions he had already in Washington, D.C., and combined them into the single largest man cave ever.

    Average young adults, meanwhile, are left with way less ambitious spaces, but the mindset is by all means the same. What is the point, then, of tyiding up with Marie Kondo if not to transform your Diogenes-like compulsive hoarding of stuff into spaces of unsoiled nothingness? And yet, the end result is just a bland imitation: our attempt to Keeping Up With The Minimalists ends up being a Swedish aesthetic of sameness. We are nudged into consuming what's quick and easy, delivered to our front doors, rather than taking the time to cultivate our own idiosyncratic taste.

    Why reflect, when an enthusiastic blogger's opinion is one search away? Who has time for making a couple mistakes? What do I care? Rather than facing the stakes, we end up ordering the product from whichever ad our favourite social network puts in front of us more often. Safe from the difficulty of doubt, we part ways with the responsibility of choice.

    Adolf Loos considered futile decoration a form of savagery, like facial tattoos, and posed the reductive modernism of Europeans—something another Austrian Adolf may sympathise with— as the final solution to all aesthetic problems. Be that as it may, it's possible that what Loos had in mind actually resemble the Kardashians, the Zuckerbergs and the Bezos more than your ordinary Färgglad chair or any pre-fabricated, drop-shipped, Amazon-delivered paraphernalia. Loos wanted customised, thoughtful environments that provided anything its dwellers needed. Instead, your everyday aesthetic is inspired by Le Corbusier's idea of a house: "a machine for living in". Inhumane.

    Drake's Toronto home hall. © Architectural Digest
    Drake's Toronto home hall. © Architectural Digest

    Not all is lost, however. Coming like a wrecking ball, the new generations' sensibility is nothing but baroque. At the highest end, Drake's hall displaying his collection of basketball jerseys— like African taxidermy— give the impression that today's rappers are contemporary adaptations of the Great Gatsby; and Gigi Hadid's maximalist kitchen counter filled with billiard balls beggs the question: what?

    Gigi Hadid's kitchen. © Hello!
    Gigi Hadid's kitchen. © Hello!

    The fact is that this unapologetical aesthetic is the fulfillment not of the vision of a cantankerous architect, but of a personal identity. Rather than consulting with a committee on best practices, post-minimalists' approach is "this is our house, this is our rules". There is a sheer intention, nothing is frugal, and the thinking happens inside the dweller's head, tired of being told what to do. The result has nothing to do with what an irresponsible less-is-more conformist would be able to come up with. Something humane. When your domestic space is all you have, maximalism is more appealing than über-efficient, played out minimalism. A true assertion of individualism is, perhaps, this new era's signal for wealth.

    Surprisingly, individual assertion is nowhere to be seen in Web design. What's even worse, those who pioneered a new approach to site layouts, such as Jason Santa Maria, Dustin Curtis, or Greg Wood, have now reverted back to simple and minimalistic websites, designed under the driving force of a played out philosophy.

    Why aren't there more e-commerce sites that are visually appealing? Why in a world where Amazon has defined the boring thumbnail grid as the lowest common denominator, most indie online stores choose to zig rather than zag?

    Perhaps the Web being driven by a tendency to follow best practices and accommodate to Google's warped sense of what's evil reins in unconventional layouts. Perhaps the sense of nonstop supervision and criticism hypertrophies our selfconsciousness and instinctive conformity. Perhaps it's just that simple is easier than complex, and less error-prone.

    Advances in technology have made possible to unstuck ourselves from Adolf Loos's view, but we haven't advanced emotionally online. We're stuck in a rut! It's time to step up, to look ahead instead of around, and build sites where it's clear to see the people who made them, rather than the frameworks they used. It's time to move beyond the gravitational pull of the social network's design of rigid likeness, algorithmical optimization of pictures and captions, and orthodoxy.

    Let us build humane sites, for a humane web.

  • Art Direction for Gatsby sites


    I have been mulling over Jen Simmon's Getting Out of Our Ruts for a while. I've noted before that it appears to me that everything online is the same, and her talk just confirms that there's been people out there saying just that for quite a while now. Even with a massive theme market, creators mimic the best sellers, because they're not going to earn a living by going out on a limb and creating something truly original. Generic wins out every time.

    Clients have of course noticed, and are even asking for these rudimentary designs. They see lots of sites with those styles, and it’s causing them to ask for them too. You have to be very brave as a business owner to risk looking different. The mouth says "unique" but the wallets shout "just like everyone else".

    In an effort to do something different with my site, I've been tinkering away at ways to be less predictable with it, and I ended up realising something very interesting: creating pages with different styles is hard.

    That in itself isn't bad. I have to remind myself constantly that what I'm trying to do is a path followed by very few people. But by making many decisions for you, frameworks make the work needed to overrule those decisions more difficult with every level of abstraction. One-off things become hard to do, because Web development frameworks dial up convenience at the expense of flexibility and variety.

    If this site is truly going to become my artistic playground, then I thought it might be helpful to share some of my crazy ideas, to sharp my skills and set the tone, feel, and artistic style for my projects. But at the same time, I don't want these experiments clashing with what I've previously posted.

    Perhaps I'm alone in this. But in any case, it could be helpful to document what I found out about tweaking Gatsby for my unconventional desire to do away with consistency.

    Templating

    One of the parameters on Gatsby Node createPages API is component, and it's usually filled with a layout component shared across all essays. I wanted to keep that functionality by default, but also wanted it to be flexible enough to add more stuff if I wanted to.

    So I did the following: I created a frontmatter variable called template.

    ---
    title: "Art Direction for Gatsby sites"
    template: artDirectionForGatsbySites
    ---

    This variable will be used be used by the createPages if it's there, like so:

    // you'll call `createPage` for each result
    posts.forEach(({ node }) => {
    let template = node.frontmatter.template
    ? node.frontmatter.template
    : "defaultEssay";
    let component = `./src/templates/${template}.js`;
    createPage({
    // This is the slug you created before
    // (or `node.frontmatter.slug`)
    path: node.fields.slug,
    // This component will wrap our MDX content
    component: path.resolve(component),
    // You can use the values in this context in
    // our page layout component
    context: { id: node.id },
    });
    });

    By using this new frontmatter variable, all the old posts kept being generated using the default template (in my case, it's stored in defaultEssay.js). And this new post is generated using the new artDirectionForGatsbySites.js1.

    Isolated Styling

    The simplest way to isolate CSS and avoid conflicting styles across your site is using CSS modules. It solves the problem of isolation not by restricting the scope of the styles, but by making sure that name collision in the styling classes never happen.

    I had mixed feelings about it, since I'm not actually solving the issue, just papering over the cracks, but it does the job. Now everytime I create a new template, there's often a .module.css file associated, and I can tinker as much as I want with the end result.

    Wrapping up

    That's all the tricks I have. Hope they are helpful. If you follow this route, by all means, do let me know, and I'll do my best to help you. And welcome to the IndieWeb!


    1. I did a similar thing with the background color, just in case I just wanted to modify that instead of creating a whole new template just for that. It's similar enough to what I just did with templates and won't discuss it here, but if you get stuck, have a look at the source code here.
  • Really Simple Syndication


    The arrangement of pieces of content following one after another in time. Free from anyone's selection and care but the reader's. Unaltered by social media circuit breakers. Straight from the source.

    RSS is a means of allowing basically anything online to be collated into a single feed. It puts the reader behind the wheel, and removes any form of intermediation between the source of content and the consumer of it. You follow, like in Twitter, but they don't really know; you like with the "Add to Feed" button, and dislike with "Delete". You are not in a platform, but on a protocol. The difference is subtle, but substantial.

    On Facebook is a Doomsday Machine, author Adrienne LaFrance said that "We need people who dismantle [the notion that social platforms are free in exchange for a feast of user data] by building alternatives." Apparently, there's a good alternative already, one that had its heyday in the beginning of the twenty-first century, when the zeitgeist online was "do whatever you want with the information present". Before social media as we know it today. Before they siloed us off into their walled gardens.

    The whole point of social media is monetization of awareness, and RSS offers none of it. It was disintermediated by design, protecting us from virality and addiction. Like email, it's a protocol, and not a platform. To have only what I want to see, shown only when I want to see it.

    Its popularity peaked long ago, but RSS is still very much alive at Feedly, Newsboat or Fraidycat (the one I use). If you feel like trying out RSS for yourself, why not get started by adding this blog to your reader?

  • Reading is too easy


    It's not like we don't read; we do it all the time. Social networks, newspapers, flyers, billboards. Everything is either words, or people reading words to us. The scarcity of the old ages has simply been obliterated. It used to cost a small fortune to put a book out there- now we have KDP; it used to be prohibitely expensive to buy books everyday- now we have subscription platforms, free books online. Technological advancements have made reading beyond cheap to sell, and beyond cheap to buy.

    The old constraints on the seller and the buyer of books were arguably what made the experience of reading meaningful, what sparked the curiosity of readers and made the candlelight readers stay awake. Books made a difference to them, whether or not they read it. The nuisance and the complexity of reading worthwile books is precisely what makes them so rewarding. Like a good hobby or a challenging workout, it's not in the mindless repetition bu in the engagement where we find the meaning and reap the rewards of the challenge.

    Nobody who ever gave his best regretted it.

    George Halas

    Maybe what we need now it's self imposing constraints, create artificial walls, build difficulties where there are none. But I have a better alternative. Look for hard books, and read them. Skip the best-sellers: most of them are junk food for the brain. Go for those that were printed decades ago, even centuries ago, and are still around. Treat them like a Middle Age monk treated the Bible: with care, respect, and patience. Try to memorise the key passages1. Summarise the main thrust of it, or to others. And then challenge and criticise it.

    It will take a while, but what's the alternative? Reading a myriad of pointless books, one week at a time? That's not reading: it's virtue signalling. In the end, reading one New York Times bestseller after another is a recipe for existential dread, because most of what gets into that list are books that are simply a waste of your time2, a sophisticated magazine with the sole purpose of being sold, not being enjoyed.

    Look for hard to find books, because good books need no marketing. Time is on their side.


    1. When was the last time that you memorise anything from a book you read?
    2. Have a look at The New York Times Non-Fiction Best Sellers of 2017 and be honest with yourself: how many of these books are still around?
  • The Future according to Ted Kaczynski


    Jules Verne, author of Twenty Thousand Leagues Under the Sea, Around the World in Eighty Days, and From the Earth to the Moon, is the embodiment of the hope, awe and fascination of the nineteenth century society with modern science. They acted as if the future was happening too fast, rapidly becoming distant past in the blink of an eye. These new steam-powered machinery were just exactly what we needed to get rid of poverty and suffering. No goal was ambitious enough; everything was a matter of time.

    "The exit of the opera in the year 2000", a 1902 painting by Albert Robida
    "The exit of the opera in the year 2000", a 1902 painting by Albert Robida

    Wherever you looked, you saw the Carnegies and the Rockefellers, the Teslas and the Edisons, "benefactors of the age"1 making life easier and easier. All of humanity, relentlessly building an unsinkable ship in order to cruise to paradise.

    Needless to say, this naiveté broke down in a very Titanic-esque fashion with World War I. But it could have been worse: it could have happened.


    The main thing you have to understand about Theodore Kaczynski is that he is a terrorist in the purest sense of the word: he used terror to propagate his message. As a side effect, most people have focused on what's terrorific about his actions, rather than the content of the message. Reading The Industrial Society and its Future, you are constantly juggling what you're reading with the thought that the person who wrote it had killed people in order for you to read.

    In order to get our message before the public with some chance of making a lasting impression, we've had to kill people.

    Theodore Kaczynski

    One of the most interesting parts of the essay is, of course, a section titled "The Future". The essay has been, up to that point, a historical analysis of how technology has been "a disaster for the human race". Among the consequences of the increasing dependence on technology, our society is more unstable, our lives are unfulfilling, and most people spend their time engaged in "surrogate activities", artificial goals such as consumption of entertainment, political activism and following sports team. He wrote that in 1995, but might as well be talking about yesterday.

    On top of that, Kaczynski has a pessimistic view of what the future holds for us. Unless we do something, quickly, technology is going to become so essential to our lives that there won't be a chance for human freedom, or dignity.

    The human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machine's decisions.

    Theodore Kaczynski

    For many people, he argued, their future is unemployment. Nowadays, thinking we are in an era of technological unemployment that is increasingly making skilled workers obsolete is not lunacy; it's a prevailing opinion. And the most popular solution suggested has been some form of basic income by which virtually everyone is to some degree dependent on the state: the ultimate loss of freedom.

    On those who are employed, the system will demand increasing specialisation, and their work will be also increasingly out of touch with the real world. That is a form of freedom loss: Like an ant, being specialised is fragile, dependent on the rest of the cogs in the machine to survive. Machines will take over, and human beings will be kept busy by being given relatively unimportant work. Read that as the gig-economy.

    Individuals [...] will be more dependent than ever on large organizations; they will be more "socialized" than ever and their physical and mental qualities to a significant extent [...] will be those that are engineered into them rather than being the results of chance.

    Theodore Kaczynski

    Kaczynski took what he saw to its ultimate consequences: that technology is creating for human beings a new environment, one that is ver different from what nature has adapted us. If we don't adjust, we will be forced to.

    You can see method in his madness. It's not very far from what environmental activists have been saying for quite a while: that we are close to disaster, and we need to dump this system and face the consequences, or something even worse will happen.

    So let me end with an idea of why he is wrong.


    Along with the idea that human design is counter to our nature runs the notion that human beings have evolved in a similar manner to any other species on the planet. That is, to a great extent, true. But there is something that humans have that pretty much any other species doesn't, which is culture. And what some anthropologists2 have been arguing lately is that relatively early in our evolutionary history, we crossed an evolutionary Rubicon, at which point cultural evolution became the primary driver of our fate as species.

    For example, as our cooking techniques developed, we didn't have to rely on large teeth, our bodies have coevolved with culturally transmitted knowledge related to cooking. Chopping, scraping, and pounding meat replaced some of the functions of teeth, mouths and jaws. My parents are fond of saying that digestion starts in the mouth, but perhaps it's more accurate to say that digestion starts at the butcher's shop.

    As a result, humans can no longer survive without cooking. Our ability to process food acted as a force of natural selection, and our bodies save a lot of energy by doing away with excessively long intestines. This very energy savings became one in a confluence of adjustments that allowed our species to build and run bigger brains. Technology may even be the key factor by which we are now Homo sapiens.

    Kaczynski is wrong, not because technology isn't changing our environment; it is. He's wrong because he didn't fully grasp the reach of the technological changes. He offered a pessimistic version of our future without taking into consideration that it's exactly what culture has been doing to us since the beginning. We are more socialized than we were millions of years ago. And that isn't simply gone by going back to a pre-industrial era. The future according to Theodore Kaczynski happened a very long time ago, and made us humans.


    1. As Kierkegaard described them.
    2. Joseph Henrich, among others.
  • The first beer is free


    It just feels natural because it is: you get up in the morning on your bed, have your breakfast, wear your clothes. We live surrounded by things we own. But ownership is nothing more than a form of restricted freedom: everyone else acts as if they believed that it's not OK to just crash your door and pour some of your coffee, even though they could, if they tried to. Societies are functional because, among other things, we have come to terms with the idea of someone "owning" stuff, and it has been working fine for a long time.

    Until the invention of means to sell intellectual property, that is.

    In 1980, Richard Stallman and some other software engineers at MIT were frustrated by the fact that they just could not tinker with the software that ran the Xerox 9700 laser printer that they had. They owned it (or at least the MIT had paid for it), and yet, they could not do as they pleased with it. It's as if you were sold a bottle of wine that could only pour in crystal glasses. For physical objects, the fact that something is in your posession, under your control, means complete control: they might refuse to help you if you break it, but as far as you're concerned, there's nothing that's stopping you from looking under the hood.

    This experience was what convinced Stallman of people's need to be free to modify the software they use. That is, he believed that society should treat intellectual property as if it were physical: control is either complete, once purchased, or it isn't control at all.

    Ah, but there's an obvious point to be made: software isn't physical. It's not stuff, in the sense that there are no limits to your ability to copy paste it endlessly1. The economics of redistribution is the first thing that goes out the window, because it used to be prohibitively expensive to buy something, figure out how to make it, and turn around and find people to sell it to. Non-physical properties do not operate under that paradigm. But, then, in what paradigm do they operate?


    There's a remarkable moment in The Mentalist where the star character offers advice on how to seduce a woman. "Seduction is easy once you know the basic principles". And then, he says: It'll cost you a dollar, so you pay attention. It's the perfect metaphor of how informational products are sold online: there's the platitude, followed by the sale. The platitude is free; it's a form of marketing and proof of credentials: it's as important to say something true as it is for it to be just actionable enough and said in a confident way, leaving you curious.

    Again and again, we keep on mistaking the platitude for the product. Google isn't a search engine. It sells ads. Search results are the bait. The Internet is what transmits the platitude at lightning speed to global audiences. What you do with that is what cannot be replicated. What's free on the Internet is nothing like 'free beer'; it's more like "the first beer is free".


    In 1995, CD sales peaked. As soon as a cheaper means to transmit music became widespread, forcing people to buy circular plastic to listen to music wasn't a viable business anymore. And that cheaper means was the relentless copy machine we call The Internet.

    At its most foundational level, [the Internet] copies every action, every character, every thought we make while we ride upon it.

    Kevin Kelly, Better than Free

    I know it's sort of an obvious thing to say, but humour me: the Internet changed everything. For those of us who create intellectual property for a living (writers of words meant to be read either by humans or machines), there is simply no way to approach a market just by showing up: everyone is just one click away. Copies of what you sell are not Made-in-China cheap; they are Designed-in-California free.

    And yet, we still think that what we sell is unique in some way. As if that mattered. As if there wasn't a kid that speaks just good enough broken English to do what you do better, faster, and for peanuts. As if the guys who invented the Xerox 9700 that Stallman wanted to tinker with were still around. As if Google or Facebook, or even Microsoft2, weren't furiously churning tools to accelerate this everything-is-free trend even more.

    When thinking of selling music, it always brings me back to a younger version of me, paying for alcohol when I wanted to get into a club. Music was free to dance to, as long as you paid for a minimum number of drinks.

    Everything is free. Free as in "you don't pay anything upfront". But not as in "free beer". What I want you to consider is that those companies in the business of giving you things for free so that you click ads are the ones most eager to share open source tools, contribute to projects that are meant to be free, and commoditize products that used to have more straightforward, single-purchase business models. Everything is free in the Internet because we have been taught that the only appropriate business model online is free consumption under commercial barrage.

    Stallman didn't win; information was never meant to be free on its own. We're just paying for it somewhere else.


    1. Thanks Larry Tesler.
    2. Of all places!
  • Beyond Minimalism


    Minimalists haven't got the memo yet: every website looks exactly the same. Once you casually search for themes on jamstackthemes.dev, you start to realise two things. Primo, that virtually all the descriptions contain the word "minimalist". Secondo, that every theme has that Medium-like flavour, the containerised structures, the white backgrounds (or black, if they provide with a dark theme). The same, sans-serif typography, hero titles and clean (too clean?) ambience. It's as if people around the world agreed to design Just OneTM Website Layout, and we are simply looking at different flavours of their masterpiece.

    It's time to say it: Minimalism's design philosophy has reached a dead end. There's nothing novel about it anymore.

    Minimal Mistakes, the most popular template on JamstackThemes
    Minimal Mistakes, the most popular template on JamstackThemes

    Minimalism has been the last twenty years' defining aesthetic. It represents what my generation craves the most: efficiency and simplicity. Minimalism was a reaction against the Diogenes society that was instructed by marketing to amass and posses. The world is impossibly complex, difficult to navigate, and full of clutter from a distant past, and we need to "start from scratch" and remove everything, only keeping what we really need. The climactic moment for minimalism was Steve Jobs's introducing the iPhone1.

    Steve Jobs Introducing The iPhone At MacWorld 2007
    Steve Jobs Introducing The iPhone At MacWorld 2007

    We have gone downhill since then. Everything is the same. Websites have become so similar that someone trained a neural network to churn websites for startups that do not exist.

    Why? Because minimalism is another word for playing it safe. It's shooting for mediocrity instead of attempting to become competent. Everyone buys the same products, wear the same clothes, thinks the same way. There's no need for alternatives. The logical consequence of minimalism is precisely the digital hub: Jobs' cohesive plan to take all of your devices and make them one harmonious product.

    It's all over now. Minimalism doesn't work because having Just OneTM of anything is not enough. It stagnates, then decays, and finally perishes, leaving us just with the ashes.

    Most people are cucumbers; boring, predictable, almost tasteless, okay for a salad but keep them away from my spicy tacos. Pickles, however, are tangy, crunchy, delicious, unpredictable, and perhaps most importantly, pickled.

    Dan Piraro

    Minimalism is for cucumbers. Let's be crunchy pickles.


    1. An iPod, a phone, and an Internet communicator. Are you getting it? These are not three separate devices. This is one device. And we are calling it iPhone."
  • Beware of the Man of a Single Book


    Ben Evans once joked that Jeff Bezos was probably obsessed with an old encyclopedia of retail.

    If you could look in the safe behind Jeff Bezos’s desk, instead of the sports almanac from Back to the Future, you’d find an Encyclopedia of Retail, written in maybe 1985. There would be Post-It notes on every page, and every one of those notes has been turned into a team or maybe a product.

    Benedict Evans, Amazon is a boring retailer.

    Evans is making fun of the fact that most of what Amazon has done isn't innovative, but an extreme, technologically adapted consequence of what could be learned from a retail textbook. Thomas Aquinas' beware of the man of one book is the quote that usually comes to mind when people are said to be obsessed with one particular idea, and its most common interpretation is "you should read more".

    But my take puts more emphasis on the word "beware": those who read few great books, but with focus, intensity and curiosity, are enlightened creatures that you do not want to oppose1. Jeff Bezos is a man of a single book. My interpretation is that you must be way more demanding about the things you read, so that you can be indulgent and patient with those that are worthy of your attention.

    If Thomas Aquinas were alive now, I believe he would have said beware of the man of the social network.


    1. In San Isidro, Labrador de Madrid, writer Lope de Vega said "A noteworthy student is he, the man of a single book. For when they were not filled up with so many extraneous books, as they were leaving behind, men knew more because they studied less."
  • Against Agile


    One of the most difficult things in software is naming things. It's a process that relies fundamentally on extending our language in creative ways. Astronaut, for example, comes from the Greek "astron" (ἄστρον), "star", and "nautes" (ναύτης), "sailor". Before humans could travel beyond the Earth, there was no compulsion to invent a word for it, and the one we came up with to talk about the Gagarins and the Armstrongs is a pretentious way to say that more humans have been on a ship than on the Moon.

    Our most pressing problems belong to this category, what sociologists call cultural lag: being forced to slog through obsolete institutional habits to achieve our goals. They arise from trying to organise a supersonic, computer-driven, and nuclear world with horse-and-buggy ways of life. The result of the tremendous acceleration of technological progress.

    These strains are usually met in the only sensible way possible: people are constantly striving to bring institutions into closer relationship with reality.

    Let me therefore warn you that it is not my intention to inform, or to establish some truth. What I want to do is change your attitude. I want you to sense chaos where at first you noticed an orderly arrangement of well behaved things and processes.

    Paul Feyerabend, "Against Method"

    Reacting to these forceful changes, most people furiously demand cures that, sometimes, are worse than the illness. But some of us are making efforts to achieve a relation of mutual acceptability between our institutions and anything that puts requirements on it.

    Unfortunately, the task isn't simple. In practice, we are groping for some sort of harmony between an unfinished creation and a nebulous context: understanding the world we live in and designing appropriate institutions for it are two faces of the same coin. And weirdly enough, and this is something that everyone who designs for a living has experienced, criticising is easy; what's difficult is saying it's done.


    Like many professions, software engineering was started by a bunch of trailblazers who, independently of each other, started leveraging the newly invented computer to do their work faster and better. But with the professionalisation of writing code, our industrial society concentrated intensely on finding ways to develop software more efficiently. With the first software engineers came the first software development methodologies.

    In 1970, Winston Royce published his influential article Managing the development of large software systems: concepts and techniques, in which he presented several project management models, including what we know now as waterfall, iterative, and agile.

    Yes, Agile was invented in 1970. We simply didn't have a word for it.

    In this article, Royce defines two essential steps in every software development project: analysis, and coding. The kicker is that you can only get away with doing just that for for brick-and-mortar projects. An implementation plan to manufacture larger software systems, and keyed only to these steps, however, is doomed to failure, he said. Immediately after, he presented a more grandiose approach, read "Waterfall", where analysis and coding are preceded by two levels of requirement analysis, separated by a program design step, and followed by a testing step.

    Royce, nevertheless, noticed something crucial: that over time, the design is more detailed:

    There is an iteration with the preceding and succeeding steps [...]. The virtue of all of this is that as the design proceeds the change process is scoped down to manageable limits.

    Winston Royce

    The riskiest thing for Royce was the lag between implementation and testing, and if the testing phase was unsuccessful, a major redesign is required, effectively bringing the project back to square one.

    To summarise: any Agile adherent will tell you that Waterfall is bad, and that if you aren't doing Agile, it follows that you are doing Waterfall. But from the very beginning of software development history, projects that break down activities into linear sequential phases à la Waterfall, putting off testing to the later stages of the process, are risky. That's because it's in the testing phase where, for the first time, software is ran rather than analysed, and many things in the project cannot be analysed. In other words, the paper that introduced the Waterfall model did so as a straw man, and not as a feasible way to develop software.


    How do we come to terms with the fact that the story that things were slow and now they're fast is fictional? That preeminent figures in software development had been advocating incremental and iterative development, even in the very 1970 Waterfall paper?

    I want you to consider this: that Waterfall was a convenient misrepresentation of the ideas laid down by expert software engineers for the convenience of managers. People who think of software development as an assembly line, and expect something specific, in an specific timeframe. If that isn't the case, then why have we all been asked, in an Agile project, to produce or follow a detailed roadmap with milestones?

    This alternative theory may explain why Agile projects are the way they are, as opposed to what a Professional Scrum MasterTM will tell you.

    A couple of years ago, I was interviewed for a job in which Agile wasn't just a way of doing things: it was an obsession. Tasks were moved deliberately on a huge whiteboard on the CEO's office, and people's workday was structured with clockwork precision. They had to work in pairs in 20 minute stints, forcing them to short-circuit their instincts to do anything but work. The office was designed in a way that would have made Jeremy Bentham proud, a sort of Panopticon where all the inmates software engineers were under careful vigilance by the CEO. It had all the Orwellian attributes that people despise about school, and no one said anything about "individual versus processes". That's when I came to terms with the idea that the reason why Agile has spread so wide and so fast is, too, for the convenience of managers, adapted to a software industry where people are younger and less disciplined than they were in the 70s, and crave for direction and pats on the shoulder. Agile is high-school.


    I concur with Whorf's view that language is not merely an instrument for describing events, but also a shaper. That grammar contains an ontology, a view of the world and the speaker's society and their role in it, which influences their perception, their reason, their memory of events and their testimony of them. I came to understand that Agile methodologies, such as Scrum or Kanban, are sufficiently accepted and have grown into sufficiently complex entities to be considered along the same lines as languages.

    In the influential The Structure of Scientific Revolutions, Thomas Kuhn claimed that history of science reveals proponents of competing paradigms failing to make complete contact with each other's views. Think, for example, of Michelson and Morley's experiment revealing cracks in the theory behind the existence of ether, and being swept under the rug as a result. This competing paradigms use different languages to address different problems, and communication across the divide is limited, if not impossible. Scientific progress, then, doesn't happen because of a healthy debate where the best ideas win, but rather because of evolutionary forces: the people who have 'good' ideas and accept them beyond question survived, rewriting the history of how their ideas came to eventually become dominant1.

    Indeed, even though "Waterfall" was simply a straw man methodology to be mocked and ridiculed, the development of Agile methodologies followed a similar pattern to that described by Thomas Kuhn: there is a foundational set of precepts (the proverbial "individuals and interactions over process and tools...", and so on), which turn into a firmly set tradition that strongly resists change. This set of ideas emerged under the intolerant umbrella of an us-against-them mentality that aids widespread adoption2. In the end, Agile was prone to dominate the software industry, regardless of its effectiveness.

    Agile methodologies say more about their proponents and their worldview than it uncovers 'better ways of developing software'. When someone uses the word 'agile', they are not simply getting the technical definition, but also the whole infrastructure around it. An infrastructure that is only useful if it addresses the idiosyncrasies that are bound to be repeated in software projects even as technology changes. "Agile tools" is an oximoron, and the consequence of the widespread adoption of tools like JIRA is that software engineers work in a place that is built as the best worse-case divergence from expectations: the only option that no one reacts strongly against, even if no one has no reasons to like it either.

    What I want is for you to realise that Agile has been preyed, repackaged and weaponised by the very ghosts that it was meant to encounter. It has evolved to fulfill the worldview of managers, who cherrypick the parts of Agile that align to their lowest managerial instincts, while leaving out the rest as TBD. That "working software over documentation" effectively means vague requirements, "customer collaboration" means shifting, unilaterally decided priorities, and "responding to change" means engineers have no say in the tasks that they work on. And, as a result, software development projects are prone to nonstop supervision, employee alienation, technical debt and scope creep.


    The idea of a method that contains fixed and invariant principles for conducting the business of software development doesn't hold when confronted with the realities of human nature. There is not a single idea in the Agile manifesto, however vague, that is not violated at some point in an Agile project, and such violations are not accidental, the result of ignorance, or collateral effects of inattention. That is not necessarily bad news: this violations are sometimes necessary for the completion of the project3.

    Implicit in Royce's argument about the differences between brick-and-mortar and enterprise projects is that scale matters when outlining how software development happens, and the obvious fact that scale changes with time implies that software development processes must change along with it.

    It is clear then that the idea of a fixed methodology rests on naive views of software engineering and its social aspects. To those who look at the rich material provided by history, and who don't play games with it in order to please their lower instincts or their craving for intellectual security in the form of 'best practices', it will become clear that there is only one principle that can be defended under all circumstances and in all stages of software development.

    It is the principle of anything goes.


    1. Once you twig this, it's hard not to see that political debates aren't about convincing anyone, but about prompting those who believe the same things to vote. It's all about survival.
    2. The original spread of Christianity in the Roman empire is likely to be the consequence of their intolerance. When you believe in several gods, one more isn't that big of a deal. When there is just one, believing in anything else is heresy. See Religious Toleration in Classical Antiquity, by Peter Garnsey.
    3. For example, it is both reasonable and absolutely necessary for a project to have the most critical parts of the organisation of the codebase being designed upfront, which contradict well-established Agile frameworks.
  • Originals


    In 1927, George Lemâitre, a Belgian Catholic priest, noted that one of the solutions to Einstein's equations in the general theory of relativity allowed for an expanding cosmos. That led him to postulate that the Universe must have begun at some point, and called it the "primeval atom".

    Einstein, when first confronted with this solution, informed Lemaître that he found the idea "abominable", and force his theory back into a static configuration with a completely made up change. Even as tensions with new observations increased, the idea of an unchanging universe remained popular for a time, while Lemaître's idea of a primordial point received the picturesque name of "Big Bang theory".

    Millikan, Lemaître and Einstein after Lemaître's lecture at the California Institute of Technology in January 1933.
    Millikan, Lemaître and Einstein after Lemaître's lecture at the California Institute of Technology in January 1933.

    What is an original thought? Many people would say one that no one has come up with before. But I don't think that's exactly true. I believe that original thoughts are like the Big Bang theory: surprising consequences of simply rearranging what came before.

    That's why originality seems paradoxical: a common critique of an original idea is "I thought of that before".

    Originality means finding new ways of describing the world. And if the world were static, human beings wouldn't need originality. All the good ideas would have already been recorded already, and human beings would have had to find something else to do.

    But we don't live in a static world; nor do we live in one that we fully understand. In a sense, we are no different from a paleolithical tribe that sits around the campfire at night: feeling awe when looking up at the stars, and terror when confronted with what's beyond the light. An original thought is like a torch, born from that shared fire, with the purpose of illuminating the way into the darkness.

    Original is not simply being surprising. It has to have structure, whereas a surprise can simply be something that makes you flinch. On the other hand, just rearranging information lacks the power of awe that originality entails.

    So it seems to me that originality is not in saying that God created the world, but in remarking that Einstein's relativity predicts something in accordance to that. And that is precisely why it is so useful: our knowledge is not a tight-knit network, but a somewhat structured bowl of spaghetti, and any help in the process of disentanglement is very much welcome.

    Notes on Nassim Taleb's Antifragile (black), Jonathan Blow's talk "Preventing the Collapse of Civilization" (blue), and my own input (red and pencil)
    Notes on Nassim Taleb's Antifragile (black), Jonathan Blow's talk "Preventing the Collapse of Civilization" (blue), and my own input (red and pencil)

    We surely value it, rationally and biologically. An independent mind can be magnetic, be it a comedian or a politician, for as long as it thinks the right thoughts.

    In an interview with Jordan Peterson, comic-book writer Gregg Hurwitz said that once characters like Batman or any of the Avengers take off and establish a life of their own, they have a backstory collectively held by the readers. You may come up with an alternative universe where you can muck around as much as you want, but otherwise you'd better not venture too far from the backstory or you're going to get letters from readers telling you that you got it wrong. And so in order to be original and successful there must be a collaboration between the writer and the readers, between what's surprising and what's already part of the structure. No wonder why most comic books gravitate towards mythological themes.


    If originality is so valuable, why is it so rare? Perhaps because thinking deliberately is something that doesn't come natural to human beings. Our animal brain1 favours prompt action and false positives (not doing it millions of years ago meant you would soon be a predator's dinner), and thinking deeply is only a very late development, something we need to be doing consciously and on purpose, drawing from a pool of resources that the rest of our mental system resists to share. Original is also something that makes the one that has it salient, and that is something that we avoid instinctively as part of our efforts to conform to the group.

    A good heuristic for achieving originality is bringing a common idea to its ultimate consequences. Most dystopian novels exemplify that framework: Brave New World is Aldous Huxley's novel on the catastrophe of hedonism, Orwell's 1984 is a study of how a world too safe looks like, and Asimov's Foundation series is just a portray of the decline and fall of the Roman Empire in a futuristic fashion2. In all these cases, the author is trying to answer the question "How would the world would be if this condition of human experience tilted to its maximum value?". In many cases, the answer is in the form of "Not Good", but it doesn't always have to be the case. Just imagine a world without lawyers.

    Another good heuristic is taking things out of their common context, as it is the case with idioms like "papering over the cracks", or "a new broom sweeps clean". Most memes on Reddit fit this category nicely3. It also pervades pretty much all political and business discourse, and conforms the building blocks of virtually all funny jokes.

    Politicians and diapers must be changed often, and for the same reason.

    Mark Twain

    Originality seems to be more a consequence of curiosity than mechanical thought though. You look into what the world is, and how it is currently described, and compare the differences. I would say that's 90% of what it takes: the rest is probably finding the right words for it.

    To some degree, people are curious, and articulate, but usually not both, and as luck may have it, curiosity could lead you to unfruitful paths. Despite that, I do believe that there hasn't been a period of time when being curious didn't end up in novel discoveries, so my piece of advice is this: Be curious, and originality will take care of itself.


    1. The one that Kahneman and Tversky called System I.
    2. That, at least, according to the afterword for "Legal Rites", one of the short stories collectively known as The Early Asimov.
    3. For instance, on Joe Biden's inauguration day, thousands of people photoshopped senator Bernie Sanders into a myriad of different contexts.
  • I, engineer


    I.

    I work in a nascent industry. My bachelor had nothing to do with it, and I never earned a certification that made me qualified for it. My only real obstacle between me and writing software that would be used by real world people was a whiteboard interview. I am though, in some sense of the word, and engineer.

    Ironically, I recall glaring at engineers with contempt in university. We were physicists, and there was this notion that physicists developed the equations that the engineers used to design the edifices. We considered ourselves the secret of humanity's success, the triumph of thought, and tomorrow's unsung heroes. And it has taken me several years to realise that this notion is simply false. It's just that engineers get sandbagged by historians.

    In a research paper called Urgency, uncertainty, and innovation: Building jet engines in postwar America, Phil Scranton showed that jet engines have been designed in a completely trial-and-error manner, without anyone truly understanding the theory underneath. Builders, in fact, needed the first engineers to know how to twist things and make the engine work. Theory is a byproduct1. Scranton was polite enough to point out that this arises when innovation is messy, and made a distinction between that and "more familiar analytic and synthetic innovation approaches", almost insinuating that the latter are the norm. They're not.

    In The Knowledge Machine, author Michael Stevens argues that there's something that didn't exist in Ancient Greece and now does, and that is "the knowledge machine that we call modern science". It's hard to object to the fact that Socrates did not have MRIs or means of transport faster than horses, but the idea that we have modern science and they didn't seems like a false dichotomy to me. Romans were able to make the Lycurgus cup, and anybody who builds things for a living as oppose to just writing about them knows that you do not get a cup that shows different colours depending on whether or not light is passing through it unless you go through an intentional process of iteration and refinement. There may have been some inicial accident that ignite the whole process, but engineering a result this good takes a long time. That is something that the 400 AD Romans had; and it's not conceptually different from the science that brought you the jet engine.

    I'm not denying the possibility that academic science isn't behind some practical technologies at all. But there is a body of know-how that is transmitted from master to apprentice, and only in such a manner, and the role of formal knowledge is overappreciated precisely because it is highly visible.


    II.

    I have a long standing issue with the word "tech". It arised a couple of years ago, when the then-CEO of Goldman Sachs, Lloyd Blankfein, repeated emphatically that the investment bank he headed was a tech company. Now, I don't have an issue with a CEO trying to persuade his employees that they are already working at a FAANG-like company. But Goldman Sachs isn't a tech company, and neither is any of the FAANGs.

    Facebook isn't a tech company; it's a media company like The New York Times. And so is Google. Apple is a manufacturer, and Netflix is no different from Disney or Warner Bros2. And Amazon is a big conglomerate like General Electric used to be. The relevance of technology on their core businesses has relatively no more influence than it has for literally any other company in their sector. "Tech" is not an industry, but rather an aspect of every industry.

    How the computers have fit into our lives is peculiar. When compared to other technological advantages, it is time that it sank into society's substrate. We don't have office companies, or human companies. We just have companies. And yet, we still describe Facebook, Apple, Amazon, Netflix or Google as tech companies and this, I believe, is because of how language evolves. We are still grappling, confronting and contending with the purpose of computers in our lives. It's no wonder why words that describe behaviours on the Internet keep appearing as Word of the Year consistently3.

    A computer is special in that it has a recursive purpose: it's a machine that builds machines. It takes descriptions, and transforms itself into the machine described to produce the same outputs. And on top of that, it has global reach and unlimited versatility. It's the lever long enough and the fulcrum on which to place the world that Archimedes asked for to move it.

    Its chameleonic capacity is also what makes the cost of building new tools effectively zero. Marc Andreseen announced as early as 2011 that "we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy"4. He predicted that the "software companies" would replace the incumbent ones by means of leveraging code. But, in reality, software won in the same way that the steam engine won in the Industrial Revolution: by eventually replacing any other means of production that came before with its superior performance.

    Companies that refused to adapt were displaced, such as Blockbuster. And those that adapted, survived. Such as Goldman Sachs.


    III.

    The kind of work that a software engineer does at Facebook isn't the same as the one that a journalist does in The New York Times. But its ultimate purpose is the same: selling ads. The reason why the social network is worth more than the newspaper— and why the engineer earns more than the journalist— is because of the leveraging capacity of the former: Facebook attracts a bigger audience, and with higher quality for advertisers than The New York Times ever will. And the reason why they aren't in the same league is because computers are natural copy-pasters, tireless replicators. Facebook is able to regurgitate an endless feed of engaging content to its user. On the other hand, at some point, you get to The New York Times' footer, and that's it.

    The moral, the philosophical, the ethical consequences of that aren't clear yet. But the mechanics are straightforward: an algorithm is more powerful than all the people in the world. And like the Industrial Revolution was a cultural shift in terms of what really is the fruits of someone's labor, that shift has happened again with computers. Then, people moved from effort to efficiency; now, people are moving from efficiency to ability. The focus isn't so much on how much can you do today (at sunrise, the computer is several hours ahead of you), but in your capacity to describe the job accurately to your laptop. The cloud has simply floored the pedal; you're not even required to have the machine anymore: a description of the computer that you need is enough.

    The new job requirement is this: a willingness to recognize failure, to not paper over the cracks, and to change. To reflect deliberately, even obsessively, on failure, and being constantly on the lookout for new solutions5. A long, but beautiful, way to say ingenuity. The trait of engineers.


    1. I became aware of this story while reading Nassim Taleb's Antifragile.
    2. And that's pretty much the reason why it was so easy for Disney to catch up on Netflix's strategy.
    3. From the Oxford English Dictionary: unfriend (2009), selfie (2013), and my personal favourite: 😂 (2015).
    4. From Why Software is Eating the World. The whole essay deserves a careful read; it has aged really well.
    5. Paraphrased from Atul Gawande's Better: A Surgeon's Notes on Performance.