Recently I’ve been thinking about the technological ecosystem. It has evolved over the decades from a scrappy, loosely-connected community of enthusiasts to the scene of some of the world’s largest companies. This shift was largely economically driven. The advent of e-commerce showed business that they had much to be gained by “going digital.” The traditional marketing formulas centered around locale (the “place” member of the 4 Ps of marketing took on a different character altogether) were upended, and logistical innovations unlocked seemingly endless scale, as the rise of giants like Amazon can attest. There are of course many factors and fierce competition, but it’s clear that the character of technology in all of our lives is nearly unrecognizable from the standpoint of the pre-2000s era.
The move of information technology from the margins to the center of business demanded much higher scale, elasticity, and service level than were available before. Accordingly, technologists and businesses have delivered. Today, most people considering a digital business would never think of self-hosting their infrastructure (there are, of course, exceptions) and will turn to a cloud provider for the provisioning of servers, the largest being AWS, Google, and Microsoft. Although there is clear competition and innovation among these providers—one receives a whole “ecosystem” of services—one is in effect choosing one feudal lord over another for their leige.
I first heard of this idea from a blog post by Bruce Schneier awhile back. Bruce talks about this idea mainly through the lens of user security, and the ramifications are manifold. The choices available to a user (or business) will be circumscribed by the choices of the provider. Also, control over many aspects of service operation, including security, are shifted to the provider, which can be a good thing, as he says. Shifting providers is costly (though not impossible).
It is worth thinking about what is given up, though. In a previous post I talked about how believing in cloud abstractions limits the ability of a technologist to understand and influence their operation—it takes away some of their power. Obviously, this can be good and useful, so far as the technology behaves as expected. Abstraction also removes humans and materials from consideration, including their suffering and impacts on the Earth, and I think this is a unalloyed bad thing.
Today, much of our nutrition is provided by enormous agricultural and food processing businesses. Many people have little conception of where their food comes from and the processes (largely fossil fuel driven) that produce it. They have little thought for the lives taken to produce meat. Other than during extraordinary circumstances such as a disaster or logistical disruption, we don’t really have to worry about the availability of food either. If we go to the grocery store, we will usually find what we want. When we visit a restaurant whatever food we desire appears before us in a matter of minutes.
This situation is pretty similar to the one we find ourselves within in the technology ecosystem. While the need for data does not rise to the same level as the need for food, in a lot of ways our modern lives do depend on its provisioning. There are differences, though. In particular, the food industry is regulated to a level that the technology industry does not even begin to approach. We depend on the proper functioning of systems and regulations to detect and pull sickening food products from our grocery shelves, to ensure that they are properly labeled for ingredients and nutrients, and that they are safely and (to perhaps an insufficient degree) ethically sourced. As the digital domain grows to affect more of our lives, I think we should expect and demand that the technological analogues of these regulations be put into practice and enforced.
With regards to mindfulness, we can learn from food systems as we consider our technological options. Many people keep a garden or raise backyard chickens. While it is impossible for most people to subside solely on the food that they produce themselves, they derive considerable pleasure from being involved with the production of some amount of their food. Growing a garden and getting one’s hands dirty puts one in the place where it happens and teaches one about the processes, challenges, and pleasures of growing food.
Many communities have “community gardens” in which a group of people cooperates to grow and tend a food garden together, and they all reap the proceeds at harvest time. It is a communal activity, bringing people together, and can produce a surprising quantity of food (though, again, typically not enough for the community to survive on). I think that this model is one that technologists can emulate, what could be called a “community computer club.”
Purchasing, configuring, and running physical servers is something of a lost art these days. Though it is entirely possible. Modest servers may be purchased relatively cheaply (especially used or refurbished) and can be run on free and open source software (FOSS). Indeed, nearly all commercial cloud infrastructure runs on FOSS. There are many skills that running servers oneself can teach: security (physical and otherwise), reliability and disaster recovery, system administration, governance, and many more.
The infrastructure thus provided may be used to serve applications that are of use and interest to the community—the club members themselves, or to others as granted by the club. Governance could take many forms, but I envision a club that I would like to be part of having the following characteristics:
Like a community or personal garden, such infrastructure is unlikely to be able to meet all the information technology needs of its members, but also like a garden it can provide them with community, mindfulness, learning, and fun. It doesn’t break the stranglehold and power of our technological feudal overlords, but it can expose its members to another way of doing things.