When I got into the technology business almost two decades ago, a few things seemed odd. First, that businesses were paying a lot of money for technology that was so hard to use. Second, that businesses would spend weeks or months selecting a high cost technology product and then not deploy it, turning it into what became known as “shelfware.” By far the strangest idea to me, however, was that the people selecting the technology generally were not the people who had to use the technology. This seemed not only a fundamentally flawed idea, but one that didn’t match the realities of technology adoption.
Most enterprise technology organizations have been constructed as command and control structures, with decisions made at the top radiating from there down to the rank and file. Or at least that was the theory. Senior technology executives, for example, would typically sanction a small number of programming languages for official use — historically, Java and .NET were the most commonly approved stacks. Never mind that these, in many cases, were not the best tools for the job — they were what was approved.
This type of top-down organizational control is possible, even today, as long as the cost of a given technology exceeds a certain threshold. If a developer’s only path to procurement is a CIO writing a check, it stands to reason that the CIO will have a say — and usually the deciding say — in a given decision. But the fact is that this control hasn’t been possible for years, because the costs of so many technologies has been zero or close enough not to matter.
In Linux, developers found a freely available, highly serviceable and always improving Unix-like operating system upon which to build. In MySQL, a lightweight, stripped down database that likewise was available for free. In Apache, a free web server that was credible enough that even IBM abandoned its own commercial alternative. In Perl, PHP or Python, to name just the Ps, free languages that reduced the mean time of application creation. In Eclipse, a free alternative to expensive Integrated Development Environments (IDEs). In AWS, a very low cost, on-demand source of provisioned hardware. And so on.
There are, of course, dozens if not hundreds of other important projects and products that deserve a mention here. But their ultimate importance is in their aggregate impact. For the first time a developer, be they independent or operating within a larger enterprise, needed no one’s permission to acquire the technology they needed to be productive. Whether they required hardware, storage, an operating system, runtime, database, or development tooling, it was all available to them at no cost. With access to the underlying source code, in most cases.
What this meant was that the balance of power began to tilt in favor of developers. Developers, not their bosses, became the kingmakers. Technology selection increasingly wasn’t determined by committee or bake offs or who played golf with the CIO, but by what developers decided, on their own, to use.
MySQL salespeople used to walk into businesses, for example, only to be told that they were wasting their time because the business wasn’t using any MySQL. At which point the MySQL salesperson would reply, “That’s interesting, because your organization has downloaded the package 5,000 times in the last two years.” This was and is the new balance of power. Not for every technology sector, of course, but for more every year.
Every day we at RedMonk talk to businesses in the technology sector. What an increasing number of them are asking is the simple question: “If developers are the most important constituency on the planet, what should I be doing?” This is the question that the New Kingmakers (free ebook via New Relic) is designed to answer. If your business involves software — and there are vanishingly few that don’t, today — understanding the constituency mostly likely to determine success from failure is imperative. Those would be king, after all, need the help of kingmakers.