Open source and low-cost tools shifted the balance of power to developers.
When I got into the technology business almost two decades ago, a few things seemed odd. First, that businesses were paying a lot of money for technology that was so hard to use. Second, that businesses would spend weeks or months selecting a high cost technology product and then not deploy it, turning it into what became known as “shelfware.” By far the strangest idea to me, however, was that the people selecting the technology generally were not the people who had to use the technology. This seemed not only a fundamentally flawed idea, but one that didn’t match the realities of technology adoption.
Most enterprise technology organizations have been constructed as command and control structures, with decisions made at the top radiating from there down to the rank and file. Or at least that was the theory. Senior technology executives, for example, would typically sanction a small number of programming languages for official use — historically, Java and .NET were the most commonly approved stacks. Never mind that these, in many cases, were not the best tools for the job — they were what was approved.
This type of top-down organizational control is possible, even today, as long as the cost of a given technology exceeds a certain threshold. If a developer’s only path to procurement is a CIO writing a check, it stands to reason that the CIO will have a say — and usually the deciding say — in a given decision. But the fact is that this control hasn’t been possible for years, because the costs of so many technologies has been zero or close enough not to matter. Read more…