Demand for computing power

Submitted by gil on

I'm putting my prediction hat on again, let's come back in a year and see how well I've done.

It's well known that Intel has fallen behind TSMC's fab technology. I think this is the beginning of a trend that will drive up the cost for computing power across the market. Big buyers of semiconductors, who carefully forecast their own growth in demand for computing power, also estimate the lower costs and improved performance that come with each generation and use that to guide their budgets. When Intel starts stumbling and comes up short, with chips that offer less computation or more electricity draw than expected, these buyers suffer as they now have to increase the volume of their buys of chips that aren't as marginally effective as expected. It's a big capital demand up front and a higher total cost over the lifetime of the hardware. Increased demand from the biggest volume buyers means increased cost to the smaller buyers who are also suffering from the higher total cost and missed performance of those chips. Our old hardware doesn't go away, and the new supply of chips doesn't disappear. But this is a big enough supply chain disruption to drive up the cost of computing power for a decade.

So what does this mean for the software industry? The cost of self-hosting is going up but so is the cost of cloud computing. If capital costs are high expect businesses to lower headcounts and hire fewer engineers to do more. I expect that we'll see a move away from productive but computationally inefficient ecosystems like Python and Ruby and towards more computationally efficient ecosystems like Rust, Swift and C++. I don't know if Java, C# and Javascript - the languages with top-tier JITs - are in one category or the other, but they're probably more efficient than not. They might suffer more from the decrease in headcounts over their increased capital costs.