Compute. Energy. Information. The Levers of Advantage in This Decade
Why compute, energy, and information now behave as a compounded strategic system for institutions and companies.

Every decade has its shortcuts to power. In the industrial age, it was steel and logistics. In the broadcast age, it was attention and distribution. In the software boom, it was networks and scale. This decade feels different, not because the old levers disappeared, but because three fundamentals are reasserting themselves with unusual force.
Compute. Energy. Information.
They are not buzzwords. They are the operating system of competitiveness for companies and countries, and increasingly for institutions that sit somewhere in between. If you want to understand why some organizations seem to move with impossible speed, why some products arrive earlier than expected, why certain ideas spread while others vanish, follow these three.
They are also tightly entangled. Compute turns energy into capability. Information turns capability into direction. And direction determines where energy and compute are spent. The organizations that treat them as a single system will outpace those that manage them as separate budgets.
Compute: the new industrial capacity
Compute is not just servers and GPUs. It is the ability to turn ambition into iteration. It is a production line for experimentation.
In practice, compute is three things:
1) Infrastructure. Chips, data centers, networks, storage, and the plumbing that makes modern services reliable. This is the visible part of compute, the part that shows up in capital expenditure, cloud bills, and hardware roadmaps.
2) Software and talent. Compilers, kernels, distributed systems, model training pipelines, evaluation harnesses, and the people who can keep them humming. Two companies can buy the same hardware and still live in different centuries depending on how well they can use it.
3) Organizational rhythm. The most overlooked dimension. Compute is only an advantage if it compresses the loop between idea, test, and deployment. The winners are not simply those with the most processors, but those who can ask better questions and answer them faster.
This decade’s defining pattern is that capability is becoming more continuous. In the past, progress often arrived in punctuated leaps: a major product release, a big acquisition, a new factory. Now capability can be trained, fine-tuned, evaluated, deployed, and improved on a rolling basis. That turns compute into a kind of industrial capacity: not for manufacturing cars or phones, but for manufacturing decision quality, personalization, prediction, and increasingly, automation.
It also changes what “scale” means. Scale is not only market reach. It is the ability to run more experiments, more simulations, more evaluations, and more feedback cycles than your competitors. It is the ability to explore a larger space of possibilities, and then compress that exploration into something that feels like taste, timing, and intuition.
Of course, compute is not evenly distributed. Advanced chips are complex to design, expensive to fabricate, and constrained by supply chains. Cloud access makes compute feel abundant, but availability under stress, pricing, and export controls remind everyone that compute is physical. The cloud is a convenience layer over a world of factories, electricity, and logistics.
So compute is becoming a strategic resource again. Not in a cinematic way, but in a practical way: who has priority access, who can afford sustained training runs, who can recruit the engineers who know how to wring performance out of a system, who can keep inference costs low enough to ship features at scale.
In the background, another truth is emerging: compute is no longer a side function of IT. It is increasingly the core capacity of the organization, as central as manufacturing was to the 20th century firm.
Energy: the constraint that bites back
For a while, digital progress felt like it floated above the physical world. We talked about “the cloud” as if it lived somewhere weightless. We built products whose marginal cost approached zero, and we came to expect that everything important could be copied infinitely.
Then the bills arrived, and they were measured in megawatts.
Energy is the quiet governor of the compute era. Every model training run is a conversion of electricity into structure. Every inference request is a tiny draw on a grid. Every data center is a physical footprint: land, cooling, backup systems, fiber routes, and human operators.
Energy matters for three reasons this decade:
1) Scale is colliding with the grid. As demand rises, energy becomes a gating factor, not just a cost line. You cannot deploy your way around an overloaded substation. You cannot optimize your way out of permitting delays. You cannot hire your way out of transformer shortages. At some point, reality is a queue.
2) Energy is becoming more political. Energy security, fuel prices, mineral supply chains, and geopolitical shocks spill directly into industrial strategy. When energy is stable and cheap, it fades into the background. When it is volatile, it shapes what gets built and where.
3) Energy is becoming a competitive differentiator. Organizations that can secure long-term power, improve efficiency, locate intelligently, and design systems that are resilient under constraints will have a persistent edge. The most valuable engineering work may not be another feature, but a 15 percent reduction in inference cost, or a cooling design that unlocks a new site, or a scheduling strategy that follows renewable availability without breaking product latency.
There is a cultural shift hiding here. The next wave of digital winners will not be purely “software companies” in the old sense. They will be hybrids. They will know the physics of power and heat. They will negotiate with utilities. They will treat energy contracts as product strategy. They will design algorithms with electricity in mind, because electricity is no longer an externality. It is an input.
This is also where the decade’s anxiety gathers. The world is trying to electrify more of its economy while also feeding a growing appetite for computation. If energy supply lags, or if an energy crisis returns in a severe form, the effect will not be confined to household bills. It will ripple through data center construction, cloud pricing, industrial policy, and the pace of AI deployment. The bottleneck will not be ideas. It will be power.
Information: the scarce resource in a world of abundance
We live in an age of infinite content and limited clarity. The result is that information, properly understood, has become both more valuable and more contested.
When people say “information,” they often mean “data.” Data matters, but it is only the raw ingredient. The real advantage comes from turning data into a living system of understanding: what is happening, what is changing, what matters, and what to do next.
Information as a lever of advantage has several layers:
1) Data access and data rights. Who can collect, store, and use which data, under which rules. Regulation, privacy expectations, and platform dynamics shape this layer. Competitive moats are increasingly built not only on data volume, but on legitimate, durable access.
2) Intelligence and interpretation. The ability to translate signals into decisions. This includes analytics, forecasting, and machine learning, but also includes organizational cognition: how teams share knowledge, how they resolve disagreement, how they update beliefs when reality changes.
3) Trust and provenance. In a world where synthetic media is cheap and persuasion is scalable, trust becomes a core asset. Provenance, auditability, and verification are not merely compliance issues. They become the basis of brand value and institutional legitimacy.
4) Narrative and coordination. Information shapes what people believe is possible and what they choose to do together. The organizations that can communicate clearly, build shared context, and align action will outcompete those that drown in their own noise.
The strange paradox of this decade is that tools are making it easier to generate content while making it harder to know what to believe. That increases the premium on high-quality information. It also changes what it means to have an advantage.
A company can have excellent models and a powerful compute stack, and still lose if its information system is polluted: poor data hygiene, misaligned metrics, incentives that encourage story-telling over truth, leadership that confuses confidence with accuracy. In that environment, the organization spends compute and energy to reinforce the wrong conclusions.
In other words, information is the steering wheel. Compute is the engine. Energy is the fuel. Plenty of engines burn fuel without going anywhere useful.
The flywheel: how the levers compound
The most important thing about compute, energy, and information is that they compound.
Better information helps you deploy compute where it will actually create value. It also helps you detect waste and reduce energy spend. Better compute lets you process information faster and build smarter systems. Better energy positioning makes compute cheaper, more reliable, and more scalable, which gives you more opportunities to learn from information.
This creates a strategic flywheel. Once it spins, it is hard to stop.
That flywheel explains why certain organizations seem to widen the gap year after year. They do not merely “invest in AI.” They invest in the whole stack: instrumentation, evaluation, infrastructure, energy strategy, governance, and talent. They build internal markets for truth, where ideas compete with evidence. They turn learning into a factory.
It also explains why others feel stuck. They might buy tools, hire consultants, or run pilots, but the system around those efforts is not designed to learn. The data is fragmented, the incentives are misaligned, the infrastructure is brittle, the energy costs are unpredictable, and the organization cannot separate signal from noise. The result is that compute becomes expensive and information becomes political, and the flywheel never starts.
What this means for advantage
For leaders, the implication is not that everyone must become a chip company or an energy trader. It is that advantage will increasingly belong to those who can do three things well:
Build capacity, not just projects. Treat compute as a capability that improves over time, not a one-off procurement. Invest in the teams and systems that make iteration cheaper.
Treat energy as strategy, not overhead. Efficiency is product design. Power reliability is operational resilience. Location is competitive positioning. The best energy decisions look boring on a slide, and decisive on a balance sheet.
Make information trustworthy and actionable. Data quality, governance, evaluation, and transparency are not bureaucratic chores. They are the foundations of speed. In the coming years, the ability to prove what is true, what was generated, and why a decision was made will matter as much as the decision itself.
And for societies, the implication is even larger. These levers shape national competitiveness, but also the distribution of opportunity. Access to compute influences which researchers can contribute. Energy prices influence which regions can host industry. Information integrity influences whether democracies can coordinate and whether institutions can maintain legitimacy.
These are not separate debates. They are one debate, about the foundations of progress.
A nice ending: the gap is moving, not standing still
It is tempting to tell a simple story: the rich will get richer, the powerful will accumulate more compute, more energy, more information, and the gap will widen without limit. There is truth in that, but it is not the whole truth.
In some ways, the gap really is closing. Tools are becoming more accessible. Techniques spread quickly. Open ecosystems let smaller teams build things that once required armies. Education is more available than ever, and a single skilled person can produce work that would have taken a department a decade ago. The floor is rising.
In other ways, the gap may widen sharply. The physical constraints are real. The best chips are scarce. The best infrastructure clusters where energy is abundant and politics are favorable. If an energy crisis returns, or if grids struggle to expand fast enough, the winners will be those who can secure power and ride volatility, while others are forced to ration ambition.
So the decade’s story is not one of uniform divergence or uniform democratization. It is a story of shifting terrain. Advantage will not belong automatically to the biggest, nor reliably to the smartest, but to those who can connect these fundamentals into a coherent system: compute that converts energy efficiently, information that directs compute wisely, and institutions that keep trust intact when the world is loud.
That is the real contest. Not who has the most of any single lever, but who can pull all three, in rhythm, while the ground keeps moving.