From Power to Chip: How the Average Person Can Participate in the Wealth Opportunities of the AI Era

Bitsfull2026/03/16 16:0915514

概要:

From Power to Chip: How the Average Person Can Participate in the Wealth Opportunities of the AI Era


Editor's Note: When people talk about AI, the focus is often on the most visible aspects: chatbots, AI assistants, and various new applications. However, behind these products, a deeper industrial restructuring is taking place. From power to chips to data centers, and from models to applications, AI is actually a technology stack consisting of multiple layers of infrastructure, and the flow of capital and profit is far more complex than meets the eye.


This article, from the perspective of the "AI Five-Layer Structure," systematically analyzes this value chain: why billions of dollars are flowing into energy, chips, and cloud infrastructure; why model companies are burning through cash at a rapid pace; and where the true value may be concentrated in this technological revolution.


By comparing AI to historical cycles such as the power revolution and the construction of internet infrastructure, the author attempts to answer a key question: in this technological wave that may reshape the global industrial structure, where is capital flowing, and how can the average person participate in this AI wealth opportunity.


The following is the original text:


Most people think AI is just a chatbot.


I can understand why you'd think that. You open ChatGPT, ask it to help you edit an email, and it does so immediately. It feels like magic. So, you close the page, thinking you now understand what AI is all about. But that's like swiping a Visa credit card at a restaurant and then thinking you understand how Visa makes money. You've only used the product without seeing the underlying system.


For most of last year, I've been trying to figure out where AI's real profits are actually going. And an embarrassing fact is: it took me a long time to realize that I've been looking at the wrong level. I've been focused on ChatGPT, Claude, Gemini—things you can directly interact with.


Meanwhile, 700 billion dollars have quietly flowed into another set of infrastructure that I can't even name: chips I've never heard of, packaging technologies that sound made up, cooling systems, power plants. In Texas, Iowa, and Hyderabad, a large amount of concrete is being poured to build data centers.


A year ago, almost no one around me was talking about these things. And now, everyone has started talking.


This article is going to be quite long. If you don't have time to finish reading it now, you can bookmark it and come back to it later.


I want to take you through the complete AI value chain: starting from the electricity powering the data centers, all the way to the applications on your phone.


And I will explain it in a way that even if you have never read a public company's annual report in your life, you can understand. I will explain all the terms; I will provide real data for every judgment I make; for areas where I am still uncertain, I will be honest about it, because there are indeed some.


So, let's get started.


1. Five-Layer Cake (Why Isn't Anyone Discussing the Bottom Four Layers)


AI is infrastructure. Just like the internet, just like electricity, it requires factories. — Jensen Huang


Most people understand AI in this way: a smart computer answering questions.


This is like saying the internet is "a place to watch videos." Technically correct, but completely missing the point.


In January 2026, at the World Economic Forum, Jensen Huang described AI as a five-layer system:


Energy


Chips


Cloud


Models


Applications


He referred to this whole system as the "largest infrastructure construction in human history."


First, think about this term: Infrastructure.


Roads. Power grids. Water supply systems. These things keep modern civilization running, but people usually only notice them when they go wrong.


AI is becoming the same kind of thing—invisible, indispensable, with an extremely high construction cost. I refer to this entire structure as the AI Stack. It consists of five layers, one stacked upon the other, with each layer supporting the one above it, and money flowing bidirectionally between these layers.


The simplest version I can give is this:


Energy, you need electricity to power the computer, and a lot of it.


Chips, you need processors dedicated to computation. This is not the CPU in your laptop.


Cloud, you need massive warehouse-style data centers filled with these chips and interconnected by ultra-high-speed networks.


Models, you need real AI software — an "intelligent brain" learning patterns from data.


Applications, you need products that people actually use, like ChatGPT, Google Search, or a bank's anti-fraud system.


Any AI discussion that only talks about the fifth layer (application layer) ignores a full 80% of reality. And if you're an investor, entrepreneur, or just someone trying to understand where the world is headed, the truly important point is that money does not flow evenly across these five layers. It consolidates, compounds, and flows to a very few key nodes.


And today, that money is flowing to places most people are not paying attention to at all.



II. Tracking the Flow of Funds (the answer is not where you think)


People's attention is almost entirely focused on the application layer. ChatGPT, GitHub Copilot, Claude, Perplexity.


These are products you can directly use, so it's easy to think that's roughly the story of AI, these applications.


But most people overlook one thing. By 2026, the world's top four cloud computing companies (Amazon, Microsoft, Google, Meta) are expected to collectively have a yearly capital expenditure (CapEx) of $650 billion to $700 billion.


That's in one year, for the four companies combined.


This figure is roughly equivalent to the entire year's GDP of Switzerland. And about 75%, around $450 billion, will be directly invested in AI infrastructure.


It's not chatbots, not applications. It's buildings, chips, fiber optics and networks, cooling systems, things that hardly anyone talks about at cocktail parties. And that's exactly where the money is.


When you really stop to think about it, before anyone can use ChatGPT, someone had to first do one thing: build a shopping-mall-sized data center, then fill it with tens of thousands of specialized processors, connect them with networking gear worth more than most companies, and power the whole thing with enough electricity to power a small city. And do this every single day.


That's Layers 1 through 3: Energy, Chips, Cloud Infra—these are the invisible layers, the places where the real massive capital deployment goes.


Someone might ask: "But what about OpenAI? Haven't they made billions already?"


Indeed they have.


By the end of 2025, OpenAI's Annual Recurring Revenue (ARR) had reached $20 billion, up from $6 billion a year ago and $2 billion the year before that.


A two-year 10x growth is a rarity in human commercial history at this scale.


But the catch is, the costs are just as staggering.


2025: OpenAI burns through about $9 billion in cash


2026: Expected burn is $17 billion


Just the inference costs, i.e., the cost to actually run the model when you ask AI a question:


2025: $8.4 billion


Expected 2026: $14.1 billion


Based on current projections, OpenAI might not achieve cash flow positivity until 2029 or 2030.


So the question is: Where is all this money going?


The answer: Flowing down the AI tech stack.


Flowing to:


Microsoft Azure (OpenAI is obligated to pay Microsoft 20% of revenue by 2032 under the agreement)


Nvidia's GPUs


Engineering firms building the data centers


And energy providers powering it all


If you stare at this system for a little while, you'll notice an almost cyclical structure:


Microsoft invests in OpenAI


OpenAI uses this money to purchase Azure cloud services


Azure uses revenue to purchase Nvidia chips


Nvidia announces record profits


Everyone claps


And then the funds keep flowing down.


There is a crucial structural fact in the AI tech stack:


The vast majority of users are at the top layer (application layer)


The vast majority of profits are at the bottom layer (infrastructure layer)


And this misalignment between user position and profit position is at the core of the entire AI investment logic.


This is the first law of the AI value chain: revenue flows up, capital settles down.



III. You've Actually Seen This Scene Before


All human problems are essentially engineering problems, and engineering problems can eventually be solved. —Buckminster Fuller


If you want to truly understand what AI is undergoing, you can look back at the history of the electricity revolution from 1880 to 1920.


In 1882, Thomas Edison built the first commercial power station on Pearl Street in Manhattan, New York. At that time, most people thought electricity was just a novelty, a more "sophisticated" way of lighting. After all, gas lamps worked just fine. Who really needs this thing?


But in just 40 years, electricity completely reshaped almost every industry: manufacturing, transportation, communication, healthcare, entertainment


The real winners of this revolution were not the inventors of the light bulb, but those who built the infrastructure: General Electric, Westinghouse Electric, power companies, copper mining companies, engineering firms.


Today, AI is repeating the same pattern, only the pace has been compressed to a few years instead of decades.


Contrast the two chains:


AI Stack: AI → Data Center → Chip → Raw Materials → Energy


Power Stack: Power → Factory → Machinery → Raw Materials → Coal / Hydro


The two paths are almost identical. And once again, the winner is not primarily at the application layer, but at the infrastructure layer.


I call this phenomenon Infrastructure Gravity. Whenever a new computing platform emerges, those who first create wealth are always the "ones selling the shovels."


Applications may eventually rise to the top, receiving all the media attention. But the infrastructure takes away most of the profit.


For example, in the 2026 fiscal year (ending January 2026), Nvidia had annual revenue of 215.9 billion USD, a 65% year-on-year growth. In the last quarter alone, the data center business generated revenue of 62.3 billion USD, a 75% year-on-year growth. This business now accounts for 91% of Nvidia's total revenue.


In other words, a company has quarterly revenue of 68 billion USD, with 90% coming from the same business line.


Now, looking at chip manufacturing. In 2025, TSMC held about 70% of the global semiconductor foundry market, with sales of 122.5 billion USD. Samsung Electronics, in second place, had only 7.2%. This level of monopoly makes even Standard Oil from that year look less extreme.


Infrastructure always wins first. The real question is how long this window of opportunity will last.


Ask anyone what the Internet revolution is, and they will say Google, Amazon, Facebook.


But if you ask where the earliest money was made, the answer is actually Cisco Systems, Corning, the companies laying fiber optic networks.


It's the same story, just in a different era.


Section Four: The Unpopular Part


The stock market is a machine to transfer money from the impatient to the patient. —Charlie Munger


I have to admit something. When I first started looking at AI from an investor's perspective, I also made the same mistake as most people; I looked at the application layer. I saw the growth of ChatGPT. I saw Anthropic raising billions of dollars. So I thought, AI companies will win, and that's where I should invest.


Later, three things changed my perspective, and they happened in sequence.


First Thing: Hottest Companies Are Burning Cash


I found that almost every "AI company" is burning cash like crazy. OpenAI, Anthropic, Mistral AI, xAI. All of them are spending money at a rate much higher than they are earning. The reason is not a poor business model, but rather that the cost of computation is structural.


Every time you ask AI a question, the system must perform real computation. Computation requires GPUs, GPUs require electricity. And the stronger the model, the higher the computational demand, so the operating costs will only get higher and higher.


In other words: The perceived winners in AI are actually the biggest spenders.


Second Thing: Biggest Earners Are Foundational


I noticed that infrastructure companies are printing money. Nvidia's gross margin is close to 75%, TSMC is ramping up production and raising prices because demand far exceeds supply.


These companies do not have an issue of "when to profit." Their issue is that we can't build fast enough. These are entirely different problems.


Third Thing: Stop Thinking Like a 'Consumer' (Also the Most Uncomfortable)


I realized that I had been thinking about AI like a consumer.


What consumers see are applications. What engineers see is the tech stack. Once you see the entire tech stack, you can no longer ignore it.


Every AI release becomes a Capital Expenditure (CapEx) announcement. Every model upgrade becomes a new chip order. Every new feature becomes a new data center lease.


The whole industry is starting to resemble concentric circles: the closer to the center, the more concentrated the profits.


Perhaps you are: a software engineer focused on AI models, a retail investor who bought Nvidia at $300, or someone in India observing this revolution from afar (maybe you're all three at once – that's the most interesting position).


Regardless of where you stand, the principle is the same. Consumers see the product, investors see the supply chain. And the best investors see what has been formed in the supply chain before the product even launches.


Section Five: Investor Map – Disassembling the AI Tech Stack Layer by Layer


The article is already quite long, so I will pick up the pace.


Below is the structure of each layer of the AI Stack, key players, and potential opportunities.


Layer 1: Energy


AI data centers are extremely power-hungry. A single large model training run could consume as much electricity as a small town does in a year. By 2026, global AI data centers are expected to consume around 90 terawatt-hours of electricity annually, roughly a tenfold increase from 2022.


This presents a very straightforward investment thesis: those who can provide stable power to data centers stand to benefit. This includes nuclear power companies, natural gas companies, renewable energy companies, grid operators, especially energy companies near data center clusters.


In October 2025, Jensen Huang said: The speed at which data centers are self-generating electricity could be faster than getting power from the grid. In fact, many tech companies are already directly building generation facilities next to their data centers, bypassing the grid.


This point is quite shocking to me. These tech companies are becoming their own power companies.


Beneficiaries include utilities, independent power producers, power equipment manufacturers (transformers, switchgear, etc.). In regions like Asia, for example, India, as hyperscaler data centers expand, power equipment and transmission companies will also benefit.


Layer 2: Chips


This is the most familiar layer to the general public because of Nvidia. However, this layer is far more complex than just one company.


The chip layer can be further subdivided into several sub-layers:


Design Companies


Nvidia (GPU), AMD, Broadcom, Qualcomm


And an increasing number of cloud giants' in-house chips: Google TPU, Amazon Trainium, Microsoft Maia


Manufacturing Companies


Almost monopolized by TSMC with a market share of around 70%, followed by Samsung Electronics (7.2%). Intel is attempting to rebuild its foundry business, but this will take years.


Equipment Companies


The chip manufacturing machines come from ASML (the only company producing EUV lithography machines), as well as Applied Materials, Lam Research, Tokyo Electron


Memory Companies


AI models require a large amount of High Bandwidth Memory (HBM). Key players: SK Hynix, Samsung, Micron Technology


Packaging Technology


Advanced packaging technologies (such as TSMC's CoWoS) have become a new bottleneck.


The most surprising aspect of this layer is actually the concentration:


Nvidia: about 92% AI GPU market share


TSMC: manufactures nearly all AI chips


ASML: the sole EUV equipment supplier


One company designs. One company manufactures. One company produces manufacturing machines. This concentration is both an investment opportunity and a geopolitical risk.


Layer 3: Cloud & Data Centers


This is where the chips truly come alive.


Huge warehouse-like facilities:


Thousands of servers


High-speed network connections


Liquid cooling systems (moved from optional to standard)


The market is dominated by three major cloud players:


Amazon Web Services (31%)


Microsoft Azure (24%)


Google Cloud (11%)


Oracle is also rapidly expanding, planning $50 billion in capital expenditures by 2026. But this layer is much more than just hyperscalers.


For example:


Foxconn assembles 40% of AI servers


Arista Networks provides network equipment


Credo Technology (2025 Stock Price Up 117%)


Vertiv Offers Liquid Cooling


Data Center Real Estate Companies:


Equinix


Digital Realty


Even concrete suppliers are involved, with each layer having a complete supply chain.


According to Bank of America's estimate, by 2026, hyperscalers will allocate 90% of their operating cash flow to capital expenditures. In 2025, this figure was 65%.


Morgan Stanley forecasts that these companies will issue over $400 billion in debt this year to build data centers. In 2025, this number was $165 billion.


When I first read that number, I paused. $400 billion in debt in a year just to build more warehouses full of computers.



Layer 4: Model


This layer is the "brain layer," responsible for training and building actual AI models.


Main players include:


OpenAI (GPT series, with annual revenue of over $200 billion)


Anthropic (Claude, reportedly with an early 2026 annual revenue run rate of around $190 billion)


Google DeepMind (Gemini)


Meta AI (Llama, open-source model)


Mistral AI


xAI (developing Grok)


This layer fascinates me because it is both the most hyped and the least profitable.


For example:


OpenAI's revenue growth is unprecedented, but it is still expected to burn $17 billion in cash in 2026.


Anthropic is growing equally fast but highly relies on financing—a recent $5 billion funding round early in 2026 valued the company at around $170 billion.


The issue is that this layer's business model has a structural contradiction. The model becomes stronger, requiring more computing power, while the cost of computing power tends to grow faster than revenue.


This is somewhat akin to running a restaurant, where each new dish requires more expensive ingredients, but customers expect prices to remain the same.


The result is that profit margins are constantly squeezed.


When will this change? I'm not sure, perhaps not in the short term.


For investors, this layer represents high risk and high reward. The problem is that most companies are still private.


Thus, investment exposure on the public market mainly comes from two channels:


Cloud Computing Companies


For example, Microsoft owns a significant stake in OpenAI and provides computing power to it through Microsoft Azure.


Chip Companies


Because their hardware is heavily consumed during the model training process.


Layer 5: Applications


This is the layer you see every day. For example, ChatGPT, Google Search powered by Gemini, Microsoft Copilot features in Office, banks' AI anti-fraud systems, Netflix's recommendation algorithm, AI image enhancement on your phone


The application layer is the broadest and most crowded layer. Thousands of startups and large enterprises are competing here. In the long run, it may become the largest layer in market size. Some forecasts suggest that by the early 2030s, the market size of the application layer could exceed $2 trillion.


But at the current stage, this layer also has the thinnest margins and the most uncertain competition.


In this layer, true differentiation comes from data. Companies with unique, proprietary data will establish lasting advantages.


For example:


Salesforce—Enterprise CRM Data


Bloomberg—Financial Market Data


Epic Systems—Medical Records Data


A company that masters this data moat can perform deep fine-tuning on AI models, something a generic chatbot cannot achieve.


For investors, the application layer may ultimately offer the most significant upside but also carry the highest capital destruction.


Most AI startups will fail, with only a few survivors achieving exponential compounding growth.


The most likely investment thesis in the next 3 to 5 years is to bet on infrastructure now and applications later. The smartest capital has already positioned itself this way.


Companies that will truly win at Layer 5 are likely those that possess data that others cannot access.


Interestingly, many of these companies don't even consider themselves AI companies.




Six. AI Risk: "Isn't This Just Another Bubble?"


An investor's greatest enemy is likely themselves. —Benjamin Graham


Let's address the most common question head-on. "What about the internet bubble? Isn't this the same thing? Massive infrastructure investments, no profits, everyone caught up in hype."


This is a good question and deserves a thoughtful answer.


The key difference is that during the internet bubble era, companies were building infrastructure before the demand had truly materialized. Companies were laying fiber optics, constructing server farms, while most internet users were still on dial-up.


The result was that the infrastructure was in place, but the real demand only emerged 5 to 7 years later. During that interim period, a large number of companies went bankrupt.


By 2026, the demand for AI already exists. Nvidia's chip supply is unable to meet demand, TSMC's advanced packaging capacity is fully booked, and cloud computing lease prices are rising instead of falling. Meanwhile, from March to October 2025, OpenAI added 400 million weekly active users. Models are being utilized.


Compute is being consumed. Customers are paying. This does not mean there is no risk. In fact, the risk is substantial, and I likely think about this issue more frequently than I'm willing to admit.


Three points are particularly worth noting.


Capital Misallocation Risk


By 2026, tech companies will spend over $650 billion on data centers.


If the growth rate of AI services revenue is not sufficient to support these investments, many companies will face significant margin compression. Even Amazon's free cash flow could turn negative this year.


This is Amazon we're talking about, a company that virtually invented the cloud computing business model.


Supply Chain Concentration Risk


The AI supply chain is highly concentrated.


TSMC produces around 70% of the world's chips.


ASML is the sole EUV lithography machine supplier.


Nvidia designs 92% of AI data center GPUs.


Any major disruption, geopolitical, natural disaster, or shifts in the competitive landscape could impact the entire AI industry chain.


For example, a major earthquake in Hsinchu, Taiwan, could set global AI development back by years. This should be deeply concerning.


DeepSeek Variable


In January 2025, the Chinese AI lab DeepSeek released a model. Its performance is on par with state-of-the-art models, but the training cost is only a tiny fraction of before.


This challenges a core assumption that more computing power input always leads to better AI.


If future open-source and high-efficiency models continue to narrow the gap, the infrastructure investment logic will be undermined.


I don't believe DeepSeek overturns the entire AI investment logic. But it does introduce a previously nonexistent variable. And once this variable emerges, it won't disappear.


But I always come back to a larger framework.


The long-term forecasts from consulting firms are as follows: McKinsey & Company expects global data center investment to reach $6.7 trillion cumulatively by 2030; PwC estimates AI to contribute $15.7 trillion to global GDP by 2030; the International Data Corporation (IDC) forecasts that AI-related solutions will have a cumulative economic impact of $22.3 trillion.


Even if these numbers are overestimated by 50%, we are still facing the largest technology-driven economic transformation since the Internet. The issue is not the direction but the scale.


I often hear people say, "I'm skeptical about AI."


Of course, you can be.


You can be skeptical of the model's capabilities, skeptical of the development timeline, but do not overlook the supply chain structure.


These are two entirely different things. One is a healthy rational skepticism, the other will make you miss out on opportunities.


Five years from now, the winners of this cycle will definitely look very obvious.


History is always like that. And the key to this game right now is: to understand the structure before others see it.


Seven, Engage in This Game at the Right Level


Imagine AI as a five-level video game. Each level is a different checkpoint.


Level 1: Energy


This is the beginner tutorial level. Important, simple, and as long as you operate normally, you will hardly lose. Low risk, stable returns.


Like the game's mission NPC: won't die but always gives rewards.


Level 2: Chip


This is the Boss battle. Power is most concentrated, profits are highest. But at the same time, there is the highest technical risk and geopolitical risk.


The rewards are huge, but it's Hard mode.


Level 3: Cloud Computing


This is the multiplayer server, where all players are active. Hyperscalers are like server administrators, taking a cut from all transactions.


Level 4: Model


This is the PvP arena. The competition is extremely fierce, and the pace of innovation is very fast.


Most players will be eliminated, only those with the best equipment can survive.


Level 5: Application


This is an open-world map. The possibilities are endless, but there are no fixed rewards. You have to find tasks yourself.


The true Meta Strategy is simple. You don't need to complete all levels.


Most people go play Level 5 because it's the most eye-catching. But the smartest money right now is grinding Levels 2 and 3 because that's where the highest returns are at this stage.


Your position in the tech stack determines what you should focus on.


For Non-Techies


You don't need to understand how a GPU works. You just need to know that someone has to manufacture GPUs, someone has to build data centers for them, someone has to power them. And these companies are all publicly listed, so you can read their financial reports.


For Techies


You already know models are getting stronger. But you might be underestimating one thing: the real bottleneck is shifting to the physical world: power, cooling, chip packaging. The AI competition of the next decade may be more of an engineering problem than a model architecture problem in papers.


For Investors


The AI value chain is actually five different trades. Different risks, different time horizons, different winners. Treating AI as an industry is akin to treating "tech" as an industry in 1998. Massive internal differences.


This situation won't last forever. One day, infrastructure will mature, the application layer will consolidate, and value will shift back up.


The internet age was the same. Ultimately, the real money was made by Amazon, Google, Facebook, not the fiber optic companies and server manufacturers.


AI hasn't reached that stage yet. It's still the infrastructure stage, the selling-shovels stage.


And right now, the shovels are making crazy money. Those who understand the full tech stack will see the signals before the inflection point.


Others will be surprised time and time again, wondering where the money is flowing.


Ten years from now, understanding the AI tech stack will be as foundational as understanding a balance sheet.


Remember three things: Understand the tech stack. Map out the hierarchy. Track the flow of capital.


That's the game.


[Original Tweet]


From Power to Chip: How the Average Person Can Participate in the Wealth Opportunities of the AI Era - Bitsfull