Despite the news catching up, critical investment insights remain underreported, emphasizing the need for proactive, informed decision-making in evolving markets.
You know how, while you're reading, an article pops up and it's about something you already know? The "Ok, let's see where this is going" feeling? I got a lot of that over the past few months, for the exceedingly unusual reason that there suddenly seemed to be a bunch of news about topics we already wrote about. Not last week or a month ago, but years ago.
Topics that once seemed so lone-voice or out-there, that our R (our reputational risk sensitivity) called for carefully phrased language and detailed exposition. It's one thing to be an outlier, it's another to be labelled extreme. Explain it thoroughly enough, and you might be ok.
It's not as if anything we wrote about, or variations of it, hadn't happened many times before in history. These observations appeared extreme only because there was no resonance whatever with the institutional investor community or financial news media, which seemed unaware of-or entirely unconcerned about-any of these issues. Issues that were part of a plexus of an incipient, likely historically inflationary environment. (Issues like record national debt leverage; strategic hard commodity supply limitations; the ageing out of disinflationary forces like the manufacturing labor arbitrage between the U.S. and emerging economy nations; and the remarkable appearance of non-debaseable currency (!!), aka Bitcoin).
Perhaps it's because the institutional investor focus is on next month and next year, not what happened 20 years or a century ago or in another hemisphere. Or because of the "I'll believe it when I see it" factor. People don't like a "new normal." We were early observers in a market that likes a year early, just not years early.
But here they were, in recent weeks, articles about the record level of government debt and about the impact of Federal interest expense on discretionary spending programs. Even related pieces about FEMA running out of disaster relief funding.
Articles about Chinese semiconductor manufacturers maybe beginning to challenge Apple (AAPL) and Nvidia (NVDA).
And even about data centers. (Boy, that one was quick! People like this topic!) About how much electric power they require. (No more stories about bitcoin, which pales in comparison; now it's about data centers consuming more power than small European countries.) Articles about how much the major IT companies are spending on those data centers. Even down to details like their rampant water use-the market can be efficient about a subject if investors are looking at it.
Articles about large-scale merger activity and oil reserves purchases in the Permian Basin, even though there is not the slightest whiff or expectation of higher oil prices. What's all that activity about?
So, even through the traditional information sources, the investing public is beginning to see elements of some of the pressures and changes that have been regular fare over here. Eventually, if something secular and significant keeps happening, you can't keep not seeing it.
So, for this Quarterly Commentary, it seemed like a fine idea to let the news lead the way and save us some work.
We did not seek out these articles in order to fit a narrative. They were collected sporadically and haphazardly as they "popped up." No doubt, selection bias and confirmation bias played their unsubtle parts. The product is impressionistic, not comprehensive.
Also, these new excerpts were selected for their information content, not just as visual props to emphasize a point. They're intended to be read as an integral part of the Commentary [even though they can't be read aloud during the webinar].
Editor's Late Note:
Despite high hopes for the experiment of a Commentary told side-by-side with general-circulation newspaper articles, it reached a failure point about two-thirds of the way through, despite best late-stage efforts to locate the relevant content.
Although elements of the larger picture are being told, the most important ones are still absent. The ones that lead to actual conclusions and how they might inform investment decisions. But, at least one needn't be embarrassed to say them aloud, now.
It was the June release of the Congressional Budget Office's 10-year projections for the federal budget and the economy, 2024 to 2034, that sparked articles like this one(prior page) from the Washington Post.
At the beginning of the Covid-19 pandemic, we sketched out how the continued expansion of the government's already excessive balance sheet leverage-mostly by piling on annual budget deficit borrowing- would put it on a course to exceed the record level reached at the end of World War II in 1946. Newspapers are now willing to note the same.
Successive Commentaries described that these figures are not mere abstractions, but would eventually have practical, lived consequences, such as when net interest expense becomes so large as to squeeze out discretionary spending. Discretionary spending programs (as opposed to Social Security and Medicaid, which are the largest budget items by far) include, among over 800 line items, the Veterans Health Administration; Operations and Maintenance for the several military branches; salaries for the FBI; separate line items for various National Institutes of Health, like the National Heart, Lung and Blood Institute, and the National Cancer Institute; and the Disaster Relief Fund of FEMA.
The CBO report included a chart showing the crowding out dynamic visually. Even despite their remarkably optimistic assumptions, the Net Interest line will cross over the Discretionary spending line. An ominous pictorial.
Optimistic? They presume a decline in inflation to 2% from 2026 through 2034. Included in that number is a lower rate of energy inflation. We're personally in touch with some very experienced analysts who have some familiarity with the energy sector (our desks are within shouting distance), who would suggest that energy price inflation is more likely to increase dramatically than decrease modestly in the coming decade.
The CBO also projects that the interest rate on 10-Year Treasury notes will remain the same as today's 4.1% by 2034. This was despite forecasting that debt leverage and annual new borrowings will exceed all historical experience since 1946. Which is to presume that more debt leverage and more money printing to support it will not impact the perceived credit quality or yield required by potential lenders.
But the CBO tends to ignore present-time reality less than far-future reality. In response to the proposed fiscal policies of the presidential candidates, the CBO raised their estimate of what the Net Interest expense as a proportion of GDP would be in 2031. They raised it very bigly. The 2021 10-Year Outlook projected that interest would account for 2.4% of U.S. GDP in 2031; this year's report raises that figure to 3.7% of GDP. That's a 50-plus percent increase in the reduction of the entirety of the U.S. economic output that the interest expense will cost. It's a 1.3% diminution of what GDP would otherwise have been. That massive estimate change occurred in just three years.
All sorts of assumptions go into a 10-year economic projection. Another non-partisan, well respected organization that examines such public policy assumptions is the University of Pennsylvania's Penn Wharton Budget Model, PWBM. Its 10-year economic projections incorporate both the Harris and Trump campaign tax and spending proposals (like the Harris expanded Child Tax Credit, or Trump's elimination of taxes on Social Security benefits), along with the expected impact on things like average wages and income, GDP, and the government debt.
As to budget deficits, relative to the baseline model of what is already projected:
Those operating losses are before including interest expense (see the accompanying OMB chart ).
Debt: the PWBM projects that total debt held by the public would increase- above the base case-by:
That's the way it's going, even with benign assumptions about interest rates, economic growth, and so on.
What neither model covers is the ultimate or end-game implications of the crowding out, by rising interest expense, of critical discretionary budget categories.
In one sense, it's not a problem, because critical budget items will ultimately be paid; they must be. That simply requires yet more borrowing and money creation, which is politically determined.
But from that deficit spending-given the existing debt load and scale of the annual additions and build-up of the interest burden-there eventually comes a tipping point beyond which the economy can no longer grow as fast as the debt and interest expense do.
One can look at the tipping point:
If the current Federal debt held by the public is $34.8 trillion (which it is), and if the current annual deficit is $1.9 trillion, then the debt is increasing at a 5.5% rate even before refinancing below-market-coupon Treasuries at market rates as they mature.
So, when exactly does the nation's debt and interest end of the see-saw tip decisively over the revenue side in the playground of economic life?
One might think the silence on this scenario surprising, since neither the CBO nor PWBM graphs suggest that interest expense's share of the budget will cease expanding, and since econometric modelling is designed precisely to tease out such iterations. That level of commentary might not be within their mandate.
Computer models aside, history is pretty clear about the unpleasantness after passing that tipping point.
Another benign assumption in the federal budget projections is the $20 billion in annual disaster relief appropriations that have been the norm for the past five years. For most of the 2000s, the annual figure was about $9 billion, and before then, mostly $2 to $3 billion. These are inflationadjusted dollars.
When a given year is worse than budgeted for, supplemental appropriations are requested. But it is the $20-odd billion that's budgeted, even if multiples of that amount will be required.
As of the September 2024 fiscal year end, the combined appropriations were $42 billion, twice the budgeted amount. Events after those appropriations make that figure way off the mark, more of which below.
Measured by the number of billion-dollar weather and climate disasters (which account for the bulk of such damage), 2024 set a record. Disaster categories included wildfires, drought and heat wave events, flooding, and tornado outbreaks.
The instances of billion-dollar events have been increasing very obviously in recent years:
For fiscal 2025, the budgeted appropriations of about $23 billion don't include the major hurricane related disasters that occurred this past September 24 th and October 9th, which couldn't make it into the 2024 or 2025 budgets. One week into the 2025 fiscal year, FEMA's 2025 budget was already half expended.
The inflationary impact of this trend is transmitted along at least two paths.
One path is through the government, since appropriations must get larger, which means additional deficit spending and borrowing, which means larger-scale money printing, which dilutes the value of each unit of the currency and savings. These tens or scores of billions of disaster relief dollars might seem trivial by comparison with a multi-trillion-dollar budget, but they are not.
To illustrate, let's say that the ultimate FEMA spending for 2024 or 2025 is $75 billion. The figure for 2018, which included hurricanes Harvey, Irma and Maria, was $60 billion. And 2005-which included hurricanes Katrina, Rita, and Wilma-was a $103-billion-year. These are the same size as the budgets for discretionary programs like the Cybersecurity and Infrastructure Security Agency, the Federal Election Commission, the National Park Service, the Capitol Police, and research activities for the National Science Foundation. Among Mandatory Outlays, Unemployment Compensation is $40 billion and Military Retirement Spending is $80 billion.
The vulnerability of Discretionary Outlays is made clearer by looking at just how big the swings in available funding now are in response to the most modest changes in the balance between tax revenues and nondiscretionary expenses. The following table shows the changes between the 2023 Federal budget and what, in June 2024, the CBO projected the 2025 tax revenues and outlays to be. Mandatory Outlays are mostly Medicare, Medicaid, and Social Security. Interest expense must be paid, of course, but is listed separately as part of Total Outlays.
Revenues were actually projected to rise a lot more than Mandatory Outlays, which is a very good direction to be going in. That would have provided over $225 billion in additional funding for Discretionary budget items, a 33% increase. Except that interest expense increased by much more than that. So much so that what was left over for Discretionary Outlays slipped from positive to negative.
As the rump of the budget, Discretionary spending is now like the rise/drop vector on the distal end of a playground see-saw. Sitting at one end of the see-saw is the great weight of the mandatory spending and interest expense, in increasingly unstable equipoise with the tax revenue figure weighing in at the other end. Small changes in their relative balance, along with a touch of momentum, can result in great swings in what is available:
The total economic cost of the billion-dollar class of disasters (not just the number of such incidents) has likewise been rising markedly. Three of the highest six annual damage counts occurred in the past three years: 2021, 2022, and 2023, and they averaged $144 billion. Even adjusted for inflation, damages have been rising at two or three multiples of GDP growth.
The second pathway for transmission of larger weather-based disasters is through the private market, one of which is insurance. FEMA and other agencies like the Army Corps of Engineers assist with a modest fraction of the $100 billion-plus class of damages.
Moody's estimates, earlier this month, of the combined insured damage from Hurricanes Helene and Milton were between $35 billion and $50 billion. Fitch Ratings estimated that just Milton's insured losses will range from $30 billion to $50 billion. In the four years through 2023, insured property losses from natural disasters was between $74 billion and $99 billion. In the 15 years prior to that, the mode was in the $10-$15 billion range, interrupted by periodic outliers.
Higher insured losses mean higher insurance rates, whether as a cost of living in a home or of operating a business.
There are costs that are not recoverable from the federal government or insurers. As of October 4 th, Accuweather's initial estimated economic cost of Hurricane Helene-from infrastructure damage, business disruptions and other impacts- was $225 to $250 billion. It placed a further estimate of $160-$180 billion on Hurricane Milton damage.
Ultimately, the costs borne by municipalities get transmitted to taxpayers, which is a way of saying higher prices (for living).
Accuweather and the National Oceanographic and Atmospheric Administration are different sources with, doubtlessly, different means and methods. Despite this fact, Accuweather's estimates of just these two hurricane events would place 2024 atop the accompanying line chart of the greatest annual disaster years, which already appear to be on an upward trajectory.
The accompanying articles about China's smart-phone and AI chip prowess came up recently. Preceding them, as reviewed in a prior Quarterly Commentary, was China's introduction of an electric vehicle competitive with Tesla's (TSLA), to which local market prices adjusted promptly. Downward, that is. This was followed by a China-developed smart phone chip that was competitive with the Apple iPhone. Market clearing prices for iPhones in China promptly adjusted.
Now, Huawei is first to market-as opposed to catching up-with a triple-fold smartphone that opens to tablet size.
The next logical step for these China-proprietary products would be a concerted global sales effort, once sufficient local market take-up and production scale have been achieved. As of yet, Tesla and Apple have not had to contend much with these outside of China.
There will be other such products, as the accompanying stories show, most recently a Huawei AI chip it says is competitive with the NVIDIA H100-that's THE chip, the one that all the excitement is about.
The first important point about these China-proprietary products is not any specific one of them, but that China's technological capabilities-in government-selected and supported strategic markets-rival some of the highest-level U.S. capabilities. In one sense, it is only logical, since China has been manufacturing high-tech equipment like the iPhone for decades. U.S. sanctions several years ago on critical semi-conductor manufacturing technology merely served to motivate China to develop its own proprietary versions. Which it has been doing successfully.
A second important point is that the sales-and-pricing impact of competitive products has now been market tested in China. It shows that, despite the experience to date of undisrupted, unchallenged growth and profitability, the U.S. IT giants are subject to ordinary business threats and challenges, like pricing and substitution risk.
A third point, as often happens, is that because it hasn't happened before, "the market" doesn't expect it to.
Question: What would the ready availability, outside of China, of price-discounted global-scale consumer and commercial products-like EVs, smartphones, and AI chips-mean for S&P 500 (SP500, SPX) IT Sector revenue growth, profit margins, and valuation multiples? It hasn't happened yet. Doesn't mean that it will. If you were China, what would you do?
Among the many ways today's IT sector companies are unique is that it's been at least a decade since they've faced disruptions to their growth and/or profitability and/or valuations. It's not unusual, during a bull market, for all three to work in conjunction. This generation of investors has experienced a uniquely long version of it-and, therefore, expect to continue experiencing it. But these news articles are not esoteric, technical arguments. They're about pedestrian, unglamorous business issues like technological competition, competitive pricing threats, and being closed out of important markets.
The next set of articles introduce, for the first time in the IT sector, cost-of-goods sold resource shortages, along with new capital spending and operating-cost challenges. Valuations convey no concern about deteriorating return-on-capital challenges.
Data centers require a lot of electric power. Their rapid multi-year expansion, on a larger and larger base, means large-scale power demand expansion.
The number of hyperscale datacenters in the eight years to 2023 increased at almost a 20% annual rate. That measure, though, doesn't differentiate between the sizes of these facilities, so it understates power demand growth. The lower bound in size, to qualify as a datacenter, is 5,000 square feet. There are over 5,300 datacenters in the U.S., of which more than 1,000 are hyperscale facilities. Hyperscale has, until recently, been defined as 10,000 to 100,000 sq. ft. in size. But the designation now includes facilities far larger than that, a scale that few people not in that industry have seen or could even imagine without visiting in person. Those consume vastly more power.
For the Generative AI phase of computing, even hyperscale installations are insufficient. Far larger exascale versions are now under construction. The bar chart on the last page essentially depicted only the Cloud Computing growth phase of hyperscale, because AI has only begun to make a meaningful impression upon this sector in the past 12 months or so.
OpenAI's public debut of ChatGPT was Nov. 2022. By Feb. 2023, at 100 million monthly users, it was the most rapid adoption of any application ever.
Microsoft (MSFT) debuted an early version of Bing Chat in February 2023, and Google's (GOOG,GOOGL) general release of the Bard chatbot was in May 2023.
Between May and September 2003, Open AI released ChatGPT for Apple's (AAPL) operating system, for Android phones, also a business enterprise version, and an AI image generator called DALLE3In November 2023, Amazon (AMZN) announced a corporate version of an AI chatbot, for $20/month per user. Apple's large language model product for its phones is pending.
It is all that recent.
It's now recognized how daunting the incremental power demand is. Even at the chip level, no single figure measures it. Computational needs differ between the AI model training phase and the inference phase when it is tasked with output.
A comprehensive study published this past June found that image generation can consume 45x the energy of image captioning. Image captioning, in turn, uses more than 30x the energy of simple text classification, as when an AI model is used to automatically label and classify documents and emails. That's a 1,350x energy difference.
How to relate to these figures? The study described, for 1,000 inferences, a low-computation text generation AI model as using as much energy as 9% of a full smartphone charge, while image generation would use the energy of 130 smartphone charges, or over 10% of a smartphone charge for just one image.
But even low-computationally-intensive tasks can ultimately consume vast amounts of energy. A Google information page was referenced as indicating that, even back in 2016, Google Translate was being used over a billion times a day.
Hyperscale and exascale data center power use is not readily available from the companies themselves. The article at right, part of a series on AI by the New York Times, referenced a study in Joule, the peer-reviewed science journal that focuses on the challenges of global energy needs.
Data center operators are now experiencing some of the same limitatis to unfettered expaion that are commoin other industries.
One such limitation is community opposition over competing claims on resources, from quietude (see article at right) to tangible resources like water consumption and power, which has resulted in higher household electric bills.
This discommode can rebod against the data center operators as political costs, often as zoning restrictions.
In addition to the 1,000 hyperscale data centers now in place, twice the number of four years ago, there are another 440 in the planning or development phase. Those two sets of numbers, though, bear no useful comparison to one another, because-again-the recent generation of AI hyperscale data centers are a whole other animal. They are vastly greater in size and consume vastly more power than conventional data centers, and have a host of other unique requirements.
Staying with electricity for the moment, even using traditional CPU chips designed for serial processing of instructions, hyperscale data centers have up to about 10x the power density, or about 30kW-that is, 30,000 watts-per server rack or cabinet, than older data centers. Then, at the chip level, NVIDIA's GPU parallel processing chips use multiples more power than CPUs. That easily moves GPU-based data centers beyond 40kW power densities. That is around where air cooling of server racks becomes insufficient, and water cooling becomes necessary.
And each generation of GPU chip draws yet more power. NVIDIA's V100 chip drew 300 watts, its successor A100 chip drew 400W, and the current H100 chip, which ignited the current frenzied expansion activity in AI usage, draws up to 700 watts.
The reason the H100 is in such demand, at $30,000 to $40,000 apiece, is that its processing performance, in trillions of operations per second, is over 30x greater than the V100. NVIDIA's next chip, the B200, to be available next year at this time, will draw up to 1,200 watts.
Translated into total power requirements in a relatable way, the H100's 700W power consumption compares to a mid-size microwave oven. Mind-boggling as it might seem, an exascale datacenter could house over 1,000,000 servers. Multiply 1 million x 700 watts, and you have 700MW of power consumption. As a reality check, Switch, a company that specializes in building exascale data centers, is constructing one in Nevada that will provide customers up to 650 MW of power.
Based on how much electricity the average electric utility customer uses, that Nevada plant's consumption could support about 500,000 households. Since there are over 12,600 utility-scale power plants in the U.S., the Nevada exascale data center, if it were an electric utility, would qualify as the 476 th largest, or in the 96 th percentile.
This is why the AI-era data center can no longer take its power from the local electric utility in any ordinary way. It's one thing to require meaningful additional share of a utility's capacity city; it's another to displace the entire customer base.
Which explains why Microsoft undertook the seemingly extreme step of helping to fund the $1.6 billion re-start of a unit of the Three Mile Island nuclear plant, which was the site of the catastrophic 1979 partial meltdown of its other unit. Microsoft entered into a 20-year energy purchase agreement for the 835MW plant. The regulatory review process is not expected to be completed until 2027.
Microsoft's direct cost wasn't disclosed, but that astounding figure appears to be well within reason, judging by new-build exascale datacenters. Meta is reported to be building an $800 million facility in Jeffersonville, Indiana, and a $1 billion facility in Mesa, Arizona. Google is reportedly planning a $600 million data center in Wasco County, Oregon. These are but a few of many.
Water is another limiting resource. Recalling the upper limit of 40kW per server cabinet for air cooling, Switch's Nevada facility is engineered for up to 55kW per cabinet. To translate, again, that's 55,000 watts. Imagine a 1-story building the size of 10 football fields, with aisle after aisle filled with 6-foot cabinets holding 500,000 or a million microwave ovens that are running every second of every day. Ergo, water cooling. This has several implementation implications.
One has to do with the high density of water-at 62 lbs. per cubic foot, it's not quite as heavy as rocks, but pretty close. That precludes multi-story buildings, because supporting the weight of the water would require prohibitively expensive engineering and construction. New large-scale data centers are therefore one-story construction. This, in turn, requires substantial land area, ranging from hundreds of acres to over 1,000. Because, aside from other infrastructure, like power generators, solar and wind power installations are used to power much of the support operations.
Another implication has to do with where such large tracts of land-say, 1,000 acres- can be found that are not proximate to a population center that might deny zoning permissions. Paradoxically, many areas that offer plentiful and inexpensive land are also arid, with aquifers that might already be depleted by overfarminr which are subject to subsidence or fissure creation that could damage local roads and buildings.
So, local population, land, water, and magnificent quantities of 24/7 electric power are or can be limiting factors.
Collateral but very relevant implications of AI data center expansion.
Exascale data centers require massive quantities of other physical resourceacebook's $1 billion, 2.5 million sq. ft. data center in Mesa, Arizona, to be completed in 2026, will use 12,000 tons of steel. It sounds bigger as 24 million pounds (roughly the weight of 30 fully loaded Boeing 747s). I haven't found any figures for copper, yet, but they are doubtlessly also most impressive.
For comparison, the iron requirements for the turbine that sits atop an 8MW offshore wind tower are on the order of 30-plus tons of cast iron plus 40-50 tons for the rotor hub. Separately, the hollow pipes, or monopoles, that form the foundation fost offshore wind power installations have up to an 18-foot diameter and 150-foot height. That can be over 1,000 tonnes of steel.
Doesn't quite measure up to the Facebook data center's needs. On the other hand, there's never just one offshore wind turbine.
Both data centers and electrification projects like battery plants require more steel than nonresidential construction like office buildings or hotels. The return-or onshoring-of many manufactured products is likewise adding to the demand for raw materials.
Will these challenges dissuade the rapid planning and construction of AI data centers? History, behavioral finance, and just plain street smarts are clear on the matter: Never ignore the profit motive and incentive systems. A sufficiency of potential profit magically conjures much capital and political influence around itself. Enough incentive has, quite literally, moved mountains. In the past five years, three companies, just for sampling purposes- Microsoft, Meta (Facebook), and Alphabet (Google)-have spent over $370 billion on new property, plant, and equipment. Those expenditures have been accelerating. Five years ago, they jointly spent$52 billion. The trailing four-quarter figure is$117 billion. The most recent quarter run-rate is over $140 billion.
One could ask whether the torrid demand growth for AI is just a temporary rush for first-mover advantage or to protect market share and shareholder-required growth rates. Will it soon expend itself?
Profit motive and incentive systems say it's not stopping. The online commerce and advertising business model is probably the largest, most successful business in human history. That's quantifiable in the revenues, earnings, and market values.
In basic online consumer commerce, an enormous amount of money is paid by goods and service providers to direct traffic toward just a select few among the millions of web pages. AI can deepen and broaden the usefulness of that search, and also create new transaction possibilities. So, why would it stop? Think of the transaction and service density possibilities.
What manner of behind-the-scenes processing is required to track the precise geographical movements of billions of consumers and instantaneously integrate that data with their individually digitized and databased predilections for, on a given day, a single person's choice of local petting zoos, children's clothing stores, family-friendly-but-health-conscious restaurants, and GPS-based driving instructions for the ride home, along with all the on-screen notifications about speed traps, stalled vehicles, and road-side McDonalds? And those predilections change over time, so they must be constantly updated. Notice how even a simple Google search now comes with a few lines of synthesized summary of each suggested web page or article? That's AI. Within the article, or perhaps a book or a movie you're contemplating, wouldn't your average transactional turnover be more rapid if the content were nicely summarized and characterized for you? Transaction frequency means incremental profits for someone.
And those are just the most pedestrian applications. Based on varied estimates of 40 million to 70 million installed video surveillance cameras in the U.S.-as of 2019, not today-the very rapid growth rate thereof, and the facial recognition technology applied thereto, perhaps this AI application should also be considered fairly pedestrian.
Video is very data dense, and video surveillance data gets saved, because the authorities might need it retrospectively to sentence you, or your attorney might need it to get you out. The servers that store that data-which are added to constantly, which require more electric power to store as well-must always be on, every second of every day.
The use cases are endless, and most probably can't be conceived of at the moment. in every industry, sne can think of an AI-enabled application that will make them more money, save them more money, make a better product, what have you.
There might never be enough data centers.
Even in 2020, years before NVIDIA's H100 chip was introduced, the amount of data stored globally was reported at 6.7 zettabytes.18 A zettabyte is a trillion gigabytes, so a trillion billion bytes. And it was forecast to expand by nearly 20% a year. Those figures are from four years ago, before the AI era. The annual expansion rate of data storage must now be very much higher than 20%.
Not even the trillion-dollar market cap IT companies have the financial capacity to continually spend at these magnitudes on such a sharply rising trajectory. This is beginning to impact their financial profiles.
One reason for their incomparable profit margins (royalty companies and securities exchanges aside), is that the asset upon which their businesses were built, the internet, was already in place. Unlike the cable television companies-which, much to their chagrin. had to construct and maintain their own capitalintens networks-the internet-based IT companies got something of a free ride.
But now the IT companies are building their own asset-intensive infrastructure.
In the past five years, Microsoft's net income rose by 2.3x, but the PP&E on its balance sheet rose 3.8x. Its revenue generated per dollar of PP&E has declined steadily from $3.88 down to $2.11. Alphabet and Meta exhibit similar patterns.
...But the vast amount of resources required makes expanding further more complicated...Blackstone has been working to help QTS find sites outside the most power-constrained markets. It's also facilitated conversations between QTS and energy providers the firm backs.
This is why private equity capital, at scale, is already engaged. There's more to say about private equity involvement as it might impact asset allocation and fund flows in the indexation world. But we should first discuss the turning point in equity sector and asset class returns that the AI era is bringing to the fore or accelerating. And the bright-shining question: Which are the best companies and sectors to own for this AI boom?
Those who want to partake in the breathtaking possibilities of the AI era start, even if unconsciously, with a deterministic choice. A choice like this.
Haven't tiled a kitchen floor before? Some people dive right in, starting with the obvious tools and supplies, and the demo, cutting, and gluing. The project starts fast. Other people start by reading instruction manuals, uring, sketching, and figuring. They might still be planning while hammering can already be heard from other kitchens.
There are those who have the knack for getting the first approach right but, normal distribution-wise, most can't. Some people in the second group still can't quite get it right, despite the preparation, but a lot more of them will.
That sloppy analogy is supposed to describe what, in the investment world, might be thought of as the cleavage between two of the basic equity investment approaches:
That's a long way of saying that in technology investing, you can hold your breath for a couple of very divergent reasons.
On the one hand, if you do the obvious and buy the technology companies directly, you hold your breath during the race between:
The post-race analysis of new-tech bull markets suggests that this is a low probability approach to success. It always works for a while, but generally doesn't satisfy in the end.
On the other hand, you can do the less-obvious and buy established and stable businesses that are direct beneficiaries of the mass adoption of the technology itself. Businesses that supply limiting-factor goods and services required for the expansion of the tech companies, irrespective of whether the tech companies prosper or not.
Better still is if those goods and services also happen to have limited availability. Of course, asset-intensive suppliers to leading companies like Apple and Amazon don't usually fare so well, since there's a cost to scale up production capacity, and they're in a poor bargaining position.
Best of all, then-like a royal flush-is if those providers of limited supply, limiting factor resources don't even have to make any additional investments or do anything at all, just be there and collect new revenue.
The holding one's breath part, in this case, is waiting for the microeconomic demand to make an obviously visible impact upon the beneficiary businesses' revenues while-in the other kitchen-the public market demand is noisily swirling around the technology shares.
Which brings us back to the Permian Basin and to Texas Pacific Land Corp. (TPL) and Landbridge Co. (LB) Write down the few critical limiting factors for exascale data center builders, and then list the unique physical resources of these two companies. They are uncannily aligned. Here, we did it already:
What AI-Era Data Centers Need:
Land is probably the ultimate hard asset investment. For TPL and LandBridge, the land-aside from being a perpetuity that outlasts almost every operating business-embodies a portfolio of resources and revenue streams.
There's surface area for buildings and rights of way for rents; water rights; mineral rights for royalties; subsurface rights such as for wastewater or carbon storage; land for industrial activities like water recycling and renewable energy projects. On a longer timeline, there's land for development, which then becomes real estate; that's when pricing changes from dollars per-acre to dollars per square foot.
At this moment, exceedingly well-financed parties that are highly motivated and very capable need the specific combination of resources that resides in the western Permian Basin. If they can think about restarting 50-year-old nuclear reactors, they cannot but be aware of the Permian. The remuneration to the landowner who can provide the requisite resources for a half-billion-dollar or billion-dollar project is likely to be significant indeed.
Whether as lease income, physical product sales or participation arrangements, the landlord in this case simply receives incremental revenues with virtually no associated costs, just pre-tax income. The economics would be remarkable.
Some clients have asked for detailed information about whether any data center has agreed to build such a campus on TPL or LandBridge property, what size of facility, and precisely what the rent will be, and so forth. It's common to think that such data is necessary to complete an analysis to make a responsible investment decision. That's one style of decision-making. But how do you make decisions if data is not available? Do you make any decision?
Here's another style of decision making. Why does anyone think it's good that all the information is available to everyone to analyze? The efficient market might be good for the functioning of markets generally, but that simply means that you, as an individual, can't get a better price or valuation than anyone else. That's what indexation does: You're all together in the same securities and assets at the same weightings. It means you can't own a really alluring investment before everyone else does.
For the investments we're discussing here, it's fortunate that such information is lacking. Take it from the data center companies: They share very little about their cost structure or building designs or site evaluations. They're don't want competitors stealing a step on them. That sort of info is not on a database.
But the lay of the land, so to speak, is pretty clear, if you're willing to accept a general outline of the important variables and apply informed reasoning. When and if the first such data center contract is signed, much more information will no doubt be released, which will immediately ramify through various byways of the efficient market. And then everyone can know together.
But so far, the market hasn't been very efficientabout a pending AI-Permian Basin meet-up. Herewith, a demonstration:
While trying to maintain, in this Commentary, the side-by-side narrative of general circulation newspaper and magazine articles, I realized a few days ago that I didn't have any saved articles about the Permian Basin and its connection with the data center revolution. Thinking it was an oversight, I set out to search for an article or two.
That was a fail. Among the five largest-circulation newspapers in the country (excepting the Wall Street Journal, which has a paywall, so Bloomberg was substituted), a search for "Permian Basin" turned up hundreds of articles. But none of them- not in the New York Times, the Washington Post, USA Today, the Los Angeles Times the Chicago Tribune, or Bloomberg even suggested this news.
If you Google hard enough, you'll find elements of it, but how would even you know to do that, since there's nothing in the news to suggest that you should? The news topics were social and political and about daily events: about Berkshire Hathaway (BRK.A, BRK.B) buying 50% of Occidental Petroleum (OXY), and ExxonMobil's (XOM) $60 billion purchase of Pioneer; about Big Oil's net-zero emissions goals for their Permian operations; about the migration of mid-westerners to Texas.
So, the news still hasn't caught up to the important things to know. Nothing about what is really developing in the Permian Basin, or what the practical implications of the Federal Debt are, what role other hard commodities will play amidst global electrification and China's increasingly direct competitive stance versus U.S. industry, or how and why fixed-supply cryptocurrency like Bitcoin might possibly be an existentially important asset in the contingency of an era of rising demands and limited physical supply.
In the past couple of decades, it was thought that all manner of business and investing could be done through financial instruments and more novel and sophisticated securities, while hard assets, commodities, and inflation beneficiaries were essentially crowded out of the indexes. Information Technology, aside from being the crowder-outer, had an airy, intangible feel: virtuality, the "cloud"; commuter trains out, work from home-in. But the world, maybe especially the I.T. sector, seems to be returning to those hard assets and basic physical resources. They're not much owned in asset allocations, and not much understood.
Land Availability, and the Grand Central Station-ness of the TPL and LB Assets
There are 254 counties in Texas. Three of the 10 counties that produce the most oil in the entire state are clustered around the southwestern corner of New Mexico, where it intrudes into Texas. In descending order, the counties are Loving, Reeves, and Andrews. Numbers 21 and 22 on the list, also on that same corner, are Winkler and Ector. A map of the TPL and LandBridge surface and royalty acreage shows them to have dense ownership in the center of this strategic hotbed of the U.S. oil reserves portfolio.
These are the populations of those five counties in the Western Permian. The U.S. Census estimates that:
The Permian is unique in the U.S., in that the volume of associated gas that comes up with the oil from the region's wells exceeds the amount of gas produced from dedicated gas wells.
There is so much excess gas-and, as of yet, insufficient pipeline takeaway capacity-that the price of natural gas at the Waha Hub in Pecos County has actually been negative for much of the past year.
Waha is a lesser storage and connection point from which gathered gas from the Permian is transported elsewhere, such as to Gulf Coast refineries and export facilities.
There's plenty of gas for exascale data centers.
Here is a table showing TPL's share of production volumes from operators that pay royalties to TPL. One will see that both oil and natural gas volumes rose in 2023, and they rose this year, too.
One will also notice that gas volumes, when converted on an oil-equivalent basis, roughly equivalent to the same heat content, is more than a quarter of total BOE volumes. Meaning that, all else equal, natural gas should be a significant contributor to TPL's oil and gas revenues. And, with oil prices and volumes up this year, revenues through June are up roughly 10%.
Here's a second table. Embodied in the pleasing figures above, the natural gas prices that TPL realized in the first six months of this year were 40% lower than in calendar 2023. And the average gas prices it received in the last three months were 50% lower than the six-month figure. TPL could not have received very much in the way of natural gas royalties in recent months.
In approaching months, a new 490-mile pipeline, the Matterhorn Express, should be operating at capacity. It began taking small volumes from Waha earlier this month and will add almost 15% to the regional takeaway capacity. In mid-2026, an expansion of an existing Kinder Morgan pipeline would add another few percent to takeaway volumes. Another pipeline of significance, which would be anticipated for late 2026, has yet to receive a final investment decision from Energy Transfer Partners.
Therefore, the distorted localized pricing should revert toward market prices in the foreseeable future, with a positive impact on TPL's royalty revenues, to which natural gas has lately made hardly any contribution.
Oil production should also rise, because many oil producers had curtailed potential drilling activity during this period, since paying to have gas taken away was not to their liking.
AI datacenter growth is a global phenomenon. The largest such installation is in China. Various estimates exist for how much additional electric power will be consumed, how much additional upgrading and construction of the electrical grids will be required (more steel, more copper, maybe more aluminum), and how much more natural gas.
Natural gas appears to be the default. Although coal-burning electric power plants that were scheduled to close are being kept running to power data centers, no more coal plants will be built here.
Nuclear plants might be a first choice in principle, for their always-on, low-carbon-emission electricity (ignoring the enormous quantities of steel and cement required in the construction phase). But that will take a very long time, even with cooperative regulation, for a growth industry in a hurry.
There will no doubt be opportunistic deals with existing nuclear plants, but that is a limited opportunity set. At one time, there were 112 U.S. nuclear plants. There are now 93, of which 22 are in the process of decommissioning.
Natural gas wins out for the foreseeable future. In the 12 months ended July, U.S. natural gas consumption set a fourth consecutive annual high, a pattern broken only by the Covid-19 pandemic. Nevertheless, its price relative to oil is at a historic low.
TPL, again, appears to be a natural beneficiary of the optionality in this commodity.
That was a U.S.-centric gas demand discussion. Gas is an ascendant commodity globally, too. Liquified natural gas exports from the Gulf Coast to Asia and Southeast Asia have been constrained by the limited takeaway capacity from the Permian Basin. At some point, that bottleneck will be relieved.
Add growing demand from China, India, and other Asian nations due to rising industrial activity and standards of living. A subset of that, somewhat distinct from rising per-capita incomes, is a climate-change / air-conditioning negative feedback cycle that keeps raising demand for air conditioners.
These charts illustrate that it will be difficult to continue ignoring commodities, as was a privilege of post2014 cdity consumers.
Here is India and Southeast Asia airconditioner demand through 2022, and projections from 2023 onward:
General circulation publications and mass media do cover stock prices well, if they're large enough and liquid enough to be relevant in a mass-market investing world. It helps if their stock prices describe the shape of an up escalator.
And these outlets do cover large mergers, although they might not keep track of them too often or ponder what a comprehensive multiyear clidation might mean.
Larger acquisitions of Permian Basin reserves within the last year:
This follows a near-record year in 2023, when $234 billion of transactions were announced, more or less 2x to 3x the typical year since oil prices peaked in 2014. A question is, if these best-informed strategic investors think buying more reserves at the current oil price is a good idea, why don't institutional investors think so? The S&P 500 energy sector weight is still only 3.3% of the index. Not a single Energy sector ETF has as much as $1 billion of AUM; the largest one is a third of that.
One of the things energy production companies know is that they can be profitable in the Permian Basin at $50/barrel oil prices. They are constant beneficiaries of improved drilling technology and techniques, including AI to better define drilling opportunities and strategies. These improvements even make older drilling sites viable for renewed investment, which has instigated much of the acquisition activity. So-called lateral wells now extend up to three miles, but it's been a few years since one operator employed a novel horseshoe shaped two-mile lateral in Loving County that turned back toward its origin, accessing more oil with one hole.
Even if the newspapers aren't saying it yet, there is more to be said about investment possibilities in both AI and, as a newly critical key global commodity, natural gas. Maybe the corollary to intelligent inactivity (or active inactivity?) as a way to participate in long-term compounding investments like land companies, is indirect technology beneficiary investing. And, more topically, about some company-specific recent events in our portfolios. Fortunately, we write about these issues constantly, and there's always another go-round not too long away, here at the Quarterly Commentary.
Editor's Note: The summary bullets for this article were chosen by Seeking Alpha editors.