Technology

The $200bn man: Larry Ellison’s wealth rebounds as Oracle joins AI boom


Unlock the Editor’s Digest for free

Until recently, it looked as though the cloud computing market had been locked up. Just three American companies — Amazon, Microsoft and Google — account for nearly two-thirds of the cloud infrastructure business, selling computing to customers who no longer want to run their own data centres.

As with much else in the tech industry, generative AI has caused a rethink. A new generation of cloud competitors has found an opening with the sudden jump in demand for the massive computing power to train and run the latest AI models. The question now is whether they can ride this wave of demand to reach meaningful scale — or whether this is just a short-term opening caused by shortages of AI chips, along with the stock market’s insatiable need to find more AI winners.

Take the unlikely Wall Street renaissance of database software maker Oracle as a cloud infrastructure company. Its shares have jumped 15 per cent after it reported earnings this week and are up 90 per cent since the start of last year. That has put co-founder Larry Ellison, a pioneer from an earlier tech era, within a whisker of overtaking Jeff Bezos to become the world’s second-richest person with net wealth of $196bn, according to Forbes.

For years, Oracle seemed happy to take a back seat in the IT world’s most important platform shift. It moved its business applications to the cloud, but rather than go all in, it poured its cash into buying back stock instead (and as its share count collapsed in the decade to 2021, Ellison’s personal stake in the company leapt from 22 per cent to 42 per cent).

Finally, Oracle seems to have got cloud religion. But is it really possible to join a computing revolution a decade and a half late and expect to be a serious contender? In a business where economies of scale really count, it is going up against companies with unrivalled resources and well-developed expertise in running large fleets of data centres.

Part of Oracle’s new cloud push has involved planting its own servers, running its database software, inside the data centres of other cloud companies. That makes it easier for customers to connect their data and applications across different clouds, a logical move that was capped this week by an alliance with arch rival Amazon Web Services.

But there’s no question where the really big opportunity lies. Ellison this week predicted that training the most advanced AI models would soon cost $100bn apiece, leading to a huge market that would be dominated by a handful of companies for the next decade.

He is not alone in believing that becoming a strategic supplier to these AI giants represents a once-in-a-generation opening in the cloud market. Nvidia chief executive Jensen Huang likes to talk about the “AI factories” needed to power the generative AI boom — companies like Coreweave and Lambda Labs, which have bought his company’s general processing units in bulk.

In a sign of its early success, virtually all of Oracle’s growth this quarter came from cloud infrastructure, even though this still makes up only 17 per cent of total revenues.

Yet a huge gulf still separates it from the cloud giants. Its $2.2bn in revenue from this business was dwarfed by the $26bn reported by Amazon Web Services. At the same time, its capex pales in comparison to the soaring investments at the biggest tech companies. The $2.3bn it spent on building data centres this quarter was dwarfed by the $19bn Microsoft ploughed into new facilities and equipment.

For now, shortages of Nvidia’s GPUs have created an opening, as even the biggest players look to offload part of their AI computing needs on to others with spare capacity. Some of the AI features in Microsoft’s Bing search engine now run in Oracle’s cloud, while close Microsoft partner OpenAI has also shifted part of the work of training its AI models to Oracle.

What will happen when the supply of AI chips catches up with demand is an open question. Also, some of the biggest AI companies see vertical integration — running their models on their own hardware — as an important strategic advantage.

Elon Musk recently turned away from Oracle when it came to building the next giant GPU cluster for X.AI, instead saying that an expertise in hardware was a core skill that his AI company needed to develop for itself.

Companies like Oracle still claim that coming later to the business and designing specialist data centres has given them an advantage over the older facilities of the cloud giants. If AI euphoria recedes, that claim could be put to the test.

richard.waters@ft.com



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.