Goldman Sachs says AI is too expensive and unreliable — firm asks if 'overhyped' AI processing will ever pay off massive investments
Goldman Sachs is asking if we will ever recoup the the billions of dollars we invested in AI.
Corporations and investors have been spending billions of dollars on building AI. The current LLM models we use today, like GPT-4o, already cost hundreds of millions of dollars to train, and the next-generation models are already underway, going up to a billion dollars. However, Goldman Sachs, one of the leading global financial institutions, is asking whether these investments will ever pay off.
Sequoia Capital, a venture capital firm, recently examined AI investments and computed that the entire industry needs to make $600 billion annually just to break even on its initial expenditure. So, as massive corporations like Nvidia, Microsoft, and Amazon are spending huge amounts of money to gain a leg up in the AI race, Goldman Sachs interviewed several experts to ask whether investments in AI will actually pay off.
The expert opinions in the Goldman Sachs report are currently divided into two groups: one group is skeptical about its group, saying that AI will only deliver limited returns to the American economy and that it won’t solve complex problems more economically than current technologies. On the other hand, the opposing view says that the capital expenditure cycle on AI technologies seems promising and is similar to what prior technologies went through.
MIT Professor Daron Acemoglu estimates that generative AI’s impact on the economy will be limited, contributing only to around a 0.5% increase in productivity and a 1% addition to GDP output. This sharply contrasts estimates by Goldman Sachs’s economists, who suggested a 9% jump in productivity and a 6.1% increase in GDP. He also said that even though AI technologies will eventually evolve and become less costly, he isn’t convinced that the current trend of dumping more data and computing power at AI models will allow us to hit our vision of artificial general intelligence more quickly.
“Human cognition involves many types of cognitive processes, sensory inputs, and reasoning capabilities. Large language models (LLMs) today have proven more impressive than many people would have predicted, but a big leap of faith is still required to believe that the architecture of predicting the next word in a sentence will achieve capabilities as smart as HAL 9000 in 2001: A Space Odyssey,” said Acemoglu. “It’s all but certain that current AI models won’t achieve anything close to such a feat within the next ten years.”
The contrarian view on the report comes from Kash Rangan and Eric Sheridan, both Senior Equity Research Analysts at Goldman Sachs. They say that even though returns on AI investments are taking longer than expected, they should eventually pay off. Rangan says, “Every computing cycle follows a progression known as IPA — infrastructure first, platforms next, and applications last. The AI cycle is still in the infrastructure buildout phase, so finding the killer application will take more time, but I believe we’ll get there.”
“This capex (capital expenditure) cycle seems more promising than even previous capex cycles because incumbents — rather than upstarts — are leading it, which lowers the risk that technology doesn’t become mainstream,” Sheridan added. “Incumbents [like Microsoft and Google] have access to deep pools of capital, an extremely low cost of capital, and massive distribution networks and customer bases, which allows them to experiment with how the capital dollars could eventually earn a return.”
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Despite these two contrarian views, Goldman Sachs recognized AI’s two challenges—the availability of chips and power. The AI GPU crunch seems to be over, primarily because Nvidia can now deliver chips with a lead time of two to three months instead of the 11 months it used to take.
However, data center power consumption is now the primary limiting factor, especially as AI GPUs are increasingly power-hungry. A single modern AI GPU could use up to 3.7 MWh of power annually, with all the GPUs sold just last year consuming enough electricity to power more than 1.3 million average American households. Major corporations have even now started looking at modular nuclear power plants just to ensure that their massive AI data centers can get the power they require.
Only history can tell us whether AI will boom like the internet and e-commerce or bust like 3D TVs, virtual reality, and the metaverse. But whatever the case, we expect to see AI development continue. Goldman Sachs says, “We still see room for the AI theme to run, either because AI starts to deliver on its promise, or because bubble take a long time to burst.”
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.
-
Notton It's good to be skeptical.Reply
If anyone knows how to milk profits by doing all sorts of how-is-this-not-illegal things, it's Goldman Sachs.
And if they can't figure it out, its either a very poor investment, or they were too slow to the punch and want to short stock their competition.
Their role in 2007-2008 financial crisis is exactly why crypto ponzi schemes and nft money laundering became so popular in recent years. -
kjfatl Too expensive, unreliable, but a necessary business expense.Reply
AI is a tool that can do some things cheaper and better, The obvious use includes things like scanning through billions of medical images looking for clues to disease markers. No human can do this as well as AI.
It can also be used for simple things like scheduling employees at a chain restaurant with the necessary training training for for a large chain done using a simple PC.
A few investments in AI will pay off handsomely. Most will be written off. All I need now is a reliable advisor who can tell me which one to invest in and when to get out. -
drea.drechsler Too expensive it may be, but investments will likely continue apace to continue developing it as a useful tool.Reply
That is, until it either proves reliable and effective enough to justify expense or costs come in line...or both. At which point it will take off. -
hotaru251 "AI" isn't profitable becasue its a niche thing that has no real value itself.Reply
modern ai isnt what people want...they want actual ai not just llm's. -
bit_user I keep thinking of the Internet revolution, as a point of comparison. There were waves of companies that died before the Internet became essential infrastructure and so deeply integrated into our daily lives - even being right about the winds of change doesn't guarantee success. Many of the survivors of that era are now some of the biggest companies.Reply
I think many companies are primarily focused on not being left behind, first and foremost, and focusing on RoI second. That's going to shift, as we move forward. As that focus shifts, the tech will continue to improve, as will people's experience with it. There might be a short-term cooling off, for AI tech, but the tech isn't going away and I don't expect quite the same magnitude of crash in AI that we had with dot-com companies. -
bit_user
Well, that's quite a broad statement. Modern AI includes things like Image generators, which companies like Adobe haven integrated into their bread-and-butter products with impressive results.hotaru251 said:modern ai isnt what people want...they want actual ai not just llm's.
As for LLMs, I agree that there are some mismatched expectations. People are initially amazed to see a computer do and know things they never thought possible. Once that amazement wears off, they quickly start to find its limitations and there's some degree of disenchantment.
Ideally, they stick with it long enough to learn how to optimize their prompts and find better ways of harnessing it. But, even if they don't, a whole generation of kids are growing up with it and have embraced it quite decisively. They're bringing it with them, almost everywhere they go. -
shady28 bit_user said:I keep thinking of the Internet revolution, as a point of comparison. There were waves of companies that died before the Internet became essential infrastructure and so deeply integrated into our daily lives - even being right about the winds of change doesn't guarantee success. Many of the survivors of that era are now some of the biggest companies.
I think many companies are primarily focused on not being left behind, first and foremost, and focusing on RoI second. That's going to shift, as we move forward. As that focus shifts, the tech will continue to improve, as will people's experience with it. There might be a short-term cooling off, for AI tech, but the tech isn't going away and I don't expect quite the same magnitude of crash in AI that we had with dot-com companies.
That's basically what Goldman is saying. AI might be up and coming, but it's not this year or next year. It's probably 10+ years out.
And yes same thing happened late 1990s. Cisco's stock for example, topped out around 77/share, plummeted to 10/share, and today it sits at 47. Certainly Cisco itself has benefitted greatly from the widespread use of the internet as well as the spread of mobile devices (and hence, cellular infrastructure), but that doesn't mean someone who bought their stock in 1999 is doing well.
Of course, then there's 3COM. -
hotaru251
you are paying for the service not the specific feature itself.bit_user said:Modern AI includes things like Image generators, which companies like Adobe haven integrated into their bread-and-butter products with impressive results.
Nobody would pay adobe price just to add ai help. the core product isnt ai. -
bit_user
No, that's way too far. The internet didn't take that long to "happen", even if it wasn't full transformative after just like 5 years.shady28 said:That's basically what Goldman is saying. AI might be up and coming, but it's not this year or next year. It's probably 10+ years out.
AI won't take 10 years, either. The impact and extent to which it permeates everything will certainly be greater, after 10 years, but it will be a matter of course well before then.
Back then, it had the game pretty much to itself. If you look at it now, its marketshare has been nibbled away in various market niches, and it has been facing big competition & price erosion from Huawei. So, it no longer represents the market like it used to. If you look at the TAM where it plays, that has grown hugely over the same time.shady28 said:And yes same thing happened late 1990s. Cisco's stock for example, topped out around 77/share, plummeted to 10/share, and today it sits at 47.
Cisco simply grew fat, mainly because it could. Much like I expect Nvidia will lose its complete dominance of the AI market, in the long run.
Yup. Back in like 1998 or so, I paid a couple $hundred for a 100 Mbps 3Com PCI NIC. Within 5 years, you could buy little no name 100M cards for like $25. In 2004 or so, I got a motherboard with integrated 1 GigE. Perhaps where 3Com failed was making the transition to higher-speed enterprise NICs, because at least a few companies seemed to do alright in the 10 Gig market.shady28 said:Of course, then there's 3COM.
However, if you look at folks making cable modems, that market seems to have done pretty well. -
bit_user
Not sure about that. It depends on which product we're talking about.hotaru251 said:Nobody would pay adobe price just to add ai help. the core product isnt ai.