Jun

17

NVDA, from Big Al

June 17, 2024 |

Yes, the chart looks like a moonshot. Two things:

1. NVDA is selling for about 37x revs, which looks very expensive. But at that P/S ratio, it's selling for only 70x earnings. The reason is that it's a profit monster, with ttm operating income almost 60% of ttm revs.

2. Maybe tech gurus can comment on this: Looking into GPUs, I find they have a limited lifespan, especially if run 24/7 at high workload, which AI seems to demand. Even ignoring upgrades, I'm thinking the chips NVDA is selling today may need to be replaced in as little as 3-4 years. Maybe I'm wrong, but I don't read about this aspect of their business model.

Humbert H. writes:

What matters is whether anyone will catch up. That is truly an open question. They're the leader, but more so due the inertia of their customers and not because nobody can replicate their technology. That's the thing about super-highly valued tech growth stocks, nobody can predict their situation even two years from now. Starting before the dotcom era, Microsoft has been growing forever, Cisco not as much, Nortel even less so, to put it mildly.

High-end AI chips are replaced due to obsolescence much more so that "wearing out". Some will certainly fail, but less so than the graphics card-type GPUs, and that's not the driving force for the replacement cycle. Trying to decide if Nvidia is properly valued is a pointless exercise. There are always people who will know 100x about the situation, and if they could truly value it properly, they'd find a way to follow up.

H. Humbert comments:

I would think as the computing power of the GPU or TPU (tensor processing unit) increases, the communication bandwidth among the chips, server chassis and server racks is the limiting factor that affects the overall computing speed of the entire AI high performance computing center's performance. The latter is a complicated issue, depending on the data center's server connection topology and so forth. I am sure NVDA knows about the issue and they will come up with a solution to resolve some of the speed bottlenecks either organically or by M&A. As a result, they will come up with next-gen solutions and products undoubtedly. There are designed obsolesces built into the products.

There is another issue. The training of the LLMs requires exorbitant amount of energy that it can't be sustained. The energy trajectory is almost exponential. Somehow these issues need to be mitigated. The increasing amount of energy expended also translates to the huge burden of cooling for the data center. So either NVDA or other companies may come up with the solutions to address some or all of the issues. Long story short, the product lifecycle remains relatively short.

The culmination of these issues and hence the potential solutions are of course good business for those who sell the gadgets. Both of the private sectors and the brain trust of the government and the defense departments worldwide are well aware of these issues and have been working hard to come up with some viable solutions.

Asindu Drileba writes:

There are other areas like Gaming, Molecular Dynamics Simulation, 3D Rendering/Computer Graphics, Video Editing, Crypto Mining that are GPU heavy and expected to grow in the future. As for AI, I think Nvidia riding on the AI hype is a bit precarious. Yes. They have mostly "locked in" AI tooling such that it makes no sense to compete with them.

What makes it precarious, is that a single paper that finally describes how to perform current AI applications with very cheap compute i.e CPU compute. Will destroy a lot the stock. As this problem is more of a software problem and not a hardware problem. So I expect it to move & get adopted very fast if it is actually solved. Several companies like Symbolica are working on such a solution.

Dylan Distasio adds:

No comment on the investing angle or future of the industry, but modern day chips are capable of handling pretty high temperatures for a very long time. Running at 24/7 high work load at a stable temp within the safe zone probably would actually result in a longer life than a gamer situation where the chip is stressed/heated and cooled down repeatedly. In any case, I don't think lifespan and failure due to thermal issues (if maintained properly) is a significant concern with GPUs. They'd be replaced due to obsolescence first.

Humbert Humbert writes:

This is a position for electrical interconnect. The current NVDA Blackwell chiplets are connected with very short electrical interconnects which are reaching their speed limits. The speed resolutions are to bring the optical connects closer to the edges of the chips. A few years ago DAPRA has a program called PIPES is to do just that using optical fibers. I don't recall the spec and I believe it has energy spec in terms of how many femtojoules per bit as it have been recognized a number of years ago that the digital switching energy will become an energy burden. But this solution may eventually run into chip edge real estate problem because of the size of the optical fiber core.

There are limits with the current state of the art digital neural network even though it is the hottest subject in town. Analog neural network may have its niche applications that could compute at higher speed and with lower power consumption. The following is one of the many example programs that the DoD is investing in.


Comments

Name

Email

Website

Speak your mind

Archives

Resources & Links

Search