Happy Sunday and welcome to Investing in AI. I’m Rob May, CEO at Brandguard. I also co-founded the AI Innovator’s Community that does events in New York and Boston, so please check it out.
Today I want to talk about NVIDIA’s run up and if/when it might stop. The growth and profits are obviously determined by two things - general growth in the demand for AI and competition from people who aren’t NVIDIA.
I started my career in the chip industry as a FPGA and ASIC designer, so I know a bit about the industry first-hand. It has some very unusual industry characteristics that make it hard to determine if and when NVIDIA might be unseated as the major hardware producer for AI. Let’s review these characteristics in two parts. First, the criteria people look at when choosing a chip to use in their product. Then secondly we will look at factors in the way design cycles, market demand, and competition impact how chip companies grow and develop.
Before we dig into it, there is something you should understand about AI and computer chips. There are many different types of AI chips beyond GPUs. There are new ideas like neuromorphic chips, analog chips, or chips that merge semiconductors and biology to do intelligent computing. They are very different than GPUs but can apply to some of the same AI workloads. But also, you can run AI workloads on CPUs as well, they are just slow. So beating NVIDIA isn’t necessarily about building a better GPU. It’s about disrupting GPUs for certain AI workloads.
When we hear the news about AI chips, it typically focuses on benchmarks for performance on certain popular AI models. You can see one set of rankings for that here. Performance overwhelms the decision making on AI chips at this point in the market because these are mostly large chips for server farms for training large models, and AI is still in the nascent stages of being deployed inside many products, particularly at the edge.
As the applications of AI grow, managers looking to build AI products will evaluate chips on many different things beyond performance, including:
Footprint (how large is the physical chip and can it fit in my device or use case?)
Power consumption (really important for battery or low power devices)
Price (always)
Tolerances (heat, vibration, exposure to radiation, space, military, etc)
History (how long has the company been around, is this a brand new chip unproven in the market?)
Roadmap (this matters A LOT in the chip industry)
Tool chain (how does the chip get programmed and integrated and does it fit existing design workflows)
Use case generality (do you need a more general or more specific targeted chip?)
These variables make it really really difficult for a newcomer to break into the market because given the longer chip design cycles, you have to take a bet on what the demand is going to look like down the road for what you are building. So as you can see, NVIDIA probably won’t dominate across all of these vectors, although it is very strong on the most important ones.
That leads to my second point, which is the design cycles are long and that matters because it’s difficult to see far enough in advance to get things right, and can take a while to recover if you get things wrong. Just consider that Intel passed on providing chips for the iPhone launch because they didn’t expect it to be a large business.
That’s a funny thing about how the chip markets work. The big growth areas usually happen because new markets emerge for existing chips - not because chips are designed for market use cases as they are evolving. Consider that GPUs were not developed for AI, and not a use case for NVIDIA customers for most of the company’s history. But when AI arrived, GPUs were the best bet.
As another example, ARM specialized in low power chips for decades before the mobile phone revolution. And for those of you around in the early days of mobile phones, you probably remember their eventual ubiquity was not obvious. I worked at Radio Shack in the late 1990s in college and were highly commissioned for selling phones, so I pushed them any chance I could. But most people would say “what the hell would I ever do with a cellular telephone? What’s so important that I can’t wait until I get home, or to a payphone?”
So getting back to NVIDIA, I don’t see any force stopping the juggernaut in the next few years, other than a huge decline in the need for AI compute, and that seems unlikely. Any startup competitor could easily be acquired by NVIDIA for tens of billions of dollars and it would barely be material given NVIDIAs market cap. And while the other major chip companies are mounting a challenge, particularly AMD, the demand for GPUs is so great, and will be for the next few years, there is room for massive growth for multiple players.
When NVIDIA eventually slows down, I suspect it will come from one of two areas. The first is a surprising shift in market use cases. Given the long development cycles of computer chips, it’s hard to be as responsive to the market demands as software companies can be. And product designers have to choose some level of predictability and reliability in their components and their partners, which is why they favor proven tech over innovative new designs most of the time. But the way NVIDIA benefited because their chips for graphics cards happened to be a good match for new AI workloads - I think something similar could happen to slow them down. Some new popular AI workload will be a better fit for something other than a GPU, but a chip that is already established in the market in other ways.
Or secondly, it comes from a company that is building their own chips and decides to get into the broader semiconductor business. This means Apple, Amazon, Facebook, or most likely - Google. But all those companies, even if they can compete technically with NVIDIA, lack the infrastructure to sell and support chips and those who develop them. Building those capabilities will be slow.
Overall I think NVIDIA will remain dominant for quite some time and I don’t see any impending forces that could slow that down. But we know it won’t go on forever, so it’s fun to speculate where the eventual new king of AI will come from, and how far into the future that might be.
Thanks for reading.
Good read. Though it's a side note, what do you think about chiplet?
Memo to myself: https://glasp.co/kei/p/dc3e2683cfcfb7a95cad