Happy Sunday and welcome to Investing in AI. This newsletter is 100% human, and I never use AI to write any parts of it. If you haven’t checked out our podcast recently, we had Alex Schmelkin from Sixfold.ai, and Pawel Zimoch from Featrix.ai. Both have great insights and are worth a listen.
I’ve spent a lot of time recently going deeper on new AI hardware because I think it will become increasingly important in the near term future and the next wave of AI. Sam Altman has supposedly indicated that GPT-5 is showing diminishing returns compared to the gains made between GPT-3 and GPT-4. It raises questions about where we take AI next as we move from basic generative AI to better reasoning, logic, and causal analysis.
There are two things I will point out that I think are important when it comes to creating smarter AI. The first is, we know our brain doesn’t use backpropogation, the primary algorithm for training AI systems today. Backprop takes too long and consumes too much power and our brains are much more efficient than that. The second point is that intelligence in humans evolved before written text. Somehow we were “intelligent” before we had thousands of documents to train on, which means logically that intelligence isn’t necessarily a function of more data.
It raises the question - is intelligence a function of structure? Is compute > data?
I’m focusing more on the compute layer at the moment because I believe this could be true. Intelligence may be a structural computation issue as much as it is a data issue. Different types of compute, particularly analog and neuromorphic architectures, have the ability to show breakthroughs in intelligence not possible with GPUs. The way I think about it is that computational structure may determine the intelligence upside potential, and data access may determine where in that band of potentiality a given system falls.
Additionally, wherever AI goes next, we need cheaper more efficient compute so focusing on the price, power, and performance of compute for a given AI model or class of models is going to be a great place to invest.
I’ll be going deeper on AI hardware in some future newsletters, but this should set the stage for why I think it’s important. Thanks for reading.