Happy Sunday and welcome to Investing in AI! I’m Rob May, CEO at BrandGuard. I also run the AI Innovator’s Community, so if you are in NYC or Boston, we have tons of events this year. Check it out.
Today I want to look very far ahead at a possible problem for the economy that AI could create, and ask how we may deal with it. Imagine these AI agents we are building get better and better. Over time they take over more and more tasks from our work lives and personal lives. As this happens, the instructions we give them will get ever more abstract.
What happens to agents at this point? They will start to learn more, and become more personalized. If one person runs a small business and the other just has standard W-2 income, their agents will need to know different things. If one person owns a home and another rents an apartment, their agents will need to know different things. You can see a world where agents can buy knowledge modules. I can buy the “small business tax” module or the “home repair” module for my agent if those are relevant to me and not to other people.
Now imagine my agent can also package up things it’s learned and sell those to other agents. This could happen a lot in business because so many situations are unique. For example, when I sold my first company, we had started as a LLC and converted to a C corp when we took venture capital. It was unclear if the LLC time frame pre-conversion should count towards the holding period QSBS, and our accounting and law firms went deep to figure that out. Imagine that knowledge was in an agent that had gone through the process, and you could purchase that knowledge.
Once this happens, you can see an economy growing among agents. Agents that collect unique interactions and data sets will sell them to other agents. Agents with broad goals (e.g. “help me grow my business”) will go out and find some sources of knowledge that they believe they need to achieve the goals they are given.
What happens to the world economy as the agent economy grows? At 5% of all economic activity it probably doesn’t matter, but what about at 50% of the world economy?
Much like individual humans, the agents will maximize their own best interest, not the good of society, which means if their demands for knowledge create an economic boom that leads to inflation, they won’t scale back on their own. Or, maybe the opposite happens and the low cost of knowledge and the ability to do everything cheaply and electronically leads to deflation, which causes it’s own set of problems.
We can imagine what will happen by looking at other markets that have become more electronic over time. Take high frequency trading as an example. HFT firms will argue they provide liquidity and help make markets run more smoothly but, lots of research shows they increase volatility and aren’t really beneficial to markets.
When AI agents run increasingly more of the economy, what will it mean for us?
Maybe AI agents will be regulated by the Fed. Maybe growth rates will be constrained by Fed (and their international peers) rules, and programmed into agents so that their ability to grow will ebb and flow with general economic trends, acting as a counterbalancing force to other economic activity.
There are so many ways this could play out that it is difficult to imagine exactly how the economy will work when we hit this point. But, it’s a real possible future, and one that we should be thinking about.
Thanks for reading.
Yes, this future is likely to happen. Like FRB controls the growth rate of the agents, IRS imposes taxes on the agents.
Memo to myself: https://glasp.co/kei/p/ec99009ecad5578e93ad
This issue gets meta pretty quickly, given that AI tools will also be used to help optimize and regulate the growing AI agent economy. I’m hoping, unlike HFT, the initial wave of weak AI agents will generate so much “real” value that the economic impact will be more of academic than practical interest. Of course as strong AI agents take over, alignment will be a major issue.