Happy Sunday (and Mother’s Day in the U.S.) and welcome to Investing in AI! I’m Rob May, CEO at Nova, and we make Brandguard, a brand safety tool for generative AI. I’m also a very active angel investor in the AI space, and run the AI Innovators Podcast.
I was in a meeting with some AI folks recently and heard about a study that will be released soon. I won’t go into details until then but, the gist of it is that when people are given the chance to use AI tools for many day to day work tasks, they don’t always use them, even when they can. The category of tools that most often found users skipping the AI version in favor of the real way to do it - personal relations that build intimacy. (All kinds of intimacy, not just romantic relationships).
As an example, say I give you a tool that automatically praises all your co-workers once a day. In this study, people given a tool that does something like that decided NOT to use it. You can imagine why. These human to human interactions evolved because they build trust. If you fake the interaction, you fake the trust. Perhaps for a one-off interaction where you don’t know if it’s a bot or not, it doesn’t matter. But for ongoing relationships where the intimacy deepens over time, people chose (at least for now, in this study) to keep the human touch.
There is a concept in biology called a limen. A limen is a threshold below which, you don’t register a stimulus, because the stimulus is too small or light. I’m starting to wonder if there is an intimacy limen in AI. In other words, if you list out all the types of interactions we humans have with each other, and rank them by intimacy, is there a threshold at which we don’t want to use AI? The intimacy limen is the threshold at which replacing the task with AI isn’t worth it, because we humans actually want to feel ourselves doing that task. AI for store checkout or drive thru ordering? Yes. AI for consoling our best friends after a breakup? No.
When you are building AI tools, or investing in them, you sometimes see an argument like this: People do X millions of Y type of action every day. If AI can automate that away we can save Z hours a year and that’s worth billions of dollars. Yet sometimes we humans really like to interact with each other. And other times we don’t but, are programmed to feel guilty if we shirk that interaction. You can’t build what humans don’t want automated away, even if it does make our lives easier. Maybe future generations that grow up with more native AI tools will feel differently, but for now, the intimacy limen is something to consider when you are evaluating opportunities for AI.
Thanks for reading.
One of the ways we overcame this when using AI in winemaking, was to give the winemaker 6 final recipes/blends, where they could choose which one of the 6 goes to bottling - knowing any of the choices were going to achieve their goals. We didn't remove that part of the "ritual" so they still felt included in the process, even though the hard work was already done. Maybe if you had an AI that gave a human a set of choices on how to console a grieving friend, it would seem more palatable.