Why The Racial and Gender Gap In AI Will Take Time To Solve
You can't become an AI expert overnight.
Update: This post caused a lot of consternation on twitter with people misinterpreting it. So for clarification, this post is not saying there is no bias in AI. And it is not saying that the lack of diversity in comp sci is a pipeline problem. This post says nothing about comp sci in general. This post says that when any field is new and small, you can’t hold it to the same diversity standards as something that is big and stable, because you can’t expect a nascent field to generate enough awareness to attract a diverse pipeline, until it hits critical mass of awareness. If you would like to criticize the arguments in this post, there are two ways to do so. You can dispute the claim that a nascent pipeline can’t be expected to be as diverse as a more mature pipline. I’m happy to discuss that. Or, you can argue that the last AI winter didn’t qualify as a nascent pipeline. But I don’t make any broader assertions about general tech pipelines here, and make no assertions about bias in the hiring process, so if you want to talk about those, then that’s fine but they don’t have anything to do with this post.
Happy Sunday and welcome to Technically Sentient. I’m Rob May, a Partner at PJC, specializing in AI and Robotics. In this issue I want to discuss something that not a lot of people seem to understand - that fixing gender and racial gaps in specific functional areas, like AI, takes time.
To understand why, we have to roll back the clock a couple of decades. The 1950s saw the invention of AI and a lot of excitement about the promise of the field. But since then, we have seen multiple AI winters - periods with little innovation or progress. In the mid 2000s, we were in one of these AI winters. It started to end in 2012.
The reason AI winters are important here is that, they influenced how many people studied AI. Take my own path as an example. I graduated with an Electrical Engineering degree in May of 2000 and took a job designing FPGAs and ASICs. In 2002 I decided to start a part-time Master’s degree in Computer Science, focused on AI. At the time, that mean doing mostly symbolic logic programming in LISP. I was disillusioned with the prospects of doing anything real with AI, so I quit the program halfway through.
For most of the time period of 2000 - 2015, if you had talked to most academic advisors or industry leaders about AI, the advice you would have been given was “stay away” or “don’t waste your time.” This was the general consensus. I remember visiting San Francisco in the summer of 2015 to give a talk at Jason Calacanis’ incubator about my experience building and selling my first company. That night I crashed at Jason’s house and told him my next company was going to be an AI company. “Really? Is there anything there?” That’s what the world’s greatest angel investor thought about AI in 2015. My point is, no one was paying attention to AI.
The reason that’s an important point is because it means there was no pipeline of AI talent. There were people working on AI who were passionate enough about it to ignore the naysayers, and there were people who had tangential skills that could easily be converted in AI engineers. But both were small groups of people.
Why wasn’t that small group of people evenly mixed among gender and race? Because when areas of study are small, the way you find out about them is more random and somewhat path dependent. So you are at the mercy of that randomness. It is what it is. And anyway, you were discouraged from studying AI no matter what your race or gender because it was a waste of time.
Now. In 2012 that started to change with the launch of AlexNet. This was the first neural network that made such a dramatic improvement for image classification that it woke up the AI industry again. It was 10 percentage points better than the nearest competitor. But even with that performance, it wasn’t like everyone started studying AI the next day. These things take time.
A few things happened next. First, the other people working in AI paid attention to see if AlexNet was a fluke, or was real progress. That led to huge interest in CNNs and image classification, and by 2015, the error rate on Imagenet was similar to human levels. That was the moment that the broader tech community began paying more attention.
So let’s recap. From 2000 - 2012 you have very few people in general in AI because it was considered a waste of time. From 2012 - 2015 you have a slight uptick in the influx by people who knew AI people and the results of AlexNet and other image classification successes. Forward thinking colleges began offering more AI classes, but not a ton. But from 2015 - 2017, that exploded.
Now, finally, we got the very beginnings of a broader group of people studying AI, and being encouraged to pursue it. Then in 2017, I think it was Carnegie Mellon that offered the first AI bachelor’s degree, although I can’t pin down the exact date. That act got the attention of other schools, which followed suit in the following years.
On top of that, more people in tangential fields rebranded themselves as AI people. Now, finally, we began seeing some small pieces of diversity in the field.
But think about this - the very first people to start AI degree programs haven’t graduated yet. And the people who got CS degrees with an AI focus from the schools that adopted AI classes early are just a year or two into the workforce. Getting a PhD in AI is probably 7-8 years from high school graduation, and getting the education and experience to have an impact in industry is simliar - 4 years of school and 3-4 of work experience.
My broader point here is - to evaluate whether there are gender and race issues in AI takes 10-12 years from the time the field finally got hot. Before 2015, women were discouraged from going into AI because EVERYONE was discouraged from going into AI. If we consider 2015 the industry breakout year, then I’d expect it to be 2025 before we see some semblance of a gender parity trend in AI just because you need time for people to see AI as an opportunity, study it, and get in a position to contribute.
So when people ask me about gender and race in AI, I encourage them to take a full pipeline view. Don’t just look at the change in researchers and academics year to year. Look at the growth in incoming college students interested in the field, and use that as a projection of where the field will be in the future.
Changing these things takes time. Progress can be slow. But, like the Buddhists say, “little by little one walks far.” We have far to walk, but demanding rapid major progress ignores the reality of how long these things take, and ignores the progress we are making little by little.
Thanks for reading.