Gender and cultural bias in the programming world


Almost as soon as it had happened, my Twitter feed was abuzz with the keynote talk that Jacob Kaplan-Moss gave on how the internal mythology of programmer skill distribution leads to excluding and discouraging what must be the vast majority of software engineers. You should watch it, if you haven't:

His talk has stimulated a lot of us talking and thinking about this problem.

Thankfully at this point in the evolution of our industry, the notion that gender, ethnic, and other forms of discrimination don't plague programmer culture and hiring practices has lost all credibility, and the most influential voices like Jacob's are ensuring that awareness of it is thoroughly distributed in our community. The prevailing approach we've taken has been a cultural brute-force attack - advocacy groups like PyLadies and Django Girls, educational materials and courses to encourage women in tech, and thought leaders preaching the gospel of empowerment and diversity. If we make the message pervasive enough and make sure that women and other minorities have access to learning and encouragement, we can overcome the institutionalized discrimination in our industry.

But I don't think this is the most effective approach.

That's not to say it's not a valid, valuable, and respectable approach. That's to say that I think it attacks a complicated, multidimensional problem with numerous causes, confounders, and antagonists s with the equivalent of the Care Bear Stare: we don't know how or why it is going to work, but we know what it is we want and we're going to want it with all of our hearts until we get it.

What this approach underestimates is that we're still human. I have biases. You have biases. She has biases. They have biases. And we're not going to change that.

Why this is especially a problem for the programming industry is something Jacob outlined extremely well with his differentiation of assessing software engineers from quantifying and ranking ultra-marathoners: we just don't have usable metrics. And we know this. An employer will pay anywhere from $15k to $45k to a staffing firm for a new hire, which is why they'll eagerly pay their own employees thousands of dollars bonus for a referral. Countless startups are attracting huge VC investments to find technical solutions to making the hiring problem less expensive and to increasing the candidate signal-to-noise ratio. There still isn't a good solution, because there still isn't a clear idea of good data by which to measure. Solve that, and you're a billionaire, several times over.

Jacob correctly highlighted the propensity of human beings in the absence of real actual data to use stories and anecdotes as small-batch, artisanal data, and that this propensity causes us to create archetypes and stereotypes. That's why in the absence of actionable data, we're so prone to giving in to our biases.

Some of these biases are consciously constructed. The myth of the "brilliant asshole" that Jacob mentions is one of the oldest. Paul Graham, philosopher-entrepreneur founder of Y-Combinator, wrote over a decade ago:

"Those in authority tend to be annoyed by hackers' general attitude of disobedience. But that disobedience is a byproduct of the qualities that make them good programmers. They may laugh at the CEO when he talks in generic corporate newspeech, but they also laugh at someone who tells them a certain problem can't be solved."

Most of these archetypes survive because we have nothing better. Executives and managers read an article in Harvard Business Journal or Fast Company about what Google is doing or what Facebook is doing, and they accept it as the causal catalyst that fuels those companies' success. Hiring wisdom becomes wisdom precisely because it's unconventional.

If we don't have data to know what a good programmer looks like, we often use ourselves as models. Much like driving a car, where 90% of drivers self-report as being "above-average" drivers, I'd wager most professional software engineers think themselves on the right half of the bell curve. And if I'm a white, straight, cis-, male who is educated but mostly a self-taught programmer, I'm going to over-value the most common attributes that coincide with those demographics.

Believing in equal opportunity for all in programming and wanting to be part of a diverse developer team -- the by-products of our brute-force awareness campaign -- will have a hard time resulting in actual equality or diversity when it comes to hire-time or review-time, because we still just have no metrics and so only have our subjective analyses to act upon. Which are necessarily impacted by our bias, because we're still human and we will always have bias.

In fact, even if we're acutely aware of our biases, sometimes that even becomes counter-productive: we overvalue a minority's status in hiring decisions precisely because we don't want to be biased, which does minority developers a disservice by not identifying them as "the programmer who banged out a fault-tolerant, distributed key-value store for the client in a week" but "the female programmer." Nobody wants to be known as that.

Being in this world where we don't have metrics is astonishingly disorienting for programmers who are hiring managers and (I love what Heroku calls them) "vibe managers." We know there are more coding jobs than qualified programmers to fill them. We know members of our team could get poached at any time by another firm, and we know how hard it is to convince a programmer with multiple job offers that ours is the one he or she should accept. And so we focus on making sure we have a company culture that includes value-adds for belonging to it: ping pong tables, happy hours, catered lunches, and social outings.

Most of the time, these only reinforce that we think of other programmers as being exactly like us. Do your happy hours center around alcoholic beverages? - Have you considered that some of your engineers might not drink or may even be themselves or have close experiences with alcoholics? Do your social outings go out into the evening? - Have you considered that some of your engineers might have small children at home and that responsibility would exclude them from the plans? Do your catered lunches take into account engineers who are vegetarian? Vegan? Kosher or Halal? Diabetic or Celiac?

Do we adjust and edit our events and offerings to fit the team we have at any given time? Do we do none of it so that nobody feels excluded? Do we do what we've always done and decide that people who feel excluded really just don't belong in our company?

I honestly don't have a good answer. This is hard stuff to solve.

But we still have to solve it. I worked at one shop many years back where my group and its leaders were exclusively white, straight, cis- males, and even belonging to that same demographic, I was really uncomfortable. We didn't have an outage; we "got raped." Song lyrics got sung repeatedly, replacing single words of the lyric with "semen." I expressed discomfort to my team lead; as a result, half of the group was uncomfortable that I was "censoring" them, and the other half complied with the letter of my request without the spirit and indulged in doing so (instead "raped", we got "graped.") I left that company in under a year. As its revenues were plummeting, it got bought out by a larger competitor that laid off most of the staff.

So to solve it, to develop an effective strategy toward achieving the ends we want of something closer to a world in which diversity, opportunity, and inclusiveness is hard-wired into our industry, I think we must accept our humanity and grant that we'll always be biased. If we accept that, if we accept that our hiring decisions will always be subjective and prone to bias, and if we want diversity and inclusiveness anyway, where would that take us?

The first place is your recruiting team and your recruiting practices. Diversity starts there. If your recruiting team is stocked with white, straight, cis- men, there's bias in your very first filter - that's not a promising start. The most common complaint I hear among my peers who are leading and building engineering teams is that they're not seeing enough applications from female engineers - the internal culture of recruiting team may be one of the biggest reasons why. Nothing reinforces a monoculture like monoculture.

But let's say you've got a recruiting team that has internally adopted a more diverse composition and awareness. Since they themselves are probably not terribly technical people, I'll wager we've given them a checklist of qualifications to search LinkedIn or Dice for: N years experience in Python or Django; experience with one or more of the following: Backbone.js, bootstrap.js, AngularJS, or ReactJS; experience in agile methodologies like Scrum or Kanban; etc.

I think this fuels the barrier to entry that Jacob highlights.

All of those enumerated qualifications will only allow through the filter candidates that have passed through other recruiters' filters before - which means they've already been subject to discrimination. There's no room in that checklist for qualifications we may very well want besides "N years experience in the industry" - motivation to grow, self-learning capability, communication and community skills, engineering discipline for procedures and policies. Your recruiting team needs to look for traits other than checkboxes for technology stacks. And they need to be aware internally of the biases that get invited to the table when subjectively evaluated traits like that enter the picture.

So the recruiter is satisfied and sends the candidate along for consideration by putting together a packet of a resumé and some code samples and then shipping it off to the hiring manager - somebody like me. I've been hiring Python engineers for 7 years now, and I will tell you right now that I have my own biases that can come into play in my decision making, even if I'm entirely aware of them and entirely desirous to not let them influence my calculus. I will tell you right now, I fail at that and I always will. So what do I do?

First, I don't want to ever see a resumé with a name or address on it. I don't want to know if they're male or female. I don't want to guess from their name that they're Chinese or Indian. I don't want to wonder if, because of the hyphenated last name, she's married or has a family. I don't want to infer from their address in Anacostia that they're black or from their address in Potomac, Maryland, that they're white, middle-upper class. None of the inferences I could possibly make from that information are at all relevant to whether or not this candidate would make a good hire, and yet they're the first thing we list on a resumé. Call them Candidate #421. Leave it at that.

Second, I want to see code samples before I ever see the resumé or hear anything about the candidate. If it's code that solves a stock programming problem that we give every candidate, even better - apples to apples. In my role as a team leader and principal engineer, the most important thing for me about a particular candidate is: do they write code that I want to read and I want to approve a pull request for? I want the code to be my first impression about the candidate. Res ipsa loquitur.

I can only consciously override my own biases so much. I'd rather build structure into my work life so that my biases are not even given a chance to act.

I think it's really once we've gotten to this point that the brute-force attacks on prevailing cultural expectations in the programming world can really take hold. That's the point where we can actually see the cultural transformations where sexual harassment training for managers is no longer an empty ritual for a cover-your-ass HR policy, but actually something managers want to learn to ensure they're not promoting an exclusive or oppressive work environment. Or where we're actively not okay with our colleagues and employers being discriminatory or frat-ish or offensive.

Jacob's absolutely right that there's work to be done. I can't imagine thinking that success for your company involves systematically disqualifying over half of the potential applicant pool. I think the place to get started is in accepting our biases and engineering mechanisms to ensure they don't even have a chance to act, because in the absence of usable hiring metrics, it's otherwise inevitable that they'll come into play.

Currently unrated