We built voice modulation to mask gender in technical interviews. Heres what happened. Since we started, weve amassed data from thousands of technical interviews, and in this blog, we routinely share some of the surprising stuff weve learned. In this post, Ill talk about what happened when we built real time voice masking to investigate the magnitude of bias against women in technical interviews. In short, we made men sound like women and women sound like men and looked at how that affected their interview performance. We also looked at what happened when women did poorly in interviews, how drastically that differed from mens behavior, and why that difference matters for the thorny issue of the gender gap in tech. The setup. When an interviewer and an interviewee match on our platform, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question. Interview questions on the platform tend to fall into the category of what youd encounter at a phone screen for a back end software engineering role, and interviewers typically come from a mix of large companies like Google, Facebook, Twitch, and Yelp, as well as engineering focused startups like Asana, Mattermark, and others. After every interview, interviewers rate interviewees on a few different dimensions. Feedback form for interviewers. As you can see, we ask the interviewer if they would advance their interviewee to the next round. We also ask about a few different aspects of interview performance using a 1 4 scale. On our platform, a score of 3 or above is generally considered good. Women historically havent performed as well as menOne of the big motivators to think about voice masking was the increasingly uncomfortable disparity in interview performance on the platform between men and women. At that time, we had amassed over a thousand interviews with enough data to do some comparisons and were surprised to discover that women really were doing worse. Specifically, men were getting advanced to the next round 1. Interviewee technical score wasnt faring that well either men on the platform had an average technical score of 3 out of 4, as compared to a 2. Despite these numbers, it was really difficult for me to believe that women were just somehow worse at computers, so when some of our customers asked us to build voice masking to see if that would make a difference in the conversion rates of female candidates, we didnt need much convincing. so we built voice masking. Since we started working on interviewing. Some early ideas included sending female users a Bane mask. When the Bane mask thing didnt work out, we decided we ought to build something within the app, and if you play the videos below, you can get an idea of what voice masking on interviewing. In the first one, Im talking in my normal voice. And in the second one, Im modulated to sound like a man. Armed with the ability to hide gender during technical interviews, we were eager to see what the hell was going on and get some insight into why women were consistently underperforming. The experiment. The setup for our experiment was simple. Every Tuesday evening at 7 PM Pacific, interviewing. In these practice rounds, anyone with an account can show up, get matched with an interviewer, and go to town. And during a few of these rounds, we decided to see what would happen to interviewees performance when we started messing with their perceived genders. In the spirit of not giving away what we were doing and potentially compromising the experiment, we told both interviewees and interviewers that we were slowly rolling out our new voice masking feature and that they could opt in or out of helping us test it out. Most people opted in, and we informed interviewees that their voice might be masked during a given round and asked them to refrain from sharing their gender with their interviewers.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
November 2017
Categories |