There Are Many Nuances to Bias and Injustice
In my last article, I looked at the issue of gender bias in the tech industry. My day job is organizing decision behavior and I’m always wary of the web of biases and prejudices we all carry with us. I knew I couldn’t explore this topic without getting out of the building and talking to women in tech. And I thought it would be interesting to get a sense of what the landscape looks like from successful New Zealanders.
My first interview was with Pieta Brown, Chief Analytics Officer for Lab360, a New Zealand analytics services business. Pieta is responsible for driving the Lab360 analytics vision, strategy and thought leadership. Her career has spanned the legal, consulting and statistician professions. She’s passionate about using data analytics to drive value for clients and New Zealand as a whole. She was one of the first two graduates of the Master of Professional Studies in Data Science from Auckland University.
Rohan: What does the phrase ‘gender bias’ mean to you?
Brown: It’s askew in our decision making and action taking based on a person’s gender. It’s about making decisions you otherwise wouldn’t, creating or preventing opportunities based on your assumptions of what a person brings to the table. It’s about stereotypes and whether we let them frame our thinking.
Gender bias in the technology industry is well-documented in the media. Women report being excluded from groups and teams within the workplace as well as social circles outside of work. They face an unconscious bias based on their gender and are overlooked for promotions. They experience inflexible working environments around maternity leave and child care. In the worst cases, they are victims of sexual harassment by their peers and superiors.
One unfortunate part of being human is we tend to accept the way something is framed when asked for our opinion. The posing of the question determines the answer in a significant way. This fundamental part of the way our brains are wired raises a lot of questions about how we navigate the information age. We can come up with evidence to prove pretty much anything. And for as long as we keep searching for ‘the answer’ without reflecting on the question, we’ll struggle to gain the initiative over the events in our lives.
The lesson is that we should get in the habit of looking at something in several different ways to give us a chance to actually understand what is at play. Manipulating the discussion by using a framing effect was the intent behind my next question which Pieta, to her credit, stopped in its tracks.
Rohan: What sort of framing effects is present in the phrase ‘gender bias in the tech industry?’
Brown: The trouble is that the entire industry as a whole has been tarred with the same brush; that this assumed gender discrimination is the experience of women in tech everywhere. Here in New Zealand our tech sector is influenced and impacted by Silicon Valley’s sexism. This means if you’re a woman in tech, you’re framed as suffering the effects of gender bias, which in turn makes it real. It is a 21st century self-fulfilling prophecy.
Gender bias is an example, albeit a widespread one, of the meaner side of human nature. If we weren’t capable of feats of greatness, beauty and inspiration we’d be rather a mean-spirited bunch. One of the more sinister aspects of bias is that much of it is unconscious. We often don’t really know when we’re being assholes.
In other situations, we can be quite well-meaning but act in a way that reveals some of our deeper prejudices. The good thing is these prejudices are visible to others and, hopefully with a bit of patient communication, we can help each other see the implications of our worst behavior.
Rohan: What sort of unconscious social bias bothers you the most in human society?
Brown: That ‘different’ is worse or, in some way, is not as good as ‘us’ and who we are.
One of the conceits about organizations is we can mandate the behavior we want with rules and decrees from the throne. In the domain of injustice, the influence of John Rawls took people to the place of seeking to design the just institution.
In the approach of what Amartya Sen calls the ‘arrangement-focused’ approach to justice, elements of individual behavior are assumed. As Rawls notes ‘… for the most part, I examine the principles of justice that would regulate a well-ordered society. Everyone is presumed to act justly and to do their part in upholding just institutions’. The idea that we can design our way out of gender bias is attractive but elusive.
Rohan: If we hired 50:50 men and women, but only from Ivy League universities, would we be managing the gender bias?
Brown: That wouldn’t fix the gender bias as female Yale graduates aren’t representative of women in a general sense, and would, in turn, create a new bias; an education bias. The answer is not imposing rules to create a 50:50 gender split.
Rohan: Which will be more effective in encouraging the outcome of more intelligent, motivated and empowered women in tech companies? Writing policies to encourage their selection, or through our behavior creating workplaces that naturally attract them?
Brown: We need to focus on workplaces where women naturally choose to work. Because if this doesn’t happen, if Silicon Valley maintains its sexist culture, these women will end up going elsewhere, which is a loss for the industry as a whole.
As a writer, it is important for me to remain aware of my language, how my inner voice finds its way onto the page, and to strike a balance between how the story writes itself and what I’m setting out to say. Something I’m fond of is frameworks to help assess the relative fitness of writing. One such framework is the Finkbeiner test, which helps journalists avoid gender bias in media articles. To pass the test, the article must not mention:
- That the person is a woman
- Her husband’s job
- Her child care arrangements
- How she nurtures her underlings
- How she was taken aback by competitiveness in her field
- How she’s a role model for other women
- How she’s the ‘first woman to…’
Rohan: The Finkbeiner test is useful because it explores consequent questions of bias. In what ways does gender bias reveal itself in your world?
Brown: I believe we associate different characteristics and strengths with gender stereotypes. None of this is unique to technology, it’s about our use of language generally and whether we do or don’t accept certain behaviors. I’d add to the Finkbeiner test ‘any reference to her appearance’.
Sometimes we just say dumb stuff. We’ve all done it and it’s contextual. Sometimes thinking back, we can’t believe what we said. But we did say it and in some way it reflects who we are, our experiences and our expectations of the world. One thing that comes with our connected world is that we can’t be sure what other people will make of what we say and how far things will go. Given how all of us lapse in terms of self-awareness, perhaps the thing to be surprised about is that we don’t get dragged over the coals more often.
Rohan: Nobel prize winner Sir Richard ‘Tim’ Hunt found himself in a world of pain last year when he said the trouble with girls working in science is that “three things happen when they are in the lab… you fall in love with them, they fall in love with you and when you criticize them, they cry”. He later went to apologize wholeheartedly: “I mean, I’m really, really sorry that I caused any offense. That’s awful. I certainly didn’t mean… I just meant to be honest actually”. How might his statement have been true for him? How might we be compassionate toward his perspective?
Brown: This might well have been his own experience. He may have seen it and then he generalized from that. It’s possible that he absolutely may not have seen the characterization of what he said. To be compassionate toward what he said means we would need to reframe it.
Rohan: I grew up in a family of powerful women who belonged to a Germaine Greer world. One thing she wrote was “Every woman knows that, regardless of all her other achievements, she is a failure if she is not beautiful”. What does this mean to you?
Brown: Well, what do we mean by ‘beautiful’? If we’re talking about physical appearance, then that’s a very sad statement. Beauty can mean so many thing things and has many subtle contexts to it. There can be so much judgment in the language we use.
Rohan: She also wrote “Women’s issues are often disguised as people issues, unless they are relegated to the women’s pages which amazingly still survive. Senior figures are all male, even the few women who are deemed worthy of obituaries are shown in images of their youth”. I’ve taken the line that gender bias in tech is an issue of human injustice. Have I simply disguised a women’s issue as a people issue?
Brown: The recently released Elephant in the Valley report investigated claims by surveying more than 200 women working in Silicon Valley/Bay Area in San Francisco, and the results were far from encouraging:
- 60 percent received unwanted sexual advances
- 65 percent of those women had received those advances from a superior at work
- 59 percent felt they were not receiving the same opportunities as their male colleagues
- 84 percent had been told they were too aggressive
- 53 percent were told they were too quiet
It’s hard not to argue with these results, and to feel a camaraderie and empathy toward these women who are dealing with this unwarranted gender bias while going about their daily working lives. There are so many nuances to bias, and to injustice, and it’s easy to focus on traits such as race and gender. But ultimately we are talking about human experiences: human experiences that might need different and nuanced strategies.
Did you like today’s post? If so you’ll love our frequent newsletter! Sign up HERE and receiveThe Switch and Shift Change Playbook, by Shawn Murphy, as our thanks to you!