Digital Information

Tech Has Become Another Way For Men To Oppress Women

We act as if technology were neutral but its not. The challenge now is to remove the gender bias, says human rights lawyer and writer Lizzie OShea

Most women in the Bay Area are soft and weak, cosseted and naive, despite their claims of worldliness, and generally full of shit, wrote former Facebook product manager Antonio Garca Martnez in 2016. They have their self-regarding entitlement feminism, and ceaselessly vaunt their independence. But the reality is, come the epidemic plague or foreign invasion, theyd become precisely the sort of useless baggage youd trade for a box of shotgun shells or a jerry can of diesel. This is from his insider account of Silicon Valley, Chaos Monkeys. The book was a bestseller. The New York Times called it an irresistible and indispensable 360-degree guide to the new technology establishment. Anyone who is surprised by the recent revelations of sexism spreading like wildfire through the technology industry has not been paying attention.

When Susan Fowler wrote about her experience of being sexually harassed at Uber, it prompted a chain of events that seemed unimaginable months ago, including an investigation led by former attorney general Eric Holder, and the departure of a number of key members of the companys leadership team. Venture capitalist Justin Caldbeck faced allegations of harassing behaviour, and when he offered an unimpressive denial, companies funded by his firm banded together to condemn his tepidity. He subsequently resigned, and the future of his former firm is unclear. Since then, dozens of women have come forward to reveal the sexist culture in numerous Silicon Valley technology and venture capital firms. It is increasingly clear from these accounts that the problem for women in the tech industry is not a failure to lean in, it is a cultureof harassment and discrimination that makes many of their workplaces unsafe and unpleasant.

At least this issue is being discussed in ways that open up the possibility that it will be addressed. But the problem of sexism in the tech industry goes much deeper and wider. Technological development is undermining the cause of womens equality in other ways.

American academic Melvin Kranzbergs first law of technology tells us that technology is neither inherently good nor bad, nor is it neutral. As a black mirror it reflects the problems that exist in society including the oppression of women. Millions of people bark orders at Alexa, every day, but rarely are we encouraged to wonder why the domestic organiser is voiced by a woman. The entry system for a womens locker room in a gym recently refused entry to a female member because her title was Dr, and it categorised her as male.

But the issue is not only that technology products reflect a backward view of the role of women. They often also appear ignorant or indifferent to womens lived experience. As the internet of things expands, more devices in our homes and on our bodies are collecting data about us and sending it to networks, a process over which we often have little control. This presents profound problems for vulnerable members of society, including survivors of domestic violence. Wearable technology can be hacked, cars and phones can be tracked, and data from a thermostat can reveal whether someone is at home. This potential is frightening for people who have experienced rape, violence or stalking.

Unsurprisingly, technology is used by abusers: in a survey of domestic violence services organisations, 97% reported that the survivors who use them have experienced harassment, monitoring, and threats by abusers through the misuse of technology. This often happens on phones, but 60% of those surveyed also reported that abusers have spied or eavesdropped on the survivor or children using other forms of technology, including toys and other gifts. Many shelters have resorted to banning the use of Facebook because of fears about revealing information about their location to stalkers. There are ways to make devices give control to users and limit the capacity for abuse. But there is little evidence that this has been a priority for the technology industry.

Products that are more responsive to the needs of women would be a great start. But we should also be thinking bigger: we must avoid reproducing sexism in system design. The word-embedding models used in things like conversation bots and word searches provide an instructive example. These models operate by feeding huge amounts of text into a computer so it learns how words relate to each other in space. It is based on the premise that words which appear near each other in texts share meaning. These spatial relationships are used in natural language-processing so that computers can engage with us conversationally. By reading a lot of text, a computer can learn that Paris is to France as Tokyo is to Japan. It develops a dictionary by association.

But this can create problems when the world is not exactly as it ought to be. For instance, researchers have experimented with one of these word-embedding models, Word2vec, a popular and freely available model trained on three million words from Google News. They found that it produces highly gendered analogies. For instance, when asked Man is to woman as computer programmer is to ?, the model will answer homemaker. Or for father is to mother as doctor is to ?, the answer is nurse. Of course the model reflects a certain reality: it is true that there are more male computer programmers, and nurses are more often women. But this bias, reflecting social discrimination, will now be reproduced and reinforced when we engage with computers using natural language that relies on Word2vec. It is not hard to imagine how this model could also be racially biased, or biased against other groups.

These biases can be amplified duringthe process of language learning. As the MIT Technology Review points out: If the phrase computer programmer is more closely associated with men than women, then a search for theterm computer programmer CVs might rank men more highly than women. When this kind of language learning has applications across fields including medicine, education, employment, policymaking and criminal justice, it is not hard to see how much damage such biases can cause.

Removing such gender bias is a challenge, in part because the problem is inherently political: Word2vec entrenches the world as it is, rather thanwhat it could or should be. But if we are to alter the models to reflect aspirations, how do we decide what kind of world we want to see?

Digital technology offers myriad waysto put these understandings to work. It is not bad, but we have to challenge the presumption that it is neutral. Its potential is being explored in ways that are sometimes promising, often frightening and amazing. To make the most of this moment, we need to imagine a future without the oppressions of the past. We need to allow women to reach their potential in workplaces where they feel safe and respected. But we also need to look into the black mirror of technology and find the cracks of light shining through.

This article was originally published in The Guardian. Image courtesy of Ben Jennings.

A Pakistani walks in the main hall of a church following a suicide attack in Quetta, Pakistan
Suicide Bombers Kill Nine At Christian Church In Pakistan [Video]
Firefighters from Kern County, Calif.
Flames Threaten Rich California Enclave, Residents Flee [Video]
The AFP Assistant Commissioner Neil Gaughan speaks to reporters about the arrest.
Australian Police Charge Man Of Acting As Economic Agent For North Korea [Video]
One of two bodies is removed from the home of billionaire founder of Canadian pharmaceutical firm Apotex Inc
Apotex Billionaire And Wife Found Dead In ‘Suspicious’ Case [Video]
How to start a business
How To Start A Business Online [Video]
The price of one Bitcoin neared $20,000 across many exchanges
As Bitcoin Flirts With $20,000, Let’s Revisit Its Earlier Crashes [Video]
The Road to Recognition Infographic
Personal Branding A To Z [Infographic]
EU Justice Commissioner Jourova holds a news conference in Brussels
EU Agrees Clampdown On Bitcoin Platforms To Tackle Money Laundering
How infographic can benefit your business
How Infographics Can Help Your Business [Infographic]
Content marketing
Visual Gold! The New Revolution Of Content Marketing [Infographic]
Infographic Online Reviews
Everything You Need to Know About Online Reviews [Infographic]
How to Effectively Curate Content Infographic
How To Do Content Curation Effectively [Infographic]
Twitter
Twitter’s Launching Its Native Tweetstorm Feature, Called ‘Threads’
Money on Facebook
How To Make Money On Facebook [Infographic]
Facebook Updates
Facebook Opens AR Studio To All Creators
Hashtags
Instagram Announces New Option For Users To Follow Specific Hashtags
Famous and rich
Morning Habits Of The Rich And Famous [Podcast]
Mindset mastery
4 Things You Need To Know About Forming A Success Mindset
How to overcome fear
How This Brain Trick Can Destroy Your Fear And Replace It With Confidence [Video]
10 ways to motivate yourself
10 Ways To Motivate Yourself When You’re Really Not Feeling It