Siri, Alexa, Cortana, Google Assistant. What do these voice assistants have in common? Well... they’re all female.
This may not have been something you noticed until now, and some will argue that several AI voice automated systems now come with the option of a male voice. However, the fact is the default voice on most voice assistants is female and, for the most part, has a female name.
This form of AI is increasingly a part of our daily life. Almost 3 billion people are currently using voice automated software to assist with setting their alarms in the morning, to find a nearby takeaway or even check the weather for the weekend, and the number of voice assistant users is expected to grow exponentially over the coming years.
And while the benefits of using the likes of Alexa are substantial, the creators of these voice assistants are facing criticism for opting for female voices, as they can subconsciously reaffirm the outdated social construct and gender bias that women are subservient, quiet, polite and here to “assist” others.
So, why have so many companies chosen to treat AI and voice automation as feminine?
In the blog post, we look at why this gender bias is so prevalent in AI, the challenges programmers face when creating male voice assistants, and what we can do as an industry to alter this bias.
The Reasons Voice Assistants are Female
We Prefer Female Voices
There are an overwhelming number of studies that suggest humans prefer the sound of a female voice, and some even theorise our preference for female voices begins when we are fetuses, as these sounds would soothe and calm us in the womb.
Other research found that women tend to articulate vowel sounds more clearly, which makes women easier to understand, particularly in the workplace. This is not something new for the industry.
Female voice recordings were used even during World War II in aeroplane cockpits due to the fact, they spoke at a higher pitch than male pilots, so they were easier to distinguish.
However, this preference has been heavily disputed, and the many myths surrounding the idea that female voices are easier to hear even when using small speakers or over background noise have been proven to be false.
There’s even proof that women can receive complaints about their vocal ticks. And you can see this for yourself by conducting a simple Google search. If you type into Google search “women’s voices are” the top suggested search will finish this sentence with the word “annoying”, which as we all know is not very complimentary.
There’s No Data for Male Voices
This is probably the most argued point for programmers when they begin creating voice automated AI.
Over the years, text-to-speech systems have been predominantly trained on female voices. Because we have such rich data for female voices, companies are more likely to opt for them when creating voice automated software as it’s the most time- and cost-efficient solution.
Female voice recordings date back to 1878 when Emma Nutt became the first woman to be a telephone operator. Her voice was so well received, she became the standard all other companies strived to emulate. By the end of the 1880s, telephone operators were exclusively female.
Because of this gender switch in the industry, we now have hundreds of years of female audio recordings we can use to create new forms of voice automated AI that we know users will respond well to.
Why waste your time and money collecting male voice recordings and creating male-voiced AI, when you don’t know how your users will respond to it. And this leads us on to our next point...
The Challenges of Creating Male Voice Automation
As the AI industry is dominated by female voices, it should come as no surprise that creating male voice automated systems can be incredibly difficult. Let’s take a look at an example from Google.
In 2016, Google launched “Google Assistant” and there was a reason the tech giant went with a gender-neutral name... because Google wanted to launch its new voice assistant with both a male and female voice.
Unfortunately, the systems Google used to create its new “assistant” were only trained using female data, which meant they performed better with female voices.
Google’s older text-to-speech system would join pieces of audio together from recordings, by using a speech recognition algorithm. It worked by adding markers in different places in sentences to teach the system where certain sounds would start and end.
Brant Ward, the global engineering manager for text-to-speech at Google, explained that those markers weren’t as accurately placed for male voices, meaning that it would be harder to obtain the same quality for a male voice as it is for a female voice.
Unfortunately, the systems that Google and other companies had available to them at the time were trained on more female data than male data.
The team working on Google Assistant strongly advocated for both a male and female voice, but the company decided against creating a male one once it discovered how challenging it was.
Ward said it would have taken over a year to create a male voice for Google Assistant, and after this completion, there was no guarantee it would have been of a high enough quality or received well by users.
How Can We Tackle Gender Bias in Voice Automation?
It appears the gender bias in voice automation is down to a lack of data and widely accepted and unchallenged perceptions of the female voice. When you have all this information stacked up in front of you, creating male voice automated software may seem like an impossible task.
However, there are steps we can take to alter the gender bias not just in voice automation, but throughout the AI industry itself.
1. Invest in Machine Learning Technology
With new machine learning technology at our disposal, text-to-speech systems are becoming more advanced and are now more able to create naturalistic male and female voices for AI.
For example, Google teamed up with AI specialists DeepMind – a British AI subsidiary of Alphabet Inc. and research laboratory – with the plan to develop a new kind of text-to-speech algorithm that would reduce the number of recordings needed to create voices that closer assimilated that of a real human.
By 2017, both Google and DeepMind had created an algorithm named WaveNet, which helped Google develop a more naturalistic female and male voice to add to Google Assistant.
Today, America’s version of Google Assistant comes programmed with 11 different voices, some of which have different accents. To ensure its product is as inclusive as possible, Google assigns new users of Google Assistant with one of two basic voices – one male and one female – completely at random.
2. Set the AI Standards
By 2027, the global market value of AI is expected to reach $267 billion. AI and voice assistants, in particular, are becoming a vital part of our daily routines, yet there are no industry-wide guidelines based on the humanization of AI.
Most companies that have developed voice assistants still opt for a female voice and/or a female name, which again can reaffirm the gender stereotype that women are here to serve others.
Because of this, we need to think about creating and implementing industry-wide standards based on how gender is portrayed throughout AI. If we had such standards in place, businesses would be able to create more inclusive and gender-balanced AI.
When creating these new industry standards, we must obtain active contributions from AI developers and within this group should be a diversity of gender identity, sexual orientation, race and ethnicity.
With this focus group, we could work on defining what “female,” “male,” “gender-neutral,” and “non-binary” human voices sound like, their characteristics, and when and where to use them.
These industry standards should also include a blueprint for developing standard text-to-speech algorithms that are unbiased and are more sensitive to gender, societal and cultural appropriations.
3. Transparency in Data Collection
In order to improve the gender bias we see in the data for voice automation, we need more companies to be more transparent with their own data.
Companies need to be encouraged to disclose the demographic of their AI development teams, any research findings including user preferences for voices, as well as any research on gender-neutral AI voices.
All this information is vital as it can help us understand the relationship between technology, AI and gender bias and find new solutions to this problem.
4. Inclusivity for the AI Industry
This seems like an obvious statement, but when you look at the stats, our industry is clearly not doing enough to encourage people of different genders to follow a career in AI. Currently, “women make up an estimated 26% of workers in data and AI roles globally, which drops to only 22% in the UK”.
This percentage drops significantly when we look at the number of people working in AI who are transgender or non-binary. To rectify this percentage, we must encourage and help more women and other genders to venture into this area of expertise in higher education.
If our AI development teams are more diverse, we will have a workforce that will be able to address complex gender issues before and during the production of new voice assistants. So, if we want our teams to be more diverse, we need to set up strong educational foundations that are inclusive to all.
We can achieve this by increasing the number of learning channels available to students from secondary school level onward, and have female, transgender and non-binary individuals take an active role in development of AI course materials. If students can see they are being represented in courses they are studying, they are more likely to continue in further education.
Final Thoughts
Voice assistants are and might always be a part of our daily lives, and because of this, we cannot ignore the gender bias surrounding this type of AI technology.
However, by opting to use more advanced machine learning developments as well as having more open discussions about gender representation in voice automation, we can take the steps towards creating more inclusive AI products.
You’ll Also Love...
Sneaky Sexism: Why Sexist Advertising Still Exists | Ella Fisher, Marketing Assistant
The Forgotten Bias: Ageism in Marketing | Ella Fisher, Marketing Assistant
Discrimination in Advertising: Defunding Diversity | Liam McNamee, Programmatic Trader
Performative Activism: The Problem with Rainbow-Washing | Ella Fisher, Marketing Assistant
Not an Option: Cultural Sensitivity in Advertising | Francesca Bonis, Account Executive