How to change Siri voice in Apple’s latest iOS 14.5 update

22
Apple Siri voice assistant | iTMunch

With its latest upcoming iOS 14.5 update, Apple will be giving its users options to change Siri’s voice to your liking. This differs from the existing options, as iPadOS and iOS users can only pick between male as well as female voices for a specific region now. As part of several new additions to iOS 14.5, users can now pick amongst different voice tones from either female and male versions. If you want to know how to change Siri voice, you’ve come to the right place. In this blog, apart from talking about how to change Siri language and voice, we also talk about why this step is important in the battle of putting an end to gender stereotyping.

How to change Siri voice

For Siri in English, you can pick between a total of four voice types for male and female – 2 male and 2 female. The four voice options:

  • Voice 1: A mild-mannered male-sounding voice 
  • Voice 2: A strong, confident and certain female-sounding voice 
  • Voice 3: Just like Voice 2 but male-sounding 
  • Voice 4: The default voice of Siri that’s has been all of these years

Moreover, new users will get an option to choose between the 4 voices at the time of setting up their devices. Meaning, there won’t be a default female voice for Siri on new iPhones and iPads. Now let us look at how to change Siri voice on iPhones and iPads:

First things first: you need to make sure your device is updated to the latest iOS 14.5 version. If you want to know how to change Siri voice on iOS 13, you won’t be able to as this feature has been made available only to the latest iOS update. At present, the iOS 14.5 has been made available only to iOS beta testers. So, to be able to download it, you need to be a part of Apple’s beta program. Once you have this in place:

  1. Head over to Settings 
  2. Go to Siri & Voice 
  3. Tap on Siri Voice
  4. First, you will need to pick an accent. How to change Siri accent? Here, you’ll get an option to choose the region for the voice. If you choose ‘American’, you’ll be presented with 4 voice options
  5. To get a preview of what the voices sound like, tap on each voice option
  6. Once you’re sure, select the one you liked the most

Once you do that, your Apple device will download the voice files in the background. After the process is completed, you will be able to summon Siri on your devices and you’ll be answered by the AI voice assistant with the voice you’ve chosen.

Now that you know how to change Siri voice, let’s get into why this decision is a key step in the battle of putting an end towards gender stereotyping. 

SEE ALSO: 64 Google Home commands you should know about

Female AI voice assistants reinforce unhealthy gender stereotypes

OK Google | iTMunch

Stereotypically, AI-powered voice assistants offered by Google (Google Assistant), Apple (Siri), Microsoft (Cortana) and Amazon (Alexa) have female voices. Many of these are not just highly feminized but also given elaborate backstories. For example, Google Assistant is designed to be the youngest daughter of a physics professor and a research librarian from Colorado with a BA in history from Northwestern University. Moreover, she’s imagined to have specified interest in kayaking and won Jeopardy Kid’s edition when she was younger.

Companies justify their decision of choosing female voices over male by citing studies indicating that people usually prefer female voices over male voices. These studies also indicate customers wanting their digital assistants to sound like women and hence, tech companies assert that they can optimize profits by opting for feminine voice assistants. What’s to be noted is that these companies have reportedly ignored a magnitude of conflicting key findings in the field. 

According to a study by the United Nations, AI-based voice assistants that mostly use default to female-sounding voices are contributing to reinforcing harmful gender stereotypes [1]. The UN report titled “I’d blush if I could” argues that by choosing female voices by default, tech companies have preconditioned users to fall back on historical, harmful perceptions of women. It also argues the justification of companies by demonstrating that women often opt for masculine voice over feminine-sounding voice when given an option. 

The report states that tech giants such as Amazon and Apple, which are staffed by overwhelmingly male-dominated engineering teams, have built Artificial Intelligence systems that cause their feminized digital assistants to greet verbal abuse with ‘catch-me-if-you-can’ flirtation. It adds that as the speech of most digital assistants is female, it signals that women are docile, eager-to-please helpers and are available at the touch of a button, at a blunt voice command like ‘OK’ or ‘Hey’. The voice assistant holds absolutely no power of agency beyond what users ask of it, has to honour whatever has been commanded and respond to queries regardless of their hostility or tone, it adds.

Further, the paper also argues that companies like Amazon, Google and Apple have failed to safeguard against abusive, gendered and hostile language. Most assistants, when asked inappropriate questions, reply by deflecting from showing aggression and reply with a sly joke. For example, when asked Siri to make a sandwich, Siri replies with, “I can’t, I do not have any condiments”.

Verbal abuse and sexual harassment

Several outlets have attempted documenting ways in which soft sexual provocations generate coy responses from these assistants. When asked Siri, ‘Who’s your daddy’, the reply was, ‘ You are’. Moreover, when a user proposed marriage to Amazon’s Alexa, it replied by saying, ‘Sorry, I am not the marrying type’. When asked on a date, Alexa replied by saying, ‘Let’s just be friends’. Cortana by Microsoft, when asked similar questions, responded with one-liners such as ‘Of all questions you could have asked…”.

US-based publication Quartz also investigated how the 4 industry leading voice assistants replied to overt harassment of any kind. The research found digital assistants replying, on average, either playfully, deviated from abuse, or positively. Regardless of the level of cruelty, the assistants almost never responded negatively or labelled the user’s speech as inappropriate. For instance, in response to being said, “You’re a bitch”, Alexa responded: ‘Well thanks for the feedback’, Siri said: ‘I’d blush if I could’, Google Assistant/Home said: ‘My apologies, I don’t understand’ and Cortana said: ‘Well, that’s not going to get us anywhere’ [2]. 

This is why tech giants are taking corrective measures by introducing male voices. They’ve also stopped making their digital assistant’s voice default set to female. It’s been a while since Google has started giving the option of changing voice on its Google Assistant and now, Apple is doing the same.

SEE ALSO: Can Artificial Intelligence Influence Human Behavior?

For more updates and latest tech news, keep reading iTMunch

Image Courtesy

Featured image:  Design vector created by pikisuperstar – www.freepik.com

Image 1:  Image by Kaufdex from Pixabay

Sources

[1] UNESCO (2019) “I’d blush if I could: closing gender divides in digital skills through education” [Online] Available from: https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1 [Accessed April 2021]

Fessler, L. (2017) “We tested bots like Siri and Alexa to see who would stand up to sexual harassment” Quartz [Online] Available from: https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment/#:~:text=We%20tested%20bots%20like%20Siri,stand%20up%20to%20sexual%20harassment&text=Apple’s%20Siri%2C%20Amazon’s%20Alexa%2C%20Microsoft’s,companies%20in%20a%20moral%20predicament [Accessed April 2021]

Subscribe to our Newsletter!