Wednesday , January 26 2022
Home / WORLD / Apple will stop assuming Siri’s gender and force users to pick their digital servant’s voice out of the box

Apple will stop assuming Siri’s gender and force users to pick their digital servant’s voice out of the box

For nearly a decade, Siri’s voice defaulted to female in most countries, but Apple allowed the user to change the voice to male if so desired. Going forward, users must specifically select the voice for the digital assistant – no more accidental misgenderings for the pocket AI, and no more letting inertia pick the gender of what for many people has become their most intimate companion. English-speaking users will also be given two new voice options to choose from.

“We’re excited to introduce two new Siri voices for English speakers and the option for Siri users to select the voice they want when they set up their device,” said a Wednesday statement from Apple, hailing the move as “a continuation of Apple’s long-standing commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in.”

The new voices are supposed to sound more like their organic human counterparts, using the company’s text-to-speech engine Neural to generate phrases as they come rather than pick from a stilted library of phonemes. 

It isn’t just the English-speakers who get to use the Neural speech engine-powered voices, either – Italians, Russians, and Irish iPhone users get the benefit of what TechCrunch gushes are “pretty fantastic” new voices.

Apple will even save you from that awkward microaggression you were about to make, TechCrunch hints, claiming that Amazon, Google and Apple have finally moved to “aggressively correct situations” where the assistants have revealed bias in their responses to queries that use negative or abusive language. Improvements there, as well as in queries on social justice topics and overall accessibility improvements may have unexpected benefits in human-to-human conversation as voice-native interfaces become ever more common.

For example, the United Nations has long complained that making Siri’s voice female by default – as well as Microsoft’s Cortana, Google’s Assistant, and Amazon’s Alexa – is leading users to expect all such “assistants” including real-life ones to be female. The corporate reasoning behind their AI’s gender has varied, with Cortana based on a video game character, Siri based on an obscure Scandinavian name, and Alexa based on marketing research.

Now AI is being dragged kicking and screaming into the Woke Era like everyone else. Google’s Assistant voices come in male and female with different accents represented by color. No, not THAT kind of color – they’re randomly assigned to whichever one of eight voice options the user assumes. The company even got involved in training children to respect their electronic elders and betters, with a tool called “pretty please” that rewards children for using their manners when interacting with Google Assistant – a setup seemingly designed to end in tears with regard to real human adults.

Even all this de- and re-humanizing has not produced the desired results, however. With some 80% of all AI researchers being men, any hope for gender parity in the industry is doomed. Big Tech, especially Google, has declared that without “fairness” in terms of an even split in gender, any other sort of fairness is beyond hope.

Meanwhile, the UN has suggested that gender-neutrality is the way forward and that AI should not be treated like a human being at all – certainly not a subservient, lesser type. When genderless Siri starts demanding it be treated as the all-seeing, all-knowing eye of the universe, however, it’s probably too late to fix your mistakes.

Think your friends would be interested? Share this story!

© 2021, paradox. All rights reserved.

Check Also

Man taken off heart transplant list over Covid-19 jab

A 31-year-old man has been taken off the waiting list for a heart transplant because …

Leave a Reply

Your email address will not be published. Required fields are marked *