In iOS 15.4, Apple will provide Siri a less gendered Speech Choice
Share
The Company's AI Helper will soon have a wider Range of Capabilities.
Apple has announced that Siri has received a new voice that is less gendered. The iOS 15.4 update will include a fifth US voice that is designed to sound less male or feminine.
Apple told Axios, which broke the storey first, that the voice was recorded by a "member of the LGBTQ+ community." It didn't reveal anything further about that person's identification.
It comes after Apple required customers to select their preferred gender of voice for the assistant while setting up their phone.
Previously, Apple chose the gender of the voice based on the country, a choice that some felt was based on unjust stereotypes about those genders. In the United States, the voice assistant had always used a female voice by default.
More than only AirTag anti-stalking messaging and direct iPhone contactless payments will be included in Apple's iOS 15.4. The latest iOS 15.4 beta, according to Axios, has a less gendered Siri speech option for English speakers.
The voice, which was recorded by a member of the LGBTQ community, was intended to broaden Apple's assistant's diversity. Apple claimed in a statement that this will give users additional options for a "voice that speaks to them."
It's unclear when or if the voice will be translated into other languages. Apple also didn't mention when iOS 15.4 will be released, however the company is reported to be holding an event in early March that would be a good time to unveil the new software.
Apple said in a statement to Axios, "We're delighted to launch a new Siri voice for English speakers, providing people more options for picking a voice that speaks to them." If you're an Apple beta tester, the new gender-neutral voice is available in iOS 15.4, which was released to developers and the general public on Tuesday, as option 5 in the Siri UI menu.
In a few weeks, the final version of the gender-neutral Siri voice will be deployed as part of a software update. A UN body released a report a few months ago saying that default female-sounding virtual assistants could propagate "problematic gender stereotypes" and normalize "one-sided command-based speech encounters with women."