Microsoft removes AI-based facial analysis tool that claims to identify emotions

Microsoft is unplugging several AI-based facial analysis tools, including one it says could identify a subject’s emotion from images and video. Experts have strongly criticized these “emotion recognition” tools, calling the equating of external emotional manifestations with internal feelings unscientific. They say that facial expressions that were thought to be universal actually differed from one population to another.

The decision to restrict public access to these tools is part of a broader overhaul of the company’s policies on AI ethics. Microsoft’s updated responsible AI standards emphasize accountability for who uses services and greater human oversight of where the tools are applied.

Concretely, Microsoft will limit access to certain features of facial recognition services (Azure Face), while removing others entirely. Users who want to use Azure Face for Face ID will need to apply, telling Microsoft how and where they would deploy the systems. Some use cases with less harmful potential will remain in open access.

In addition to removing public access to the emotion recognition tool, the Washington-based tech giant will also remove Azure Face’s ability to identify age, gender, facial hair, hair, smile and makeup.

Announcing the news, Natasha Crampton, Chief AI Officer at Microsoft, wrote in a blog post: “Experts inside and outside the company have pointed to the lack of scientific consensus on the definition of “emotions”, the challenges of generalizing inferences across use cases, regions, and demographics, and heightened privacy concerns around this type of capability.

Microsoft said it will stop offering the features to new customers starting June 21, while existing customers will have their access revoked on June 30, 2023.

However, Microsoft will continue to use the feature in at least one product – the Seeing AI app which uses machine vision to describe the world for people with visual impairments.

Sarah Bird, Microsoft Group Senior Product Manager for Azure AI, said in a blog post that emotion recognition tools “can be useful when used for a set of controlled accessibility scenarios.”

Microsoft will also introduce similar restrictions to the Custom Neural Voice feature, which allows customers to create AI voices based on recordings of real people.

Comments are closed.