Face Equality International
IFEW

Facial Recognition Technology: friend or foe of the Facial Difference (FD) community?

Background

We hear from our community that facial recognition tech (‘FRT’) does not always work for them. Our 2024 community survey showed that people with facial differences report a range of difficulties with the technology, and the serious harms which can result. These can include difficulty applying for jobs, trouble accessing money through a banking app, delays getting through border control systems and many others.

We have started talking to people working in this space to ensure that FRT, as one part of our digital world, develops safely and equitably, taking into account the rights of the FD community. Please watch this video of our presentation at the Zero Project conference in 2026 to hear more about our approach. It explains why we are asking tech companies to engage ethically with our community as a whole.

We do not claim to be experts in the technology but want to share our understanding of the possible risks and rewards to promote discussion and also to help people with facial and visible differences make decisions about how or when they take part in shaping this technology.

The field of AI and related technology continues to move at a fast-pace. The information below is a starting point that we will continue to review as the technology and our understanding evolves.

What is Facial Recognition Technology (FRT)? 

FRT is a form of biometric security which offers a way to identify someone using their face. FRT is increasingly widespread.

It can be used to open your phone without a pin code; to help law enforcement authorities trace suspects or find missing persons; to check someone’s identity before issuing an official document; to make background filters on remote video call platforms work; by banks to safeguard your money; to automate hospital admissions; to monitor school attendance or workers’ hours; to enable special effects filters on social media; to diagnose medical conditions – and so on.

It can be used to detect faces (as when a passport photo machine ensures that your image is correctly lined up within the on-screen circle). Or to analyse faces (as when some recruitment video software analyses facial expressions to make judgments about you). Or to identify you (as when border control e-gates verify your identity against a stored image of you).

What are the rewards of facial recognition technology for the facial difference community?

  1. The biggest positive about FRT is convenience and efficiency. Take border control – if the tech works, it can allow fast and smooth progress through the e-gates. If it doesn’t, you may be redirected to the back of a different queue to have your passport checked by a human being, causing delay and frustration. It may also reduce the need to send paper documents when applying for things like jobs or bank accounts.
  1. FRT may provide an extra layer of security against identity theft, when it works.

What are the risks of FRT for the facial difference community?

  1. A major concern with FRT is privacy.  Think about who is using FRT. You may have no problem with a prospective employer using it to check your identity, but how would you feel them about them using it to search social media sites for images featuring your face? When is it OK for shops to use FRT to monitor you as a customer – or perhaps to tailor your retail experience based on your past shopping patterns? Also think about how your data might be used and stored. Images collected using FRT can form part of a wider profile about you. For example, if FRT is used to monitor lawful public protests, your presence at that protest may reveal information about your beliefs and associations. Consider the extent to which the law in your country protects you in these sorts of situations.
  1. Another concern about FRT relates to the potential for mistakes. International examples exist of people being banned from shops or even arrested due to a false FRT report. The potential for mistakes seems to be greater for some marginalised groups (including some ethnic groups and people with certain conditions). It is unclear how reliable it is for people with a wide range of facial differences.
  1. FRT can be used in a lot of different ways. If FRT can distinguish a visibly different face from a typical face, can we always be sure that, in all parts of the globe in the future, that technology will be used only to ensure inclusion, rather than to increase marginalisation purposefully in some way, or drive up the cost of having a visible difference?
  1. Finally, there are practical concerns about training FRT systems to recognise visibly different faces. Among our community there are many wonderful and visibly different faces. Training a system to recognise all of them might be challenging. And ensuring that any photos used to train the technology are adequately protected would also be critical.

What are FEI doing about all of this?

Our call-to-action can be summed up as follows:

Where do we go from here?

The changes brought by our digital world can seem overwhelming and frightening. But we are not alone as individuals or as a community – and there are many people, organisations and legislators across the globe working to ensure that some of the risks are reduced or prevented.As the digital world changes and we as a community learn more, we will continue to listen, adapt and welcome your views.We find reassurance in community.

If you have a FD and want to tell us about an experience using FRT, please contact us. If you feel safe doing so in your context, you may also want to raise it directly with the organisation concerned. We are exploring a potential university research partnership which aims to make it easier for the FD community to report these experiences and their impact on you. Follow us on socials for updates when we have them.

If you are an organisation already working (or keen to work) to ensure the ethical inclusion of the FD community within their digital systems, we would love to hear from you.

Get in touch about Facial Recognition Technology

Tags: , , , , , , Posted by

Country spotlight: Face equality in Taiwan

We’re proud to have worked so closely with the Sunshine Welfare Foundation, Taiwan, as Founder Members of Face Equality International and one of the first countries in the world to put on their own face equality event! Here, Sunshine share with us some of the challenges facing people with facial differences in Taiwan and how they are working hard to address them.

Read More

Facial Recognition Technology: friend or foe of the Facial Difference (FD) community?

Background We hear from our community that facial recognition tech (‘FRT’) does not always work for them. Our 2024 community survey showed that people with facial differences report a range of difficulties with the technology, and the serious harms which can result. These can include difficulty applying for jobs, trouble accessing money through a banking […]

Read More

Becoming the Woman in the Portrait

Melanie shares with us the impact of having her portrait photograph taken, touching on society’s views of difference, and how it feels to be truly seen. Photographer Martina Holmberg wins Taylor Wessing Portrait Prize 2025 with photograph featuring visible difference, a positive move for inclusion and representation in the arts.

Read More
×