Background
We hear from our community that facial recognition tech (‘FRT’) does not always work for them. Our 2024 community survey showed that people with facial differences report a range of difficulties with the technology, and the serious harms which can result. These can include difficulty applying for jobs, trouble accessing money through a banking app, delays getting through border control systems and many others.
We have started talking to people working in this space to ensure that FRT, as one part of our digital world, develops safely and equitably, taking into account the rights of the FD community. Please watch this video of our presentation at the Zero Project conference in 2026 to hear more about our approach. It explains why we are asking tech companies to engage ethically with our community as a whole.
We do not claim to be experts in the technology but want to share our understanding of the possible risks and rewards to promote discussion and also to help people with facial and visible differences make decisions about how or when they take part in shaping this technology.
The field of AI and related technology continues to move at a fast-pace. The information below is a starting point that we will continue to review as the technology and our understanding evolves.
What is Facial Recognition Technology (FRT)?
FRT is a form of biometric security which offers a way to identify someone using their face. FRT is increasingly widespread.
It can be used to open your phone without a pin code; to help law enforcement authorities trace suspects or find missing persons; to check someone’s identity before issuing an official document; to make background filters on remote video call platforms work; by banks to safeguard your money; to automate hospital admissions; to monitor school attendance or workers’ hours; to enable special effects filters on social media; to diagnose medical conditions – and so on.
It can be used to detect faces (as when a passport photo machine ensures that your image is correctly lined up within the on-screen circle). Or to analyse faces (as when some recruitment video software analyses facial expressions to make judgments about you). Or to identify you (as when border control e-gates verify your identity against a stored image of you).
What are the rewards of facial recognition technology for the facial difference community?
- The biggest positive about FRT is convenience and efficiency. Take border control – if the tech works, it can allow fast and smooth progress through the e-gates. If it doesn’t, you may be redirected to the back of a different queue to have your passport checked by a human being, causing delay and frustration. It may also reduce the need to send paper documents when applying for things like jobs or bank accounts.
- FRT may provide an extra layer of security against identity theft, when it works.
What are the risks of FRT for the facial difference community?
- A major concern with FRT is privacy. Think about who is using FRT. You may have no problem with a prospective employer using it to check your identity, but how would you feel them about them using it to search social media sites for images featuring your face? When is it OK for shops to use FRT to monitor you as a customer – or perhaps to tailor your retail experience based on your past shopping patterns? Also think about how your data might be used and stored. Images collected using FRT can form part of a wider profile about you. For example, if FRT is used to monitor lawful public protests, your presence at that protest may reveal information about your beliefs and associations. Consider the extent to which the law in your country protects you in these sorts of situations.
- Another concern about FRT relates to the potential for mistakes. International examples exist of people being banned from shops or even arrested due to a false FRT report. The potential for mistakes seems to be greater for some marginalised groups (including some ethnic groups and people with certain conditions). It is unclear how reliable it is for people with a wide range of facial differences.
- FRT can be used in a lot of different ways. If FRT can distinguish a visibly different face from a typical face, can we always be sure that, in all parts of the globe in the future, that technology will be used only to ensure inclusion, rather than to increase marginalisation purposefully in some way, or drive up the cost of having a visible difference?
- Finally, there are practical concerns about training FRT systems to recognise visibly different faces. Among our community there are many wonderful and visibly different faces. Training a system to recognise all of them might be challenging. And ensuring that any photos used to train the technology are adequately protected would also be critical.
What are FEI doing about all of this?
Our call-to-action can be summed up as follows:
- The FD community must have a leading voice in the regulation and accountability of FRT. We welcome continuing discussions with disability groups, responsible AI forums and policy /industry leaders to ensure that the rights of the community are respected and the risks recognised and minimised. If this is you, please get in touch.
- Inclusion within opportunities is a non-negotiable. People with FDs must not be blocked from applying for jobs or choosing to access other services because FRT does not recognise them. Improving FRT’s ability to recognise a wider range of faces might seem the obvious solution, but this brings with it difficulties, uncertainties and risks, as summarised above. Until these can be adequately understood and addressed on a global scale, alternatives to FRT must be built in. This might mean having a ‘human in the loop’ who can swiftly override the FRT and ensure that every opportunity is truly accessible. Alternative methods of accessing opportunities must be included at the design-stage to ensure the inclusion of the FD community.
- FRT systems must be designed to avoid the indignity of tech failure. None of us should have to be the person standing at the front of the queue having their photo taken for the 34th time while everyone behind gets impatient. It is not enough that an alternative method is available when pushed for, it must be easily accessible, even when no human being is physically present (as when using online systems).
Where do we go from here?
The changes brought by our digital world can seem overwhelming and frightening. But we are not alone as individuals or as a community – and there are many people, organisations and legislators across the globe working to ensure that some of the risks are reduced or prevented.As the digital world changes and we as a community learn more, we will continue to listen, adapt and welcome your views.We find reassurance in community.
If you have a FD and want to tell us about an experience using FRT, please contact us. If you feel safe doing so in your context, you may also want to raise it directly with the organisation concerned. We are exploring a potential university research partnership which aims to make it easier for the FD community to report these experiences and their impact on you. Follow us on socials for updates when we have them.
If you are an organisation already working (or keen to work) to ensure the ethical inclusion of the FD community within their digital systems, we would love to hear from you.
Get in touch about Facial Recognition Technology
Tags: AI, artificial intelligence, Blog, Face Equality, facial difference, Facial Recognition, Visibly Invisible Posted by