With phone cameras, surveillance cameras with facial recognition seemingly everywhere and the world entering a new phase of social change, many people are looking at how they can take simple steps to retain and protect their privacy rights.
As enshrined in data protection laws, such as GDPR, and with biometrics now being used widely, our faces are part of the personal data that we need to protect. Concerns, such as those expressed by the ICO’s head, Elizabeth Dunham, that police facial recognition systems have issues including accuracy are the reason for many to be looking at ways to protect themselves where necessary.
Public trust in facial recognition systems also still has some way to go as the technology progresses from what is now a relatively early stage. For example, the results of a recent survey released by Monash University in Australia showed that half of Australians believe that their privacy is being invaded by the presence of facial recognition technology in public spaces. Also, in the U.S., government researchers of the National Institute of Standards and Technology (NIST) have said (in May 2020) that not enough is being done to engender trust in any decisions made by facial recognition and biometrics systems, and in Europe in January, the European Commission was considering a ban on the use of facial recognition in public spaces for up to five years while new regulations for its use could be put in place.
In a democracy such as the UK, protests are allowed take place for any number of issues, and the recent protests over the killing of George Floyd and in support of Black Lives Matter have brought into focus how to protect personal data and identity while exercising democratic rights.
For example, those wishing to obscure faces in their own protest photos that they share often use software to paint over faces, or use a mosaic blur technique because these cannot be reversed, rather than a simple blur effect which it is possible for authorities to de-blur using new neural networks.
This process of blocking out faces in photos can be carried out using the built-in photo editor on a smartphone. For example:
– On iOS, open Photos, tap on the photo, select Edit (top right), tap the three dots to access Mark-up and use solid circles or squares to block out faces.
– On Android (using the native Mark-up tool), in the Photos app, select the photo, tap on Edit (bottom, second left), select Mark-up (bottom, second right), and block out faces e.g. using the Pen tool.
Removing the photo’s metadata (data stored in phone photos e.g. type of device and camera, date, time, location) can be achieved by taking screenshots the photos, and making sure that there are no other identifying features in the screenshot.
Tech and news commentators have noted recently how mask-wearing during the COVId-19 pandemic has proven to be a challenge for facial recognition systems, although it has also been suggested that AI facial recognition systems have now had the chance to have more ‘training’ in being able to identify mask-wearing people correctly.
Facial recognition (if used responsibly as intended) can help to fight crime in towns and city centres, thereby helping the mainly retail businesses that operate there, although there are still questions about its accuracy and its impact on our privacy and civil liberties.
Where sharing photos and worries about privacy is concerned, there are apps in place on smartphones that allow faces to be blocked out. Also, when on Facebook, for example, not using a close up / clear photo of your face as a public profile picture, or revealing too much about where photos were taken, as well as not geotagging or posting photos that reveal your address or show valuable items at your home / where you keep valuables are also steps that can be taken to help retain your privacy and security. Photos taken in the workplace, particularly those posted on websites and social media should also be vetted to ensure that there are no implications for physical security and that staff featured are happy to have the photo shared.