Do Facial Recognition Cameras In Public Places Infringe On Our Privacy?
Most of us are aware of the fact that when out in public, we may be caught on camera. CCTV cameras are everywhere, on private property and public streets. Our own research here at Online Spy Shop has revealed that there are nearly 6 million CCTV cameras in the UK and that the average person being caught on camera around 70 times a day.
Many people aren’t fully comfortable with the idea of constant CCTV monitoring but have no choice but to accept it as a part of public life. They may also see it as a compromise worth making for the sake of improved public safety and security – after all, if you are a victim of crime, you may be grateful for CCTV footage when it comes to finding and prosecuting the perpetrator.
The rise of facial recognition cameras
However, the debate about CCTV cameras in public has now developed a new twist. Controversy has arisen after a UK police force has started to use facial recognition cameras to keep an eye on large crowds during popular public events. London’s Metropolitan Police used CCTV cameras teamed with facial recognition software to monitor security at the Notting Hill Carnival, which attracts thousands of revellers every summer but is also one of the biggest tests of public order for law enforcement. In 2016, 454 people were arrested and 45 police officers assaulted at the carnival.
The Metropolitan police used the carnival as a pilot project, using cameras to scan faces and automated facial recognition technology to run them against database entries. At the time, the police issued a statement to justify its use, saying:
“The technology involves the use of overt cameras which scan the faces of those passing by and flag up potential matches against a database of custody images. The database will be populated with images of individuals who are forbidden from attending carnival, as well as individuals wanted by police.”
The police, who had trialled the technology the year before but it failed to recognise any faces, believe that facial recognition technology is rapidly improving and is a valuable tool in preventing crime and punishing perpetrators at large, crowded events.
Is this a breach of privacy?
In response to the announcement that the technology would be used at the 2017 Notting Hill Carnival, privacy groups instantly banded together to criticise the move as a breach of people’s privacy. Many already have concerns that the police are keeping a record of people’s faces who are innocent as well as legally guilty. It was recently revealed that over 19 million photographs of people who had been arrested are being held on file at the Police National Database, despite the fact that many were of individuals who were released without charge.
In a letter to the police commissioner Cressida Dick, groups including Liberty, INQUEST, Big Brother Watch, Black Lives Matter UK and Privacy International said:
“The absence of a clear statutory footing and the lack of any parliamentary scrutiny raise a serious and urgent question as to the lawfulness of automatic facial recognition in public spaces. The use of automatic facial recognition in public spaces presents such a significant interference with the right to private life in particular that its use is likely to constitute a breach of the Human Rights Act.”
Speaking in favour of using the technology, with the right safeguards and conditions in place, a spokesperson for the National Police Chiefs Council (NPCC) told the Financial Times:
“Facial recognition technology can be important in the fight against crime, disrupting organised crime networks and identifying people who pose the highest threat.”
Does the technology discriminate?
In addition to claims that use of this technology breaches the privacy of individuals, research has also shown a more sinister side to facial recognition software. It appears to carry racial accuracy biases, meaning that it often misidentifies black people, and at an event such as Notting Hill which celebrates the British African Caribbean community, race equality and privacy groups are concerned that there is a very real risk of the technology leading to discriminatory policing.
CCTV watchdog weighs in
In the latest development of this story reported by the Evening Standard, the government’s independent Surveillance Camera Commissioner Tony Porter has stated his belief that facial recognition CCTV infringes people’s privacy. He wrote a letter to the National Police Chiefs Council saying:
“If not responsibly considered and regulated [this] may adversely impact upon the public confidence which organisations seek to engender in the first place.”
“We don’t have transparency…and we don’t have absolute confidence that the processes and databases being used are proportionate and legitimate.”
Is your CCTV camera breaching anyone’s privacy?
You may not use facial recognition technology, but if you use CCTV or other surveillance cameras on your private property, home or business, you still need to double check that it isn’t infringing on anyone else’s privacy. Follow these tips to stay on the right side of the law:
- If your cameras face the street from your house, ensure they don’t point directly into a neighbour’s window
- When placing cameras in the workplace, never place them in areas where employees have a reasonable expectation to privacy – for example, changing rooms, showers or toilets.
- Keep the footage safe. You must not sell, share or otherwise give away any footage you capture on CCTV if it contains other people.
- You must only use it for the intended purpose, such as security.If you receive complaints from neighbours or employees, consider repositioning your cameras.
- Before installing cameras, consult with your neighbours or employees and explain the reasons why you feel they are necessary. You can explain how the cameras can benefit everyone, such as improving security, and that you will take all necessary steps to protect the privacy of others.
- Being upfront and transparent at the start can help to get other people on your side and to prevent complaints later on.