Video privacy from East to West: how are AI and facial recognition impacting the retail sphere across regions?

E-commerce has undoubtedly seen a massive boom in recent years but as we transition out of the covid-19 period, many customers have returned to brick-and-mortar stores. With this increase, retailers may have further security concerns that call for different forms of video surveillance to help monitor their spaces - be it, CCTV or bodyworn cameras. 

Moreover, video surveillance combined with AI and video analytics can offer retailers several benefits in marketing and selling to their consumers. 

For example, the London-based SPD group developed a system that can make product suggestions based on tracking the location and in-store actions of customers by analysing video footage. 

However, there also comes an increase in the amount of personal data collected, particularly video data which often contains sensitive information.


Approaches and opinions on AI in the retail space differ across regions, especially between Asia and the West, but how so? 

Video analytics is one means of AI which has become increasingly common in the retail space. It can help profile types of customers and their shopping habits, which can help inform marketing strategies to help boost sales. One particular form of video analytics is facial recognition technology (FRT). Retailers in the West use it to match customer profiles and help personalise shopping experiences, as well as assist in cashless paying whereby customers pay with their faces instead of money.

However, there is a lot of contention about its use due to data privacy concerns - i.e. it requires the mass collection and processing of data, namely, sensitive biometric data.   

In the UK, privacy campaigners, Big Brother Watch, have brought legal complaints to the ICO regarding live facial recognition technology (LFRT) used in Co-op stores, arguing it is “highly invasive”

In the US, there has been a pushback against retailers’ use of LFRT and public pressure on stores like Lowe’s and Macy’s to stop using the technology, amid privacy concerns

In Australia, a privacy investigation was also launched into retailers Bunnings and Kmart over their use of FRT in stores, with concerns about people’s personal information and whether it complies with Australian data privacy law. 

Of course, opinions differ depending on the use of FRT, but generally, there tends to be more scepticism and wariness of the technology in the West, particularly considering bias in the algorithms and codes which can negatively impact certain demographics.

FRT requires processing sensitive biometric data, which often conflicts with data privacy rules such as the GDPR and UK GDPR whereby processing biometric data for identification purposes is not allowed without explicit consent.


In contrast, in the East, approaches to FRT are rather different.

China, has often been ranked one of the most notorious countries for collecting biometric data, and 16 of the most surveilled cities are located in China. The Chinese FRT systems log almost every citizen across the country - for example, more than 6.8 million records from a single day are taken from cameras across roads, parks, and tourism spots. In this vein, FRT is also very much used in the retail industry in China. Similarly, Japanese companies are also increasingly implementing FRT across industries, and more specifically have also rolled it out in stores to help create a “frictionless” retail experience and allow customers to pay with their faces.

From a data privacy perspective, this large-scale collection of sensitive data can be questionable and some of this pushback is also beginning to be seen in the East. Surveys in Beijing found 74% were in favour of using traditional means of ID verification as opposed to FRT. Moreover, 80% of respondents were concerned about lacking security measures in facial recognition systems. 

China has recently passed and implemented the Personal Information Protection Law (PIPL) which imposes new rules for how data is collected - FRT and facial biometrics can only be used for a specific purpose and when necessary, alongside a risk assessment. Despite this, there are still valid concerns with the large-scale collection of this data in retail.


While FRT has received far more pushback in the West, body-worn cameras are another form of video surveillance that has become increasingly common, particularly in the retail sector.

They can be a useful way of helping to maintain security, manage inventory shrinkage, as well as deter violent behaviour. For example, in the UK, stores like Boots, Co-op, and Sainsbury's have all rolled out body-worn cameras to members of their staff to help counter and account for customer abuse. 

In comparison, in the East, body-worn cameras tend to be less common in the retail sphere and more commonly for law enforcement.

From a retail consumer point of view, video surveillance can be crucial.

It can help monitor shop floor activity, help deter theft and violence, as well as offer proof, should any incidents occur. What matters is how the data is handled and protected. Similarly, different forms of AI-based video analytics can greatly benefit retailers - for example, people counting and tracking around stores. While FRT is still a contentious subject, AI-based video analytics can still be very beneficial - what matters is how it is collected and protected. By anonymising this data, value can still be derived from this video and improve the customer experience. 

Data privacy rules across countries have strict rules on how personal data can be processed and companies that are looking to enter markets across these regions need to be well aware of the data protection rules across these countries and how they can and can’t use data. Particularly when it comes to collecting and using video and sensitive biometric data, their approach will need to be accordingly tailored depending on where they are operating. 

Provided the personal data (i.e. faces) are properly anonymised and protected, retailers can derive all the valuable benefits from this technology.  

We have found that many organisations have faced fines for GDPR and have been reprimanded for misusing data (including video data) and so anonymisation and effective redaction of personal information is a must-have.  


Want to use video analytics safely?

Previous
Previous

Responsibly implementing video analytics in the transportation sector

Next
Next

A simple step-by-step on how to tackle video DSARs