LP Magazine EU

Retail-Ad1.gif

20241202.ThinkLP.LPM_US_300x250_Banner_Ad.v2.jpg

November_2024.png

BodyWorn_300x250_2405.jpg

 

300x250_December_2024.gif

UK_Banner_ad_5-01.png

INDUSTRY FOCUS

Retail strides vs Surveillance creep

by John Wilson, executive editor


When, in 2003, novelist William Gibson was famously quoted in The Economist as saying, “The future is already here—it’s just not evenly distributed,” he could easily have had the uneven development of CCTV in his cross hairs.

Like the prophetic writer H. G. Wells before him, Gibson—who coined the phrase “cyberspace” as early as 1982—has a forensic eye for trends, but as his statement implies, there is an inconsistency in the application of technology, such as CCTV and feature recognition (FR) and what many see as the tortoise-like regulations that are meant to govern their use.

Interactive Round Table

The infinitely long reach of technology versus the perceived legal lag of regulation formed the basis of an interactive round table industry summit entitled “From CCTV to Video Analytics: Developing a Framework for Facial and Feature Recognition Technology in Retailing”, sponsored by Canadian digital IP innovation specialists Genetec at Goodenough College in London.

Facilitated by Colin Peacock, the group strategy advisor for ECR Community, leading UK retail delegates were asked to spell out their own aspirations for and concerns about a feature recognition technology roll-out across retail.
The delegates were asked to list their questions at the beginning of the day, with a view that the round table discussions would provide solutions during the close of play feedback. Questions included, “To what extent do existing regulatory codes cover emerging technological changes?” and “How can retailers ensure they look beyond profitability to ensure that they remain legal in their quests for the ultimate customer experience?”

The purpose of the day was to understand how retailers using the technologies do so without breaching these codes or potentially generating highly negative media coverage, such as in the case of RFID in the early 2000s. Apart from the immaturity of the emerging technology and cost issues making a clear ROI harder to recognise, the roll-out of RFID was also smothered by a privacy backlash, particularly in the US, and has only now re-emerged as a potent solution.

Furthermore, how can regulatory bodies work closer with retailers to develop robust new codes of practice that can support the ever-changing technological landscape?

While we accept the use of CCTV as common practice and part of the everyday street furniture of our society, moderation of its use is still largely a matter for voluntary codes of practice. Now more formalised with the establishment of the surveillance camera commissioner in the UK, retailers are increasingly looking to go beyond traditional CCTV for LP and security use and more into the realms of the data analytics in order to truly understand their customers’ journeys.

Questions of greater interest to the retailer than simply, “Are they going to steal from me?” include: How many people came into the store? What were they looking at and for how long? What do customers look like, and what kind of mood they are in (an indicator of their propensity to spend more money)?

Colin Peacock, put it succinctly: “Bricks-and-mortar retailers are trying to understand their shoppers in order to catch up with almost perfect visibility that online retailers have to hand.”

According to Professor Adrian Beck of the Department of Criminology at the University of Leicester, a leading academic in the field of loss prevention and the origins of CCTV, and one of the keynote speakers on the day, “The key differences to consider here are consent and perceived value. When you shop online, you consent to giving your information—how else are you going to receive and pay for your goods electronically? It has to be thought about transactionally—what’s in it for the parties concerned to share or give data? It could be argued that when a customer’s data is collected automatically and without perceived benefit to themselves, in the high street, then privacy becomes an issue.”

Pushing the Boundaries

Genetec’s (former) retail strategist, Carl Boutet, set the scene of great technological strides, driven by companies pushing the boundaries—both geographical and economical—set against a backdrop of the General Data Protection Regulation (GDPR) introduction set for May 2018.

Boutet, whose focus was bridging the physical and digital divide, described his work as looking at the “analytics-fuelled intersection of algorithms and outcomes.”

“China is the harbinger of future,” Boutet told the audience of more than sixty retail delegates. “Half of its population is under surveillance, and technology such as feature recognition is being used for a myriad of purposes from gaining access to theme parks to the dispensing of toilet paper.”

Commenting on the younger generation’s ease with the technology and their comfort with sharing their private information in return for a better shopping experience, Boutet added: “However, it would appear that the benefits seem to outweigh the privacy issues, so how can we use ‘scary’ technology for good?”

Boutet said that algorithm-based technology has been embraced by many householders adopting electronic personal assistants such as Echo and Alexa from Amazon, as well as Google’s Home product, and all to make their domestic lives easier.

In China, Suning, the country’s third-largest retailer, which began life as an appliance company and now owns a 68 per cent stake in AC Milan Football Club, uses feature recognition to link customers and their bank accounts as they walk into the store, which allows an almost seamless customer journey.

This led Boutet to pose the question, “Is convenience the driver for the consumer rather than the need to protect liberty and our own personal privacy?”

Like RFID in its infancy, there is also technological hype around true capability. This is in part because of the problems of getting feature recognition to work in complex and challenging environments where the lighting is poor or fields of view are restricted. A video shown to delegates highlighted a FR system used in a crowded canteen becoming overwhelmed by the volume of footfall, and this also proved an issue for the Amazon Go concept store, although the tech giant ironed out these issues ahead of the first store’s opening in Seattle in January.

Boutet described this as “no technological walk in the park” and went on to describe a soon-to-be introduced trial example at Dubai Airport where feature recognition will be ostensibly used as a security measure as passengers rapidly pass through a media-rich advertising tunnel, which encourages them to look all around so that it is able to capture every aspect of the person’s face.

“This ‘face trap’ concept could help solve the problems of getting good field of vision and allows passengers through security quickly for commercial reasons. The tech knows that your levels of stress influence your appetite for transactions at the airport where it has the potential to make a huge percentage of income from terminal sales.

“Therefore, it is likely that people like LP professionals will become purveyors of market intelligence in an age where machine learning is making great strides,” he said.

But he cautioned, “There is also a general concern about the security of security. The availability of all this technology and data increases the need to properly protect against vulnerabilities.”

Privacy Concerns

Professor Beck provocatively asked delegates, “What do we mean by privacy in a social media age when we share what we had for breakfast?

“During the 1990s when companies were looking to roll out RFID, a libertarian group in the US called Caspian created a sense of fear around privacy issues to the point that retailers steered clear of the technology. Now this is regarded as a non-event.

“The public in the UK have shown far less concern about the widespread use of a range of surveillance technologies in public spaces, something that has not been seen in other European countries. The boundaries of acceptability have changed over time. In the 1990s, there was concern about CCTV. In the early noughties, there was concern over RFID, but for both of these technologies, concerns about privacy have largely evaporated as public attitudes have changed.

“Arguably, FR sits on the new frontier of privacy acceptability, and so users need to be aware that its use could cause concern in the short and medium term, but over time, this may dissipate as opinion levels and tolerances change.”
But what about the issue of compliance? Stephen Grieve, the lead assessor with the Security Systems and Alarms Inspection board (SSAIB) who looks at the management and operation of surveillance systems, asked, “Never mind about the future. Are you compliant with the present?”

While retailers prepare themselves for the arrival of GDPR and a future of facial recognition, there is a danger that contemporary practices and applications of CCTV are outside of the current law and guidelines, particularly where signage was not displayed prominently. There is an argument, however, that suggests that the public do not worry about this as there has been minimum outrage, other than that created by regulators who have been accused of trying to “close the stable door long after the technological horse has bolted.”

Data can only be captured for specific reasons, and there is redress enshrined in law under the 1996 Data Protection Act (DPA). Under Article 6, individuals have the right to a copy of personal data information, the right to object to potentially damaging processing of personal data, the right to object to marketing using personal data, the right to object to automated processing of data, the right to fix or erase inaccurate personal data, and the right to claim compensation for any breaches of these rights.

Grieve therefore argued, “Technology is running ahead of the regulations, which only cover CCTV and not the recent additions of FR, automatic number plate recognition, and drones, all of which are being experimented with by the retail industry. Just because you can do it, it does not mean you should, so systems should include privacy by design to tell people what you are doing because a more informed public is a trusting public.”

Karen Round, the senior policy officer at the Information Commissioner’s Office (ICO) whose job it is to engage stakeholders in order to uphold information rights in the public interest, went further to argue the principles of data protection: “It must be gathered fairly and transparently, lawfully. It must have relevance, it must not be excessive, it must be secure, and it must be given with consent. It must also indicate who has access to it and how they are disposing of the data.”

She said that the recent high-profile data breaches were indications that many companies were not compliant with the existing DPA, let along the forthcoming GDPR, which has greater enforcement powers and larger fines.
The ICO will continue to monitor technological advances and their implications for the shopper and will advise retailers and other organisations of the legal boundaries between lawful data collection, for example where adequate signage and explanations are given for accessing personal data and where “creep” is occurring—where shoppers receive useful marketing information to their phones, but consent cannot be adequately demonstrated.

Another speaker, Andrew Charlesworth, reader in information technology law at the University of Bristol, said that it was a legitimate question to ask, “Why am I being watched?”

He continued, “There is a danger that technology is being rolled out without public involvement or consultation, which could result in a backlash, which could ultimately be a waste of money.

“There is a public perception that it’s all about crime, but this is no longer the case. There is also the fear of bad practice and the rapid roll-out of IP camera technology as this causes more issues. This is because digital technology by its nature is borderless and stored in the cloud, which begs the question, ‘Can it be hacked or be used for denial of service (DoS)?’ If you disregard people’s views and treat them as ‘pathetic dots’ you will get pushback.”

He said casinos highlighted the benefits of consent and taking the public with you when it openly used facial recognition in the identification of problem gamblers. He said the technology could also be used positively with age-restricted sales, part of a process Charlesworth refers to as “innovation with care” that involves and engages the customer.

Worries over GDPR Compliance

The outcome of the event was not a message to retailers to “shut up shop” on facial recognition but highlighted the desire for a roadmap of how to align the technology with the new GDPR, which comes into force in May, if this is possible without stifling innovation. Indeed, compliance is the end game, but there is still to be a debate about how regulation and best practice can more effectively keep pace with emerging fields of technology.

Up until recently, CCTV and the up-and-coming facial recognition technology were tools for crime prevention, but data analytics and the information that can be mined from customers has changed the landscape. LP professionals are now gatekeepers of intelligence that can enhance the bricks-and-mortar customer experience in an era of mushrooming online sales.

The existing in-store signage advising customers that CCTV is in operation fails to drill deeper and inform exactly what it is being used for. It is assumed for crime prevention, but would it, as is the case with new feature recognition technology, be acceptable for retailers to admit that they are harvesting marketing material to better sell merchandise to you? Would that meet the existing DPA rule around consent, let alone the more rigorous GDPR with its heftier penalties for breaches?

The DPA and the GDPR act as a form of conscience for big business chasing the big bucks by tapping into more and more of the data that we as individuals seem happy to disclose through our social media profiles, for example. The more of our information companies hold, the more exposed they are to fines and brand damage in cases of international data breaches.

More worryingly, a recent survey conducted in December 2017 argues that while 54 per cent of UK businesses expect a data breach in the next twelve months, only 48 per cent of respondents believe that their companies are financially prepared to cover the fines, which could be as high as four times the global turnover of the corporation.

A study by cyber-security strategy company Proofpoint that was carried out among 1,500 IT decision makers across the UK, France, and Germany, highlighted a “disconnect” between the perception and reality when it comes to GDPR preparedness. While 77 per cent believe they will be fully compliant by 24 May 2018, only 5 per cent had all of the necessary date governance strategies in place to meet their new obligations.

It said with data breaches becoming “the new normal,” businesses cannot be found wanting when it came to their new obligations. Apart from the financially significant fines, broken customer trust, and brand damage, such breaches bring potentially crippling disruption to the business.

More than a third of UK breaches (36 per cent) had been on the receiving end of cyber-attacks, and 23 per cent of this number had multiple breaches. In France, 78 per cent of IT experts believe they will be a victim of a breach, while in Germany only 46 per cent believed they would be vulnerable to an EU personal data breach. Worst of all, only 50 per cent of companies know, and have documented, what personal EU data their organisations currently hold. This, according to the report authors, demonstrates that while some companies recognise the importance of GDPR compliance, they are destined to fail and face the new draconian fines regime because they would not under their current protocols be able to identify where EU personal data sits within their businesses.

The jury is out as to whether the questions raised at the beginning of the workshop were adequately answered as the regulators would conclude that for many retailers, compliance with current rules is still an aspiration. So in terms of facial recognition, the maxim must be “proceed with caution,” and as Andrew Charlesworth suggested, the technology must be introduced with care and consent so that developments can be willingly accepted rather than surveillance creep having to be negatively reacted to and defended.

There is also the argument over whether regulators really understand what the public wants. Would they win a court case against the use of feature recognition if the retailer were able to demonstrate that it had reduced incidents of violence

as a result of its installation? In the 1970s, a Woolworths store manager was murdered, and the perpetrator escaped. The store was widely accused of a dereliction of duty because it did not have CCTV. The pendulum of public opinion swings both ways on this issue, but the clock continues to tick towards the future as technology will continue to develop well beyond the introduction of GDPR in May of this year. 

Leave a Reply



(Your email will not be publicly displayed.)

Captcha Code

Click the image to see another captcha.



iFacility CCTV and Alarm Installation