This is the second of three articles produced collaboratively by Dr Sophie Taysom, an independent consultant at Keyah Consulting, and Dr Sue Chadwick, a Strategic Planning Advisor at Pinsent Masons LLP, on smart buildings and data.
In the previous article, Sophie set out some of the opportunities and challenges in the development and maintenance of smart buildings. The focus of this article by Sue is the more problematic aspects of smart buildings – in particular the collection, storage, and sharing of data. We point to the current concerns around automatic facial recognition (AFR).
Most of us are familiar with the General Data Protection Regulation and Data Protection Act (GDPR) that came into force last year. The key principles established include that a data controller must have a lawful basis for processing personal data and in the case of special category data, obtain specific consent for the processing, and a requirement for consent to be ‘unambiguous’ and indicated through ‘clear affirmative action’. The GDPR also sets out seven key principles underlying data management: lawfulness, fairness and transparency; purpose limitation; data minimisation; accuracy; storage limitation; Integrity and confidentiality and accountability – described as a “fundamental building block for good data protection practice” by the Information Commissioner’s Office (ICO).
Emerging technologies challenge the scope, interpretation and application of these principles, particularly when the processes are automated and the data is not just personal but biometric. AFR is a perfect example of this kind of technology: it is both new and evolving rapidly but its implications are not fully understood nor is it appropriately regulated.
AFR has been in the news a lot recently. In August, the Swedish Data Protection Authority fined a school for using facial recognition to record student attendance. – even though the study was for a limited period of time and in just one class, and the ICO announced that it was investigating the use of facial recognition technology at Kings Cross. Earlier this month the courts ruled on the case of Bridges v South Wales Police, where the court judgment began with an acknowledgment that “this is the first time that any court in the world has considered AFR.” It acknowledged that use of AFR led to two arrests and allowed the police to find an individual who made a bomb threat, but also noted the “potential baleful uses to which AFR could be put by agents of the state and others”.
The Bridges case focused on the use of an AFR-equipped van by the South Wales Police on two occasions – one in a busy shopping area and then again in a large public exhibition. Digital images of faces were extracted from a live CCTV feed and processed to extract facial data that was then compared that data against a number of different watchlists. Although much of the ruling is simply not relevant in a wider context because it considered the specific scope of data protection powers and responsibilities of a law enforcement body under the (Part 3) Data Protection Act 2018, the principles established are highly relevant to the use of this technology generally:
- The use of AFR by a public body does engage – and infringe – the ‘right to privacy’ under Article 8 of the European Convention on Human Rights, even where images are captured in a public space and discarded almost immediately.
- AFR data “clearly does comprise personal data” for (Part 3) Data Protection Act 2018 purposes and that “It is beyond argument” that it is also “biometric data” requiring specific consent to be given for its use.
- The court dismissed the challenge that was raised on equalities grounds but noted that in view of the evidence presented the police might want to consider “whether the NeoFace Watch software may produce discriminatory impacts”
The ICO investigation at Kings Cross is the first time that the built environment and data governance have collided but it is not the last. AFR is just one example of an emerging technology that raises complex questions about the ethics of data and resulted in unintended consequences; as buildings grow smarter, these questions will proliferate. If information law and practice is struggling to keep up with the pace of change, the planning system is barely in the race, but we cannot ignore this issue for much longer: if the Hackitt recommendations are adopted, there will soon be a requirement for a “golden thread” of data to be maintained through the life of a building – starting with its design.
The question we address in the final article in this series is: how do we get the balance right? How do we make the most of opportunities presented by ‘smart’ buildings while remaining mindful of regulatory requirements.
Get in touchIf you have a question or if you’re interested in working with me, or would just like a chat, drop me a message via my contact page. |