Computer Vision and GDPR: like oil and water or a perfect match?

Buddywise
4 min readMay 27, 2020

How can GDPR guide us on how to use computer vision for good, while respecting people’s right to privacy? Could computer vision systems really be better for privacy than regular cameras?

Alice Giannini interviewed on computer vision & privacy

To discuss these important questions, we had a virtual chat with Alice Giannini: friend of Buddywise, Consultant at Stefanelli & Stefanelli Law Firm, and superstar PhD Student at the University of Florence.

Alice, what brought you to this niche field of GDPR and advanced software?

I am passionate about creating a bridge between my academic background in law and practical applications in the field of new technologies. Whilst I have been researching on criminal behaviour and accountability of Artificial Intelligence systems, I have been working alongside start-ups deploying advanced software, to assist them in their compliance with GDPR. Innovation is essential for progress and I believe the law has a key role to play in making innovation possible in a way that respects people’s rights and societies expectations.

What do you think has been the main impact of GDPR for businesses?

One of the main pillars of the not-so-new General Data Protection Regulation is the concept of accountability. Companies need to have a proactive approach in complying with principles of privacy and data protection: it’s not about formal requirements anymore, it’s a matter of being able to prove that privacy has been on your mind since the very beginning of a processing activity.

What is your take on computer vision and the right to privacy?

Computer vision is an exceptional tool which could prove vital; especially when it comes to situations like ensuring the safety of workers, as in the case of Buddywise. The development of new technologies shouldn’t be seen as a threat to the right to privacy and businesses in this field should ask themselves how they can foster trust into their products. I strongly believe that privacy and safety shouldn’t be mutually exclusive: what is imperative is to strike a meaningful balance between them — and to be able to prove it!

How does computer vision compare to regular camera systems, from a data protection perspective?

When it comes to GDPR-related obligations, computer vision is no exception. Photographs and videos are categorised as personal data, since they can be linked to identifiable individuals. However, one of the strengths of computer vision is the ability of the algorithm to extract relevant information from a very large dataset, without storing the irrelevant data. Standard video surveillance requires the storage of images from thousands of hours of footage. Large amounts of irrelevant data storage would actually be avoided through computer vision — hence ensuring data minimisation! With computer vision the focus shifts from the footage to the intelligence and to the automated decision which is drawn from it.

In articles on data protection you often read about data-protection-by-design, what does that mean?

Applying a data-protection-by-design approach to computer vision means, for example, avoiding models affected by inherent bias. One way to do so is to ensure the transparency and the explainability of the models used.

Explainability ensures that data subjects (i.e. individuals which are recorded by the cameras) are informed correctly and effectively on how the algorithm works and how their data is being processed by it. Subjects must be allowed to understand why a particular decision was made by the system and what needs to change in order to make a different decision. To take a Buddywise relevant example, why does the system state that I am not wearing the right safety equipment and what does it based that on? Most likely the answer would lie in the safety policies that are the basis for the system decision making - but that must be explainable!

Transparency represents another key aspect when it comes to computer vision and GDPR: meaningful consent. Data subjects need to know — in plain and straightforward language — the purpose of the data processing activity conducted and the images collected need to be relevant and proportional to the defined objective. The Data Controller should adopt a layered approach: first, there should be adequate signs signalling where cameras are installed. Secondly, the signs should refer to a more in-depth privacy policy, which should be easy to obtain (a good example would be by scanning a QR code on the sign). Legal mumbo jumbo and techie bla bla are definitely a no-go!

Any famous last words of advice for a business that is considering adopting a computer vision system like Buddywise?

On a final note, I’d like to stress a significant compliance tool which is represented by data protection impact assessments (DPIAs). It is crucial for companies to prove — in writing — that they have asked themselves the right questions when it comes to how their activities could have an impact on individuals’ rights before the processing starts.

Thank you for sharing with us Alice!

If you want to hear more about how we work with data protection at Buddywise, or just share your critical questions, please contact us at info@buddywise.co and let’s talk!

--

--

Buddywise

Buddywise is a computer vision startup hell bent on creating safer industrial workplaces + saving lives and limbs!