GDPR and Technology One Year On
Sometimes it feels difficult to know how to say anything new in relation to General Data Protection Regulation (GDPR). For about two years now, those of us in the industry with the ‘GDPR expert’ tag—as if there is such a thing; it is a bit like Brexit for me: everyone’s an expert and no-one has a clue—have been telling various businesses that, “It is a great thing, it keeps data owners honest’, ‘you should have always been doing this. What’s the fuss about?’ and, ‘but seriously, have you not been keeping your data secure before now?’
But even I have to admit that the introduction of this legislation, and the media coverage it has received— including in the mainstream—has shone a light on some maybe slightly dusty dark corners of both business and consumer behaviours. From a business perspective, it has been a case of ‘can we, can’t we?’ in the main decision making process of, “Tell you what, let us play safe and not do anything that might be deemed a little risky.” The whole theory of accountability causes problems because it is judgement-based, it is not an absolute rule and so for businesses that have put data into the operational camp, where rules are binary, it is hard for them to know how to apply a variable data strategy within the confines of the legislation.
From a consumer’s point of view, it is a whole new world of rights to understand, and in some cases, exploit. The legislation can range from giving them the belief that every email a company sends requires a written invitation before it is sent, through to the right to demand subject access requests (SAR), or to be forgotten; whatever the context or circumstance. The fact that businesses still need to process personal information for contractual or operational purposes can get lost in translation. And then there are those who, in recent cases, have shown that individuals can fraudulently exploit businesses by demanding SARs on other individuals, relying on the business running scared of their obligations and so not insisting on ID validation.
From a consumer’s point of view, it is a whole new world of rights to understand, and in some cases, exploit
So, some 2019 tips that I might not have felt the need to specifically call out a year ago:
Data security: Yes, it is always been vital, and it has always been a criminal target. The difference is that now the value of data is better understood by lower level petty criminals. You are probably already all over your cyber attacks and malware security protocols but now you will need to look as much from within as from without your organisations. How your own employees interact with the data they have at their fingertips can help you identify weaknesses in this very particular kind of valuable, and highly saleable, asset. You would have heard the oft repeated term that data is the new oil; except it is not. You can only steal and sell oil once, data never dies once you have got access to it. There are lots of data security tools out there, but Force point and Dark trace are a couple I have come across.
Consent: No, consent is not the be-all and end-all of permissions to communicate and process, but yes they do matter, and will do more and more as PECR gets into gear for its next iteration, (European lawyer availability notwithstanding) and then you will have to worry about cookie consent too. Get some decent software to help you manage them (look up Syrenis, very impressive), and that also gives you a secondary benefit of creating a single customer view to help your SAR demands.
Consumer requests: Be clear of what you are obliged to share, what you are not, and if it is different by the market you operate in. And do not hand it out willy-nilly—it is not unreasonable to put in place identity protocols to ensure the person who is making the request for everything you hold on an individual from their home address to their inside leg measurement is really who they say they are. Again there are loads of players here, but a nice one I have seen in the context of e-commerce is Onfido’s.
Then, once you have got your hygiene factor kit-bag in place, it is about how you use it and that is all about common sense, roles and responsibilities, and putting yourself (truly putting yourself) in the shoes of the consumer.
All of our automated processing capability (I hesitate to use the terms artificial intelligence or machine learning, but they are there) is growing and will only continue to do so. In that context, the need to be clear on why we are doing certain things, where the customer is valued, and where the checks and balances are as we set these robotic hares running will become paramount. Humans need to decide that, not machines.
Only you can know how your consumers are likely to feel about your brand’s behaviour, so use that knowledge. Show consumers you have learned from their interactions with you and respond appropriately, sometimes you will get it wrong, but as long as you can say with a clear conscious that you have done all you could to safeguard their data—how it is used and by who—you are not going to get into trouble with the ICO.