Debating access to personal data
From A-List celebrities seeking to protect their intimate photos stored in the cloud from public scrutiny, to service users and customers surrendering personal details on social networks, the issue of personal data protection is ubiquitous and problematic.
In its report, 'Big Data and Data Protection', The UK's Information Commissioner (ICO) has recently issued new rules for companies and public bodies seeking to harvest personal data from their customers and service users.
The use of massive collections of personal data for research, advertising and decision making has become commonplace. Social networking services, for instance, gather transactional data which is aggregated and sold on to advertising networks. Though it is not considered personal data, it is still possible to de-anonymise personal data by combining it with other publicly available sources.
David Haynes, a Visiting Lecturer and PhD student in the School of Mathematics, Computer Science and Engineering, welcomes the emphasis on risk management emerging from the ICO's new report:
"If there is one message that comes out of the ICO's latest consultation on paper Big Data and Data Protection, it is that risk management is a fundamental part of effective data protection. In its report, the ICO suggests a pragmatic approach where the risk of de-anonymisation is assessed so that a judgement can be made which balances the benefits and the risks. The immediate impulse of organisations will be to consider the organisational risks such as loss of reputation, official sanctions such as fines or the risk of being sued by aggrieved individuals."
Haynes, in the Department of Library & Information Science and whose dissertation investigates the relationship between risk, regulation and access to personal data, is also of the view that the ICO report's consideration of tools such as privacy seals, for enforcing data protection, is a step in the right direction.
"The report's emphasis on tools is commendable and could be a useful technique as part of risk assessments - particularly from the perspective of risks to the individual. Privacy Impact Assessments, for instance, have been promoted by the ICO for some time now. Privacy seals were also suggested - again a great idea in principle - but who will manage this? The Truste Scheme in the United States, is one example. Or perhaps it could be some form of co-regulation as in the advertising industry in the UK, where the government oversees the industry's self-regulatory regime. Bodies such as the Advertising Standards Authority (ASA) and the Internet Advertising Bureau (IAB) provide interesting models for consideration in the coming year."
Haynes's main criticism of the report is that it gives the impression that "if only there were more tools, data protection would be sorted out".
"Targeted guidance to organisations that collect and use 'big data' could refer to tools such as privacy by design or data minimisation as part of the array of responses available for information management. It strikes me that the principle-based approach is the best guarantee that innovations arising from analysis of big data are not smothered by over-regulation while keeping pace with developments in technology. Risk is an effective means of evaluating different approaches to regulation."
'Big Data and Data Protection' was published on 4th August 2014. The Information Commissioner welcomes comments on the report, which can be sent in by 12th September 2014.
Please visit this weblink for more information on courses offered in the Department of Library & Information Science at City.