What’s the difference between Privacy and Data Protection?

In Ireland, it’s increasingly common to see “privacy” used interchangeably or as a substitute for “data protection”. This may be due to lack of awareness, the influence of U.S. focused organisations, or marketing preferences for a catchier term. Whatever the reason, it is important to know the legal difference between the terms in order to avoid confusion about rights and obligations.

Privacy is a broad term encompassing a number of rights, including the right to be let alone and the right to respect for private and family life, home and communications. A useful description of privacy is from the UK’s Calcutt Committee report of 1990 as “the right of the individual to be protected against intrusion into his personal life or affairs, or those of his family, by direct physical means or by publication of information.” In Ireland, privacy rights derive mainly from the Constitution of Ireland (as one of the unenumerated personal rights recognised by Article 40.3), Article 8 of the European Convention on Human Rights and Article 7 of the EU Charter of Fundamental Rights.

Data Protection means the protection of individuals in relation to the collection and use of their personal data. In Ireland, data protection is governed by Article 8 of the EU Charter of Fundamental Rights, the General Data Protection Regulation (GDPR) and the Data Protection Act 2018. Organisations have legal obligations in relation to processing of personal data and individuals have rights, including information, access, rectification, objection and erasure. Other rights include lodging a complaint with the Data Protection Commission and receiving compensation for damage suffered as a result of an organisation not complying with their GDPR obligations.

The terms “privacy” and “data protection” are sometimes related in practice because the same factual situation can engage with both concepts. For example, the risk to an individual resulting from how their personal data is processed could involve a breach of privacy depending on the circumstances.

However, we recommend keeping “privacy” and “data protection” distinguishable to avoid confusion and misunderstandings about legal rights and obligations. It is important for individuals to know they have data protection rights, and for organisations to know they have data protection obligations, in situations which often have nothing to do with privacy.

Common ground for GDPR and blockchain?

Block Chain Network Concept – Distributed Ledger Technology – Blockchain (https://www.stockvault.net/photo/241737/block-chain-network-concept-distributed-ledger-technology)

On 12 July 2019 the European Parliament released its report on Blockchain and the General Data Protection Regulation. The report aims to clarify the existing tensions between the rights of data subjects and blockchain technology and propose solutions, while reassuring its proponents that the EU institutions recognise the potentially game-changing applications for blockchain technology across multiple industries, as was addressed in a European Parliament Resolution of 3 October 2018.

The GDPR is principles-based and these principles inform everything that flows from their application, including its scope to be technology neutral, as is expressly mentioned in recitals, and thus future proof. Blockchain technology is built on blocks of digital information, or nodes, that are distributed across multiple data controllers. Each node builds on the last and, to maintain the integrity of the chain, cannot be modified or altered after each transaction is completed.

The challenges that blockchain’s presents to the GDPR framework are immediately apparent. The GDPR is built on the presumption of an identifiable data controller, or joint controllers, who is accountable for how personal data is processed. Moreover, the technical specificities of the blockchain model are not easily aligned with data subjects’ rights to rectification or erasure of personal data, or the right to be forgotten. As the technology creates a perpetual ledger, principles such as data minimisation and storage also fall foul.

The report also identifies various ways in which blockchain can be used to advance GDPR objectives; without the need for a single (or joint) data controller, it offers transparency over who has accessed data. Data subject rights of access and portability are facilitated by the technology. Ultimately, where blockchain technology has been in the processing of personal data, its compliance with GDPR should be assessed on a case-by-case basis taking into consideration factors such as the context (public v private) and whether the encryption of the data meets the threshold for anonymisation.

The above-mentioned EP resolution makes it clear that there is an explicit intention to support the adoption of blockchain technology across the EU. For GDPR compliance the report proposes regulatory guidance, codes of conduct and certification mechanisms to provide guidance. Alternately, research funding could made available for interdisciplinary research on blockchain protocols that could be ‘compliant by design’.

What is clear is that at present there is nothing concrete in the pipeline that will assuage the concerns of privacy advocates and the question remains – where there is a will can a way be found?

Commission decision threatens residential listings

Former Woolworths store in Acocks Green - Retail Unit To Let - sign
Image: https://www.flickr.com/photos/ell-r-brown/4325812085

Last week the WRC ordered property website daft.ie to remove adverts that were found to be in breach of the Equal Status Acts. Adverts posted by users that contained terms like “suit young professionals, “rent allowance not accepted”, “references required” were held to be discriminatory by the Commission.  

The WRC found that daft.ie has vicarious liability for advertisements placed on its website by third parties where these were in breach of the Acts. It held that daft.ie was not shielded from these obligations by virtue of its status as an ‘Information Society Service Provider’ (ISSP) under the eCommerce Directive (2000/31/EC), which allows internet intermediaries limited exemption from secondary liability as ‘conduits’ or transmitters of information.

The WRC adjudicator declared that there was no evidence before it to allow the exemption under the Directive and it directed daft.ie to implement a methodology to “identify, monitor and block discriminatory advertising on its website”. Without a more detailed description – and none is offered in the ruling – of how the WRC adjudicator arrived at her decision, it is impossible to assess the weight that was given to various factors outlined in the submissions.

The carving out of the exemption is a cornerstone of the eCommerce Directive and aims at preserving the intermediary model and preventing unintended collateral censorship. The Directive does allow for a service provider to terminate or prevent an infringement at the direction of a court or administrative authority; however, the direction by the adjudicator for daft.ie to put in place a system that would automatically flag potentially discriminatory postings to the site conflicts with provisions of the Directive that prevent member states from imposing a general obligation on ISSPs to monitor the information that they transmit or store, nor to actively seek facts or circumstances indicating illegal activity.

Considering the near inevitability of occasional users of daft.ie’s services posting adverts using discriminatory language, the question remains whether daft.ie is the correct entity to automatically screen out all potentially discriminatory language. In light of rapidly evolving technological solutions to ‘policing’ the internet, there have been high-level discussions about whether the current Directive needs to be revised. Certainly, the WRC decision referred to daft.ie competitors that have adopted a process such as the one it proposes. Nevertheless, as the law currently stands, there is a question mark over whether the WRC correctly applied EU law and whether on appeal the High Court would uphold its decision.

Risk-based approach to GDPR compliance

In practice, we are coming across some confusion about the GDPR’s risk-based approach to compliance. It is important not to confuse this compliance concept with the risk facing your business or organisation.

Risk is the possibility of something happening which may cause loss, harm or damage, and it is usually assessed in terms of likelihood x severity.

In a GDPR compliance context, and where a risk-based approach is relevant, this means risk to the rights of individuals that may result from you processing their personal data. In other words, risk to individuals, not to your business or organisation.

GDPR compliance obligations that refer to risk include accountability measures, security measures, data breach notification, Data Protection Impact Assessment (DPIA), and privacy by design.

While the term “risk” is not defined in the GDPR, Recital 75 provides useful guidance.

When undertaking risk assessment, risk for GDPR compliance purposes (risk to individuals) must be understood and distinguished from the important operational risks affecting your organisation (risk to you) including possible regulatory sanctions, reputational harm and/or legal action that might result from GDPR non-compliance.

Privacy Notices and GDPR

With the prospect of increased regulatory activity ahead, it is important for businesses and organisations to ensure that their Privacy Notices are compliant.

Under GDPR, data controllers must provide individuals with mandatory information about the processing of their personal data and their rights, and the most effective way to do this is by a Privacy Notice (sometimes referred to as a Privacy Policy or a Privacy Statement, but the name is not important – what matters is the content and how it is provided).

Privacy Notices are an essential part of GDPR transparency obligations, and it should be transparent to individuals that their personal data is being processed, to what extent, and what their rights are.

More than one Privacy Notice version may be needed depending on the category of individuals involved (customers, employees, etc.).

The information has to be provided in a clear, easily accessible format at the time personal data is obtained from individuals, or within one month when obtained from another source.

The mandatory categories of information to be provided are set out in Articles 13-14, and include purposes of processing, legal basis for processing, legitimate interests for processing (if applicable), data sharing, international transfers, and data retention.

Working all this out, with documentation to meet the requirements of accountability, can be challenging for businesses and organisations.

It may be necessary to refresh data mapping or review justifications for legal basis.

Privacy Notices should also align with the Records of Processing Activities (Article 30).

Privacy Notices are a critical part of GDPR compliance, but they are not a once-off exercise, and must be kept under review to reflect your processing activities.

FP Logue secures social welfare payments for client who refused to apply for Public Services Card

FP Logue received confirmation recently that the Department of Employment Affairs and Social Protection has agreed to pay social welfare benefits to a client who refused to register for the Public Services Card (PSC).

Our client had presented a passport and proof of address with an application for benefits and received a formal decision from the Deciding Officer that the payments would be available for collection in the local post office in due course.

Some days later our client was informed by a member of staff that the approval had been a mistake and that the payments would be suspended until such time as an application for the PSC was processed. Our client refused to make the application and asked for written reasons to be provided. The position was subsequently confirmed in writing that payments were suspended until a PSC application was processed.

We wrote to the relevant official on our client’s behalf pointing out that there was no requirement under social welfare law for an applicant to register for the PSC and that the payment had been unlawfully suspended and that our client had been grossly misinformed as to their rights by officials.

We have now received confirmation that payments have been released confirming our assertion that a PSC registration is neither mandatory nor compulsory for the purposes of accessing social welfare benefits.

Introducing #InfoLaw2019 – 22 March

It’s nearly a year since GDPR came into force and we are beginning to see what the world looks like post-GDPR. Just yesterday we saw the French regulator hand out a €50 million fine to Google and it seems there is more to come. We are seeing the rise of the non-profit complainant taking on cases for individuals. DPOs are beginning to find their feet and the first cohort is starting to experience what it is really like to hold this position in the public sector and large organisations.

There are still many questions on people’s minds:

  • Will there be an avalanche of litigation?
  • What is the DPO really supposed to do?
  • Will the GDPR change the way the State and the public sector handle personal data?
  • How are big organisations adapting?

#InfoLaw2019 you can find out the answer to these questions and more from some of Ireland’s leading lawyers, DPOs and industry professionals. We’ll give more details of speakers and topics in the next week or two, check our website for more details or you can order your discounted early-bird ticket below:

Data controllers at risk if they presume mixed personal data can’t be accessed by data subjects without third party consent

Privacy written in tiles

The biggest data protection myth out there is that third party personal data cannot be disclosed under a subject access request that covers “mixed” personal data, i.e. information that contains personal data of more than one individual.

If I had €10 every time a data controller made this claim I wouldn’t need to write this update because I’d have already retired a rich man and would be sitting in my vineyard in the South of France enjoying the good life.

The reality is that there is no presumption against disclosure of third party personal data in a mixed access request. Obviously third parties have privacy rights which cannot be adversely effected but that doesn’t mean they have to consent to disclosure. A data controller has to balance competing interests and make a decision in line with the GDPR, that’s what the law says.

While it may be a difficult to decision to make in some circumstances, generally there should be no real issue since the GDPR facilitates the processing of others people’s personal data as long as it is lawful, responding to a subject access request is no different.

Litigation risk

Data controllers are risking legal proceedings or complaints to the Data Protection Commission if they wrongly assume that all third party data must be purged when responding to subject access requests.

In many situations the rationale for the subject access request is to access information about other people, for example family members or professionals and in those circumstances data subjects may have a very strong legitimate interest in accessing mixed data.

B v General Medical Council

The English Court of Appeal considered this issue in the case of B v General Medical Council [2016] EWCA Civ 1497 which concerned a request by a patient to access a report prepared by the General Medical Council after the patient had complained about his treatment by a doctor. The doctor objected to the release of the report saying it contained both his and the patient’s personal data and therefore his right of privacy prevented the report being released to his former patient.

The General Medical Council nevertheless decided that on balance the rights of the patient favoured releasing the report to him. The doctor successfully appealed to the High Court but that appeal was overturned in the Court of Appeal on the basis that there is no presumption in favour of refusing access to mixed data and the data controller is best placed to make that evaluation and in this instance had done so correctly and lawfully.

Data controllers need to take heed

This case shows that data controllers have a wide margin of discretion but nevertheless have to weigh up the competing interests when handling a subject access request for access to mixed personal data. There is no presumption that mixed data must be refused or that the third party data subject must consent to release.

Any data controller that handles a subject access request based on these presumptions risks litigation or a complaint to the Data Protection Commission.

This article was also published on LinkedIn