A clip from the TG4 Uchtú documentary where I discuss subject access rights with broadcaster Evanne Ní Chuilinn
Government fined more than €5 million by European Court for decade-long environmental law breach
Ireland was sanctioned by the Court of Justice of the European Union because of a failure to carry out an environmental impact assessment of a wind farm development in Derrybrien, Co Galway. Ireland was ordered to conduct an assessment by the court in 2008 but has yet to comply with the European court’s directions in the ten years which followed. The European Commission then took Ireland back to court resulting in serious fines.
European Law states that an environment impact assessment must be carried out before permission is granted for any project which is likely to have significant effects on the environment . No assessment was carried out before the construction of Derrybrien despite the clear European law on the matter. Construction of the wind farm in 2003 caused a massive landslide which killed thousands of fish and severely damaged the surrounding environment. Following this, Ireland was taken to court in 2008 and lost.
Ireland was then given two months to do an environmental impact assessment on the land. The State came up with a draft plan to carry out a non-statutory assessment but even this came to nothing. Ireland was granted extra time by the EU with December 2016 as the final deadline but still no action was taken.
The CJEU this year took the Ireland back to court, on grounds that Ireland had not made any significant effort to carry out an environmental impact assessment of the project nor made any concrete plans to do so. They decided that the delay in complying could not be justified and there was no excuse for the inaction.
Ireland argued that they had had no power to direct the company (which is publicly owned) in ownership of the land to carry out the assessment, citing that a judgement cannot affect third parties when they are not heard in proceedings. They also argued that the measures that Ireland was required to take were never specifically identified, meaning that their steps toward a non-statutory assessment technically complied with the 2008 judgement. However, the court rejected these arguments and decided in favour of the European Commission.
A large financial penalties was imposed on Ireland to prevent the recurrence of similar infringements on EU law. The court found that the best way to do this would be t with a lump sum, followed by a significant daily amount as long as the breach continued. This was done to encourage Ireland to carry out the long-awaited environmental impact assessment. The final amount decided on by the courts was a lump sum of €5,000,000 followed by a periodic penalty payment of €15,000 per day from the date of delivery of the present judgement until the date of compliance with the 2008 judgement.
It is clear that all of the expense could have been avoided if Ireland ensured that the wind farm operator, which it owns, met its responsibilities and conducted an environmental impact assessment.
This post was authored by Daire Murray, a TY student from Loreto Kilkenny, who spent the week working with us.
Transparency for political advertising – will it all end in tears?
Against the backdrop of the International Grand Committee on Disinformation and ‘Fake News’, which convened this week in Dublin, the government has approved a proposal to Regulate Transparency of Online Political Advertising. The legislation aims to increase transparency around paid advertising and impede the electoral process from being captured by a narrow range of interests that align themselves with harmful content and electoral interference. The proposed regulation, which is essentially a stopgap until a Statutory Electoral Commission is established to oversee a wider reform of the electoral process.
It’s old hat by now that the electoral process has proven to be especially vulnerable to certain interest groups’ desire to wedge their way into social platforms and manipulate the spread potentially harmful information. Platforms themselves are divided on how they manage political content; Facebook has decided to take an entirely hands off approach, while Twitter has announced a ban on all political advertising.
Both positions have their detractors – the unequivocal refusal to regulate and thus tacitly condoning the spread of contentious content, versus the decision to become the arbiter of political content in your feed. Following the Twitter ban, the Taoiseach voiced his reservations regarding the disabling of a significant channel for political representatives to reach voters. He also expressed his concern that such a ban could act as a contagion for a ban on political advertising across all media, including billboards and newspapers.
While such a view has more than a whiff of scaremongering, it reflects the inherent tension in regulating political content online. Apart from a consensus that ‘something must be done’ there is very little agreement on where to draw the line, nor a sense of how easy it will be to police once drawn. Watch this space.
Enforcing EU environmental law
Fred Logue’s slides from this recent event which took place in UCC on Friday 18 October 2019
Garda Requests Under Section 41(b) of the Data Protection Act 2018
The General Data Protection Regulation (GDPR) restricts how personal data may be processed by a data controller. In particular, personal data may not be used for a purpose incompatible with the purpose for which the data was initially collected.
An exception to this is Section 41(b) of the Data Protection Act 2018. This allows a data controller operating in Ireland to disclose personal data to a third party to the extent that this is “necessary and proportionate for the purposes of preventing, detecting, investigating or prosecuting criminal offences.“
Typically, this may arise following a request from An Garda Síochána or another law enforcement body for disclosure of information containing personal data. Such requests are not uncommon in sectors such as financial services and telecommunications, although it is up to each company to handle such requests in a legally compliant manner.
However, some data controllers might be alarmed to receive a request from the Gardaí seeking disclosure of information under Section 41(b) of the Data Protection Act 2018. This might be for information relating to named individuals or for a copy of CCTV footage. The request might be marked urgent, refer to criminal allegations, or be somewhat broad or exploratory in nature. You might feel under pressure to comply with the request.
The most important thing to know is that, unless there is a legal obligation or a mandatory reporting requirement, you don’t have to comply with a request for disclosure of personal data under Section 41(b) of the Data Protection Act 2018. However, the catch is that if you choose to comply with the request in full or in part under Section 41(b), you bear the risk as data controller. This means being satisfied that disclosing the personal data is necessary and proportionate for the purpose of preventing, detecting, investigating or prosecuting criminal offences. This places a burden on you as data controller to justify the processing and keep appropriate records to demonstrate GDPR compliance. You also have other obligations, including transparency to data subjects, data minimisation, facilitating data subject rights, and ensuring appropriate data security.
If information concerning individuals or video footage is important for a criminal investigation, the Gardaí can and often will get a District Court order or even a search warrant. And if this is served on you, there will be a legal obligation to provide the specific information, and you will have protection as a result. Depending on the circumstances, this may be preferable to complying voluntarily with a request for disclosure under Section 41(b), and taking on the risk and potential liability of getting it wrong.
And if you choose not to comply with a request for disclosure under Section 41(b) of the Data Protection Act 2018, which you are entitled to do in the absence of any other legal obligation or mandatory reporting requirement, bear in mind that the communication received may likely contain sensitive or confidential information that should not be retained unless there is a specific reason to do so.
Who decides the truth?
Earlier this month Minister for Health, Simon Harris, launched a Vaccine Alliance – a network of healthcare professionals, policy makers, patient advocates, students and representatives from groups most affected by vaccine hesitancy – to boost the uptake rate of childhood vaccines. Recently four European countries, including the UK, lost their measles-free status and there are fears Ireland could follow suit. The decline in vaccine uptakes has been linked to the spread of misinformation – or “fake news” – on social media platforms. Minister Harris threw down the gauntlet to social media companies to “decide which side they want to be on” and take decisive action to help reverse this trend.
The challenge of regulating tech companies in the public interest, particularly social media platforms, has been explored in depth. Proposed measures become entangled in overlapping areas of tech, policy, piracy, free speech and platform liability. Differentiating between illegal speech and ‘opinions I don’t agree with’ (like vaccine disinformation) presents serious challenges to freedom of expression and plurality; at the same time making social media platforms the arbiters of truth is manifestly undesirable. Regulatory overreach would likely be detrimental to the free access to the services that modern society has come to rely upon as well as stifle innovation.
On the other hand the algorithms that drive the social media companies’ traffic favour provocative content that engages users and prolongs their time on the platform, providing a captive audience for targeted ads. In effect, social media has become weaponised to serve advertisers – disinformation is a profitable business.
Across the board the response has been a patchwork of work-arounds. In 2018, the European Commission published its report on a ‘Multi-dimensional Approach to Disinformation’, which opted for a co-regulatory Code of Practice and promotion of media literacy. Companies themselves have taken initiatives to manage misinformation – the broad consensus around the potential harm caused by vaccine misinformation has assisted this agenda. Facebook works with third-party checkers to reduce the distribution of stories that have been flagged as misleading. Instagram has said it would hide hashtags that have a “high percentage” of inaccurate vaccine information with mixed results. Twitter is launching a new tool that directs users to credible public health resources. In February this year, YouTube announced that is demonetising anti-vaccination content. This month, Google adjusted its search algorithm to boost original journalism.
Clearly, a degree of self-regulation has already been adopted by these tech giants. But private entities, that are change agents in the areas of privacy, competitiveness, freedom of speech and national security and law enforcement, operating without oversight run the risk of the tail wagging the dog.
As part of its remit to transpose the Audiovisual Media Services Directive (EU) 2018/1808 (AVMSD) into Irish law for September 2020, the Broadcasting Authority of Ireland (BAI) will effectively become EU-wide watchdog for video on-demand services that are based in Ireland. Under the Directive, providers will require age verification, parental controls and a ‘robust’ complaints mechanism. The BAI would become a statutory regulator with legally enshrined enforcement power to police social media sites’ video content.
The UK government published its Online Harms White Paper on 26 June 2019 which proposed both government and industry-led initiatives including developing a regulatory framework and independent regulator, user redress and a statutory duty of care imposed on social media companies which focuses on a set of desirable outcomes that it would leave to the companies to decide how to implement, not unlike the implementation regime for GDPR. Apart from the measures to be adopted as part of its duties with respect to the AVMSD, the Irish government has not proposed any parallel regime, despite the obvious and pressing need to do so in light of Ireland’s unique position as EU country of incorporation for a large number of global social media companies. Until it does, government talk about the onus being on social media companies to decide which side they want to be on, is cheap.
What’s the difference between Privacy and Data Protection?
In Ireland, it’s increasingly common to see the term “privacy” being used interchangeably or as a substitute for “data protection”. This may be due to lack of awareness, the influence of U.S. terminology, or marketing preferences for a catchier term. Whatever the reason, it is important to understand the difference between the two terms in order to avoid confusion about legal obligations and rights.
Privacy is a broad term encompassing a number of rights, such as the right to be let alone and the right to respect for private and family life, home and communications. A useful description of privacy is from the UK’s Calcutt Committee report of 1990 as “the right of the individual to be protected against intrusion into his personal life or affairs, or those of his family, by direct physical means or by publication of information.” In Ireland, privacy rights derive mainly from the Constitution of Ireland (as an unenumerated personal right recognised by Article 40.3), Article 8 of the European Convention on Human Rights, and Article 7 of the EU Charter of Fundamental Rights.
Data Protection means the protection of individuals in relation to the collection, use or processing of personal data, i.e. information that relates to them as an identified or identifiable person. In Ireland, data protection is governed by the General Data Protection Regulation (GDPR) and the Data Protection Act 2018. Businesses and organisations have data protection obligations, including having a legal basis for collecting, using or processing personal data, compliance with data protection principles, and having technical and organisational measures in place to meet accountability requirements. Individuals have data protection rights, including information, access and erasure, as well as making a complaint to the Data Protection Commission or taking legal action where their rights have been infringed or they have suffered damage.
Where a breach of the GDPR is likely to cause risk or harm to an individual, one of the adverse impacts could of course also include a loss of privacy. However, the GDPR is not a privacy law. In fact, the word “privacy” does not appear anywhere in its articles or recitals.
It’s important to know the difference between privacy and data protection to avoid confusion and misunderstanding about legal obligations and rights. It is also essential for businesses and organisations to understand that they have data protection obligations, and individuals have data protection rights, in situations which often have nothing to do with privacy.
Common ground for GDPR and blockchain?
On 12 July 2019 the European Parliament released its report on Blockchain and the General Data Protection Regulation. The report aims to clarify the existing tensions between the rights of data subjects and blockchain technology and propose solutions, while reassuring its proponents that the EU institutions recognise the potentially game-changing applications for blockchain technology across multiple industries, as was addressed in a European Parliament Resolution of 3 October 2018.
The GDPR is principles-based and these principles inform everything that flows from their application, including its scope to be technology neutral, as is expressly mentioned in recitals, and thus future proof. Blockchain technology is built on blocks of digital information, or nodes, that are distributed across multiple data controllers. Each node builds on the last and, to maintain the integrity of the chain, cannot be modified or altered after each transaction is completed.
The challenges that blockchain’s presents to the GDPR framework are immediately apparent. The GDPR is built on the presumption of an identifiable data controller, or joint controllers, who is accountable for how personal data is processed. Moreover, the technical specificities of the blockchain model are not easily aligned with data subjects’ rights to rectification or erasure of personal data, or the right to be forgotten. As the technology creates a perpetual ledger, principles such as data minimisation and storage also fall foul.
The report also identifies various ways in which blockchain can be used to advance GDPR objectives; without the need for a single (or joint) data controller, it offers transparency over who has accessed data. Data subject rights of access and portability are facilitated by the technology. Ultimately, where blockchain technology has been in the processing of personal data, its compliance with GDPR should be assessed on a case-by-case basis taking into consideration factors such as the context (public v private) and whether the encryption of the data meets the threshold for anonymisation.
The above-mentioned EP resolution makes it clear that there is an explicit intention to support the adoption of blockchain technology across the EU. For GDPR compliance the report proposes regulatory guidance, codes of conduct and certification mechanisms to provide guidance. Alternately, research funding could made available for interdisciplinary research on blockchain protocols that could be ‘compliant by design’.
What is clear is that at present there is nothing concrete in the pipeline that will assuage the concerns of privacy advocates and the question remains – where there is a will can a way be found?
GDPR update for Food & Drink businesses
These are Niall’s slides from yesterday’s event at City West
Commission decision threatens residential listings
Last week the WRC ordered property website daft.ie to remove adverts that were found to be in breach of the Equal Status Acts. Adverts posted by users that contained terms like “suit young professionals, “rent allowance not accepted”, “references required” were held to be discriminatory by the Commission.
The WRC found that daft.ie has vicarious liability for advertisements placed on its website by third parties where these were in breach of the Acts. It held that daft.ie was not shielded from these obligations by virtue of its status as an ‘Information Society Service Provider’ (ISSP) under the eCommerce Directive (2000/31/EC), which allows internet intermediaries limited exemption from secondary liability as ‘conduits’ or transmitters of information.
The WRC adjudicator declared that there was no evidence before it to allow the exemption under the Directive and it directed daft.ie to implement a methodology to “identify, monitor and block discriminatory advertising on its website”. Without a more detailed description – and none is offered in the ruling – of how the WRC adjudicator arrived at her decision, it is impossible to assess the weight that was given to various factors outlined in the submissions.
The carving out of the exemption is a cornerstone of the eCommerce Directive and aims at preserving the intermediary model and preventing unintended collateral censorship. The Directive does allow for a service provider to terminate or prevent an infringement at the direction of a court or administrative authority; however, the direction by the adjudicator for daft.ie to put in place a system that would automatically flag potentially discriminatory postings to the site conflicts with provisions of the Directive that prevent member states from imposing a general obligation on ISSPs to monitor the information that they transmit or store, nor to actively seek facts or circumstances indicating illegal activity.
Considering the near inevitability of occasional users of daft.ie’s services posting adverts using discriminatory language, the question remains whether daft.ie is the correct entity to automatically screen out all potentially discriminatory language. In light of rapidly evolving technological solutions to ‘policing’ the internet, there have been high-level discussions about whether the current Directive needs to be revised. Certainly, the WRC decision referred to daft.ie competitors that have adopted a process such as the one it proposes. Nevertheless, as the law currently stands, there is a question mark over whether the WRC correctly applied EU law and whether on appeal the High Court would uphold its decision.