On March 19th, the European Data Protection Board (EDPB) adopted a formal statement on the processing of personal data in the context of the COVID-19 outbreak via written procedure. The full statement is available here
Statement by the EDPB Chair on the Processing of Personal Data in the Context of the COVID-19 Outbreak
Governments, public and private organisations throughout Europe are taking measures to contain and mitigate COVID-19. This can involve the processing of different types of personal data.
Andrea Jelinek, Chair of the European Data Protection Board (EDPB), said: “Data protection rules (such as European General Data Protection Regulation (GDPR)) do not hinder measures taken in the fight against the coronavirus pandemic. However, I would like to underline that, even in these exceptional times, the data controller must ensure the protection of the personal data of the data subjects. Therefore, a number of considerations should be taken into account to guarantee the lawful processing of personal data.”
The GDPR is a broad legislation and also provides for the rules to apply to the processing of personal data in a context such as the one relating to COVID-19. Indeed, the GDPR provides for the legal grounds to enable the employers and the competent public health authorities to process personal data in the context of epidemics, without the need to obtain the consent of the data subject. This applies for instance when the processing of personal data is necessary for the employers for reasons of public interest in the area of public health or to protect vital interests (Art. 6 and 9 of the GDPR) or to comply with another legal obligation.
For the processing of electronic communication data, such as mobile location data, additional rules apply. The national laws implementing the ePrivacy Directive provide for the principle that the location data can only be used by the operator when they are made anonymous, or with the consent of the individuals. The public authorities should first aim for the processing of location data in an anonymous way (i.e. processing data aggregated in a way that it cannot be reversed to personal data). This could enable to generate reports on the concentration of mobile devices at a certain location (“cartography”).
When it is not possible to only process anonymous data, Art. 15 of the ePrivacy Directive enables the member states to introduce legislative measures pursuing national security and public security *. This emergency legislation is possible under the condition that it constitutes a necessary, appropriate and proportionate measure within a democratic society. If such measures are introduced, a Member State is obliged to put in place adequate safeguards, such as granting individuals the right to judicial remedy.
On March 19th, the European Data Protection Board adopted a formal statement on the processing of personal data in the context of the COVID-19 outbreak. The full statement is available below.
* In this context, it shall be noted that safeguarding public health may fall under the national and/or public security exception.
The Swedish Data Protection Authority imposes a fine of 75 million Swedish kronor (approximately 7 million euro) on Google for failure to comply with the European General Data Protection Regulation (GDPR). Google as a search engine operator has not fulfilled its obligations in respect of the right to request delisting.
In 2017 the Swedish Data Protection Authority (DPA) finalised an audit concerning how Google handles individuals’ right to have search result listings for searches that includes their name removed from Google’s search engine in case of for example lack of accuracy, relevance or if considered superfluous. In its decision the DPA concluded that a number of search result listings should be removed and subsequently ordered Google to do so.
In 2018, due to indications that Google had not fully complied with the previously issued order, the DPA initiated a follow-up audit. This audit is now finalised and the DPA is issuing a fine against Google.
– The GDPR increases the level of responsibility for organisations that collect and process personal data, and strengthens the rights of individuals. An important part of those rights is the possibility for individuals to have their search result delisted. We have found that Google is not fully complying with its obligations in relation to this data protection right, says Lena Lindgren Schelin, Director General at the Swedish DPA.
The Swedish Data Protection Authority is critical to the fact that Google did not properly remove two of the search result listings that the DPA had ordered them to remove back in 2017. In one of the cases Google has done a too narrow interpretation of what web addresses needed to be removed from the search result listing. In the second case Google has failed to remove the search result listing without undue delay.
When Google removes a search result listing, it notifies the website to which the link is directed in a way that gives the site-owner knowledge of which webpage link was removed and who was behind the delisting request. This allows the site-owner to re-publish the webpage in question on another web address that will then be displayed in a Google search. This in practice puts the right to delisting out of effect.
– In its delisting request form Google states that the site-owner will be notified of the request in a way that might result in individuals refraining from exercising their right to request delisting, thereby undermining the effectiveness of this right, says Olle Pettersson, legal advisor at the Swedish DPA who has participated in this audit of Google.
Google does not have a legal basis for informing site-owners when search result listings are removed and furthermore gives individuals misleading information by the statement in the request form. That is why the DPA orders Google to cease and desist from this practice.
Facts about the right to have search result listings removed
In May 2014 the Court of Justice of the EU ruled that an individual may request a search engine provider such as Google to remove a search result listing that contains the name of an individual in case the listing is incorrect, irrelevant or superfluous. This right was strengthened with the GDPR entering into force 25th May 2018. The right is however not absolute, you cannot demand that all search results are to be removed. Individuals who wish to exercise their right to request delisting should contact the search engine provider directly.
What happens next?
Google may appeal the decision of the Swedish DPA within three weeks. If Google decides not to appeal, the decision will enter into force by the end of that time period. Once the decision has entered into force it will be handed over to the Legal, Financial and Administrative Services Agency (Kammarkollegiet) that handles the administration of fines under the GDPR.
Note to editors:
The personal data processing in question is part of the processing operations carried out by Google as a search engine operator. For this part of Google’s activity it is Google LLC (parent company of the Google group) established in the United States that decides the purpose and means of the processing. Since there is no main establishment within the EU for this part of Google’s operations, each Supervisory Authority in the EU is competent for investigating possible infringements of the GDPR within their territory.
To read the press release in Swedish, click here
To read the full decision in Swedish, click here
For further information, please contact the Swedish SA: email@example.com
The Danish Data Protection Agency has reported the municipality of Gladsaxe and the Municipality of Hørsholm to the police, as it finds that the municipalities have not met the requirements of an adequate level of security under the General Data Protection Regulation (GDPR).
For the municipalities of Gladsaxe and Hørsholm Municipality fines of DKK 100.000 and DKK 50.000 have been proposed respectively.
The Data Protection Agency became aware of the cases when both municipalities notified the agency of personal data breaches relating to the theft of computers containing personal data.
Neither computers were protected by encryption, and the loss of personal data by the municipalities therefore posed an undue risk to its citizens.
In one of the cases, the lack of security resulted in a serious personal data breach, as a computer containing personal data of 20.620 citizens, including information of a sensitive nature and personal data, was stolen from Gladsaxe City Hall.
The second security breach took place when the computer of an employee from the municipality of Hørsholm was stolen from his car. On the computer, there was information on about 1.600 employees in the municipality of Hørsholm, including information of a sensitive nature and personal data.
The specific security breaches express some of the possible consequences of the insufficient level of security which poses a high risk to all citizens of whom the municipality processes data.
Municipalities have a great deal of responsibility
“A municipality processes very large amounts of personal data concerning the municipality’s citizens, including information of a sensitive nature. As a citizen, it is not possible to opt out of the municipality’s processing of information about oneself, and the municipality therefore has a high responsibility to avoid the information being disclosed, "said Frederik Viksøe Siegumfeldt, Head of Unit of the Supervisory Unit in the Danish Data Protection Agency. He explains:
“It is simple to access the files stored on the computer when a computer’s hard drive is not encrypted, for example by moving the hard drive to another computer. Therefore, when personal data are stored locally on the computer, it is very imprudent that the municipalities' computers were not encrypted.”
Proposal of fines
The Danish Data Protection Agency has decided to report the Municipality of Gladsaxe and the Municipality of Hørsholm to the police and proposes that the two municipalities be fined DKK 100.000 and DKK 50.000 respectively.
To read the press release in Danish, click here
For further information, please contact the Danish DPA: firstname.lastname@example.org
On 5 March 2020, the Icelandic Supervisory Authority (SA) took the decision to impose an administrative fine of ISK 3.000.000 (EUR 20.643) on the National Center of Addiction Medicine in a case relating to a personal data breach.
The National Center of Addiction Medicine is an NGO that operates a detoxification clinic and four inpatient and outpatient rehabilitation centers, as well as a center for family services and a social center in Iceland. Its services are delivered by a staff of medical doctors, psychologists, registered nurses, nurse practitioners and licensed counselors.
The breach occurred when a former employee of the National Center of Addiction Medicine received boxes containing what were supposed to be personal belongings that he had left there. However, it turned out that the boxes contained patient data as well, including health records of 252 former patients and records containing the names of approximately 3.000 people who had attended rehabilitation for alcohol and substance abuse.
After carrying out an investigation of the data breach, the SA concluded that the breach was a result of a lack of implementation of appropriate data protection policies and appropriate technical and organisational measures to protect the data by the controller. The lack of appropriate measures to protect the personal data therefore constituted violations of, inter alia, Art. 5(1)f and Art. 32 of the European General Data Protection Regulation (GDPR).
When determining the fine, the SA referred to the nature of the personal data involved in the breach, which were data concerning health, and the large scope of the processing. The SA also cited the nature of the National Center of Addiction Medicine as a non-profit health care provider and the fact that the Center had made considerable efforts to improve handling of personal data, beginning before the breach came to light.
The full decision in Icelandic is available here
For further information, please contact the Icelandic SA: email@example.com
The President of the Personal Data Protection Office imposed a fine of PLN 20 000 in connection with the breach consisting in the processing of biometric data of children when using the school canteen.
The school processed special categories of data (biometric data) of 680 children without a legal basis, whereas in fact it could use other forms of students identification.
For that breach, an administrative fine was imposed on Primary School No. 2 in Gdansk. In addition, the President of the Personal Data Protection Office (UODO) has ordered the erasure of the personal data processed in the form of digital information on the specific fingerprints of the children and the cessation of any further collection of personal data.
Following an ex officio administrative proceedings, the President of the UODO has established that the school is using a biometric reader at the entrance to the school canteen that identifies the children in order to verify the payment of the meal fee.
The proceedings have shown that the school obtains the data and processes them on the basis of the written consent of the parents or legal guardians. The solution has been in place since 1 April 2015. In the school year 2019/2020, 680 pupils use a biometric reader and four pupils - an alternative identification system.
In this case, it is important to stress that the processing of biometric data is not essential for achieving the goal of identifying a child’s entitlement to receive lunch. The school may carry out the identification by other means that do not interfere so much in the child’s privacy. Moreover, the school makes it possible to use the services of the school canteen not only by means of fingerprints verification, but also electronic cards, or by giving the name and contract number. Thus, in the school, there are alternative forms of identification of the child’s entitlement to receive lunch.
In the fined Primary School No. 2, in accordance with the lunch rules, available on the website of the school’s canteen, students who do not have biometric identification have to wait at the end of the queue until all the students with biometric identification enter the canteen. Once all the students with biometric identification have entered the canteen, the students without biometric identification are allowed to enter, one by one. In the opinion of the President of the UODO, such rules introduce unequal treatment of students and their unjustified differentiation, as they clearly favour students with biometric identification. Moreover, in the authority’s view, the use of biometric data, considering the purpose for which they are processed, is significantly disproportionate.
The President of the UODO, in the grounds of his decision, emphasised that children require special protection of personal data. Moreover, in the present case, the processed data constitute the data of special categories. The biometric system identifies characteristics which are not subject to change, as in the case of dactyloscopic data. Due to the unique and permanent character of biometric data, which means that they cannot change over time, the biometric data should be used with due care. Biometric data are unique in the light of fundamental rights and freedoms and therefore require special protection. Their possible leakage may result in a high risk to the rights and freedoms of natural persons.
To read the press release in Polish, click here
The Polish text of the decision is available here
For further information, please contact the Polish SA: http://firstname.lastname@example.org
The Dutch DPA imposed a fine of EUR 525,000 on tennis association KNLTB for selling the personal data of its Members. In 2018, KNLTB unlawfully provided personal data of a few thousand of its members to two sponsors.
Boete voor tennisbond vanwege verkoop van persoonsgegevens
De Autoriteit Persoonsgegevens (AP) legt tennisbond KNLTB een boete op van 525.000 euro voor het verkopen van persoonsgegevens. De KNLTB heeft in 2018 onrechtmatig tegen betaling persoonsgegevens van een paar honderdduizend van zijn leden verstrekt aan twee sponsoren.
De Koninklijke Nederlandse Lawn Tennisbond (KNLTB) verstrekte de sponsoren persoonsgegevens zoals naam, geslacht en adres, zodat zij een selectie van KNLTB-leden konden benaderen met tennisgerelateerde en andere aanbiedingen. De ene sponsor ontving persoonsgegevens van 50.000, de andere van meer dan 300.000 leden. Die sponsors benaderden een deel van die KNLTB-leden per post of telefoon.
Verkoop van persoonsgegevens
Voor elke verwerking van persoonsgegevens moet de organisatie die ze verwerkt zich kunnen beroepen op één van de zes grondslagen uit de AVG. Bijvoorbeeld dat degene om wie het gaat toestemming heeft gegeven voor die verwerking. Verkoop van persoonsgegevens zonder toestemming van de persoon achter de gegevens is doorgaans verboden. De KNLTB vond dat hij een gerechtvaardigd belang had bij verkoop van de gegevens. De AP is het daarmee niet eens en heeft geoordeeld dat KNLTB geen grondslag had om die persoonsgegevens door te geven aan de sponsoren.
Klacht KNLTB over AP
Tijdens het onderzoek naar de KNLTB diende de tennisbond een klacht in tegen de AP, die de AP gegrond verklaarde. Die klacht ging over het optreden van AP-voorzitter Aleid Wolfsen in Nieuwsuur, op 17 december 2018. Daarin gaf Wolfsen aan dat de AP ‘een sportbond’ onderzocht. De AP heeft in reactie op deze klacht erkend dat zij in die uitzending de indruk heeft gewekt dat de handelwijze van KNLTB niet correct was, terwijl het onderzoek daarnaar nog liep. De KNLTB zag in die uitlatingen de schijn van vooringenomenheid en dat betreurt de AP. Op aanbeveling van de Nationale Ombudsman laat de AP hierbij weten dat de uitlatingen van Wolfsen ten onrechte vooruitliepen op de uitkomsten van het onderzoek.
De KNLTB heeft bezwaar gemaakt tegen het boetebesluit. De AP zal dit gaan beoordelen.
To read the full decision, click here
For further information, please contact the Dutch DPA: https://autoriteitpersoonsgegevens.nl/nl
Following a decision by the European Data Protection Board (EDPB) Chair, the EDPB March Plenary Session has been cancelled due to safety concerns surrounding the outbreak of the Coronavirus (COVID-19). The EDPB hereby follows the example of other EU institutions, such as the European Parliament, which have restricted the number of large-scale meetings.
The March Plenary Session was scheduled to take place on 19 and 20 March. You can find an overview of upcoming EDPB Plenary Meetings here