I. Importance of informational self-determination
1. Informational self-determination as fundamental right
a. The topic „Informational self-determination“ is frequently subject to press coverage. Especially in the context of eavesdropping, grid investigation or online investigations by the so-called “Federal Trojan” or the video surveillance at the workplace privacy law concerns are raised - depending on the source - rightly or wrongly.
Before examining the aforementioned phenomenon in detail it is important to create a common understanding of informational self-determination.
As regards the term as such one soon finds the so-called census decision of the German Federal Constitutional Court as of December 15, 1983.
b. The German Federal Constitutional Court takes the General Personality Right as starting point, which is primarily protected by Art. 2 (1) in connection with Art. 1 (1) of the German Constitution. According to the still up-to-date considerations of the German Federal Constitutional Court the General Personality Right also comprises the individual’s right, ensuing from the concept of self-determination, to autonomously decide, when and within which limits personal life circumstances are revealed.
With regard to that right the German Federal Constitutional Court explained the following already in 1983 in the aforementioned census decision:
„Under the current and future circumstances of automatic data processing this right requires special protection. It is especially endangered because it is no longer required for decision making processes to rely on manually collected filing cards and paper documents, but today – with the help of automatic data processing - details concerning the personal or material circumstances of an identified or identifiable natural person can be stored for an unlimited period of time and regardless of distances can be retrieved at any time within seconds. They can furthermore – especially for the implementation of integrated information systems - be combined with other data bases to form a partially or almost complete personality image, without the person concerned being able to sufficiently verify its correctness and use. As a consequence the possibilities of accessing and influencing, which may impact an individual’s behavior already as a consequence of the psychological pressure of public attention, have been enlarged in a manner unknown up to now.”
d. The Federal Constitutional Court qualified the possibility of building personality profiles not only as a danger for the General Personality Right as such but also for other fundamental rights. It is possible that an individual rather refrains from exercising its fundamental rights of the freedom of assembly and the freedom of association if he expects that his participation in an assembly or in a citizens’ initiative is officially registered and that this may result in risks for him.
e. As a consequence the German Federal Constitutional Court takes the following conclusion:
“Under the circumstances of modern data processing the free personality development requires the protection of the individual against unlimited collection, storage, use and transfer of his personal data.
This protection is awarded by the fundamental right of Art. 2 (1) in connection with Art. 1 (1) of the German Constitution. The fundamental right therefore grants the right to the individual as a rule to decide on his own about the disclosure and use of his personal data.”
f. It can therefore be concluded that informational self-determination describes a situation, in which an individual controls and knows information available about him. What is to be avoided in any case is a “transparent citizen”. This protection was improved in 2008 by the so-called “IT fundamental right”, which aims at the protection of the integrity and confidentiality of IT systems and thus also at the protection of informational self-determination.
g. It is also one of the very central statements of the Federal Constitutional Court that an individual does not have the unlimited command over his data. Each individual is exposed to certain influences as a consequence of social interaction. The Federal Constitutional Court words this as follows:
„Information, even to the extent it is related to an individual, constitutes an image of social reality, which cannot exclusively be attributed to the individual concerned. The German Constitution has – as is often underlined in the judgments of the Federal Constitutional Court - resolved the tension between individual and community to the benefit of the community-relation and community-situatedness of the individual […]. As a consequence the individual has to accept restrictions of his right of informational self-determination in the interest of an overwhelming public interest.”
Today, the resulting limitation of the de-facto self-determination over information available to and at third parties is of the utmost importance. The following applies: An individual does not have unlimited control over information about himself. Certain acts of collection, processing and use of personal data have to be accepted, be it without or against the individual’s will.
2. The concept of personal data
a. The subject of the right of informational self-determination are the personal data. In Sec. 3 (1) of the Federal Data Protection Act the legislator defines them as “any information concerning the personal or material circumstances of an identified or identifiable natural person”. As a consequence personal data do not only exist if information is related to a named individual but also if the individual’s identity can only be concluded from the circumstances, as is the case for example when social security or employee numbers, the combination of bank account data and the bank’s sort code or e-mail addresses are used.
b. The described term of personal data goes very far, which leads to the question of informational self-determination being asked in many day-to-day situations. On the occasion of going shopping, when using geo-localization services or mobile phones. Any neutral set of data becomes personal data solely by attributing its content to a natural person.
c. It has to be specially emphasized that there is nothing like unimportant or unprotected personal data. The legislator has taken the fundamental decision against establishing gradually different levels of protection for personal data.
As the sole exception the so-called „sensitive data“ (legally referred to as ‘special categories of personal data’) enjoy a particularly strong protection. Special categories of personal data shall mean information on racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, health or sex life.
3. The „I have nothing to hide“-error
a. As mentioned before there is a wide-spread lack of understanding for the discipline of privacy law. In a number cases a person who raises objections against public or private surveillance measures is confronted with the question whether he or she have something to hide.
This question is the result of an error.
It is the error that the right to informational self-determination and privacy law - although legally founded - are no more than a bureaucratic superfluousness and are about the attempt to hide something negative.
b. In accordance with the article „Why Privacy Matters Even if You Have ‘Nothing to Hide’“ by Daniel J. Solove from the Chronicle Review dated May 15, 2011, said error can be summed up as the “I have nothing to hide” error. The very interesting article lists various reasons why the protection of informational self-determination is required. At the same it establishes that protecting privacy has nothing to do with hiding something.
The mere observation also impedes the execution of fundamental rights, even though it is not about hiding things.
II. Concept of legal protection
1. Central statutory provisions
It is impossible to fully describe the theoretical foundations of the right of informational self-determination in this article.
For this reason here are the most important statutory provisions of privacy law by means of which the legislator wants to implement the concept of informational self-determination:
(1) Data economy and data reduction: Sec. 3a of the Federal Data Protection Act contains the requirement of data economy and data reduction. The key message is that a controller should always try to collect as little personal data as possible and that the procedures of anonymization and pseudonymization shall be used. In this context anonymization means the de-facto irreversible removal of information's "person relatedness" whereas mere pseudonymization can be reversed with some effort.
(2) Prohibition principle: The collection, processing and use of personal data is prohibited except if is allowed. This so-called principle of prohibition without permission expresses the fact that the handling of personal data always requires a permission. Such permission can be contained in other laws or in the Federal Data Protection Act, for example if a certain data processing measure is required fir the fulfilment of a contract. There are also statutory provisions in the Federal Data Protection Act, in which the permission requires a weighing of interests. And eventually there is consent. In the employment relationship consent is perceived as a problematic permission since the prevailing opinion considers the employees' decision to declare or withhold consent as involuntary. For this reason the German legislator currently has plans to introduce significant limitations for the admissibility of employee consent.
(3) Principle of direct collection: The next important protection element is the principle of direct collection according to Sec. 4 (2) of the Federal Data Protection Act. From a privacy law standpoint information about a certain person should always be collected directly from that person. The wide-spread habits of the so-called „background screening“ or of „calling the old employer“ are not compliant with this principle. It can be stated without exaggeration that in practice the principle of direct collection is among the least respected principles.
(4) Purpose limitation: Another very important principle requires the prior determination of a certain purpose for the collection, processing and use of personal data and the subsequent limitation of use in compliance with such purpose. There is no room for the "let's wait and see" approach. The heated debate about the planned centralized storage of various data about German citizens (e.g., participation at lawful or illegal strikes, times of absence at the workplace etc.) was inter alia fuelled by the lack of clear purposes of such central data collection and the project has not been pursued further in the originally planned form.
(5) Technical protection: The controller is obliged to protect the personal data within its sphere of influence also from a technical point of view. Here passwords, authorization concepts, the use of encryption techniques and the like come to bear. The corresponding provisions aim at preventing data losses and the corresponding dangers for individuals. This objective becomes particular obvious when it comes to security breach notification obligations vis-à-vis the competent data protection authority and the data subjects in case of the loss of certain data (e.g., bank and credit card data).
(6) Transparency: Another important objective of privacy law is to maintain a maximum of transparency vis-à-vis the data subjects, which is closely linked to the subject matter of informational self-determination. Under certain circumstances and pursuant to certain statutory provisions the data subjects have the right to claim information about, or the deletion, correction or blocking of personal data.
(7) Transfer to third countries: The statutory provisions with the greatest impact on the economy are Secs. 4b and 4c Federal Data Protection Act restricting the transfer of personal data to countries outside EU/EEA, which do not provide for an adequate level of data protection. The group of these so-called “third countries” is large. Pursuant to the parameters contained in the Data Protection Directive (Directive 95/46/EC), which have been transposed into German law third countries are all countries outside Europe with the exception of states on the so-called White List. Examples of third countries are the USA but also India and China. For the states on the White List the European Commission has acknowledged an adequate level of data protection pursuant to the Data Protection Directive. This applies to Argentina, Canada, Guernsey, the Isle of Man, Switzerland and now also for Israel. The data protection level in all other states is inadequate from a European point of view and therefore has to be compensated in each individual case on the level of the companies acting as data importers. This can be achieved by entering into model contracts or, in case of the USA, by means of a self-certification (known as the Safe Harbor certification).
2. Role of the data protection officer
According to Sec. 4f of the Federal Data Protection Act public and private bodies which process personal data by automated means shall appoint a data protection officer in writing. For private bodies this applies only if more than nine persons are permanently employed in the automated processing of personal data or if the automated processing measures present special risks to the rights and freedoms of data subjects.
The data protection officer shall work to ensure compliance with the Federal Data Protection Act and with other data protection provisions. He constitutes the central point of contact for questions related to privacy law.
The concept of having a data protection officer is successful. In the meantime other Member States are evaluating the possibility of also introducing the data protection officer in their respective laws. Recently the European Commission has presented a draft regulation on European privacy law, which contains the requirement to appoint a data protection officer for companies of a certain size.
3. Role of data protection authorities and courts
Independent data protection authorities in the 16 German Federal States control the application of the Federal Data Protection Act as well as of other data protection provisions. The data protection authorities also have the obligation, to advise and support the data protection officers and the data controllers with respect for their respective typical needs.
In practice the data protection authorities advise and support by means of a great number of publications and suggestions of how to handle certain privacy law questions. The publications are sometimes aligned between the 16 different data protection authorities.
There are also questions, however, with regard to which the data protection authorities have different points of view. It is for example the unanimous opinion among the data protection authorities that an employee can hardly give voluntary consent and that as a consequence employee cannot be recommended as the foundation for the collection, processing and use of personal data in the employment relationship.
As regards the question, however, of whether companies have the right so screen their employees’ address data against the so-called anti-terror lists of the European Union the opinions of the data protection authorities do not have the same point of view.
Court decisions on privacy law questions are relatively rare. This fact can be explained by the companies’ reluctance to risk the publicity of a court procedure in cases where they are subject of measures by a data protection authority. As a rule companies with problems in the area of privacy law avoid public attention. The result of the interplay between the described principles and the guidance provided by the data protection authority is sometimes hard to anticipate by data subjects. This may be one of the reasons for the “popularity” of privacy law as legal discipline. As a matter of fact it is not always easy to deduce the privacy law-compliant behavior from merely reading the law.
III. Application of the law and cases of doubt
The admissibility of Whistleblowing be it via the telephone or via a web interface has frequently been subject of discussions during the last years. The concept of Whistleblowing is to give employees the possibility of reporting violations of a code of conduct to a central point of contact, potentially even in an anonymous manner.
For the qualification of such Whistleblowing systems from a privacy law point of view it is essential to know which conduct rules are contained in code of conduct, the violation of which can potentially be subject of a Whistleblowing report by another employee. Often the Sarbanes Oxley Act is referred to as the solely relevant source for codes of conduct.
The Sarbanes Oxley Act – as such triggered by cases as Enron or Worldcom – inter alia aims at improving the transparency of the companies’ financial reporting. Upon a closer look it turns out that the Sarbanes Oxley Act requires a code of conduct but from a content point of view only addresses financially relevant behavior and bookkeeping. The New York Stock Exchange Corporate Governance Codex goes farther than this for companies listed at the New York Stock Exchange. It does not only contain a list of minimum contents but also the statement that each company has the right to create its own rules. This gives room for a corporate ethical identity. A couple of years ago companies created conduct rules against romantic involvements at the work place.
A German labor court which had to analyze the concerned code of conduct in the context of a court procedure considered such rules to constitute a violation of human dignity.
Slightly simplified the prevailing opinion differentiates as follows in accordance with the degree of risk for the company:
As regards crimes which are committed against the company or from inside the company against third parties as well as regards violations of human rights the company’s interest in receiving the report prevails in order to protect the company and its resources. For such reporting items Whistleblowing is admissible.
For violations of merely internal conduct rules, however, the interests of the employees being subject of a report prevail. The violation of such conduct rules does not constitute significant risks for the company. For such reporting items Whistleblowing is not admissible.
The German data protection authorities’ guidance as regards the implementation of a Whistleblowing scheme deals with the legitimate purposes of Whistleblowing, the group of persons potentially subject of a report, how to avoid anonymous reports, the notification and information obligations, the transfer of data to third parties, the blocking, correction and deletion of data, the involvement of the data protection officer, how to involve third parties and the required technical and organizational measures.
2. Cloud Computing
a. The idea of cloud computing is based on accessing centrally stored data and programs with small and – as a rule – cheap hardware devices (e.g., desktop computers with minimal equipment). The central administration of data and programs are referred to as main advantages of cloud computing, which shall simplify the creation of backups. Also updating the relevant software programs can be executed centrally (i.e., on the cloud provider’s systems) instead of having to update installations on a multitude of end user devices.
b. From a privacy law point of view the provider of cloud computing services is normally qualified as a data processor. The company storing its data in the “cloud” is the data controller. It is most likely one of the main differences between cloud computing and conventional“ data processing that the cloud provider does not any more have and maintain the required resources within his own data centre but “buys” them flexibly from external resource providers, which are not a party of the primary data processing agreement.
c. It ensues from the classification as a case of data processing and the phenomenon of third-party resource providers being involved that the data controller has to deal with the requirements of international data transfers and of the involvement of sub-processors.
d. Within the current legal framework there are a couple of open legal questions regarding privacy law compliance of cloud computing. As a consequence it will not always be possible to find a compliant solution. This is especially true when it comes to storing sensitive data in the cloud. Also the involvement of sub-processors in third countries turns into a complex undertaking as a consequence of privacy law requirements. It is only since the beginning of 2010 that a new model contract (i.e., 2010/87/EU) provides for a possibility to involve non-European sub-processors in a compliant manner.
The annoying defect of the new model contract is that it only applies if the first data processor (i.e., the cloud provider) has its seat outside the European Union. As a consequence it still requires some courage from European cloud providers to involve sub-processors in their own name. There is no clear answer to this question in sight.
3. Freedom of opinion versus informational self-determination
a. The teacher evaluation platform “Spick mich” (an internet platform allowing students to anonymously evaluate their teachers) is a good example of how the freedom of opinion and informational self-determination can be in conflict with each other.
b. Here are the facts of the case: A teacher perceived a negative evaluation of her teaching and her personality, which was accessible over the internet on the “Spick mich”-platform, as violation of her right to informational self-determination. The teacher pursued her cease -and -desist claims against the operators of the “Spick-mich”-platform through all court instances up to the Federal Supreme Court. Without success. The Federal Supreme Court inter alia argued that the freedom of opinion according to Art. 5 (1) of the German Constitution also protects anonymous opinions. The weighing of interests carried out by the Federal Supreme Court eventually lead to the permission to continue operating the evaluation platform.
c. The judgment came as a surprise to many observers. The reason from a privacy law point of view was that the statutory permission (i.e., Sec. 29 of the Federal Data Protection Act), which the Federal Supreme Court used to justify the operation of the evaluation platform does not provide for the possibility of a weighing of interests. Sec. 29 of the Federal Data Protection Act (now) has the following wording:
“The commercial collection, recording, alteration or use of personal data for the purpose of transfer […] shall be lawful if
1. there is no reason to believe that the data subject has a legitimate interest in ruling out the possibility of collection, recording or alteration […].”
A strict application of the wording of the law would have had to lead to prohibiting the evaluation platform. However, as a consequence of the collision with the freedom of opinion the Federal Supreme Court deemed it impossible to decide the case like this.
IV. The future of privay law
1. EU-regulation on privacy
Based on the European Data Protection Directive data protection laws have been created in all Member States or, where such laws already existed, they have been adapted to match the requirements of the Directive. It would, however, be wrong to praise the harmonization of privacy laws in Europe based on a directive. As a matter of fact the project of harmonization of privacy laws has failed, at least according to the European Commission. In her speech on November 28, 2011 Viviane Reding, vice president of the European Commission and commissioner for justice of the European Union, (not without reason) criticized the extreme fragmentation of privacy laws in the European Union and has announced to cure this situation by means of a regulation.
Mrs. Reding did not only address the legal differences in the Member States resulting from the different implementations of the Directive but also the financial efforts it means for companies to abide by the laws of 27 Member States. It is especially the undifferentiated filing and registration requirements in some Member States, which the European Commissions would like to abandon.
At the end of 2011 the first draft of a European privacy law regulation by the European Commission became known. On January 25, 2012 the official draft of the EU privacy law regulation was published by means of which the Commission intends to create a uniform European privacy law.
2. The European Court of Justice’s decision dated November 24, 2011
Already on November 24, 2011 the European Court of Justice took a step into the same direction as the European Commission. On November 24, 2011 the European Court of Justice rendered a judgment in a procedure according to Article 267 of the Treaty on the Functioning of the European Union. In the two relevant cases C-468/10 and C-469/10 the Court of Justice’s preliminary ruling was requested by the Spanish Supreme Court (Tribunal Supremo) being uncertain about the compliance of certain provisions under Spanish law with Directive 95/46/EC. In this judgment the European Court of Justice stated that Art. 7f of Directive 95/47/EC is directly applicable and the Member States must neither impose stricter nor less strict requirements. In its judgment the European Court of Justice answered a question, which as been discussed for years: Are the Member States entitled to impose stricter requirements for the collection, processing and use of personal data than those contained in Directive 95/46/EC?
The answer of the European Court of Justice is “no”.
3. Data protection as protection against oneself: self-exposure as legal problem
a. Another challenge for informational self-determination is created by the virtually ubiquitous social networks. The limitations of informational self-determination are created by the data subjects themselves. According to publicly available figures Facebook for example is used by approximately 800 million people worldwide. Within a standard user profile created in a social network users make plenty of details about their professional and private life accessible on the internet.
Users who are active on Twitter voluntarily report details about their daily life.
Individuals who are willing to share their experiences during night life upload pictures of a successful evening to the platform www.nachtagenten.de. It is obvious that such pictures might have undesired consequences in professional life. The adage “what happens in Vegas, stays in Vegas” has stopped applying a long time ago.
Individuals who are willing to present their professional profile to a broader audience publish their professional curriculum on www.xing.com.
In order to not only name examples related to the internet, one can take a closer look at how data are disclosed when it comes to game shows. Experiments have shown that consumers reply to almost any question about themselves and disclose address and bank account data, if the organizer of the game show asks for such information as “compensation” for taking part in the game show.
These cases of rather uncontrolled disclosures of personal data have to be differentiated from cases in which the controller obtains personal data based on the data subjects’ consent in order to be able to render certain services. One example for this is the financial industry. Sometimes the declarations of consent used in the financial industry are referred to as “normative parallel world”. The reason for this term is that the usually very detailed declarations of consent create a set of permissible data processing measures, which does not result of applying the law. There is a long way from this situation to the objective of being able to tell what is admissible and what is inadmissible solely from taking a look into the Federal Data Protection Act.
b. The German legislator plans to react to the phenomenon of self-exposure in the internet with amendments of the Federal Data Protection Act limiting the employer’s right to conduct internet researches about job candidates. In the future internet researches about job candidates shall only be admissible if the job candidate has been informed and has agreed. The research in social networks for private communication is not admissible. The so-called “googling” for interesting details about a job candidate shall in the future only be admissible if the job candidate has been made aware of the fact that such research occurs.
c. It is quite obvious that the planned reform of the Federal Data Protection Act will not be sufficient to exclude the users’ self-exposure. It is most likely impossible for example to prove that an employer has used a search engine in order to find out details about a job candidate. It is certain, however, that the readiness to abide by a certain statutory provision decreases if the likelihood of getting caught is close to zero.
As a conclusion it is quite certain that it will take many more years to answer all the legal and factual questions related to informational self-determination. Especially the new technical possibilities, the global computer networks, the ever growing offer of geo-localization services and social networks absolutely require the creation of a uniform legal framework.