Blogs

Dangerous myths peddled about data subject access rights

By EDRi · April 19, 2017

Now that the date on which the General Data Protection Regulation (GDPR) becomes enforceable is rapidly approaching, the European Data Protection Authorities (DPAs) are in the process of clarifying what their shared positions will be on various topics, including profiling. This is done through stakeholder consultation meetings.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During the latest meeting, one of the more contentious issues surrounding profiling turned out to be the transparency requirements regarding the algorithms used for automated decision making and profiling. While industry representatives in general provided constructive input on the various topics, this issue was more challenging. Several industry representatives were pushing for a very narrow interpretation of the right to access regarding the logic in automated decision making.

The basic argument is that industry has a right to hide the precise details of the calculations used to make decisions that discriminate against individuals. Three points were made in support of claims that the right of information regarding the logic of processing should not extend to disclosing the actual algorithms used:

  1. they would be protected trade secrets;
  2. intellectual property rights would preclude such disclosure;
  3. it would create a moral hazard in case of applications of profiling in fraud prevention.

Regarding the protection of trade secrets, the situation is fairly simple. The Trade Secrets Directive (2016/943/EU), for all its flaws, mentions specifically in its recitals that it shall not affect, among other rights, the right to access for data subjects. Since this Directive has to be implemented by June 2018, there is only a window of a few weeks in which trade secrets protections in some member states could, theoretically, prejudice data subject access to the logic used in automated decision making. So for all practical intents and purposes, trade secret legislation cannot be invoked to prevent disclosure of such underlying algorithms.

As far as intellectual property rights are involved, this is even more of a non-issue. The only so-called intellectual property rights that bear relevance here are copyright law and patent law.

Software copyright law does not explicitly cover underlying algorithms, a view that is reiterated in the ruling of the SAS Institute Inc. v World Programming Ltd case (C‑406/10 CJEU), in which the Court of Justice of the European Union (CJEU) ruled that the functionality of a computer program is not protected by copyright under Computer Programs Directive (91/250/EEC).

As far as patent law is involved, the European Patent Convention states that “schemes, rules and methods for performing mental acts, playing games or doing business, and programs for computers” shall not be regarded as patentable inventions (article 52(2) sub c). It would be difficult to argue that the logic for automated decision making in profiling of personal data is not a method for doing business. A requirement for patent law protection is disclosure of the underlying technology, which makes it even less likely to support an argument that it might prejudice disclosure of logic in automated decision making. Given that none of the other intellectual property rights even come close to covering the logic of algorithms, it follows that there are no barriers in intellectual property laws to disclosure of logic for automated decision making.

Even if there were intellectual property rights covering the underlying logic of software algorithms, it would still not necessarily be a given that these should override the data protection legislation. The CJEU has repeatedly considered competition law interests outweighing intellectual property interests in cases where it had to balance competition against intellectual property.

The last argument, that of a moral hazard, may or may not come into play in the context of fraud detection and insurance risk assessment. First of all, the European legislator has never made any exceptions for it in the GDPR, secondly, this can be addressed by disclosure of the logic as applied to a specific data subject instead of disclosure of the general logic as applied to all data subjects affected.

The logical conclusion for DPAs enforcing the GDPR in the future is to interpret the aforementioned arguments from parts of industry with a great deal of cynicism. They simply have no basis in the EU law and/or reality.

Rejections of data subject access requests to the underlying logic of automated decision making based on “trade secrets” or “intellectual property rights” should be treated by DPAs as violations of the GDPR and addressed accordingly.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Trade Secrets Directive (2016/943/EU)
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32016L0943

Ruling of the SAS Institute Inc. v World Programming Ltd case
http://curia.europa.eu/juris/document/document.jsf?text=&docid=122362&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=154228

European Patent Convention
http://www.epo.org/law-practice/legal-texts/html/epc/2016/e/index.html

Insurance: How a simple query could cost you a premium penalty (30.09.2013)
https://www.theguardian.com/money/2013/sep/30/insurance-query-higher-premiums

(Contribution by Walter van Holst, EDRi member Vrijschrift, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner