December 14, 2023

9 mins read

To score is to decide: GDPR and the SCHUFA case

To score is to decide: GDPR and the SCHUFA case

Can the act of assigning a score to someone constitute a decision? The European Union Court of Justice’s answer to a highly controversial question, and its broad implications.

About the SCHUFA case

Can the act of assigning a score to someone constitute a decision? This, in essence, is the question the Court of Justice of the European Union (CJEU) had to answer in Case C-634/21. And the Court’s answer is yes, following in the footsteps of the Advocate General’s opinion on the case. Rendered on 7 December, this ruling was eagerly awaited as it was the first time the Court had the opportunity to interpret the notorious Article 22 of the General Data Protection Regulation (GDPR) prohibiting decisions “based solely on automated processing“.

The CJEU has been asked to give a preliminary ruling by a German court (the Wiesbaden Administrative Court, or Verwaltungsgericht Wiesbaden) in a case involving three parties: a German consumer (the applicant), SCHUFA – a private German company specializing in credit information and credit scoring –, and the Data Protection Commissioner of the Land of Hesse whose decision in the first instance had been challenged by the applicant.

The dispute arose from a credit score produced by SCHUFA, which led a third party (a bank or other type of credit institution) to refuse a loan to the applicant. After being refused the loan, the applicant asked SCHUFA to disclose information about the data stored and to erase incorrect data. In response, SCHUFA informed her that her score was 85.96%, and provided some general information about the methodology used, but refused to say more, claiming business confidentiality. The applicant then sued SCHUFA, relying on various provisions of the GDPR. After being dismissed at first instance, the applicant appealed against this decision before the Verwaltungsgericht Wiesbaden, which referred the matter to the CJEU.

At the heart of the dispute lies the question of the applicability of Article 22 of GDPR to SCHUFA scores. But the CJEU’s ruling will have a wider impact, notably on all situations where a score or profile produced by one entity (whether private or public) is used by another. The path chosen by the Court is very welcome because it recognises that, very often, the simple act of scoring is in itself a decision, but it also raises important questions that will no doubt give rise to much discussion.

Scoring and consumer credit agencies

When banks and credit institutions decide whether or not to grant credit to a consumer, they have two options: either they use only their own data, or they rely on information and data analysis provided by a third party, i.e. a consumer credit agency. In some countries, such credit agencies do not exist, notably in Belgium and France, where banks do not share customer data with each other. But in many other countries, these companies play a crucial role in the credit market as they enable the circulation of information.

Consumer credit agencies have a long history, dating back to the 1800s. In the course to the second half of the 20th century, they began to use statistical analysis in order to score individuals. In the United States, the main credit agencies and a company named FICO started producing the famous “FICO scores” which are attributed to the vast majority of Americans. In Germany, SCHUFA does the same job: it collects consumer data from a wide range of financial players and uses algorithmic methods to produce a score that is supposed to describe the creditworthiness of each individual.

SCHUFA itself does not grant any loans nor provide any services to individuals, it merely offers information to commercial actors. Strictly speaking, SCHUFA does not take any decisions concerning anyone as it has no contractual relationship with consumers.

However, depending on how credit institutions interpret a score, its influence varies. A score may be one of many factors considered or, conversely, it may be the only criterion taken into account by the lending institution. In many cases, a lending institution will not base its decision solely on the credit bureau’s score but will automatically refuse to grant credit to applicants with a score below a certain threshold. In such a case, it is difficult to determine where exactly the decision is taken. Is the decision taken solely by the credit institution, or is the credit agency’s score itself a decision?

Credit scores and Article 22 of the GDPR

If this question matter, it’s because of the famous Article 22 of the GDPR which prohibits automated decision. Should this provision apply, would this mean that such use of a SCHUFA score is necessarily contrary to the GDPR? No, because the second paragraph provides for three exceptions to this general prohibition of automated decision-making: automated decisions may be authorised 1. if necessary for a contractual relationship, 2. if there is a legal basis authorising these decisions, 3. if it is based on the data subject’s explicit consent. And the scores produced by credit agencies could possibly fall within one of these exceptions (one aspect of the dispute is precisely that German law allows the use of probability values to assess creditworthiness, which could constitute a legal basis, but I leave that question aside here). But even if the score fell within one of these exceptions and was therefore lawful, this would have serious consequences for SCHUFA because additional requirements are imposed by the GDPR regarding automated decisions.

The CJEU identifies two arguments – put forward by the German referring court and endorsed by the Advocate General – in favour of the applicability of Article 22.

The first argument relates to the conditions set out in paragraph 1 of Article 22, which reads as follows:

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

Three conditions are set out: first, there must be a decision taken about an individual; second, the decision must be solely based on a automated processing; third, the decision must produce legal effects or significantly affect the data subject.

Although the Court formulates the matter less clearly than the Advocate General did (curiously, the Court seems to reverse the first and third conditions. The fact that it is in the field of credit is examined from the point of view of whether it is a decision. And the question of whether the score itself determines the consumer’s fate is examined from the point of view of the production of “legal effects”.), we can say that two elements were not under debate: first, the granting or refusal of credit is part of the type of decision covered by Article 22 (as demonstrated by Recital 71 of the regulation); Second, SCHUFA scores are produced automatically. However, as explained above, it is less clear that a SCHUFA score can be considered a decision. To decide this issue, the CJEU relies on the description given by the referring court, according to which the third party draws “strongly” on credit scores so that an insufficient SCHUFA score leads “in almost all cases, to the refusal of that bank to grant the loan applied for” (paragraph 48). It is therefore a factual finding that demonstrates the score itself is a decision.

The second argument relates to the lack of protection of the rights of the data subjects that would result from the non-applicability of Article 22. Indeed, under Article 13(2)(f), Article 14(2)(g) and Article 15(1)(h) of the GDPR, data subjects who are the target of an automated decision within the meaning of Article 22 are entitled to receive “meaningful information about the logic involved [in the automated decision], as well as the significance and the envisaged consequences of such processing for the data subject”.

However, if we were to consider that the credit agency score itself is not a decision, the people being assessed would not be able to ask the credit agencies to explain the underlying logic, since article 22 would not apply. If they were to turn to the credit institution, the latter would not be able to respond, since it would not have set the score itself. Consequently, for the right to obtain meaningful information to be effective and to avoid a “lacuna in legal protection” (paragraph 61), a score must be qualified as a decision within the meaning of Article 22, states the Court.

The decisional nature of scores depending on their future use?

A notable aspect of the Court’s reasoning is that the decisional nature of the credit agency score appears to depend on the way in which the score is used by the purchaser. The same score could, in one case, be classified as a decision on the ground that it is the only criterion used by a bank; and in another case be regarded as not constituting a decision on the ground that it is only one element among others taken into account by another credit institution. This seems difficult to apply in practice, notably because the information rights set out in Article 13 and 14 must be implemented before the data is processed by the data controller, i.e. the credit agency (as Binns and Veale point out), which requires knowing in advance whether the score will qualify as a decision.

The whole problem with scores is that we cannot anticipate their future uses. Credit bureau scores are often used not only by lending institutions, but also by tenants, employers or insurance companies. How should a score be qualified in these different contexts? One possible solution would be to consider that credit agency scores are in all cases decisions within the meaning of Article 22. In this sense, the US Fair Credit Reporting Act will perhaps be seen as a source of inspiration for EU regulators. Indeed, the Act requires “consumer reporting agencies” to disclose credit scores – for a fair and reasonable fee and independently of their future uses – to consumers, and to describe the “key factors that adversely affected the credit score of the consumer in the model used” (see section §609(f) of the Act).

However, the issue does not only concern credit agencies, but more broadly all situations where a score or profile produced by one entity (whether private or public) is used by another. The CJEU’s ruling is thus very welcome, in particular because it gives effective scope to Article 22, but, as we can see, it also opens up important questions that will continue to be debated.

 

Categories:

Deep Dives, I wrote

Picture credits:

Nguyen Dang Hoang Nhu