Amidst the controversy raised by the Chinese government's initiative of implementing a compulsory Social Credit Score System in the country by 2020, we decided to speak with two Master students whose areas of expertise could help us figure out what’s at stake with this technology. Christian Orriëns, a Master of Psychology student, and Ammu Joshy, a Master of Philosophy of Science, Technology and Society student, share their opinions and concerns.
Currently being run by the ANT Financial company from Alibaba group, the system might monitor how quickly citizens pay off their debts, the products they consume, and who is in their immediate social group. The exact criteria that the algorithms will take into account is still unclear, but the Chinese government has announced that partaking in the system will be compulsory for every citizen as of 2020. While having a high score could grant citizens higher loans and greater social benefits, a low score could mean, for example, losing the ability to enroll in certain universities or even traveling.
From a philosophical perspective, do you see any benefits of having a Social Credit Score System?
Joshy: ‘The developers of the system and the Chinese government say they’re doing it to increase the sincerity and loyalty of the citizens. It’s like any product, how do you improve its quality? By getting feedback on it. Then you can turn that feedback into a rating. In those terms, I agree it works. But when what you are rating is humans, it can become quite problematic. Something quantitative - your score - is derived from something qualitative - your loyalty, or responsibility. But who assigns these arbitrary numbers? With which calculating methods?’
As a psychologist, would you say the system could indeed increase the sincerity and loyalty of the citizens?
Orriëns: ‘One of the fundamental principles behind the system is a common psychological theory called operant conditioning. It comes down to behaviors being extinguished or reinforced by giving out punishments or rewards. This is being applied in psychiatric settings, in some extent to a good effect, but it is dangerous to apply it to an entire society because you’re valuing people on the basis of one single number that might not be inclusive of everything that the individual does.’
How could these scores impact how citizens make sense of themselves?
Orriëns: ‘There have been some psychological experiments suggesting people are less likely to do something they know they shouldn't, like taking money from an unsupervised container, if there is a mirror around because that makes them confront with themselves. Now, imagine that Sesame Credit is this mirror, with your score as a continuous reflection of yourself.’
Joshy: ‘I can see it encouraging pretension. You could have a fake identity in the society to blend with everyone and gain a better rating. But that can create a conflict of self, where your outside self is concerned with the general opinion, whereas your inner self is completely contradicted.’
You both seem concerned with the system reducing abstract traits of one’s personality to numbers, but couldn’t we argue the system will look for more general criteria? For instance, paying your taxes on time, you either do it or not. If you don’t, you’re not reliable.
Joshy: ‘Right, but then you would not take into account instances where the person was sick and couldn’t pay the taxes in time. Will that person’s score get lowered for that instance? An algorithm misses the nuances. It cannot capture personal or exceptional situations.’
Orriëns: ‘Exactly, this is why we have judges. If you have a situation where you couldn’t pay your bill on time, a judge would weigh in the circumstances and might get you off with it or actually fine you. With the Sesame Credit system, the human factor is being taken away. I’m not sure if we should be happy about it.’
Why is it outrageous when the government monitors details about your life but when private corporations such as Facebook do it, people are generally okay with it?
Orriëns: ‘The major difference is that I choose to join and check the box saying ‘I agree with the terms and conditions’. Therefore, to some extent, I am giving them permission to do it.’
Joshy: ‘Here, the ability to opt-out is the defining factor. With a mandatory mass Credit Score System, there is no choice.’
What about rating people? We seem concerned if the government does it, yet we frequently do it through applications like Bla Bla Car, or Airbnb. Are we as far away from these systems as we think we are?
Joshy: ‘Not really. Applications like Uber, where the customer has to rate the driver and the driver has to rate the customer, are already asymmetrical systems. To the customer, a low rate might make him miss some rides, to a driver, a low rate means losing his job. I think if we continue using these types of apps, eventually, we won’t even realize when these bigger systems come into place. We will already be okay with it.’