35C3 preroll music Herald Angel: Alright. Then it's my great pleasure to introduce Toni to you She's going to talk about "the Social Credit System," which is, kind of, feels to me like a Black Mirror episode coming to life. So, slightly nervous and really curious what we're going to learn today. So please give a huge, warm round of applause and welcome Toni! Applause Toni: Good morning, everyone! Before I'm going to be talking I'm going into my talk I'm just going to be presenting the Chinese translation streams for everyone who doesn't speak English. speaks chinese Applause So because today's talk is about China we figured it would be good to have it in Chinese as well. And, I'm going to be talking today about the Social Credit system in China, where "the" Social Credit system that you always hear about in Western media actually doesn't really exist and most of my talk will actually be talking about what all we don't know. Which could fill an entire hour or even more. But I'm just going to be focusing on some of the most interesting things for me. First of all, a little bit about me. I'm an economist, but I'm not I'm not only concerned with money. I'm kind of looking at economy, at economics as the study of incentives, which means that what I'm really interested in is how humans respond to different kind of incentives. I don't believe that humans are completely rational. But I do believe that humans do try to maximize what they think is their best interest. Now, some words about me: I studied math, economics and political science in a couple of different cities all around the world. I spent overall 19 months in China. Most recently I was there in July on a government scholarship, which was really, really interesting, because while there I read all of these Western newspaper articles about the Chinese Social Credit system, and I went to a pretty good university and I asked them: So what do you think about this system? And most of them basically looked at me blankly, and were like: What system, I haven't even heard of this! So that was kind of an interesting experience to me because in the West it's like this huge, all-encompassing system. And in China, most people that aren't directly -- that aren't directly in touch with it actually don't know anything about this. I'm broadly interested in the impact of technology on society, life, and the economy, obviously, and in my free time I do a lot of data science and machine learning with Python and R. So, I thought it was quite interesting to look at the Social Credit system, also from this point of view because you always heard that it's like this big data initiative, and then when coming down to it, what you actually see is that, they don't actually use machine learning all that much. They have, basically, a rule based catalog where, if you do this you get 50 points, if you do this you get 50 points, and then they actually have a lot of people that are reporting on other people's behavior. I'm going to be talking about how exactly it looks, later on but I was very, very surprised after reading a lot of the Western newspaper articles that were basically "Oh, this is this big dystopia, Orwellian, with big data working." And then, you read what's actually happening and they have huge lists of "if you jaywalk, you get 10 points detracted from you," this kind of thing. If you want to get in touch with me you can use Twitter but you can also use different e-mails either my professional e-mail or my personal e-mail address, that you can both see there. If you have any thoughts on that or are interested in this a little more I can give you more resources as well, because obviously today's talk will only be scratching on the surface. So, perceptions of the Social Credit System. One of the interesting things that I've talked about before was how, in the West and in China, the perception is completely different. So in the West, which is from financialtimes.com, you see this huge overwhelming guy, and he basically puts every Chinese person under a microscope. They're all kind of hunched over, and everyone has this score attached to them, and they seem pretty sad and, like, very, very Orwellian concept. Whereas, in China, this is actually from a Chinese state media, and what it says is, well, we can all live in harmony with this new system and all trust each other. And interestingly Chinese people actually believe that, to some degree. They believe that technology will fix all this current problems in society, especially because, in China currently, trust is a rare commodity. And this new system will lead to more efficiency and trust, and a better life. And I have a really, really interesting quote from a Western scholar, that really summarizes the Western perspective: "What China is doing here is selectively breeding its population to select against the trait of critical, independent thinking. This may not be the purpose, indeed I doubt it's the primary purpose, but it's nevertheless the effect of giving only obedient people the social ability to have children, not to mention successful children." This, basically, plays with the idea that if you have a low score, currently, in the cities that are already testing this system, what happens is, your children can't attend good schools. What happens is, you cannot take trains, you cannot take planes. You cannot book good hotels. Your life is just very, very inconvenient. And this is by design. This is kind of the plan. The Chinese government, they say it's a little different, the idea is about changing people's conduct by ensuring they are closely associated with it. One of the main things about this system is, there isn't very much new data being generated for the system. Instead, what's happening is, all the existing data that is already collected about you is, basically, combined into one big database for each and every person by your ID number. So, in China, once you're born, you get an ID number, which is similar to a Social Security number in the U.S. We don't really have a similar concept in Germany, and it used to be that your ID number was only necessary for public -- like for government stuff, but now you need your ID number for getting a bank account, you need your ID number for buying a cell phone, even if it's a prepaid cell phone, you still need your ID number. So all your online activity that happens with your cell phone is associated with your ID number, which means you can't really do anything anonymously, because it's all going back to your ID number. There's a couple of predecessors, some of them going actually back to the 1990s, that are supposed to be integrated into the new system. One of them, or like two of them are blacklists. One of them is a court blacklist. So in China, courts work a little bit differently. They tend to like giving you fines, as they do in other countries, but they also like giving you "apologies to do." So one of the things, if you do something, for example you're a company, your food safety wasn't up to par – you have to pay a fine. But in addition to this fine you also have to write a public apology letter in the newspaper, how you are very sorry that this happened and it won't happen again, and it was a moral failing on your part, and it won't happen again. And if you don't do that, you go on this blacklist. Similarly, if you take out a line of credit and don't pay it back within three months, or like don't don't do any payments for three months, you go on this debtors blacklist. If you're on this blacklist, which again is associated with your shēnfènzhèng, so your ID number – what happens is you cannot take trains you cannot take planes. Your life basically becomes very very inconvenient, your children can't go to good public schools, your children can't go to private schools, your children can't go to universities, all of these issues are suddenly coming up. There is also a company database that's called Credit China which is basically similar to the public debtors blacklist but it's basically a credit system a credit score for companies. And then there's the credit reference center of the People's Bank of China which is a credit score. It was supposed to be like Schufa or like the U.S. FICO for individuals. But one of the big problems in China is that there are a lot of people that aren't part of the formal economy. A lot of people are migrant workers. They get their money in cash. They do not have bank accounts. They do not have anything… they do not have rent or utilities or anything like this because they live in the country. So they own their own home which they built themselves so they didn't even finance it and their home isn't officially theirs because in China you can't actually own property. Instead the government leases it to you. So there were a lot of people that were not covered in this system, and I think the last data that I had was that less than 10 percent of Chinese adult citizens were actually in the system and had any sort of exposure to banks, which is very, very little. And that meant that people couldn't get credit because banks would only give credit to people that were in the system or people where they had some sort of handling on whether they would be paid back. Now, the implementation details of the new system are very very scarce, but the basic idea is that Chinese citizens are divided into trustworthy individuals and what the Chinese call "trust breakers". Sometimes you have five different groups, sometimes you have two different groups, but in general there's sort of this cut-off: above this line it's good and beyond this line it's bad. This is one graphic from the Wall Street Journal that just shows some of the inputs that go into the system. And one of the things that we see is that the inputs are _crazy_ crazy varied. So it is: do you pay income taxes? Do you pay your utility bills on time? Do you respect your parents? However they measure that. Do you have a criminal record? Do you pay for public transportation or have you been caught not paying? What about your friends? Do you retweet or use WeChat to distribute sort of information against the party, which they call reliability. In actuality it's not about whether it's factual, it's about whether it's against the party or not. Where do you buy and what do you buy, apparently if you buy diapers it's better than if you buy videogames. For your score. Because you know if you buy videogames obviously you're not very responsible. And if you buy diapers you have a kid, you are sort of conforming to the societal ideal. And then your score is supposed to go into all these different categories, you're supposed to have better access to social services if your score is good. You're supposed to have better access to internet services. So in theory the idea is that at one point if your score is too bad, you're not allowed to use WeChat anymore. You're not allowed to use Alibaba anymore. You can't become a government worker. You can not take planes and high speed trains. You can not get a passport. And your insurance premiums will go up. So it's supposed to be this really really big, overwhelming system. But in actuality what they say their stated goals are, is "it's a shorthand for a broad range of efforts to improve market security and public safety by increasing integrity and mutual trust in society." So one idea is to allocate resources more efficiently. Resource allocation in China is a pretty big problem, because people grow up with: There's 1.3 billion people. So there's - it's always going to be scarce. And a lot of stuff is – people grow up with this idea that it's just very very scarce, and current distribution strategies, which are mostly financially based but also often guanxi-based, don't really seem fair. For example, public transport in China is highly subsidized, which means that the price does not reflect whether – does not reflect true scarcity. So currently the way it works is in theory it's first come first serve, in practice there's people that are buying up all the tickets for, for example, the high speed train from Shanghai to Beijing and then selling it at a profit, or selling it to certain companies that have good ties to the government. That seems very unfair. So the new system is supposed to distribute them more fairly and more efficiently. The other thing is restoring trust in people. Perceived inter-personal trust and trust in institutions is extremely low in China. If you're from Germany, you might have heard that there is Chinese gangs basically buying up German milk powder and selling it in China. This is actually happening, because in 2008 there was a big scandal with laced milk powder. And ever since then, anyone who can afford it does not use Chinese milk powder, because they don't trust the government, or the regulations, the firms, enough to buy Chinese milk powder so they are actually importing this. And the big irony is: sometimes this milk powder is produced in China, exported to Germany, and then exported back to China. The Social Credit system is then supposed to identify those that deserve the trust. And the third point is sort of a reeducation of people. The idea is: they want to make people in the image that the Communist Party thinks people should be. And one additional way to the punishments and rewards this could work, is the feeling of being surveyed. Because you can't do anything anonymously, you will automatically adapt your behavior because you know someone is watching you all the time, and this is how a lot of the Chinese firewall actually works, because most people I know that are sort of more– more educated, they know ways to circumvent the Chinese firewall, but they also know that they're always being watched, so they don't do that because, you know, they're being watched, so they self– they censorship– they censor themselves. As I said before, allocation of scarce resources so far is mainly through financial guanxi channels. guanxi is basically an all permeating network of relationships with a clear status hierarchy. So if I attend a school, everyone who also attended this school will be sort of in my guanxi network. And there's this idea that we will have a system where we are all in-group, and in- group we trust each other and we do favors for each other, and everyone who's outside of my immediate group I don't trust and I don't do favors for. And in some ways the guanxi system right now is a substitute for formal institutions in China. For example if you want a passport right now. You can of course apply for passports through regular channels, which might take months and months. Or you can apply for a passport through knowing someone and knowing someone, which might take only two days. Whereas in Germany you have these very regular, formal institutions, in China they still use guanxi. But, increasingly especially young people find that guanxi are very unfair, because a lot of these are: where you went to school, which is determined by where you're born, who your parents are, and all these things. Another thing that's important to understand because: the system works through public shaming. And in a lot of western society we can't really imagine that, like, I wouldn't really care if my name was in a newspaper of someone who jaywalked for example. It would be: oh well, that's okay. But in China this is actually a very very serious thing. So saving face is very very important in China. And when I went to school there I actually – we had this dormitory, and it was an all foreigners dormitory, where the staff that were responsible for the dormitory felt that foreigners were not behaving in the way they should. So their idea was to put the names, the pictures, and the offenses of the foreigners in the elevator to shame them publicly. So for example if you brought a person of the opposite sex to your room, they would put your name, your offense and your room number in the elevator. And of course this didn't work because for a lot of western people it was basically like: "oh well I'm going to try to be there as often as possible because this is like a badge of honor for me" and the Chinese people they figured "well this is really really shame and I'm losing my face". She brought alcohol. So this didn't really work at all. But this is kind of the mindset that is behind a lot of these initiatives. As I said there's a lot of problems with – we don't really know what's going to happen. And one of the ways that we can see what might happen is actually to look at pilot systems. China has – or like ever since the Communist Party took hold – the Chinese government has tried a lot of policy experimentation. So whenever they try a new policy, they don't roll it out all over, but they choose different pilot cities or pilot districts, and then they choose "oh well this is the district where I'm going to be trying this system and I'm going to be trying another system in another district or city". And this is also what they did for the, or what they're doing for the Social Credit system. Now I have three systems that I looked at intensively for this presentation, overall there's about 70 that I know of - the Suining system, Suining is a city in China, the Rongcheng system, another city in China and Sesame Credit. Sesame Credit is a commercial system from Alibaba - I assume everyone knows Alibaba, the're basically the Chinese Amazon, except they're bigger and have more users and make more money, actually. And they have their own little system. One of the problems with this kind of system that I found when I tried modeling it, was that it's a very very complex system and small changes in input actually changed the output significantly. So when they try– usually when they try this pilot system they basically have a couple of pilots, then they choose the pilot that is best and they roll it out all over. But for this kind of thing, where you have a lot of complex issues, it might not be the best way to do that. The Suining system is actually considered the predecessor of all current systems. It had a focus on punishment, and it was quite interesting. At the beginning of the trial period they published a catalogue of scores and consequences. Here is an example. This is basically taken from this catalog. So if you took out bank loans and didn't repay them, you got deducted 50 points. Everyone started with 1000 points for this system. If you didn't pay back your credit cards you also got deducted 50 points. If you evaded taxes, also 50 points. If you sold fake goods, 35 points were deducted. And actually the system was abolished I think in 2015, 2016, because all the Chinese state media and also a lot of Internet citizens talked about how it's an Orwellian system and how it's not a good system, because it's all very centralized and everything that you do is basically recorded centrally. But Creemers writes: "Nonetheless, the Suining system already contained the embryonic forms of several elements of subsequent social credit initiatives: The notion of disproportional disincentives against rule breaking, public naming and shaming of wrongdoers, and most importantly, the expansion of the credit mechanism outside of the market economic context, also encompassing compliance with administrative regulations and urban management rules." So one of the things that is difficult for especially German speakers is that credit in Chinese, xìnyòng, means credit as in "loan", but also means credit as in "trust". So the Social Credit System is one way of trying to conflate those two – the economic credit and the trust credit – into one big system. But the Suining system basically failed. So, they adapted the system and are now practicing a new kind of system, the Rongcheng system. Whenever you read a newspaper article on the social credit system in the west, most people went to Rongcheng because they just received a couple of awards from the Chinese government for being so advanced at this social credit thing. But it's very difficult to call this "one system" because there's actually many many intertwined systems. There is one city level system, where city level offenses are recorded. For example tax evasion, and there's a couple of rules. If you evade taxes your score goes down 50. But then if you live in one neighborhood your score might go up for volunteering with the elderly. If you live in another neighborhood your score might go up for, for example, planting some trees in your garden or backyard. So depending on your neighborhood, your score might be different. If you work for a– if you work for a taxi cab company, for example, they also have their own little score system and your score might go up if you get good reviews from your drive…, from your passengers. Your score might go down if you don't follow traffic rules, these kinds of things. There are designated scorekeepers at each level. So, each district chooses a couple of people who are responsible for passing on the information to the next higher level, about who did what. There is supposed to be an official appeals procedure, so whenever you score changes you're supposed to be notified, but apparently that's not happening at this point for most people. Again, it's a system of data sharing, and one thing that they haven't really disclosed yet is what kind of data is shared. Are they only sharing the points, so if I'm in a district and I plant some trees, does the central system get the information "person A planted some trees," or does the central system get the information "person A got 5 points?" We don't know at this point. And it would mean something very different for how the system could be used. But still the end result, at this point, is that there's one score. So you have one central score and it's kind of– there's all these different smaller systems that go into this score. But at the end, everyone has one central score, and currently about 85 percent of people are between 950 and 1050. So you start off with a thousand – and those are basically the normal people – and then anyone above a 1050 is considered a trustworthy person, and anyone below 1050 is considered a trust-breaker. And, as I've said before, with the naming and shaming and all these things, what you can actually see here is a billboard with the best trustworthy families in Rongcheng. So these are the families that have the highest scores, for example. Sesame Credit is a little different. It's the only system that actually uses machine learning and artificial intelligence to determine the outputs. In Rongcheng, for example, they have artificial intelligence, they have computer vision, for the most part, and the computer vision cameras they decide– they try to recognize you when you jaywalk. And then when they recognize you when jaywalking, you get a small SMS; "well, we just saw you jaywalking, your score is now dropping." But how the score develops, depending on your jaywalking, isn't really determined by machine learning or artificial intelligence. Instead, it's determined by rules. You know: one time jaywalking deducts five points, and this is stated somewhere. Sesame Credit doesn't work like that. Instead it uses a secret algorithm, and the way– I talked to some people that work for Sesame Credit or for Alibaba, and the way they described it was; they basically clustered people based on behavior, then gave scores to these clusters, and then afterwards, did basically reverse engineered their own score, using machine learning, so that whenever something new happens, you can move to a different cluster. This Sesame Credit was actually refused accreditation as a credit score in 2017, so banks are not allowed to use the Sesame Credit score for your– to use the Sesame Credit score to determine whether they give you loans or not. Because Sesame Credit is quite ingenious – obviously Alibaba wants to keep you within their platform – so if you buy using Alibaba and using Alipay, your score goes up. If you buy using Weechatpay, which is a competing platform, your score goes down. This uses many of the same rewards mechanisms of the official government systems, and this is just an illustration of what kind of scores you can have, apparently your scores can go between 350 and 850, and in Chinese there's basically five different levels. So 385 is a "trust-breaker" or "missing trust". And then 731 is "trust is exceedingly high". So one way I tried to approach this issue was through agent- based modeling. Social Credit System is individual level, but what we're really interested in, or what I'm really interested in, is actually societal-level consequences. So if everyone gets this score, what does that mean for society? And agent-based modeling works quite well for that, because it allows us to imbue agents with some sort of rationality, but with a bounded rationality. What does bounded rationality mean? Usually in economics people assume agents are completely rational, so they are profit maximizers, they have all the information. But in reality, agents don't have all the information, they have a lot of issues with keeping stuff in their mind. So a lot of the time, they won't choose the best thing in the world, but they choose the best thing that they see. And bounded rationality allows us to account for this thing. It allows us to account for heuristics and these things. And what I did is I took the propensity for specific behavior from current state of the art research, mostly from behavioral economics. For example, I looked at tax evasion, and I looked at who is likely to evade taxes in a system, and then obviously there was some stochastic – some chance element. But the distribution that I chose is related to the current research. And I also checked that my model has similar results to the Rongcheng model, which I modeled at at the beginning. So on average 87% of my users have a score of within 10 percent of the original score, which is also the data that Rongcheng city actually publishes. Now, for the most part, I compared design choices in two axes. One of them was a centralized system versus a multi-level system, and a rule-based system versus a machine learning system. The centralized system is basically: you have a central – all the information is kept centrally, and everyone in China, or wherever, in Rongcheng has the exact same scoring opportunities. Now, if you have a centralized system the clear expectations were pretty good. But, at the same time, the acceptance from the population was really, really low, which they found during the Suining experiment. And there's also the problem of a single point of failure. Who decides the central catalog, and, depending on who, sort of, has the power, it kind of, just, reproduces power structures. So because you have this central catalog, the same people that are in power centrally, they are basically deciding some sort of score mechanism that works for them very well, so that they and their family will have high scores. And multi-level system has the advantage that local adaptation kind of works, and there's sort of many points of failure. But in my model, when I allowed locals to basically set their own rules, what happened was that they competed. So, it started out being this district of Rongcheng, for example, and this district of Rongcheng, they compete for the best people that they want to attract, and suddenly you have this kind of race to the bottom, where people want to move where they wouldn't be prosecuted, so they move to places where there's less cameras, for example. At the same time, there's many points of failure, especially the way it's currently set up, with people reporting data to the next high level. And, a lot of the time, what we have actually seen in Rongcheng, was that they reported data on people they didn't like more than data on people they did like. Or, their families got better scores than people they didn't know. So it also kind of reproduced these biases. The rule based system has the advantage that people were more prone to adapt their behavior, because they actually knew what they needed to do in order to adapt their behavior. But the score didn't really correlate with the important characteristics that they actually cared about. And, as opposed to in this machine learning system, you know how in Germany we don't really know the Schufa algorithm. And I, for example, don't exactly know what I could do in order to improve my Schufa score. And this is a similar system in China with the Sesame Credit score. A lot of people don't really – they say, "well I really want to adapt my behavior to the score, to improve my score, but when I tried doing that my score actually got worse." And you can have different biases, that I'm going to be talking about in a little bit. There's also this big problem of incentive mismatch. So, the decentralized, rules-based systems like Rongcheng, which is the system that I analyzed the most. Why, because I believe this is the system that we're moving towards right now. Because Rongcheng won a lot of awards. So the Chinese government, the way they usually work is, they try pilots, then they choose the best couple of systems, they give them awards, and then they roll out the system nationwide. So I assume that the system that's going to be – the system in the end will be similar to the Rongcheng system. Now, one problem that I actually saw in my simulation was that you could have this possible race to the bottom. There's also this conflict of interest in those that set the rules, because a lot of the time, the way it works is, you have your company, and your company, you, in combination with your party leaders, actually decide on the rules for the score system. But the scores of all your employees actually determines your company's score. If you employ a lot of people with high scores you get a better score. So you will have this incentive to give out high scores and to make sure that everyone gets high scores. But at the same time the government has an incentive for scores to be comparable. So there's a lot of incentives mismatch. The government also has the incentive to keep false negatives down, but they actually, the way the Chinese system currently works is, they emphasize catching trust-breakers more than rewarding trust-follow... or trustworthy people. So, false positives, for them, are less important, but false positives erode the trust in the system, and they lead to a lot less behavioral adaptation. I was actually able to show this using some nudging research that showed that as soon as you introduce an error probability and you can be caught for something that you didn't do, your probability of changing your behavior based on this score is actually lower. And in Rongcheng, one of the perverse things that they're doing is, you can donate money to the party or to, like, party affiliated social services, and this will give you points, which is kind of an indulgence system. Which is quite interesting, especially because a lot of these donation systems work in a way that you can donate 50000 renminbi and you get 50 points, and then you donate another 50000 renminbi and you get another 50 points. So you can basically donate a lot of money and then behave however you want, and still get a good score. And the trust in other people can actually go down even more in this system, because suddenly you only trust them because of their scores, and the current system is set up so that you can actually look up scores of everyone that you want to work with, and if they don't have a score high enough then suddenly you don't want to work with them. The trust in the legal system can also decrease, actually. Why? Because trust in the legal system in China is already low, and a lot of the things, like jaywalking, they're already illegal in China, as they are here, but no one cares. And suddenly, you have this parallel system that punishes you for whatever. But, why don't you just try to fix the legal system, which would be my approach. Suddenly, illegal activity could happen more offline, and this is one of those things that is quite interesting. In countries that we've seen that have moved towards mobile payments, and away from cash, you see less robberies but you don't actually see less crime. Instead you see more new types of crime. So, you see more credit card fraud, you see more phone robberies, these kinds of things. And this is also where things could move in the Chinese case. One major problem is also that this new system – I've talked a little bit about this one, but – it can introduce a lot of new bias, and reproduce the bias even more. So, for example, China is a country of 55 minorities. The Han are a big majority, they have about 94 percent of the population. So any computer vision task, we've shown, that they are really, really bad at discriminating between individuals in smaller ethnic groups. In the U.S., most computer vision tasks perform worse for African-Americans, they perform worse for women, because all of the training sets are male and white, and maybe Asian. In China, all of these tasks are actually performing worse for ethnic minorities, for the Uyghurs, for example. And one way that they could try to abuse the system is to basically just – what they're also doing already in Xinjiang is – to basically just identify, "oh this is a person of the minority, well I'm just going to go and check him or her more thoroughly." This is actually what happens in Xinjiang. If you're in Xinjiang and you look like a Turkish person, or like from Turkmenistan, from a Turkish people, you are a lot more likely to be questioned. You're a lot more likely to be stopped and they ask you or require you to download spyware on your phone. And this is currently what happens and this new kind of system can actually help you with that. I've said that it can reproduce these kind of power structures, and now obviously we all know neutral technology doesn't really exist, but in the Chinese case, in the social credit case, they don't even pretend – they always say "well, this is neutral technology and it's all a lot better," but actually it's the people currently in power, they decide on what gives you point and what deducts points for you. Another problem, currently the entire system is set up in a way that it all goes together with your shēnfènzhèng, with your I.D. card. What if you don't have an I.D. card? That's foreigners for one. But it's also people in China that were born during the one child policy and were not registered. There's quite a lot of them, actually. They're not registered anywhere and suddenly they can't do anything, because they don't have a score, they can't get a phone, they can't do anything, really. And part of the push with this social credit system is to go away from cash, actually. So if you need to use your phone to pay, but for your phone you need your shēnfènzhèng. If you don't have a shēnfènzhèng, well, tough luck for you. And currently the system in Rongcheng is set up in a way that you can check other people's scores and you can also see what they lose points for. So you can actually, sort of, choose to discriminate against people that are gay, for example, because they might have lost points for going to a gay bar, which you can lose points for. Another big issue, currently, is data privacy and security. Personal data is grossly undervalued in China. If you ask a Chinese person, "what do you think, how much is your data worth?," they say "what data? I don't have data." And, currently, the way it works is, if you have someone's ID number, which is quite easy to find out, you can actually buy access to a lot of personal information for a small fee. So you pay about 100 euros and you get all hotel bookings of the last year, you get information of who booked these hotels with them, you get information of where they stay, you get train bookings, you get access to all of the official databases for this one person. And for another 700 renminbi you can actually get live location data, so you can get the data of where this person is right now, or where his or her phone is right now, but if you've ever been to China you know that where the phone is, usually, the people aren't far. Supchina actually did an experiment where a couple of journalists tried buying that, because it's actually these kind of services are offered on weechat, pretty publicly. And you can just buy them, quite easily. So one additional thing that I looked at is, because one of the things that is quite interesting is, you have this idea of credit as twofold. Credit is trust credit but credit is also loan credit, and what if credit institutions actually use this unified credit score to determine credit distribution? The idea is that it's supposed to lead to reduced information asymmetry, obviously, so fewer defaults and overall more credit creation. New people are supposed to get access to credit, and there's supposed to be less shadow banking. But what actually happens? I'm not going to be talking about how I set up the model but just about my results. If you have this kind of score that includes credit information but also includes morally good – or measures of being morally good – what you have is, in the beginning, about 30 percent more agents get access to credit, and especially people that previously have not gotten credit access suddenly have credit access. But the problem is that this social credit score that correlates all of these different issues, it correlates only very, very weakly with repayment ability or repayment wishes, and thus suddenly you have all of these non-performing loans. You have – and what we see is sort of like – we have non-performing loans. Banks give out less loans because they have so many non-performing loans, and then the non-performing loans are written off, and suddenly banks give out more loans. But you have this oscillating financial system, where you give out a lot of loans, a lot of them are non- performing, then you give out a lot of loans again. And this is very, very vulnerable to crisis. If you have a real economic crisis during the time where non- performing loans are high, then a lot of banks will actually default, which is very, very dangerous for a financial system as nationed as the Chinese one. Now, what are some possible corrections? You could create a score that basically is the same as the Schufa score. So that it looks only at credit decisions, but suddenly, you lose a lot of incentives for the social credit score, if the social credit score doesn't matter for credit distribution anymore. Another thing, and this is, I think, the more likely one, is that you have a blacklist for people that have not repaid a loan in the past. So you basically get one freebie, and afterwards if you didn't repay your loan in the past then you will not get a loan in the future. You will still be part of the social credit system, and your social credit score will still be important for all of these other access issues, but it won't be important for access to loans anymore, once you've been on this blacklist. Which is probably something that the Chinese government could go behind, but it's also more effort to take care of it; then you have to think about, "well, you can't leave them on the blacklist forever, so how long do you leave them on the black list? Do they have to pay back the loan and then they get off the blacklist? Or do they have to pay back the loan and then stay not in default for a year, or for five years?" There's a lot of small decisions that, in my opinion, the Chinese government hasn't really thought about, up until now, because they're basically doing all these pilot studies, and all of these regional governments are thinking of all these small things, but they're not documenting everything that they're doing. So, once they – they want to roll it out by 2020, by the way, nationwide – once they've rolled it out there's a pretty big chance, in my opinion, that they'll have a lot of unintended consequences. A lot of things that they haven't thought about, and that they will then have to look at. So, I believe that some sort of system is likely to come, just in terms of how much energy they've expended into this one, and for the Chinese government at this point, for the party, it would be losing face if they did not include any such system, because they've been talking about this for a while. But most likely, it would be a kind of decentralized data sharing system. And when I ran my simulation... By the way I will make public my code, I still need some, basically, I used some proprietary data for my model, and I still need the permission to publish this. Once I publish this one I will also tweet it, and we'll put it on GitHub for everyone to play around with, if you want to. And some of these implementation details that were very important in determining model outcomes where "do we have a relative or absolute ranking?" So far, all of the systems I looked at had absolute rankings, but there's a point to be made for relative rankings. Do we have one score, where, basically, if you're a Chinese person you get one score? Or do we have different sub-scores in different fields? Do we have people reporting behavior, or do we have automatic behavior recording? How do you access other people's scores? How much information can you get from other people's scores? Currently, if someone is on a blacklist, for example, if you have their ID number, again, you can put it into this blacklist, and then they will say "oh, this person is on this blacklist for not following this judge's order," and then it says what kind of judge's order it was. So, most likely, it will be something like this. The idea is that the Social Credit system isn't only for individuals, but also for firms and for NGOs. So, what kind of roles will firms play in the system? I haven't looked at that, in detail, at this point, but it would be very interesting. Another idea that western people often talk about is, do people also rank each other? Currently, that's not part of the system in China, but it might be at one point. And lastly, where does the aggregation happen? So I've said that a lot of it is actually data sharing in China. So what kind of data is shared? Is the raw data shared? "Person A did something." Or is the aggregated data shared? "Person A got this score." At this point, most of the time, it is actually the raw data that is shared, but that also has sort of these data privacy issues, of course, that I've talked about. OK, perfect! No there's 10 more minutes. Thank you for your attention! If you have questions, remarks you can ask them now or you can catch me up later. You can tweet to me or send me an e-mail, whatever you're interested in. Thank you very much! applause Herald Angel: Hello! As Toni said, we have 10 minutes left for questions. If you have a question in the room, please go crouch in front of our five microphones. If you're watching the stream, please ask your questions through IRC or Twitter, and we'll also try to make sure to get to those. Let's just go ahead and start with mic one. Question: Good! Thank you very much for this beautiful talk. I was wondering how did the Chinese government, companies, and most of all, the citizens themselves, respond to you doing this research, or, let's put it differently, if you would have been in the system yourself, how would your research affect your social credit score? laughter Answer: So, um... There's actually two different responses that I've seen. When I talk to the government themselves, because I was there on a government scholarship, and mentioned that I'm really interested in this, they basically said oh well this is just a technical system. You don't really need to be concerned with this. It is not very important. Just, you know, it's just a technicality. It's just for us to make life more efficient and better for everyone. So I assume my score would actually go down from doing this research, actually. But when I talk to a lot of people at universities, they were also very – they were very interested in my research, and a lot of them mentioned that they didn't even know that the system existed! Herald: Before we go to a question from our signal angel, a request for all the people leaving the room, please do so as quietly as possible, so we can continue this Q and A. The signal angel, please! Signal Angel: Jaenix wants to know, is this score actually influenced by association with people with a low score. Meaning that, is there any peer pressure to stay away from people with bad scores? Answer: The Sesame credit score definitely is influenced by your friends' scores, the Rongcheng score, so far, apparently, is not influenced, but it is definitely in the cards, and it is planned that it will be part of this. I think WeChat, which is the main platform – it's sort of like WhatsApp, except it can do a lot a lot more – WeChat is still not connected to the Social Credit Score in Rongcheng. Once they do that, it will most likely also reflect your score. Herald: All right, let's continue with mic 3. Q: I have a question about your models. I'm wondering, what kind of interactions are you modeling? Or actions, like, what can the agents actually do? You mentioned moving somewhere else. And, what else? A: Okay so the way I set up my model was, I set up a multilevel model. So I looked at different kinds of levels. I started out with, basically, they can evade taxes, they can get loans and repay loans, they can choose where to live, and they can follow traffic rules or not follow traffic rules. And because these were, sort of, four big issues that were mentioned in all of the different systems, so I started out with these issues, and looked at, what kind of behavior do I see? I used some research that – some friends of mine actually sent out surveys to people and asked them "well, you're now part of the system. Did your behavior change, and how did it change depending on your responses, depending on your score, and depending on the score system that exists?" And I, basically, used that, and some other research on nudging and on behavioral adaptation, to look at how likely is it that someone would change their behavior based on the score. Herald: All right let's do another question from the interwebs. Q: Yeah, it's actually two questions in one. How does this system work for Chinese people living abroad, or for noncitizens that do business in China? A: Currently the system does not work for noncitizens that do business in China, because it works through the shēnfènzhèng. You only get a shēnfènzhèng if you're a Chinese citizen or you live in China for 10 or more years. So everyone who is not Chinese is currently excluded. Chinese people not living in China, if they have a shēnfènzhèng, are on this system, but there's not a lot of information. Herald: All right, mic 4. Q: Well, we've come a long way since the Volkszählungsurteil. Can you tell us anything about the dynamic in the time dimension? How quickly can I regain credit that was lost? Do you have any observations there? A: So in the Suining system what they actually did was they had a very, very strict period. So if you evaded taxes your score would be down for two years and then it would rebounce. In the Rongcheng system, they did not publish this kind of period. So, my assumption is that it's going to be more on a case by case basis. Because, I looked at the Chinese data, I looked at the Chinese policy documents, and they didn't really, for most of the stuff, they didn't say how long it would count. For the blacklists, which was kind of the predecessor that we look at currently, the way it works is you stay on there until whatever the blacklist – until whatever the reason for the blacklist is has been resolved. So, you stay on there until you send off this apology that the judge ordered you to. And then, usually, you still needed to apply to get off. So it doesn't – for blacklists, it does not work that you automatically get off. You need to apply, you need to show that you've done what they've asked you to do, and then you can get off this blacklist. And I assume it will be a similar sort of appeals procedure for the system. Herald: All right. Let's go to mic 2. Q: Thank you. I just wanted to if looking up someone else's data in details, like position et cetera, does affect your own score? A: Currently, it apparently does not, or at least they haven't published that it does. It might in the future, but most likely it's actually behavior that they want. So they want you to look up other people's scores before doing business with them. They want you to, basically, use this to decide who you're going to associate with. Q: Thank you! Herald: All right, do we have another question from the Internet, maybe? Signal: Yes, I do! Standby... The question is, how is this actually implemented for the offline rural population in China? A: Quite easily; not at all at this point. The idea is, by 2020, that they will actually have all of this is implemented. But even for the offline – or let's say offline rural population in China is getting smaller and smaller. Even in rural villages you have about 50-60% of people that are online. And most of them are online via smartphone, and their smartphone is connected to the shēnfènzhèng. So it's not very complicated to do that for everyone who is online. For everyone who's offline, off course, this is more problematic, but I think the end goal is to not have people offline at all. Herald: All right. Let's jump right back to microphone 2, please. Q: Thank you for the very good and frightening talk, so far. At first I have to correct you in one point. In Germany we have a similar system because we have this tax I.D., which is set from birth on and rests 30 years after a person's dead. Yeah. So we have a lifelong I.D. A: You're right. I just... I don't know mine, so I figured… dismissive sound. Q: No problem! But, at least we could establish a similar system, if we have a government which would want it. A question for you: you mentioned this "guanxi." Is it a kind of a social network? I didn't understand it, really. A: Yes, it is a kind of social network, but one that is a lot more based on hierarchies than it is in the West. So you have people that are above you and people that are below you. And the expectation is that, while it's a quid pro quo, people that are above you in the hierarchy will give you less than you will give to them. Q: Aha, okay. Herald: OK, all right. Unfortunately, we are out of time, so, please give another huge applause for Toni! applause postroll music subtitles created by c3subtitles.de in the year 2019. Join, and help us!