Return to Video

35C3 - "The" Social Credit System

  • 0:00 - 0:19
    35C3 preroll music
  • 0:19 - 0:23
    Herald Angel: Alright. Then it's my great
    pleasure to introduce Toni to you
  • 0:23 - 0:28
    She's going to talk about "the Social Credit
    System," which is, kind of, feels to me
  • 0:28 - 0:36
    like a Black Mirror episode coming to
    life. So, slightly nervous and really
  • 0:36 - 0:41
    curious what we're going to learn today.
    So please give a huge, warm round of
  • 0:41 - 0:46
    applause and welcome Toni!
    Applause
  • 0:52 - 0:55
    Toni: Good morning, everyone! Before I'm
    going to be talking I'm going into my talk
  • 0:55 - 1:00
    I'm just going to be presenting the
    Chinese translation streams for everyone
  • 1:00 - 1:03
    who doesn't speak English.
  • 1:03 - 1:17
    speaks chinese
  • 1:17 - 1:23
    Applause
  • 1:23 - 1:27
    So because today's talk is about China we
    figured it would be good to have it
  • 1:27 - 1:32
    in Chinese as well. And, I'm going to be
    talking today about
  • 1:32 - 1:35
    the Social Credit system in China, where
    "the" Social Credit system that you
  • 1:35 - 1:39
    always hear about in Western media
    actually doesn't really exist
  • 1:39 - 1:43
    and most of my talk will actually be
    talking about what all we don't know.
  • 1:43 - 1:47
    Which could fill an entire hour or even
  • 1:47 - 1:51
    more. But I'm just going to be focusing on
    some of the most interesting things for
  • 1:51 - 1:58
    me. First of all, a little bit about me.
    I'm an economist, but I'm not I'm not only
  • 1:58 - 2:03
    concerned with money. I'm kind of looking
    at economy, at economics as the study of
  • 2:03 - 2:08
    incentives, which means that what I'm
    really interested in is how humans respond
  • 2:08 - 2:12
    to different kind of incentives. I don't
    believe that humans are completely
  • 2:12 - 2:18
    rational. But I do believe that humans do
    try to maximize what they think is their
  • 2:18 - 2:24
    best interest. Now, some words about me: I
    studied math, economics and political
  • 2:24 - 2:29
    science in a couple of different cities
    all around the world. I spent overall 19
  • 2:29 - 2:35
    months in China. Most recently I was there
    in July on a government scholarship, which
  • 2:35 - 2:40
    was really, really interesting, because
    while there I read all of these Western
  • 2:40 - 2:44
    newspaper articles about the Chinese
    Social Credit system, and I went to a
  • 2:44 - 2:48
    pretty good university and I asked them:
    So what do you think about this system?
  • 2:48 - 2:52
    And most of them basically looked at me
    blankly, and were like: What system, I
  • 2:52 - 2:57
    haven't even heard of this! So that was
    kind of an interesting experience to me
  • 2:57 - 3:02
    because in the West it's like this huge,
    all-encompassing system. And in China,
  • 3:02 - 3:07
    most people that aren't directly -- that
    aren't directly in touch with it actually
  • 3:07 - 3:11
    don't know anything about this. I'm
    broadly interested in the impact of
  • 3:11 - 3:17
    technology on society, life, and the
    economy, obviously, and in my free time I
  • 3:17 - 3:21
    do a lot of data science and machine
    learning with Python and R. So, I thought
  • 3:21 - 3:27
    it was quite interesting to look at the
    Social Credit system, also from this point
  • 3:27 - 3:32
    of view because you always heard that it's
    like this big data initiative, and then
  • 3:32 - 3:35
    when coming down to it, what you actually
    see is that, they don't actually use
  • 3:35 - 3:40
    machine learning all that much. They have,
    basically, a rule based catalog where, if
  • 3:40 - 3:44
    you do this you get 50 points, if you do
    this you get 50 points, and then they
  • 3:44 - 3:49
    actually have a lot of people that are
    reporting on other people's behavior. I'm
  • 3:49 - 3:53
    going to be talking about how exactly it
    looks, later on but I was very, very
  • 3:53 - 3:57
    surprised after reading a lot of the
    Western newspaper articles that were
  • 3:57 - 4:02
    basically "Oh, this is this big dystopia,
    Orwellian, with big data working." And
  • 4:02 - 4:08
    then, you read what's actually happening
    and they have huge lists of "if you
  • 4:08 - 4:15
    jaywalk, you get 10 points detracted from
    you," this kind of thing. If you want to
  • 4:15 - 4:19
    get in touch with me you can use Twitter
    but you can also use different e-mails
  • 4:19 - 4:23
    either my professional e-mail or my
    personal e-mail address, that you can both
  • 4:23 - 4:28
    see there. If you have any thoughts on
    that or are interested in this a little
  • 4:28 - 4:32
    more I can give you more resources as
    well, because obviously today's talk will
  • 4:32 - 4:38
    only be scratching on the surface. So,
    perceptions of the Social Credit System.
  • 4:38 - 4:43
    One of the interesting things that I've
    talked about before was how, in the West
  • 4:43 - 4:48
    and in China, the perception is completely
    different. So in the West, which is from
  • 4:48 - 4:55
    financialtimes.com, you see this huge
    overwhelming guy, and he basically puts
  • 4:55 - 4:59
    every Chinese person under a microscope.
    They're all kind of hunched over, and
  • 4:59 - 5:04
    everyone has this score attached to them,
    and they seem pretty sad and, like, very,
  • 5:04 - 5:09
    very Orwellian concept. Whereas, in China,
    this is actually from a Chinese state
  • 5:09 - 5:14
    media, and what it says is, well, we can
    all live in harmony with this new system
  • 5:14 - 5:19
    and all trust each other. And
    interestingly Chinese people actually
  • 5:19 - 5:24
    believe that, to some degree. They believe
    that technology will fix all this current
  • 5:24 - 5:30
    problems in society, especially because,
    in China currently, trust is a rare
  • 5:30 - 5:39
    commodity. And this new system will lead
    to more efficiency and trust, and a better
  • 5:39 - 5:44
    life. And I have a really, really
    interesting quote from a Western scholar,
  • 5:44 - 5:48
    that really summarizes the Western
    perspective: "What China is doing here is
  • 5:48 - 5:53
    selectively breeding its population to
    select against the trait of critical,
  • 5:53 - 5:57
    independent thinking. This may not be the
    purpose, indeed I doubt it's the primary
  • 5:57 - 6:02
    purpose, but it's nevertheless the effect
    of giving only obedient people the social
  • 6:02 - 6:07
    ability to have children, not to mention
    successful children." This, basically,
  • 6:07 - 6:12
    plays with the idea that if you have a low
    score, currently, in the cities that are
  • 6:12 - 6:17
    already testing this system, what happens
    is, your children can't attend good
  • 6:17 - 6:23
    schools. What happens is, you cannot take
    trains, you cannot take planes. You cannot
  • 6:23 - 6:31
    book good hotels. Your life is just very,
    very inconvenient. And this is by design.
  • 6:31 - 6:39
    This is kind of the plan. The Chinese
    government, they say it's a little
  • 6:39 - 6:42
    different, the idea is about changing
    people's conduct by ensuring they are
  • 6:42 - 6:48
    closely associated with it. One of the
    main things about this system is, there
  • 6:48 - 6:53
    isn't very much new data being generated
    for the system. Instead, what's happening
  • 6:53 - 7:00
    is, all the existing data that is already
    collected about you is, basically,
  • 7:00 - 7:06
    combined into one big database for each
    and every person by your ID number. So, in
  • 7:06 - 7:10
    China, once you're born, you get an ID
    number, which is similar to a Social
  • 7:10 - 7:14
    Security number in the U.S. We don't
    really have a similar concept in Germany,
  • 7:14 - 7:19
    and it used to be that your ID number was
    only necessary for public -- like for
  • 7:19 - 7:26
    government stuff, but now you need your ID
    number for getting a bank account, you
  • 7:26 - 7:30
    need your ID number for buying a cell
    phone, even if it's a prepaid cell phone,
  • 7:30 - 7:35
    you still need your ID number. So all your
    online activity that happens with your
  • 7:35 - 7:38
    cell phone is associated with your ID
    number, which means you can't really do
  • 7:38 - 7:44
    anything anonymously, because it's all
    going back to your ID number. There's a
  • 7:44 - 7:49
    couple of predecessors, some of them going
    actually back to the 1990s, that are
  • 7:49 - 7:55
    supposed to be integrated into the new
    system. One of them, or like two of them
  • 7:55 - 8:01
    are blacklists. One of them is a court
    blacklist. So in China, courts work a
  • 8:01 - 8:07
    little bit differently. They tend to like
    giving you fines, as they do in other
  • 8:07 - 8:12
    countries, but they also like giving you
    "apologies to do." So one of the things,
  • 8:12 - 8:17
    if you do something, for example you're a
    company, your food safety wasn't up to par
  • 8:17 - 8:22
    – you have to pay a fine. But in addition
    to this fine you also have to write a
  • 8:22 - 8:27
    public apology letter in the newspaper,
    how you are very sorry that this happened
  • 8:27 - 8:32
    and it won't happen again, and it was a
    moral failing on your part, and it won't
  • 8:32 - 8:37
    happen again. And if you don't do that,
    you go on this blacklist. Similarly, if
  • 8:37 - 8:43
    you take out a line of credit and don't
    pay it back within three months, or like
  • 8:43 - 8:48
    don't don't do any payments for three
    months, you go on this debtors blacklist.
  • 8:48 - 8:52
    If you're on this blacklist, which again
    is associated with your shēnfènzhèng, so
  • 8:52 - 8:58
    your ID number – what happens is you
    cannot take trains you cannot take planes.
  • 8:58 - 9:02
    Your life basically becomes very very
    inconvenient, your children can't go to
  • 9:02 - 9:07
    good public schools, your children can't
    go to private schools, your children can't
  • 9:07 - 9:15
    go to universities, all of these issues
    are suddenly coming up. There is also a
  • 9:15 - 9:20
    company database that's called Credit
    China which is basically similar to the
  • 9:20 - 9:25
    public debtors blacklist but it's
    basically a credit system a credit score
  • 9:25 - 9:29
    for companies. And then there's the credit
    reference center of the People's Bank of
  • 9:29 - 9:35
    China which is a credit score. It was
    supposed to be like Schufa or like the
  • 9:35 - 9:43
    U.S. FICO for individuals. But one of the
    big problems in China is that there are a
  • 9:43 - 9:47
    lot of people that aren't part of the
    formal economy. A lot of people are
  • 9:47 - 9:53
    migrant workers. They get their money in
    cash. They do not have bank accounts. They
  • 9:53 - 10:00
    do not have anything… they do not have rent
    or utilities or anything like this because
  • 10:00 - 10:05
    they live in the country. So they own
    their own home which they built themselves
  • 10:05 - 10:11
    so they didn't even finance it and their
    home isn't officially theirs because in
  • 10:11 - 10:16
    China you can't actually own property.
    Instead the government leases it to you.
  • 10:16 - 10:21
    So there were a lot of people that were
    not covered in this system, and I think
  • 10:21 - 10:27
    the last data that I had was that less
    than 10 percent of Chinese adult citizens
  • 10:27 - 10:33
    were actually in the system and had any
    sort of exposure to banks, which is very,
  • 10:33 - 10:38
    very little. And that meant that people
    couldn't get credit because banks would
  • 10:38 - 10:42
    only give credit to people that were in
    the system or people where they had
  • 10:42 - 10:47
    some sort of handling on whether they
    would be paid back. Now, the
  • 10:47 - 10:52
    implementation details of the new system
    are very very scarce, but the basic idea
  • 10:52 - 10:56
    is that Chinese citizens are divided into
    trustworthy individuals and what the
  • 10:56 - 11:02
    Chinese call "trust breakers". Sometimes
    you have five different groups, sometimes
  • 11:02 - 11:05
    you have two different groups, but in
    general there's sort of this cut-off:
  • 11:05 - 11:14
    above this line it's good and beyond this
    line it's bad. This is one graphic from
  • 11:14 - 11:20
    the Wall Street Journal that just shows
    some of the inputs that go into the
  • 11:20 - 11:25
    system. And one of the things that we see
    is that the inputs are _crazy_ crazy
  • 11:25 - 11:35
    varied. So it is: do you pay income taxes?
    Do you pay your utility bills on time?
  • 11:35 - 11:40
    Do you respect your parents? However they
    measure that. Do you have a criminal
  • 11:40 - 11:47
    record? Do you pay for public
    transportation or have you been caught
  • 11:47 - 11:50
    not paying? What about your friends?
  • 11:50 - 11:58
    Do you retweet or use WeChat
    to distribute sort of information against
  • 11:58 - 12:04
    the party, which they call reliability. In
    actuality it's not about whether it's
  • 12:04 - 12:08
    factual, it's about whether it's against
    the party or not. Where do you buy and
  • 12:08 - 12:12
    what do you buy, apparently if you buy
    diapers it's better than if you buy
  • 12:12 - 12:17
    videogames. For your score. Because you
    know if you buy videogames obviously
  • 12:17 - 12:20
    you're not very responsible. And if you
    buy diapers you have a kid, you are sort
  • 12:20 - 12:29
    of conforming to the societal ideal. And
    then your score is supposed to go into all
  • 12:29 - 12:34
    these different categories, you're
    supposed to have better access to social
  • 12:34 - 12:39
    services if your score is good. You're
    supposed to have better access to internet
  • 12:39 - 12:43
    services. So in theory the idea is that at
    one point if your score is too bad, you're
  • 12:43 - 12:48
    not allowed to use WeChat anymore. You're
    not allowed to use Alibaba anymore. You
  • 12:48 - 12:57
    can't become a government worker. You can
    not take planes and high speed trains. You
  • 12:57 - 13:05
    can not get a passport. And your insurance
    premiums will go up. So it's supposed to
  • 13:05 - 13:13
    be this really really big, overwhelming
    system. But in actuality what they say
  • 13:13 - 13:19
    their stated goals are, is "it's a
    shorthand for a broad range of efforts to
  • 13:19 - 13:25
    improve market security and public safety
    by increasing integrity and mutual trust
  • 13:25 - 13:32
    in society." So one idea is to allocate
    resources more efficiently. Resource
  • 13:32 - 13:39
    allocation in China is a pretty big
    problem, because people grow up with:
  • 13:39 - 13:44
    There's 1.3 billion people. So there's -
    it's always going to be scarce. And a lot
  • 13:44 - 13:50
    of stuff is – people grow up with this
    idea that it's just very very scarce, and
  • 13:50 - 13:55
    current distribution strategies, which are
    mostly financially based but also often
  • 13:55 - 13:59
    guanxi-based, don't really seem fair. For
    example, public transport in China is
  • 13:59 - 14:05
    highly subsidized, which means that the
    price does not reflect whether – does not
  • 14:05 - 14:10
    reflect true scarcity. So currently the
    way it works is in theory it's first come
  • 14:10 - 14:15
    first serve, in practice there's people
    that are buying up all the tickets for,
  • 14:15 - 14:19
    for example, the high speed train from
    Shanghai to Beijing and then selling it at
  • 14:19 - 14:23
    a profit, or selling it to certain
    companies that have good ties to the
  • 14:23 - 14:28
    government. That seems very unfair. So the
    new system is supposed to distribute them
  • 14:28 - 14:34
    more fairly and more efficiently. The
    other thing is restoring trust in people.
  • 14:34 - 14:38
    Perceived inter-personal trust and trust
    in institutions is extremely low in China.
  • 14:38 - 14:43
    If you're from Germany, you might have
    heard that there is Chinese gangs
  • 14:43 - 14:48
    basically buying up German milk powder and
    selling it in China. This is actually
  • 14:48 - 14:55
    happening, because in 2008 there was a big
    scandal with laced milk powder. And ever
  • 14:55 - 14:59
    since then, anyone who can afford it does
    not use Chinese milk powder, because they
  • 14:59 - 15:04
    don't trust the government, or the
    regulations, the firms, enough to buy
  • 15:04 - 15:10
    Chinese milk powder so they are actually
    importing this. And the big irony is:
  • 15:10 - 15:16
    sometimes this milk powder is produced in
    China, exported to Germany, and then
  • 15:16 - 15:22
    exported back to China. The Social Credit
    system is then supposed to identify those
  • 15:22 - 15:27
    that deserve the trust. And the third
    point is sort of a reeducation of people.
  • 15:27 - 15:33
    The idea is: they want to make people in
    the image that the Communist Party thinks
  • 15:33 - 15:41
    people should be. And one additional way
    to the punishments and rewards this could
  • 15:41 - 15:45
    work, is the feeling of being surveyed.
    Because you can't do anything anonymously,
  • 15:45 - 15:49
    you will automatically adapt your behavior
    because you know someone is watching you
  • 15:49 - 15:56
    all the time, and this is how a lot of the
    Chinese firewall actually works, because
  • 15:56 - 16:01
    most people I know that are sort of more–
    more educated, they know ways to
  • 16:01 - 16:05
    circumvent the Chinese firewall, but they
    also know that they're always being
  • 16:05 - 16:09
    watched, so they don't do that because,
    you know, they're being watched, so they
  • 16:09 - 16:18
    self– they censorship– they censor
    themselves. As I said before, allocation
  • 16:18 - 16:24
    of scarce resources so far is mainly
    through financial guanxi channels. guanxi
  • 16:24 - 16:29
    is basically an all permeating network of
    relationships with a clear status
  • 16:29 - 16:34
    hierarchy. So if I attend a school,
    everyone who also attended this school
  • 16:34 - 16:40
    will be sort of in my guanxi network. And
    there's this idea that we will have a
  • 16:40 - 16:46
    system where we are all in-group, and in-
    group we trust each other and we do favors
  • 16:46 - 16:51
    for each other, and everyone who's outside
    of my immediate group I don't trust and I
  • 16:51 - 16:56
    don't do favors for. And in some ways the
    guanxi system right now is a substitute
  • 16:56 - 17:03
    for formal institutions in China. For
    example if you want a passport right now.
  • 17:03 - 17:07
    You can of course apply for passports
    through regular channels, which might take
  • 17:07 - 17:12
    months and months. Or you can apply for a
    passport through knowing someone and
  • 17:12 - 17:17
    knowing someone, which might take only two
    days. Whereas in Germany you have these
  • 17:17 - 17:23
    very regular, formal institutions, in
    China they still use guanxi. But,
  • 17:23 - 17:27
    increasingly especially young people find
    that guanxi are very unfair, because a lot
  • 17:27 - 17:32
    of these are: where you went to school,
    which is determined by where you're born,
  • 17:32 - 17:37
    who your parents are, and all these
    things. Another thing that's important to
  • 17:37 - 17:42
    understand because: the system works
    through public shaming. And in a lot of
  • 17:42 - 17:48
    western society we can't really imagine
    that, like, I wouldn't really care if my
  • 17:48 - 17:55
    name was in a newspaper of someone who
    jaywalked for example. It would be: oh
  • 17:55 - 18:00
    well, that's okay. But in China this is
    actually a very very serious thing. So
  • 18:00 - 18:05
    saving face is very very important in
    China. And when I went to school there I
  • 18:05 - 18:13
    actually – we had this dormitory, and it
    was an all foreigners dormitory, where the
  • 18:13 - 18:18
    staff that were responsible for the
    dormitory felt that foreigners were not
  • 18:18 - 18:25
    behaving in the way they should. So their
    idea was to put the names, the pictures,
  • 18:25 - 18:29
    and the offenses of the foreigners in the
    elevator to shame them publicly. So for
  • 18:29 - 18:34
    example if you brought a person of the
    opposite sex to your room, they would put
  • 18:34 - 18:40
    your name, your offense and your room
    number in the elevator. And of course this
  • 18:40 - 18:45
    didn't work because for a lot of western
    people it was basically like: "oh well I'm
  • 18:45 - 18:49
    going to try to be there as often as
    possible because this is like a badge of
  • 18:49 - 18:54
    honor for me" and the Chinese people they
    figured "well this is really really shame
  • 18:54 - 18:58
    and I'm losing my face". She brought
    alcohol. So this didn't really work at
  • 18:58 - 19:06
    all. But this is kind of the mindset that
    is behind a lot of these initiatives. As I
  • 19:06 - 19:12
    said there's a lot of problems with – we
    don't really know what's going to happen.
  • 19:12 - 19:17
    And one of the ways that we can see what
    might happen is actually to look at pilot
  • 19:17 - 19:24
    systems. China has – or like ever since
    the Communist Party took hold – the
  • 19:24 - 19:29
    Chinese government has tried a lot of
    policy experimentation. So whenever they
  • 19:29 - 19:34
    try a new policy, they don't roll it out
    all over, but they choose different pilot
  • 19:34 - 19:39
    cities or pilot districts, and then they
    choose "oh well this is the district where
  • 19:39 - 19:43
    I'm going to be trying this system and I'm
    going to be trying another system in
  • 19:43 - 19:48
    another district or city". And this is
    also what they did for the, or what
  • 19:48 - 19:53
    they're doing for the Social Credit
    system. Now I have three systems that I
  • 19:53 - 19:58
    looked at intensively for this
    presentation, overall there's about 70
  • 19:58 - 20:07
    that I know of - the Suining system,
    Suining is a city in China, the Rongcheng
  • 20:07 - 20:11
    system, another city in China and Sesame
    Credit. Sesame Credit is a commercial
  • 20:11 - 20:16
    system from Alibaba - I assume everyone
    knows Alibaba, the're basically the
  • 20:16 - 20:23
    Chinese Amazon, except they're bigger and
    have more users and make more money,
  • 20:23 - 20:27
    actually. And they have their own little
    system. One of the problems with this kind
  • 20:27 - 20:32
    of system that I found when I tried
    modeling it, was that it's a very very
  • 20:32 - 20:39
    complex system and small changes in input
    actually changed the output significantly.
  • 20:39 - 20:44
    So when they try– usually when they try
    this pilot system they basically have a
  • 20:44 - 20:48
    couple of pilots, then they choose the
    pilot that is best and they roll it out
  • 20:48 - 20:53
    all over. But for this kind of thing,
    where you have a lot of complex issues, it
  • 20:53 - 20:59
    might not be the best way to do that. The
    Suining system is actually considered the
  • 20:59 - 21:06
    predecessor of all current systems. It had
    a focus on punishment, and it was quite
  • 21:06 - 21:12
    interesting. At the beginning of the trial
    period they published a catalogue of
  • 21:12 - 21:17
    scores and consequences. Here is an
    example. This is basically taken from this
  • 21:17 - 21:23
    catalog. So if you took out bank loans and
    didn't repay them, you got deducted 50
  • 21:23 - 21:27
    points. Everyone started with 1000
    points for this system. If you didn't pay
  • 21:27 - 21:33
    back your credit cards you also got
    deducted 50 points. If you evaded taxes,
  • 21:33 - 21:44
    also 50 points. If you sold fake goods, 35
    points were deducted. And actually the
  • 21:44 - 21:50
    system was abolished I think in 2015,
    2016, because all the Chinese state media
  • 21:50 - 21:56
    and also a lot of Internet citizens talked
    about how it's an Orwellian system and how
  • 21:56 - 22:03
    it's not a good system, because it's all
    very centralized and everything that you
  • 22:03 - 22:09
    do is basically recorded centrally. But
    Creemers writes: "Nonetheless, the Suining
  • 22:09 - 22:13
    system already contained the embryonic
    forms of several elements of subsequent
  • 22:13 - 22:17
    social credit initiatives: The notion of
    disproportional disincentives against rule
  • 22:17 - 22:22
    breaking, public naming and shaming of
    wrongdoers, and most importantly, the
  • 22:22 - 22:27
    expansion of the credit mechanism outside
    of the market economic context, also
  • 22:27 - 22:30
    encompassing compliance with
    administrative regulations and urban
  • 22:30 - 22:34
    management rules." So one of the things
    that is difficult for especially German
  • 22:34 - 22:40
    speakers is that credit in Chinese,
    xìnyòng, means credit as in "loan", but
  • 22:40 - 22:46
    also means credit as in "trust". So the
    Social Credit System is one way of trying
  • 22:46 - 22:52
    to conflate those two – the economic
    credit and the trust credit – into one big
  • 22:52 - 23:02
    system. But the Suining system basically
    failed. So, they adapted the system and
  • 23:02 - 23:07
    are now practicing a new kind of system,
    the Rongcheng system. Whenever you read a
  • 23:07 - 23:11
    newspaper article on the social credit
    system in the west, most people went to
  • 23:11 - 23:15
    Rongcheng because they just received a
    couple of awards from the Chinese
  • 23:15 - 23:22
    government for being so advanced at this
    social credit thing. But it's very
  • 23:22 - 23:26
    difficult to call this "one system"
    because there's actually many many
  • 23:26 - 23:31
    intertwined systems. There is one city
    level system, where city level offenses
  • 23:31 - 23:39
    are recorded. For example tax evasion, and
    there's a couple of rules. If you evade
  • 23:39 - 23:44
    taxes your score goes down 50. But then if
    you live in one neighborhood your score
  • 23:44 - 23:48
    might go up for volunteering with the
    elderly. If you live in another
  • 23:48 - 23:54
    neighborhood your score might go up for,
    for example, planting some trees in
  • 23:54 - 23:59
    your garden or backyard. So depending on
    your neighborhood, your score might be
  • 23:59 - 24:06
    different. If you work for a– if you work
    for a taxi cab company, for example, they
  • 24:06 - 24:11
    also have their own little score system
    and your score might go up if you get good
  • 24:11 - 24:17
    reviews from your drive…, from your
    passengers. Your score might go down if
  • 24:17 - 24:26
    you don't follow traffic rules, these
    kinds of things. There are designated
  • 24:26 - 24:32
    scorekeepers at each level. So, each
    district chooses a couple of people who
  • 24:32 - 24:38
    are responsible for passing on the
    information to the next higher level,
  • 24:38 - 24:43
    about who did what. There is supposed to
    be an official appeals procedure, so
  • 24:43 - 24:47
    whenever you score changes you're supposed
    to be notified, but apparently that's not
  • 24:47 - 24:54
    happening at this point for most people.
    Again, it's a system of data sharing, and
  • 24:54 - 24:59
    one thing that they haven't really
    disclosed yet is what kind of data is
  • 24:59 - 25:05
    shared. Are they only sharing the points,
    so if I'm in a district and I plant some
  • 25:05 - 25:11
    trees, does the central system get the
    information "person A planted some trees,"
  • 25:11 - 25:16
    or does the central system get the
    information "person A got 5 points?" We
  • 25:16 - 25:22
    don't know at this point. And it would
    mean something very different for how the
  • 25:22 - 25:27
    system could be used. But still the end
    result, at this point, is that there's one
  • 25:27 - 25:31
    score. So you have one central score and
    it's kind of– there's all these different
  • 25:31 - 25:36
    smaller systems that go into this score.
    But at the end, everyone has one central
  • 25:36 - 25:45
    score, and currently about 85 percent of
    people are between 950 and 1050. So you
  • 25:45 - 25:50
    start off with a thousand – and those are
    basically the normal people – and then
  • 25:50 - 25:59
    anyone above a 1050 is considered a
    trustworthy person, and anyone below 1050
  • 25:59 - 26:05
    is considered a trust-breaker. And, as
    I've said before, with the naming and
  • 26:05 - 26:12
    shaming and all these things, what you can
    actually see here is a billboard with the
  • 26:12 - 26:18
    best trustworthy families in Rongcheng. So
    these are the families that have the
  • 26:18 - 26:24
    highest scores, for example. Sesame Credit
    is a little different. It's the only
  • 26:24 - 26:28
    system that actually uses machine learning
    and artificial intelligence to determine
  • 26:28 - 26:33
    the outputs. In Rongcheng, for example,
    they have artificial intelligence, they
  • 26:33 - 26:37
    have computer vision, for the most part,
    and the computer vision cameras they
  • 26:37 - 26:42
    decide– they try to recognize you when you
    jaywalk. And then when they recognize you
  • 26:42 - 26:48
    when jaywalking, you get a small SMS;
    "well, we just saw you jaywalking, your
  • 26:48 - 26:57
    score is now dropping." But how the score
    develops, depending on your jaywalking,
  • 26:57 - 27:01
    isn't really determined by machine
    learning or artificial intelligence.
  • 27:01 - 27:07
    Instead, it's determined by rules. You
    know: one time jaywalking deducts five
  • 27:07 - 27:12
    points, and this is stated somewhere.
    Sesame Credit doesn't work like that.
  • 27:12 - 27:20
    Instead it uses a secret algorithm, and
    the way– I talked to some people that work
  • 27:20 - 27:25
    for Sesame Credit or for Alibaba, and the
    way they described it was; they basically
  • 27:25 - 27:33
    clustered people based on behavior, then
    gave scores to these clusters, and
  • 27:33 - 27:41
    then afterwards, did basically reverse
    engineered their own score, using machine
  • 27:41 - 27:46
    learning, so that whenever something new
    happens, you can move to a different
  • 27:46 - 27:54
    cluster. This Sesame Credit was actually
    refused accreditation as a credit score in
  • 27:54 - 28:03
    2017, so banks are not allowed to use the
    Sesame Credit score for your– to use the
  • 28:03 - 28:09
    Sesame Credit score to determine whether
    they give you loans or not. Because Sesame
  • 28:09 - 28:14
    Credit is quite ingenious – obviously
    Alibaba wants to keep you within their
  • 28:14 - 28:20
    platform – so if you buy using Alibaba and
    using Alipay, your score goes up. If you
  • 28:20 - 28:29
    buy using Weechatpay, which is a competing
    platform, your score goes down. This uses
  • 28:29 - 28:34
    many of the same rewards mechanisms of the
    official government systems, and this is
  • 28:34 - 28:38
    just an illustration of what kind of
    scores you can have, apparently your
  • 28:38 - 28:46
    scores can go between 350 and 850, and in
    Chinese there's basically five different
  • 28:46 - 28:56
    levels. So 385 is a "trust-breaker" or
    "missing trust". And then 731 is "trust is
  • 28:56 - 29:06
    exceedingly high". So one way I tried to
    approach this issue was through agent-
  • 29:06 - 29:10
    based modeling. Social Credit System is
    individual level, but what we're really
  • 29:10 - 29:13
    interested in, or what I'm really
    interested in, is actually societal-level
  • 29:13 - 29:19
    consequences. So if everyone gets this
    score, what does that mean for society?
  • 29:19 - 29:24
    And agent-based modeling works quite well
    for that, because it allows us to imbue
  • 29:24 - 29:29
    agents with some sort of rationality, but
    with a bounded rationality. What does
  • 29:29 - 29:33
    bounded rationality mean? Usually in
    economics people assume agents are
  • 29:33 - 29:38
    completely rational, so they are profit
    maximizers, they have all the information.
  • 29:38 - 29:45
    But in reality, agents don't have all the
    information, they have a lot of issues
  • 29:45 - 29:51
    with keeping stuff in their mind. So a lot
    of the time, they won't choose the best
  • 29:51 - 29:58
    thing in the world, but they choose the
    best thing that they see. And bounded
  • 29:58 - 30:02
    rationality allows us to account for this
    thing. It allows us to account for
  • 30:02 - 30:08
    heuristics and these things. And what I
    did is I took the propensity for specific
  • 30:08 - 30:12
    behavior from current state of the art
    research, mostly from behavioral
  • 30:12 - 30:18
    economics. For example, I looked at tax
    evasion, and I looked at who is likely to
  • 30:18 - 30:25
    evade taxes in a system, and then
    obviously there was some stochastic –
  • 30:25 - 30:31
    some chance element. But the
    distribution that I chose is related to
  • 30:31 - 30:38
    the current research. And I also checked
    that my model has similar results to the
  • 30:38 - 30:45
    Rongcheng model, which I modeled at at the
    beginning. So on average 87% of my users
  • 30:45 - 30:49
    have a score of within 10 percent of the
    original score, which is also the data
  • 30:49 - 31:01
    that Rongcheng city actually publishes.
    Now, for the most part, I compared design
  • 31:01 - 31:05
    choices in two axes. One of them was a
    centralized system versus a multi-level
  • 31:05 - 31:11
    system, and a rule-based system versus a
    machine learning system. The centralized
  • 31:11 - 31:21
    system is basically: you have a central –
    all the information is kept centrally, and
  • 31:21 - 31:27
    everyone in China, or wherever, in
    Rongcheng has the exact same scoring
  • 31:27 - 31:36
    opportunities. Now, if you have a
    centralized system the clear expectations
  • 31:36 - 31:40
    were pretty good. But, at the same time,
    the acceptance from the population was
  • 31:40 - 31:47
    really, really low, which they found
    during the Suining experiment. And
  • 31:47 - 31:51
    there's also the problem of a single point
    of failure. Who decides the central
  • 31:51 - 31:59
    catalog, and, depending on who, sort of,
    has the power, it kind of, just,
  • 31:59 - 32:05
    reproduces power structures. So because
    you have this central catalog, the same
  • 32:05 - 32:11
    people that are in power centrally, they
    are basically deciding some sort of score
  • 32:11 - 32:15
    mechanism that works for them very well,
    so that they and their family will have
  • 32:15 - 32:23
    high scores. And multi-level system has
    the advantage that local adaptation kind
  • 32:23 - 32:29
    of works, and there's sort of many points
    of failure. But in my model, when I
  • 32:29 - 32:37
    allowed locals to basically set their own
    rules, what happened was that they
  • 32:37 - 32:43
    competed. So, it started out being this
    district of Rongcheng, for example, and
  • 32:43 - 32:46
    this district of Rongcheng, they compete
    for the best people that they want to
  • 32:46 - 32:52
    attract, and suddenly you have this kind
    of race to the bottom, where people want
  • 32:52 - 32:58
    to move where they wouldn't be prosecuted,
    so they move to places where there's less
  • 32:58 - 33:04
    cameras, for example. At the same time,
    there's many points of failure, especially
  • 33:04 - 33:14
    the way it's currently set up, with people
    reporting data to the next high level.
  • 33:14 - 33:19
    And, a lot of the time, what we have
    actually seen in Rongcheng, was that they
  • 33:19 - 33:24
    reported data on people they didn't like
    more than data on people they did like.
  • 33:24 - 33:29
    Or, their families got better scores than
    people they didn't know. So it also kind
  • 33:29 - 33:41
    of reproduced these biases. The rule based
    system has the advantage that people were
  • 33:41 - 33:47
    more prone to adapt their behavior,
    because they actually knew what they
  • 33:47 - 33:51
    needed to do in order to adapt their
    behavior. But the score didn't really
  • 33:51 - 33:54
    correlate with the important
    characteristics that they actually cared
  • 33:54 - 34:02
    about. And, as opposed to in this machine
    learning system, you know how in Germany
  • 34:02 - 34:07
    we don't really know the Schufa algorithm.
    And I, for example, don't exactly know
  • 34:07 - 34:12
    what I could do in order to improve my
    Schufa score. And this is a similar system
  • 34:12 - 34:17
    in China with the Sesame Credit score. A
    lot of people don't really – they say,
  • 34:17 - 34:22
    "well I really want to adapt my behavior
    to the score, to improve my score, but
  • 34:22 - 34:29
    when I tried doing that my score actually
    got worse." And you can have different
  • 34:29 - 34:36
    biases, that I'm going to be talking about
    in a little bit. There's also this big
  • 34:36 - 34:43
    problem of incentive mismatch. So, the
    decentralized, rules-based systems like
  • 34:43 - 34:47
    Rongcheng, which is the system that I
    analyzed the most. Why, because I believe
  • 34:47 - 34:52
    this is the system that we're moving
    towards right now. Because Rongcheng won a
  • 34:52 - 34:57
    lot of awards. So the Chinese government,
    the way they usually work is, they try
  • 34:57 - 35:02
    pilots, then they choose the best couple
    of systems, they give them awards, and
  • 35:02 - 35:07
    then they roll out the system nationwide.
    So I assume that the system that's going
  • 35:07 - 35:13
    to be – the system in the end will be
    similar to the Rongcheng system. Now, one
  • 35:13 - 35:20
    problem that I actually saw in my
    simulation was that you could have this
  • 35:20 - 35:25
    possible race to the bottom. There's also
    this conflict of interest in those that
  • 35:25 - 35:30
    set the rules, because a lot of the time,
    the way it works is, you have your
  • 35:30 - 35:36
    company, and your company, you, in
    combination with your party leaders,
  • 35:36 - 35:44
    actually decide on the rules for the score
    system. But the scores of all your
  • 35:44 - 35:48
    employees actually determines your
    company's score. If you employ a lot of
  • 35:48 - 35:53
    people with high scores you get a better
    score. So you will have this incentive to
  • 35:53 - 35:57
    give out high scores and to make sure that
    everyone gets high scores. But at the same
  • 35:57 - 36:05
    time the government has an incentive for
    scores to be comparable. So there's a lot
  • 36:05 - 36:10
    of incentives mismatch. The government
    also has the incentive to keep false
  • 36:10 - 36:16
    negatives down, but they actually, the way
    the Chinese system currently works is,
  • 36:16 - 36:23
    they emphasize catching trust-breakers
    more than rewarding trust-follow... or
  • 36:23 - 36:29
    trustworthy people. So, false positives,
    for them, are less important, but false
  • 36:29 - 36:34
    positives erode the trust in the system,
    and they lead to a lot less behavioral
  • 36:34 - 36:41
    adaptation. I was actually able to show
    this using some nudging research that
  • 36:41 - 36:48
    showed that as soon as you introduce an
    error probability and you can be caught
  • 36:48 - 36:55
    for something that you didn't do, your
    probability of changing your behavior
  • 36:55 - 37:02
    based on this score is actually lower. And
    in Rongcheng, one of the perverse things
  • 37:02 - 37:10
    that they're doing is, you can donate
    money to the party or to, like, party
  • 37:10 - 37:17
    affiliated social services, and this will
    give you points, which is kind of an
  • 37:17 - 37:24
    indulgence system. Which is quite
    interesting, especially because a lot of
  • 37:24 - 37:33
    these donation systems work in a way that
    you can donate 50000 renminbi and you get
  • 37:33 - 37:37
    50 points, and then you donate another
    50000 renminbi and you get another 50
  • 37:37 - 37:44
    points. So you can basically donate a lot
    of money and then behave however you want,
  • 37:44 - 37:56
    and still get a good score. And the trust
    in other people can actually go down even
  • 37:56 - 38:01
    more in this system, because suddenly you
    only trust them because of their scores,
  • 38:01 - 38:05
    and the current system is set up so that
    you can actually look up scores of
  • 38:05 - 38:10
    everyone that you want to work with, and
    if they don't have a score high enough
  • 38:10 - 38:15
    then suddenly you don't want to work with
    them. The trust in the legal system can
  • 38:15 - 38:22
    also decrease, actually. Why? Because trust
    in the legal system in China is already
  • 38:22 - 38:26
    low, and a lot of the things, like
    jaywalking, they're already illegal in
  • 38:26 - 38:30
    China, as they are here, but no one cares.
    And suddenly, you have this parallel
  • 38:30 - 38:37
    system that punishes you for whatever.
    But, why don't you just try to fix the
  • 38:37 - 38:45
    legal system, which would be my approach.
    Suddenly, illegal activity could happen
  • 38:45 - 38:52
    more offline, and this is one of those
    things that is quite interesting. In
  • 38:52 - 38:58
    countries that we've seen that have moved
    towards mobile payments, and away
  • 38:58 - 39:05
    from cash, you see less robberies but you
    don't actually see less crime. Instead you
  • 39:05 - 39:12
    see more new types of crime. So, you see
    more credit card fraud, you see more phone
  • 39:12 - 39:19
    robberies, these kinds of things. And this
    is also where things could move in the
  • 39:19 - 39:30
    Chinese case. One major problem is also
    that this new system – I've talked a
  • 39:30 - 39:34
    little bit about this one, but – it can
    introduce a lot of new bias, and reproduce
  • 39:34 - 39:45
    the bias even more. So, for example, China
    is a country of 55 minorities. The Han are
  • 39:45 - 39:50
    a big majority, they have about 94 percent
    of the population. So any computer vision
  • 39:50 - 39:58
    task, we've shown, that they are really,
    really bad at discriminating between
  • 39:58 - 40:05
    individuals in smaller ethnic groups. In
    the U.S., most computer vision tasks
  • 40:05 - 40:10
    perform worse for African-Americans, they
    perform worse for women, because all of
  • 40:10 - 40:17
    the training sets are male and white, and
    maybe Asian. In China, all of these tasks
  • 40:17 - 40:27
    are actually performing worse for ethnic
    minorities, for the Uyghurs, for example.
  • 40:27 - 40:32
    And one way that they could try to abuse
    the system is to basically just – what
  • 40:32 - 40:38
    they're also doing already in Xinjiang is
    – to basically just identify, "oh this is
  • 40:38 - 40:46
    a person of the minority, well I'm just
    going to go and check him or her more
  • 40:46 - 40:50
    thoroughly." This is actually what happens
    in Xinjiang. If you're in Xinjiang and you
  • 40:50 - 40:59
    look like a Turkish person, or like from
    Turkmenistan, from a Turkish people, you
  • 40:59 - 41:04
    are a lot more likely to be questioned.
    You're a lot more likely to be stopped and
  • 41:04 - 41:13
    they ask you or require you to download
    spyware on your phone. And this is
  • 41:13 - 41:18
    currently what happens and this new kind
    of system can actually help you with that.
  • 41:18 - 41:25
    I've said that it can reproduce these kind
    of power structures, and now obviously we
  • 41:25 - 41:30
    all know neutral technology doesn't really
    exist, but in the Chinese case, in the
  • 41:30 - 41:34
    social credit case, they don't even
    pretend – they always say "well, this
  • 41:34 - 41:37
    is neutral technology and it's all a lot
    better," but actually it's the people
  • 41:37 - 41:44
    currently in power, they decide on what
    gives you point and what deducts points
  • 41:44 - 41:50
    for you. Another problem, currently the
    entire system is set up in a way that it
  • 41:50 - 41:55
    all goes together with your shēnfènzhèng,
    with your I.D. card. What if you don't
  • 41:55 - 41:59
    have an I.D. card? That's foreigners for
    one. But it's also people in China that
  • 41:59 - 42:05
    were born during the one child policy and
    were not registered. There's quite a lot
  • 42:05 - 42:09
    of them, actually. They're not registered
    anywhere and suddenly they can't do
  • 42:09 - 42:14
    anything, because they don't have a score,
    they can't get a phone, they can't do
  • 42:14 - 42:21
    anything, really. And part of the push
    with this social credit system is to go
  • 42:21 - 42:27
    away from cash, actually. So if you need
    to use your phone to pay, but for your
  • 42:27 - 42:30
    phone you need your shēnfènzhèng.
    If you don't have a shēnfènzhèng,
  • 42:30 - 42:33
    well, tough luck for you.
  • 42:33 - 42:39
    And currently the system in Rongcheng
    is set up in a way that you can check
  • 42:39 - 42:45
    other people's scores and you can also see
    what they lose points for. So you can
  • 42:45 - 42:50
    actually, sort of, choose to discriminate
    against people that are gay, for example,
  • 42:50 - 42:53
    because they might have lost points for
    going to a gay bar, which you can lose
  • 42:53 - 43:02
    points for. Another big issue, currently,
    is data privacy and security. Personal
  • 43:02 - 43:07
    data is grossly undervalued in China. If
    you ask a Chinese person, "what do you
  • 43:07 - 43:15
    think, how much is your data worth?," they
    say "what data? I don't have data." And,
  • 43:15 - 43:19
    currently, the way it works is, if you
    have someone's ID number, which is quite
  • 43:19 - 43:25
    easy to find out, you can actually buy
    access to a lot of personal information
  • 43:25 - 43:31
    for a small fee. So you pay about 100
    euros and you get all hotel bookings of
  • 43:31 - 43:36
    the last year, you get information of who
    booked these hotels with them, you get
  • 43:36 - 43:41
    information of where they stay, you get
    train bookings, you get access to all of
  • 43:41 - 43:47
    the official databases for this one
    person. And for another 700 renminbi you
  • 43:47 - 43:53
    can actually get live location data, so
    you can get the data of where this person
  • 43:53 - 43:56
    is right now, or where his or her phone is
    right now, but if you've ever been to
  • 43:56 - 44:03
    China you know that where the phone is,
    usually, the people aren't far. Supchina
  • 44:03 - 44:08
    actually did an experiment where a couple
    of journalists tried buying that, because
  • 44:08 - 44:14
    it's actually these kind of services are
    offered on weechat, pretty publicly. And
  • 44:14 - 44:26
    you can just buy them, quite easily. So
    one additional thing that I looked at is,
  • 44:26 - 44:31
    because one of the things that is quite
    interesting is, you have this idea of
  • 44:31 - 44:39
    credit as twofold. Credit is trust credit
    but credit is also loan credit, and what
  • 44:39 - 44:44
    if credit institutions actually use this
    unified credit score to determine credit
  • 44:44 - 44:49
    distribution? The idea is that it's
    supposed to lead to reduced information
  • 44:49 - 44:55
    asymmetry, obviously, so fewer defaults
    and overall more credit creation. New
  • 44:55 - 45:00
    people are supposed to get access to
    credit, and there's supposed to be less
  • 45:00 - 45:05
    shadow banking. But what actually happens?
    I'm not going to be talking about how I
  • 45:05 - 45:09
    set up the model but just about my
    results. If you have this kind of score
  • 45:09 - 45:14
    that includes credit information but also
    includes morally good – or measures of
  • 45:14 - 45:19
    being morally good – what you have is, in
    the beginning, about 30 percent more
  • 45:19 - 45:24
    agents get access to credit, and
    especially people that previously have not
  • 45:24 - 45:30
    gotten credit access suddenly have credit
    access. But the problem is that this
  • 45:30 - 45:36
    social credit score that correlates all of
    these different issues, it correlates only
  • 45:36 - 45:42
    very, very weakly with repayment ability
    or repayment wishes, and thus suddenly you
  • 45:42 - 45:48
    have all of these non-performing loans.
    You have – and what we see is sort of
  • 45:48 - 45:52
    like – we have non-performing loans.
    Banks give out less loans because they
  • 45:52 - 45:59
    have so many non-performing loans, and
    then the non-performing loans are written
  • 45:59 - 46:04
    off, and suddenly banks give out more
    loans. But you have this oscillating
  • 46:04 - 46:09
    financial system, where you give out a lot
    of loans, a lot of them are non-
  • 46:09 - 46:13
    performing, then you give out a lot of
    loans again. And this is very, very
  • 46:13 - 46:19
    vulnerable to crisis. If you have a real
    economic crisis during the time where non-
  • 46:19 - 46:25
    performing loans are high, then a lot of
    banks will actually default, which is
  • 46:25 - 46:30
    very, very dangerous for a financial
    system as nationed as the Chinese one.
  • 46:30 - 46:37
    Now, what are some possible corrections?
    You could create a score that basically is
  • 46:37 - 46:41
    the same as the Schufa score. So that it
    looks only at credit decisions, but
  • 46:41 - 46:45
    suddenly, you lose a lot of incentives for
    the social credit score, if the social
  • 46:45 - 46:48
    credit score doesn't matter for credit
    distribution anymore.
  • 46:48 - 46:52
    Another thing, and this is, I
    think, the more likely one,
  • 46:52 - 46:56
    is that you have a blacklist for people
    that have not repaid a loan
  • 46:56 - 46:59
    in the past. So you basically
    get one freebie, and afterwards
  • 46:59 - 47:04
    if you didn't repay your loan in the past
    then you will not get a loan in the
  • 47:04 - 47:08
    future. You will still be part of the
    social credit system, and your social
  • 47:08 - 47:12
    credit score will still be important for
    all of these other access issues, but it
  • 47:12 - 47:16
    won't be important for access to loans
    anymore, once you've been on this
  • 47:16 - 47:22
    blacklist. Which is probably something
    that the Chinese government could go
  • 47:22 - 47:30
    behind, but it's also more effort to take
    care of it; then you have to think about,
  • 47:30 - 47:34
    "well, you can't leave them on the
    blacklist forever, so how long do you
  • 47:34 - 47:38
    leave them on the black list? Do they have
    to pay back the loan and then they get off
  • 47:38 - 47:46
    the blacklist? Or do they have to pay back
    the loan and then stay not in default
  • 47:46 - 47:53
    for a year, or for five years?" There's a
    lot of small decisions that, in my
  • 47:53 - 47:58
    opinion, the Chinese government hasn't
    really thought about, up until now,
  • 47:58 - 48:01
    because they're basically doing all these
    pilot studies, and all of these regional
  • 48:01 - 48:05
    governments are thinking of all these
    small things, but they're not documenting
  • 48:05 - 48:10
    everything that they're doing. So, once
    they – they want to roll it out by 2020,
  • 48:10 - 48:15
    by the way, nationwide – once they've
    rolled it out there's a pretty big chance,
  • 48:15 - 48:19
    in my opinion, that they'll have a lot of
    unintended consequences. A lot of things
  • 48:19 - 48:29
    that they haven't thought about, and that
    they will then have to look at. So, I
  • 48:29 - 48:33
    believe that some sort of system is likely
    to come, just in terms of how much energy
  • 48:33 - 48:37
    they've expended into this one, and for
    the Chinese government at this point, for
  • 48:37 - 48:42
    the party, it would be losing face if they
    did not include any such system, because
  • 48:42 - 48:46
    they've been talking about this for a
    while. But most likely, it would be a kind
  • 48:46 - 48:53
    of decentralized data sharing system. And
    when I ran my simulation... By the way I
  • 48:53 - 49:00
    will make public my code, I still need
    some, basically, I used some proprietary
  • 49:00 - 49:06
    data for my model, and I still need the
    permission to publish this. Once I publish
  • 49:06 - 49:11
    this one I will also tweet it, and we'll
    put it on GitHub for everyone to play
  • 49:11 - 49:16
    around with, if you want to. And some of
    these implementation details that were
  • 49:16 - 49:20
    very important in determining model
    outcomes where "do we have a relative or
  • 49:20 - 49:25
    absolute ranking?" So far, all of the
    systems I looked at had absolute rankings,
  • 49:25 - 49:31
    but there's a point to be made for
    relative rankings. Do we have one score,
  • 49:31 - 49:35
    where, basically, if you're a Chinese
    person you get one score? Or do we have
  • 49:35 - 49:41
    different sub-scores in different fields?
    Do we have people reporting behavior, or
  • 49:41 - 49:46
    do we have automatic behavior recording?
    How do you access other people's scores?
  • 49:46 - 49:50
    How much information can you get from
    other people's scores? Currently, if
  • 49:50 - 49:56
    someone is on a blacklist, for example, if
    you have their ID number, again, you can
  • 49:56 - 50:00
    put it into this blacklist, and then they
    will say "oh, this person is on this
  • 50:00 - 50:05
    blacklist for not following this judge's
    order," and then it says what kind of
  • 50:05 - 50:11
    judge's order it was. So, most likely, it
    will be something like this. The idea is
  • 50:11 - 50:16
    that the Social Credit system isn't only
    for individuals, but also for firms and
  • 50:16 - 50:22
    for NGOs. So, what kind of roles will
    firms play in the system? I haven't looked
  • 50:22 - 50:28
    at that, in detail, at this point, but it
    would be very interesting. Another idea
  • 50:28 - 50:34
    that western people often talk about is,
    do people also rank each other? Currently,
  • 50:34 - 50:39
    that's not part of the system in China,
    but it might be at one point. And lastly,
  • 50:39 - 50:45
    where does the aggregation happen? So I've
    said that a lot of it is actually data
  • 50:45 - 50:54
    sharing in China. So what kind of data is
    shared? Is the raw data shared? "Person A
  • 50:54 - 51:04
    did something." Or is the aggregated data
    shared? "Person A got this score." At this
  • 51:04 - 51:08
    point, most of the time, it is actually
    the raw data that is shared, but that also
  • 51:08 - 51:13
    has sort of these data privacy issues, of
    course, that I've talked about. OK,
  • 51:13 - 51:19
    perfect! No there's 10 more minutes. Thank
    you for your attention! If you have
  • 51:19 - 51:24
    questions, remarks you can ask them now or
    you can catch me up later. You can tweet
  • 51:24 - 51:30
    to me or send me an e-mail, whatever
    you're interested in. Thank you very much!
  • 51:30 - 51:37
    applause
  • 51:37 - 51:42
    Herald Angel: Hello! As Toni said, we have
    10 minutes left for questions. If you have
  • 51:42 - 51:47
    a question in the room, please go crouch in
    front of our five microphones. If you're
  • 51:47 - 51:50
    watching the stream, please ask your
    questions through IRC or Twitter, and
  • 51:50 - 51:54
    we'll also try to make sure to get to
    those. Let's just go ahead and start with
  • 51:54 - 51:56
    mic one.
    Question: Good! Thank you very much for
  • 51:56 - 52:03
    this beautiful talk. I was wondering how
    did the Chinese government, companies, and
  • 52:03 - 52:07
    most of all, the citizens themselves,
    respond to you doing this research, or,
  • 52:07 - 52:12
    let's put it differently, if you would
    have been in the system yourself,
  • 52:12 - 52:14
    how would your research affect your
    social credit score?
  • 52:14 - 52:17
    laughter
  • 52:17 - 52:26
    Answer: So, um... There's actually two
    different responses that I've seen. When I
  • 52:26 - 52:31
    talk to the government themselves, because
    I was there on a government scholarship,
  • 52:31 - 52:35
    and mentioned that I'm really interested
    in this, they basically said oh well this
  • 52:35 - 52:39
    is just a technical system. You don't
    really need to be concerned with this. It
  • 52:39 - 52:43
    is not very important. Just, you know,
    it's just a technicality. It's just for us
  • 52:43 - 52:49
    to make life more efficient and better for
    everyone. So I assume my score would
  • 52:49 - 52:55
    actually go down from doing this research,
    actually. But when I talk to a lot of
  • 52:55 - 53:01
    people at universities, they were also
    very – they were very interested in my
  • 53:01 - 53:05
    research, and a lot of them mentioned that
    they didn't even know that the system
  • 53:05 - 53:10
    existed!
    Herald: Before we go to a question from
  • 53:10 - 53:15
    our signal angel, a request for all the
    people leaving the room, please do so as
  • 53:15 - 53:20
    quietly as possible, so we can continue
    this Q and A. The signal angel, please!
  • 53:20 - 53:26
    Signal Angel: Jaenix wants to know, is
    this score actually influenced by
  • 53:26 - 53:32
    association with people with a low score.
    Meaning that, is there any peer pressure
  • 53:32 - 53:36
    to stay away from people with bad scores?
    Answer: The Sesame credit score definitely
  • 53:36 - 53:43
    is influenced by your friends' scores, the
    Rongcheng score, so far, apparently, is
  • 53:43 - 53:48
    not influenced, but it is definitely in
    the cards, and it is planned that it will
  • 53:48 - 53:54
    be part of this. I think WeChat, which is
    the main platform – it's sort of like
  • 53:54 - 54:00
    WhatsApp, except it can do a lot a lot
    more – WeChat is still not connected to
  • 54:00 - 54:05
    the Social Credit Score in Rongcheng. Once
    they do that, it will most likely also
  • 54:05 - 54:10
    reflect your score.
    Herald: All right, let's continue with
  • 54:10 - 54:15
    mic 3.
    Q: I have a question about your models.
  • 54:15 - 54:20
    I'm wondering, what kind of interactions
    are you modeling? Or actions, like, what
  • 54:20 - 54:25
    can the agents actually do? You mentioned
    moving somewhere else. And, what else?
  • 54:25 - 54:31
    A: Okay so the way I set up my model was,
    I set up a multilevel model. So I looked
  • 54:31 - 54:38
    at different kinds of levels. I started
    out with, basically, they can evade taxes,
  • 54:38 - 54:47
    they can get loans and repay loans, they
    can choose where to live, and they can
  • 54:47 - 54:54
    follow traffic rules or not follow traffic
    rules. And because these were, sort of,
  • 54:54 - 54:59
    four big issues that were mentioned in all
    of the different systems, so I started out
  • 54:59 - 55:05
    with these issues, and looked at, what
    kind of behavior do I see? I used some
  • 55:05 - 55:11
    research that – some friends of mine
    actually sent out surveys to people and
  • 55:11 - 55:16
    asked them "well, you're now part of the
    system. Did your behavior change, and how
  • 55:16 - 55:23
    did it change depending on your responses,
    depending on your score, and depending on
  • 55:23 - 55:28
    the score system that exists?" And I,
    basically, used that, and some other
  • 55:28 - 55:34
    research on nudging and on behavioral
    adaptation, to look at how likely is it
  • 55:34 - 55:39
    that someone would change their behavior
    based on the score.
  • 55:39 - 55:42
    Herald: All right let's do another
    question from the interwebs.
  • 55:42 - 55:48
    Q: Yeah, it's actually two questions in
    one. How does this system work for Chinese
  • 55:48 - 55:54
    people living abroad, or for noncitizens
    that do business in China?
  • 55:54 - 56:01
    A: Currently the system does not work for
    noncitizens that do business in China,
  • 56:01 - 56:05
    because it works through the shēnfènzhèng.
    You only get a shēnfènzhèng if you're a
  • 56:05 - 56:11
    Chinese citizen or you live in China for
    10 or more years. So everyone who is not
  • 56:11 - 56:16
    Chinese is currently excluded. Chinese
    people not living in China, if they have a
  • 56:16 - 56:21
    shēnfènzhèng, are on this system, but
    there's not a lot of information.
  • 56:21 - 56:29
    Herald: All right, mic 4.
    Q: Well, we've come a long way since the
  • 56:29 - 56:34
    Volkszählungsurteil. Can you
    tell us anything about the dynamic in the
  • 56:34 - 56:44
    time dimension? How quickly can I regain
    credit that was lost? Do you have any
  • 56:44 - 56:47
    observations there?
    A: So in the Suining system what they
  • 56:47 - 56:54
    actually did was they had a very, very
    strict period. So if you evaded taxes your
  • 56:54 - 56:59
    score would be down for two years and then
    it would rebounce. In the Rongcheng
  • 56:59 - 57:03
    system, they did not publish this kind of
    period. So, my assumption is that it's
  • 57:03 - 57:09
    going to be more on a case by case basis.
    Because, I looked at the Chinese data, I
  • 57:09 - 57:14
    looked at the Chinese policy documents,
    and they didn't really, for most of the
  • 57:14 - 57:20
    stuff, they didn't say how long it would
    count. For the blacklists, which was kind
  • 57:20 - 57:25
    of the predecessor that we look at
    currently, the way it works is you stay on
  • 57:25 - 57:29
    there until whatever the blacklist –
    until whatever the reason for the
  • 57:29 - 57:35
    blacklist is has been resolved. So, you
    stay on there until you send off this
  • 57:35 - 57:40
    apology that the judge ordered you to. And
    then, usually, you still needed to apply to
  • 57:40 - 57:45
    get off. So it doesn't – for blacklists,
    it does not work that you automatically
  • 57:45 - 57:49
    get off. You need to apply, you need to
    show that you've done what they've asked
  • 57:49 - 57:53
    you to do, and then you can get off this
    blacklist. And I assume it will be a
  • 57:53 - 57:57
    similar sort of appeals procedure for the
    system.
  • 57:57 - 58:04
    Herald: All right. Let's go to mic 2.
    Q: Thank you. I just wanted to if looking
  • 58:04 - 58:09
    up someone else's data in details, like
    position et cetera, does affect your own
  • 58:09 - 58:12
    score?
    A: Currently, it apparently does not, or
  • 58:12 - 58:17
    at least they haven't published that it
    does. It might in the future, but most
  • 58:17 - 58:23
    likely it's actually behavior that they
    want. So they want you to look up other
  • 58:23 - 58:26
    people's scores before doing business with
    them. They want you to, basically, use
  • 58:26 - 58:30
    this to decide who you're going to
    associate with.
  • 58:30 - 58:33
    Q: Thank you!
    Herald: All right, do we have another
  • 58:33 - 58:39
    question from the Internet, maybe?
    Signal: Yes, I do! Standby... The question
  • 58:39 - 58:48
    is, how is this actually implemented for
    the offline rural population in China?
  • 58:48 - 58:53
    A: Quite easily; not at all at this point.
    The idea is, by 2020, that they will
  • 58:53 - 58:59
    actually have all of this is implemented.
    But even for the offline – or let's say
  • 58:59 - 59:06
    offline rural population in China is
    getting smaller and smaller. Even in rural
  • 59:06 - 59:13
    villages you have about 50-60% of people
    that are online. And most of them are
  • 59:13 - 59:16
    online via smartphone, and their
    smartphone is connected to the
  • 59:16 - 59:21
    shēnfènzhèng. So it's not very complicated
    to do that for everyone who is online. For
  • 59:21 - 59:26
    everyone who's offline, off course, this
    is more problematic, but I think the end
  • 59:26 - 59:32
    goal is to not have people offline at all.
    Herald: All right. Let's jump right back
  • 59:32 - 59:38
    to microphone 2, please.
    Q: Thank you for the very good and
  • 59:38 - 59:45
    frightening talk, so far. At first I have
    to correct you in one point. In Germany we
  • 59:45 - 59:50
    have a similar system because we have this
    tax I.D., which is set from birth on and
  • 59:50 - 59:59
    rests 30 years after a person's dead.
    Yeah. So we have a lifelong I.D.
  • 59:59 - 60:02
    A: You're right. I just... I don't know
    mine, so I figured… dismissive sound.
  • 60:02 - 60:08
    Q: No problem! But, at least we could
    establish a similar system, if we have a
  • 60:08 - 60:16
    government which would want it. A question
    for you: you mentioned this "guanxi." Is
  • 60:16 - 60:21
    it a kind of a social network? I didn't
    understand it, really.
  • 60:21 - 60:26
    A: Yes, it is a kind of social network,
    but one that is a lot more based on
  • 60:26 - 60:32
    hierarchies than it is in the West. So you
    have people that are above you and people
  • 60:32 - 60:37
    that are below you. And the expectation is
    that, while it's a quid pro quo, people
  • 60:37 - 60:42
    that are above you in the hierarchy will
    give you less than you will give to them.
  • 60:42 - 60:47
    Q: Aha, okay.
    Herald: OK, all right. Unfortunately, we
  • 60:47 - 60:52
    are out of time, so, please give another
    huge applause for Toni!
  • 60:52 - 60:54
    applause
  • 60:54 - 60:58
    postroll music
  • 60:58 - 61:17
    subtitles created by c3subtitles.de
    in the year 2019. Join, and help us!
Title:
35C3 - "The" Social Credit System
Description:

more » « less
Video Language:
English
Duration:
01:01:17

English subtitles

Revisions