Yi Tingyue has a lot going for her. She won a scholarship to China’s prestigious Sichuan University, where she graduated with a master’s in graphic design. She drives an Audi A4 and owns a penthouse apartment on the outskirts of provincial capital Chengdu. Vacations are spent touring Japan, Thailand and the U.S. Little wonder Yi is an 805.
That’s the score assigned to Yi by Sesame Credit, which is run by Jack Ma’s online-shopping empire Alibaba, placing the 22-year-old near the top of the scheme’s roughly 500 million–strong user base. Sesame determines a credit-score ranking—from 350 to a theoretical 950—dependent on “a thousand variables across five data sets,” according to the firm.
Unlike Western-style credit systems, Sesame takes in a broad range of behaviors both financial and social, all underwritten by an invisible web of Big Data. It’s the most prominent in a rising network of social-credit-score systems in China that are dramatically expanding the concept of creditworthiness—and raising fears internationally about Orwellian overreach by an autocratic regime.
In China, cash has long been king. As recently as 2011, only 1 in 3 Chinese people had a bank account. The nation’s rapid rise from collectivized penury to the world’s No. 2 economy meant it never had the chance to develop Western-style credit histories. That meant people could default on loans,
or sell shoddy or counterfeit goods, with few repercussions. Society was dogged by a question: Whom can you trust?
In 2015, the government set about addressing this by allowing eight companies—including Sesame parent Ant Financial—to run trial commercial credit scores. The official guidance called for a nationwide system that would “allow the trustworthy to roam everywhere under heaven, while making it hard for the discredited to take a single step,” to be in place by 2020.
Data, of course, is key. As Sesame had access to the records of Alibaba’s mobile-payment app Alipay, which today boasts over 1 billion users worldwide, the company stole an easy march on its rivals in China. The system functions like a frequent-flyer scheme on steroids—one that transforms all of society into a first-class lounge. High scorers like Yi enjoy a bevy of perks. She can rent cars without a deposit, get better rates of foreign exchange and even skip hospital waiting lines. Rail stations have special waiting rooms for high Sesame scorers. For a time, high scorers could use a designated security queue at Beijing airport.
Yi keeps her score high in part by traditional brand loyalty: using Alipay wherever she goes and investing in the firm’s savings fund. But Sesame’s scheme doesn’t merely encourage good behavior. Yi fears “unsociable” behavior might also impact her score; she says she “wouldn’t dream” of parking a shareable bike in an undesignated place, for example.
The fear stems from the fact that as commercial credit systems like Sesame have started taking off in China, around half a dozen local authorities have launched pilot credit systems specifically designed to socially engineer behavior. These, too, ascribe a number to citizens. Good deeds gain points; bad deeds lose them, with perks and hardships attached. If you fall afoul of a pilot system, you cannot get a loan or mortgage, even if your offense is nonfinancial, like quarreling with neighbors. Conversely, nonfinancial “good deeds”—like giving blood—can cause your loan’s interest rate to drop.
This overlapping mishmash of commercial and state-run systems has stoked fears that China’s autocratic Communist Party is planning by 2020 to implement a single citizen score using opaque algorithms that’s largely political. In a speech Oct. 4, U.S. Vice President Mike Pence described it as
“an Orwellian system premised on controlling virtually
every facet of human life.”
In fact, China’s social-credit scoring is best understood not as a single system but as an overarching ideology: encompassing punishments and rewards, to improve governance and stamp out disorder and fraud. Commercial schemes mostly handle the perks, state schemes the punishments. Both work in concert to encourage socially responsible behavior.
But they are undeniably intrusive. Government agencies compile and share data on judgments against individuals or companies. Fail to pay a fine or court-ordered compensation, or default on your debts, and you will be put on the “List of Untrustworthy Persons.” Blacklisted individuals cannot make “luxury purchases,” such as high-speed rail and air tickets or hotel rooms. Five million people have been barred from high-speed trains and 17 million from flights under the scheme, according to the official website. “The ripple effect on every part of your life becomes a multiplier on punishments,” says professor Frank Pasquale, a Big Data expert at the University of Maryland.
And some elements are indeed worthy of dystopian fiction. In certain areas of China, call a blacklisted person on the phone and you will hear a siren and recorded message saying: “Warning, this person is on the blacklist. Be careful and urge them to repay their debts.” When a blacklisted person crosses certain intersections in Beijing, facial-recognition technology projects their face and ID number on massive electronic billboards. Beijing-based lawyer Li Xiaolin was blacklisted after a court apology he gave was deemed “insincere.” Unable to buy tickets, he was stranded 1,200 miles from home.
“It was ridiculous,” he says.
Although it sounds draconian, the repercussions of blacklisting “tend to be coercive rather than punitive,” China legal scholar Jeremy Daum says. Once you’ve complied with the court ruling, you’re theoretically scrubbed from the blacklist (although many report that this is easier said than done). Essentially, the system utilizes technology to enforce existing legal prohibitions through shame and hardship rather than by adding a new layer of arbitrary regulation. Despite being blacklisted himself, Li is in favor of the system overall. “It’s good for society,” he says.
The fear, however, is that it will also be good for a Chinese government eager to quash free speech and root out dissenters. Under the state systems, “spreading rumors” is disproportionately penalized. In one region, neighborhoods have dedicated watchers to record deeds and misdeeds. As the state continues to persecute religious minorities like the Uighurs and silence outspoken academics, the worry is what metrics the social-credit systems may extend to next.
Still, for now people like Yi are happy to relish the newfound order that a social-credit score can bring—as well as enjoy the perks. “At first I worried about all my information being exposed,” says Yi, sipping free Starbucks coffee courtesy of Sesame’s loyalty points. “But then I thought, I’m just a normal kid, a regular person. What harm could there be?”