Week 7 Readings, Viewings and Discussion: Social Scoring #23
Replies: 13 comments 9 replies
-
Responding to the second question regarding the equality of the social credit score, I do not believe that it is actually possible to gauge all citizens equally when it comes to one’s social credit score. I think that ‘trustworthiness’ is too vague of an area to have concrete, solid measurements to analyze everyone with, leading to very biased and varying results. The article says that they started out with simply analyzing whether or not a citizen fulfilled a contract or legal commitment, but it seems like quite a slippery slope to begin on, especially once the local governments get involved who have less direct regulation. The article then goes on to comment on how it began to award points for good behavior and deduct points for bad behavior such as traffic violations. Lastly, it touched on using filial piety in the scoring system; meaning they would begin to measure how frequently one visits their parents, etc. With all of these abstract measurements I cannot possibly see how they would be able to gauge all citizens equal. Especially when it comes to filial piety, I believe that citizens of lower socioeconomic stances are especially at harm. These citizens could already be working several jobs, taking off hours in order to increase their weekly hours, and completing other tasks in their daily life to make ends meet so that they may not have time to visit their parents’ house. I do not think it would be fair to then deduct or neglect points from these citizens who cannot afford to spend time away from their jobs or other daily tasks as they are just trying to support their family. Lastly, we have also already seen that creators of algorithms design them with biases and other persuasions incorporated into them. I believe that this social credit score would be no different - the designers deciding what to measure from each citizen can lead the measurements to being biased and the algorithm itself that collects and analyzes the data can also begin to incorporate its own biases. This leads to a potential concern that people with a lower socioeconomic status, people of color, and other minority groups will be at risk of having low social credit scores to no doing of their own. Like most to all systems we have studied so far this semester, each one has developed its own bias, and I don’t believe the social credit score will be any different. For these reasons, I do not believe in any way that it will be possible to gauge all citizens equally when it comes to this score. |
Beta Was this translation helpful? Give feedback.
-
One thing I would like to mention as someone who identifies as of Chinese descent is that I don't think the English translations of the Chinese terms do justice to the meaning of the terms and impose some sort of assumption/bias. 信用 (xinyong) is a positive term in my head and it got converted into "credit" and treated more so in the context of something similar to the American FICO, which often brings about a different type of connotation in a negative way. However, there's also going to be something lost in translation. I mention this as a result of the fact that there's a huge difference between "credit" and "trustworthiness" in the English dictionary and it causes a lot of disparity between what I believe I understood and what I was led to understand based on the wording in the readings. I say this, especially since I did not like how Hvistendahl just glossed over the deep anxiety of pianzi because it did not expand upon why historically and socially it runs so deep because it's completely different to how we would view swindlers in America. To answer the fourth question, I find Chen and Cheung's argument more convincing because it is well supported by background analysis, citations, and examples. Additionally, I don't agree with some of the arguments that Horsley brings up, especially that the social credit system does not collect data on every citizen. Even if the system, does not collect any information on an individual, which in itself is data that the individual does not utilize certain services. I think several other of the readings briefly touch on the fact that, if you aren't in the system, you are ignored by the system. Another reason why I feel like I believe more int he argument of Chen and Cheung is that the paper is from the University of Hong Kong, where the political ties between them and mainland China have always been strained but inseparable. Their understanding of the Social Credit System just holds more weight for me in addition to their detailed analysis. |
Beta Was this translation helpful? Give feedback.
-
For question 5, I would like to bring up something that I was interested in. Chen and Cheung mention how education of children are affected as a sanction of the scoring systems, and Hvistendahl mentioned punishments for those that cheated through the scoring system. I do not know if those in this discussion post are aware, but the class ranking system in China are very public compared to the American ranking system (where high schools have started abolishing the ranking system over the years). It has already caused a lot of societal issues in China, especially with the rise of terminology such as xueba(学霸)and xuezha(学渣). You can read a little more about it in Student Rankings Are Making Parents and Kids More Miserable. I think that the social credit system would serve to exaggerate the effects that people are already receiving from youth as their lives continuing revolving around a score, starting from the school class rank to their gaokao (高考) score (the score that determines what college you go to and the most important score in a student's life). I don't think the readings really expand upon this, but I would like to pose the question of how do you think that the social credit system would change how students transition from adolescence to adulthood? The rise of mobile payments and big tech is, after all, expedited by those in our and the younger generation. |
Beta Was this translation helpful? Give feedback.
-
Something in the reading that I found surprising was that citizens in China do not even know what contributes to their social score, especially what data they collect on them is being used to give scores, and the difference between the use of big data in the U.S. compared to China. Although big data has a lot of positive uses to it (companies trying to understand customer behavior online and give them personal recommendations, or thermal scanners during a pandemic for instance, ) and is very helpful in today’s world to make people’s life easier, the way it is used in this case in China is very unethical in my opinion. In the U.S., it is currently not legal to collect the kind of data that China collects on their citizens, and a lot of what’s collected in the U.S. is being done by companies but it is not directly tied to the government unlike in China (where warrants are not needed to collect these types of data on people). We also have had many laws in place regarding personal data over the years whereas in China there are none making the situation vague on purpose in my opinion. I was surprised that this collection is barely regulated by law in China and that there is no legal system that protects personal data, which means that people just don’t have a right to privacy at all and are not being treated fairly, giving me the impression that citizens are stripped out of all their freedom and constantly have to stress about doing good things for society and not share too much info online under pressure of receiving a bad score. For example, their government partners with internet “titans” like WeChat and private entities to identify trend setters in social groups and share their data with the government (from social media and phone purchases to gaming data). All of this reminds me of black mirror’s nosedive episode kind of with the social scoring system driving people’s quality of life, behavior, and actions. What’s ironic to me is that this whole system has a purpose to assess trustworthiness of citizens to comply with laws, norms, professional and ethical standards, but the process seems unethical itself to the extent of how they use big data on people. Therefore, I believe that China needs to solidify this loose legislation on data privacy by not partnering with those companies that directly share all kinds of data to them for instance, and have actual enforced laws in place that can protect citizens and their personal information. |
Beta Was this translation helpful? Give feedback.
-
In response to prompt 5, something that I found surprising in the readings was the pervasiveness of big tech companies such as Alibaba in China. The personal account of Lazarus Liu in Mara Hvistendahl’s “Inside China’s Vast New Experiment in Social Ranking” was especially eye-opening. In this article, Hvistendahl recounts a typical day for Lazarus Liu in China focusing on all the interactions he has with Alibaba technology. I was shocked by the extent of Alibaba’s reach on the lives of Chinese citizens. For example, the article mentions how Alibaba’s Alipay even imitates Samsung’s mobile OS displaying third-party applications on its home screen, allowing them to track usage data that goes beyond their application’s services. As the article progressed, Hvistendahl begins discussing his own experiences with Alibaba’s expansive network of data collection points. On just a single afternoon, Alibaba was aware of almost all of his actions and behaviors. While this is primarily attributable to Alibaba’s service Alipay, it was also discussed how Alibaba owns their own grocery stores (accepting only Alipay) as well as has several strategic investments in other popular businesses such as the bike-sharing service Ofo. This type of scale enables them to essentially monitor every aspect of their users’ lives and now with the Zhima Credit score, this monitoring can be reflected into a score that directly impacts these users’ livelihood. I believe that this example demonstrates the malicious potential of big tech if these corporations are able to operate under limited regulation and oversight. While I believe that this is an issue in the United States as well, I think that the issue is much more extreme in China due to the widespread monopolization of technology in China as well as the strict oversight of the Chinese government over its citizens. This combination poses a great threat to the privacy of the Chinese people and gives the government seemingly unlimited potential in terms of how they will monitor its citizens going forward. Looking ahead, I believe it is crucial to establish clear laws and regulations for technology companies, especially when it comes to monopolization. In our constantly expanding digital world, the ability to scale has become increasingly easier for large corporations. Moreover, the pervasiveness of technology in our lives gives them much greater reach and exposure into our daily lives. Therefore, in order to mitigate this almost god-like access to people’s information, there must be limits put in place that restrict how much information these corporations can control. |
Beta Was this translation helpful? Give feedback.
-
The sanctioning that is done in China after someone falls below a certain threshold is not a good system. Barring people from traveling and job opportunities causes them to be stuck in holes that they cannot climb out of. For example, if someone’s social credit was dinged because they failed to visit their family members who live across the country, they would have more difficulty raising the filial piety aspect of their score because of the travel restrictions. Many of these sanctions discussed create slippery slopes that cause those who receive them to fall even further down the social ladder and limit opportunities to climb up. In the Black Mirror episode, this type of slippery slope is exaggerated once Lacie is unable to get a premium plane ticket because her social score dipped below 4.2. I can also see how these types of sanctions could be illegitimate. In the US, a clear example of illegitimate sanctions is the use of a no-fly list by the TSA. This has affected many Middle-Eastern/Muslim-Americans in the past and even as recently as January, was suggested as a way to try and punish those who participated in and were present at the Capitol Hill riots. There are many stories of people being barred from travel simply for the fact that they shared a name with a legitimate criminal or terrorist. They were accidentally included on the list without their knowledge. The main reason that these sanctions often become illegitimate is because there is no form of due process. Potential members aren’t called into court to defend themselves and oftentimes aren’t even notified of their inclusion until they try and board an airplane. All of the power and control over these lists are given to federal agents. Federal law enforcement officials have the ability to place anyone they want on these lists for many reasons, from lack of cooperation to being a legitimate terrorist. Proposed Chinese sanctions are very similar to the no-fly list in the US. Additionally, China’s legal system has been heavily criticized for its lack of due process (because of the arrests and imprisonment of reporters or dissenters on bogus charges). I would expect this social system and its sanctions to follow a similar or worse path than what has happened to the no-fly-list in the US because of the untrustworthy legal system that exists in China. Chinese citizens will be sanctioned with little notification or any way to contest and get themselves removed from these lists. Article on US no-fly list. https://www.rollcall.com/2021/01/14/lawmakers-want-capitol-attackers-added-to-u-s-no-fly-list/ |
Beta Was this translation helpful? Give feedback.
-
In response to prompt four, I found Horsley’s argument more convincing than Chen and Cheung’s argument regarding the social credit system. It seems that this social crediting system has been demonized through widespread myths and rumors. The intention of this system is seemingly similar to the implementation of rules, aiming to control the behaviors and actions of citizens for the good of the country. It also is often true that people think the government is watching them or monitoring all that they do, but in reality, the focus is more on the aggregate in many cases as opposed to the individual. I find this to be more believable than the view that the Chinese government is personally and individually monitoring each and every one of its citizens. Overall, it just seems like Horsley had a more realistic view on the system than Chen and Cheung. The potential for harm and risk is there, but the system does not seem to be nearly as intrusive and damaging as it has been made out to be. The usage of people’s data in order to determine rewards and consequences seemed like a horrifying and unimaginable concept to me at first, but after examining the systems relative to my social location, I am able to see that data usage like this is happening all around me. In addition to this, I found it interesting to compare this social credit system in China to systems in the United States that are used for similar purposes. Credit scores in the United States determine where you can live, what you can buy, and how much you can spend. Though credit scores may not be based on the same level of sensitive data that the social credit system uses, credit scores in the United States do control and determine many aspects of your life. Another system that is similar to the social credit system is Americans’ social security number. This number is essentially synonymously tied to your identity and allows the government to track you and many aspects of your life. Your licenses, addresses, financials, and criminal history are all tied to this number you were assigned at birth. It is readily accessible by the government and law enforcement. Also, private companies have access to your consumer wants and needs through the Internet. Technology is being developed to track you and learn from you so it can manipulate your behavior to an extent. While it is important to consider the ethical implications of big data usage and privacy, it is also important to consider that other areas of the world may use technology in a different way that is contrary to our democratic, capitalism society. America’s current systems have their own abundance of issues and perpetuate many harmful societal issues like racism, sexism, and classism. |
Beta Was this translation helpful? Give feedback.
-
The entire Wired reading "Inside China's Vast New Experiment in Social Ranking" was very interesting but not very surprising. I do not see it as shocking since technology, around the world, is progressing at an exponential rate. While reading the article it seemed like society, as a whole, continues have the same long-existing problems and are perpetuated by new technology like the one discussed in the article, Ant Financial and Alipay. I find it interesting that the creators of the social ranking system will not provide details on what they actually track, but would rather only highlight and speak about the positives like how "your data will magically open doors for you". But rather, will your data actually close more doors than open them? From the article, it seemed like the algorithm feeds into the systemic inequality societies have faced for centuries. That someone in a lower class, has the, nearly impossible, option of moving classes since the system already targets them. I say this since, the technology discussed, docks scores for "behaviors that have nothing to do with consumer etiquette". The technology also rates people lower if their contacts have poor ratings; well, then how is that person supposed to friend people with higher scores if they will decrease that person's score? It logically makes sense that a person with a lower score will stay at the bottom. This technology brings a whole new, and intricate, level of social class. Additionally, what I find interesting, and do not really understand, is why this technology is making it a goal to create "consequences for dishonest behavior" when in my opinion, the institution that experienced the dishonesty should pursue consequences. It seems like if a person makes a mistake or has a flaw in judgement once, they are heavily persecuted and might even be labeled as "common folk". This technology perpetuates a society that is built upon toxicity and dishonesty. As an example, rating people higher if their friends have high scores, leads people to make assumptions about their contacts and which they need to stay away from. Lastly, I do think this technology could be used for good, if pursued in a just and equal way, however, I do not think the inventors of this technology have a majority of good intentions. |
Beta Was this translation helpful? Give feedback.
-
Although clearly a utopian society that does not reflect the reality of a social credit scoring system, I was intrigued by the way the over-obsession with social scores can affect everyday life and interactions in the Black Mirror episode Nosedive. The idea brought forth in Horsley’s argument that “whoever violates the rules somewhere shall be restricted everywhere” is crucial in Lacie’s life in this episode. As her social score continues to spiral downward, she becomes further restricted not only from social events, like Naomi’s wedding, but also the ability to rent a car or catch a ride from a stranger. Because of a mere score, she is looked down upon by all those who interact with her. Especially in this utopia in which a person’s social score is the first thing you are faced with upon meeting someone, the sheer impact this has on the way someone is treated by others is astonishing. Even Lacie, whose score had deteriorated down into the 2 range, is hesitant to take a ride with a perfectly nice woman just because her score is a 1.4. This numbers game dominates her life and the message the episode puts forth is that one can only find true happiness when they are free from the forced ‘perfect’ society in which they are able to finally express their honest opinions without fear of being rated lower by others for this honesty. In China with the fear of a national social credit score that will govern and restrict a person’s activity in society, Nosedive paints a dramatized yet concerning picture of what a society looks like dictated by numbers. The goal of improving governance and order in a country by publishing blacklists and red lists, which indicate a person’s trustworthiness, may be well-intentioned. However the impact this type of social credit system may have in collecting biased data that does not truly reflect a person’s character and then results in exclusions has the potential to dominate every aspect of a citizen’s life as it did with the members of this utopian society in Nosedive. Factors like social behavior, consumption habits, and political loyalty do not necessarily reflect a person’s credit trustworthiness. Something that particularly struck me in Hvistendahl’s article Inside China’s Vast New Experiment in Social Ranking was the idea that a person’s friends and acquaintances could also play a role in impacting this social credit score. Although Facebook has revised its platform policies to prohibit this data from impacting credit eligibility, the idea that in the future a person’s social network could impact the government’s idea of their trustworthiness is truly terrifying. As seen in Nosedive, Lacie shunned a coworker whose score had dropped due to a breakup, not wanting to be associated with someone who has a lower score in fear of it negatively impacting her own. A big data expert in Hvistendahl’s article also commented, “you could imagine a future where people are watching to see if their friends’ credit is dropping and then dropping their friends if that affects them.” This notion that the people someone associates with could affect this scoring I feel is fundamentally wrong and something to be cautious about in the future as these algorithms progress. |
Beta Was this translation helpful? Give feedback.
-
With regard to prompt #2, the question of scoring people on "filial piety" and other cultural norms seems hard to do in a fair and consistent manner. From a Western point of view, at first it seemed rather unacceptable to me that these social credit scores intend to use a measure of how well one takes care of their parents as a reflection on the person, because I immediately imagined scenarios where children who are estranged from their parents or suffered unhealthy relationships with their parents would be unfairly punished for cutting ties with them. That said, the concept of taking care of one's parents and grandparents is a common cultural expectation in China, and specifically is considered a duty, much like we in the U.S. consider things like serving on a jury to be an honor and a duty. So, it would make sense that when trying to calculate a social credit score that qualities that Chinese culture values would be included to measure a person's social status. However, while it's understandable to me that the score would try to include such qualities, I don't believe it's fair or possible to truly measure them. In fact, the whole concept of social credit scores seems inequitable and punitive. To begin with, measuring how well-fed someone's parents are as a reflection on a person is reinforcing the social subjugation of the poor. A person raised in poverty in China may do their best to take care of their parents, but if the cyclical nature of poverty leaves them with poor options for work when they're older, they may not be able to care for their parents to the satisfaction of the creditors' expectations, and they may be punished with a drop in their social score. This lower social score would then reinforce their limited opportunities in life and continue to keep them in poverty and unable to access certain programs and benefits. The same logic can be said for most of the system at large. If being friends with a person of a lower score harms your own, then there's little incentive for friends to help uplift friends. Instead, people get stuck in the social strata they begin with and are unable to climb their way to a better score, effectively reinforcing the existing social standing of people and preventing those that the creditors' deem "subpar" from moving up in society. There's simply no fair, comprehensive way to measure the quality of a person because values, opportunities, and personal situations differ so widely between people that there can't be a simple way to measure everybody against one another. This attempt to do so through social credit is merely exacerbating the struggles of the lower class while creating competition for benefits among the upper class. |
Beta Was this translation helpful? Give feedback.
-
In the article written by Chen and Cheung, it was stated that the Shanghai government considered including filial piety in calculating a person's social credit score. They provided two examples in which it could be involved in calculating a social credit score in China, the "frequency in which an individual visits his or her parents" and "whether an individual's parents had enough food". In these two examples, there are multiple barriers that may prevent a person from being able to satisfy these criteria. When considering that this was a potential criteria in the score, it is difficult to say that the score would objectively and accurately gauge a person's "worthiness" of receiving a high score. Just as it is in the United States credit system, a score does not reflect the experience of an individual, rather it potentially reflects whether a person made ill choices at some point in their life. Considering the circumstances that could affect an individual's relationship with their parents, evaluating how they treat their parents or how maintained their parents are would not accurately depict an individual as a whole, just that at some point an individual made decisions that could have had the potential to affect their parents. Sanctioning, as referenced in the article, has the potential to drown out citizens with already low social credit scores. An example of this would be someone with a low score attempting to move to a new city in order to find better employment. If a sanction has been placed on them, this person may find themselves out of a job and with a loss of an opportunity to increase their social credit score, preventing them from ever rising out of the overbearing measure. This system of continuing to, for lack of a better phrase, "beat a dead horse" has little upside for those who have not been on the right path, and all the benefits to those with the luxury of never making a mistake that would negatively impact their score. After reading Horsley's article, it's difficult to say which argument is more convincing. It seems Horsley shrugs the concept off as nothing more than bad interpretation of the system, while Chen and Cheung provide detailed reporting on the implementation and implications of the social credit system. While Horsley provides examples of what the "actual" credit system in China looks like, Chen and Cheung provide examples of how an individual's |
Beta Was this translation helpful? Give feedback.
-
This is something I was watching the other day, and I think it adds on to the discussions we've been having. WongFu Productions released their movie "Everything Before Us" (2015) on Youtube, and it was actually released before the Black Mirror Nosedive episode by 1 year (I believe?). The movies utilizes an emotional integrity score system which was controlled and handled by the DEI (Department of Emotional Integrity). I thought it would just be interesting to share. It's a romantic drama film, but I think it provides another form of social scoring that I don't think we explicitly talked about but we inherently sometimes see in society. Link to the Youtube playlist: https://www.youtube.com/watch?v=qcHbQi-jznQ&list=PLSHabwxChOtWGtvKtFaUrIbWYDlk4uEi-&ab_channel=WongFuProductions |
Beta Was this translation helpful? Give feedback.
-
Half of me feels like social scoring could be an effective means for proving trustworthiness, while the other half thinks only about the privacy issues, social impact, and potential for complete government control. As someone who does believe in a rewards system but also dislikes communism, I found myself constantly debating the usefulness and ethical concerns of this system. In the Hvistendahl article, the author talks to a journalist named Lui Hu who made an error when paying off a fine equivalent to $1350 without knowing. It was not until he was trying to travel much later that he realized his error cost more than $1350. His error had landed him a spot on the blacklist, a position that restricted his services and traveling privileges. At first, I tried to be open-minded and saw this as a form of more efficient credit—the contributing members of society benefit and those who do not contribute are penalized. Considering the government did not know that his failure to pay the fine was accidental, this made sense to me. Of course the system has it’s errors, much like our credit system. For example, Lui tried to pay the fine correctly and tell the court that he had fixed his error, but as he mentions, “there are many mistakes in implementation that go uncorrected.” I thought this was similar to the effect of a case of identity theft in the US credit system. Obviously, payment errors and identity theft are issues do not happen to everyone, but when it does happen, it can be something that is uncorrected for a detrimental amount of time. Nevertheless, playing devil’s advocate did not last long after reading some of the social implications and methods for gaining a better score in social scoring. After Lui told his unfortunate story, he mentioned that friends begin to “quietly drop you as a contact” if they discover you are on the blacklist, out of fear of dropping their own score. While the algorithm might be effective in determining trustworthiness, these flaws eliminate the human aspect of social interactions and leave little room for second chances. Moreover, the methods by which someone can increase their scores seems disingenuous as well. Mara Hvistendahl mentions in her article that “helping the poor merited 10 points,” but helping “the poor in a way that was reported by the media: 15.” While this system might have referred to an earlier versions of social scoring, the idea alone was the most similar correlation between the social scoring system and nosedive I could find. With the accelerating rate of technological growth and the ethical dilemmas found in social scoring in mind, I find it ironic that Horsely states that “a single and all-pervasive ranking system isn’t [a thing to be worried about]—yet.” This is exactly what we should be worried or at least concerned about, considering its potential for global spread. Like Chen and Cheung mention, “given the inadequate protection afforded to personal data in China, the country is an ideal social laboratory for big data experimentation.” This is something that policy-makers need to take into consideration when looking to integrate this system into their own societies. Chinese society has already integrated this system, but they were not given much of a choice. Since China regulates “speech, association, and other civil rights,” the exact parameters needed in order to integrate this system into other societies would have to be vetted extensively, given strict guidelines for what is “good and bad,” and the privacy restrictions for government use would have to be clearly identified. In a country like the United States, I have trouble seeing a full integration of this essentially involuntary technology, especially with regards to a government assigned SCS, but these are issues that we should be prepared for rather than waiting until it is. |
Beta Was this translation helpful? Give feedback.
-
Week 7 (March 16/18)
Lead: Team 5
Blog: Team 3
Required Readings and Viewings (for everyone):
Response prompts:
The response prompts focus on the Chen and Cheung article, which will be the focus on the class meeting on 16 March. Post a response to the readings that does at least one of these options by 10:59pm on Sunday, March 14 (Team 1 and Team 3) or 5:59pm on Monday, 15 March (Team 2 and Team 4):
Beta Was this translation helpful? Give feedback.
All reactions