Beyond Surveillance Capitalism: Privacy, Regulation and Big Data in Europe and China
Beyond Surveillance Capitalism: Privacy, Regulation and Big Data in Europe and China
To cite this article: Brett Aho & Roberta Duffield (2020): Beyond surveillance capitalism:
Privacy, regulation and big data in Europe and China, Economy and Society, DOI:
10.1080/03085147.2019.1690275
Beyond surveillance
capitalism: Privacy,
regulation and big data in
Europe and China
Abstract
Keywords: surveillance capitalism; social credit system; GDPR; data privacy; big
data; data regulation.
Introduction
In recent years China and the European Union have each adopted major policies
on big data, placing their digital economies on two fundamentally different
paths of development. In China, the social credit system (SCS) is being culti-
vated as one of the most substantial social and economic reform projects in
national history and is expected to emerge as a defining institution shaping
China’s continued development in the information age. In Europe, the
General Data Protection Regulation (GDPR) has been formulated as a compre-
hensive regulation on data protection and privacy, defining the way that both
companies and states are able to collect and use data. This paper proposes
that each governance project represents a radically different approach as to
how data are conceptualized, with substantial implications for future social
and economic progress. China’s SCS has been in development since 2014
and aims to have most of its basic structures in place by 2020, whilst the
GDPR was proposed in 2012 and adopted by the European Parliament in
2016, with most provisions entering into force in 2018.
This paper asserts that both the GDPR and the SCS have emerged in
response to the globalized expansion of a relatively new logic of accumulation
that scholar Shoshana Zuboff (2015, 2019) refers to as ‘surveillance capitalism’.
In brief, technology corporations, bolstered by a dearth of regulatory oversight,
have gradually expanded their capacities for data collection and analysis as more
and more human activity has moved online. Empowered by new algorithmic
technologies, these corporations have developed capacities to mine vast data-
bases of behavioural data, transforming individuals into data subjects whose
actions, decisions and attitudes can be understood and manipulated for profit.
This has resulted in a situation where a relatively small number of corporations
now wield a substantial degree of power over the social and economic beha-
viours of consumers and populations around the world. This is the context in
which both China’s SCS and Europe’s GDPR have emerged.
Although at first glance it may seem that the act of comparing these two pol-
icies represents a folly of apples and oranges, SCS and GDPR can be inter-
preted as concrete steps that each government has taken in response to the
proliferation of data surveillance infrastructures. In a sense, they represent
more than the policies that they inscribe on society and can be seen as broad
normative statements on how each governing entity conceptualizes big data
and how its associated technologies should be harnessed or restrained. Both
governance projects represent assertions of state control over digital sectors,
and both fundamentally alter the shape and course of digital economic develop-
ment. Two very different futures emerge, with the European Union attempting
to limit the power of surveillance capitalism with the passage of the GDPR, and
China fully embracing its logics for further state use. The implications of these
steps are substantial, placing Europe and China on very different paths in terms
of social and economic development. In effect, what is emerging is two dissim-
ilar forms of capitalism, operating on fundamentally different sets of logics.
Brett Aho and Roberta Duffield: Beyond surveillance capitalism 3
The rapid adoption of digital technologies has led to the production of massive
troves of data by humans simply carrying out day to day activities. With every
text and e-mail, every social media post, every website visit, every click or swipe,
every song listened to, video watched, item purchased, game played, bill paid,
place visited or medical symptom researched, behavioural information is now
collected and stored. Massive databases have emerged, containing personal
data on billions of individuals that can be analysed, studied and instrumentalized
to modify future human choices. Tech firms have been amongst the first to
embrace these possibilities, selling data and behavioural predictions to commer-
cial ventures and advertisers, who in turn harness these insights to more effec-
tively market products to consumers. Zuboff (2015, 2019) refers to this new
structure of consumer relations as ‘surveillance capitalism’, arguing that data-
driven consumption operates on fundamentally different logics than traditional
market capitalism, in large part due to its propensity to anticipate and modify
human behaviours. Karl Polanyi (2001[1944]) suggests that industrial market
capitalism is largely based on the construction of three ‘fictional commodities’
in which human life is reframed as labour, nature is reframed as real-estate,
and exchange as money. Zuboff (2015) builds on this conceptualization,
suggesting that surveillance capitalism has reframed a fourth fictional commod-
ity; reality itself. As she elaborates:
it has ever been before. As proposed by James C. Scott (1998), if something can
be rendered legible, it can also be manipulated. In examining the foundations of
past state endeavours to harness the power of data to implement utopian social
projects, he identifies four key elements:
the legibility of a society provides the capacity for large scale social engineering,
high-modernist ideology provides the desire, the authoritarian state provides the
determination to act on that desire, and an incapacitated civil society provides the
leveled social terrain on which to build. (Scott, 1998, p. 5)
In the case of surveillance capitalism, big data provides the capacity, shareholder
value provides the desire, powerful firms provide the determination to act on
that desire, and an unwitting or indifferent populace provides the levelled
social terrain on which to build. Within a milieu of deregulation, where corpor-
ate actors have become the dominant power exerting influence over social
forces, high modernism has become privatized, motivated not by a utopian
ideal, but by the generation of profit.
Zuboff uses the term ‘extraction’ to describe the relationship between corpor-
ate entities and individuals under surveillance capitalism, describing a process
by which data are mined from a population, then analysed, operationalized
and deployed to shape or modify behaviour (Zuboff, 2015, p. 2019). Data col-
lection capacities have expanded dramatically by normalizing the amount of per-
sonal information that citizens share, and through the application of powerful
algorithms, firms are often able to better understand and predict individual be-
haviour than individuals themselves. Today, applications are developed and
tweaked using extensive A/B testing that seeks to maximize the time individuals
spend on apps, as well as the breadth and depth of information that is dissemi-
nated and collected (Christian, 2012). In a TED talk entitled ‘How a handful of
tech companies control billions of minds every day’, former Google design ethi-
cist Harris (2017) stresses that firms are effectively hijacking human brain
activity for profit. Indeed, the work of corporate data scientists is largely to
understand and learn how human-technology interactions can be more effec-
tively manipulated. As Nicholas G. Carr (2011) describes, on a neurobiological
level, the human brain itself is becoming the subject of profit maximization
strategies as human-technology interaction becomes increasingly integrated
into daily life.
This is not to suggest that technology has transformed the individual into a
mindless automaton whose free will has been subjugated by algorithms con-
trolled by data operators. However as more and more aspects of human life con-
tinue to move from the analogue to the digital, it is important to consider that
most digital interfaces are being developed by corporations whose motivations
are first and foremost profit-driven (Alaimo & Kallinikos, 2017). As ‘nudge’
capacities increase, these interfaces are increasingly designed with behavioural
modification in mind, with substantial implications for power relations within
modern capitalism (Rouvroy, 2012). As Zuboff (2015) notes,
Brett Aho and Roberta Duffield: Beyond surveillance capitalism 5
False consciousness is no longer produced by the hidden facts of class and their
relation to production, but rather by the hidden facts of commoditized behavior
modification. If power was once identified with the ownership of the means of
production, it is now identified with ownership of the means of behavioral modi-
fication. (p. 82)
This power over the means of behavioural modification is precisely what China
seeks to harness in its development of SCS, and what Europe seeks to challenge
in its adoption of the GDPR.
The rise of surveillance capitalism can be largely connected to the neoliber-
alization of political and economic structures as witnessed in the Atlantic region
during the latter half of the twentieth century (Zuboff, 2019). Having come of
age during an era of growing global free-market hegemony, Silicon Valley and
the wider Western tech sector has emerged as one of the least regulated indus-
tries in modern history relative to its size. This is compounded by the fact that
most existing regulatory institutions are simply not designed to respond to the
challenges of the information economy, having been conceptualized in an era
when industrialism was the primary driver of development (Cohen, 2016). As
a result, most major technology companies have been allowed to operate free
from state oversight, enabling the development of surveillance and social engin-
eering capacities that might otherwise raise the alarm of state regulators.
As information technologies have begun to push civilization towards new
forms of capitalist relations, at least two governments have made moves to
assert some control over the ways that these technologies impact their societies,
demonstrating a certain democratic (European Union) or authoritarian (China)
self-determination over the forces of unrestrained techno-capitalism. In much
of the rest of the world, including the United States, business models premised
on mass data surveillance seem to be expanding. Even in the wake of the Cam-
bridge Analytica scandal and foreign interference in the 2016 presidential elec-
tion, the United States has yet to adopt any substantial data privacy laws at the
federal level. Rather, surveillance capitalism as a new mode of accumulation has
begun to permeate through a range of industries, including the finance, insur-
ance, automotive, retail, and travel industries (Zuboff, 2019). With the consider-
able lobbying power of the tech industry, and the considerable profit-
maximizing potential that surveillance-centred business models provide to
other industries, it will be difficult for liberal market economies to regulate
firms’ use of data. Indeed, in much of the world, a sort of social contract
seems to have emerged in which citizens tacitly accept data surveillance as
long as firms continue to provide desirable services. In less developed countries,
the focus of major corporations seems to be on securing new sources of data
extraction, perhaps best exemplified by Facebook’s controversial ‘Free Basics’
programme (Hempel, 2018; Yim et al., 2017).
Scholars tend to form similar predictions around how unregulated tech
sectors will gradually reshape societies. Columbia law professor Wu (2010) pre-
dicts an expansion of cartels and monopolies. Similarly, scholars Mayer-
6 Economy and Society
Schönberger and Ramge (2018) argue that power will become concentrated
amongst those companies that control the most valuable data. Historian Yuval
Noah Harari (2018) concurs, asserting that regulation of the ownership of
data is the key to preventing the concentration of wealth and power amongst
a small elite. In the field of surveillance studies, scholars often highlight how
asymmetrical data accumulation dispossesses subjects of agency over their per-
sonal information, laying the foundation for unjust data practices, including
social sorting and discrimination (Cinnamon, 2017; Lyon, 2007, 2003). Most
seem to agree that barring some form of political intervention, surveillance
capitalism will continue to exacerbate trends of rising social and wealth
inequality.
However, China’s adoption of the SCS and the European Union’s adoption
of the GDPR have placed each state on a different path. In the case of China, the
general aim is to transfer the power of data surveillance from the private sector
to the public sector, repurposing existing surveillance infrastructures and tech-
nologies to advance state agendas. In Europe, the GDPR reflects broad norma-
tive aims to protect individual privacy and preserve individual freedom,
consequently limiting the degree of behavioural control that corporations can
exert over consumers. The next two sections will examine the SCS and the
GDPR in turn. In the final section, the implications of these two paths will
be examined.
Background
China is currently in the process of developing ‘the boldest and most ambitious
governance reform programme launched by China since 1978’ (Sapio, 2017).
Political scientist Sebastian Heilmann (2016) refers to the project as establishing
a ‘new digital Leninism’, describing SCS as the ‘the most ambitious Orwellian
scheme in human history, seeking to establish an all-seeing state’ (p. 17). The
project builds off the fundamental logics of surveillance capitalism as well as
its technological infrastructures, expanding upon the surveillance and social
engineering capacities whilst harnessing their potentials for purposes of state-
craft. The foundation of the project is the assignment of dynamic credit
scores for every economic actor operating within the national market, from
giant conglomerates and state-owned enterprises down to small businesses
and individuals. These scores are assigned by a series of algorithms operation-
ally managed by central government authority and allow the state to encourage
desired social and economic behaviours whilst discouraging undesirable beha-
viours through an operationally managed system of tailored rewards and pun-
ishments. What emerges is a novel system that enables data-informed
economic and social planning on a national scale.
Brett Aho and Roberta Duffield: Beyond surveillance capitalism 7
The root of the SCS lies in the rapid digitization of the Chinese economy.
According to official statistics, in 2017 China was home to 731 million internet
users and 695 million mobile internet users, and Chinese consumers are now
responsible for 40 per cent of the value of all global e-commerce transactions
(CNNIC, 2017; Woetzel et al., 2017). As headlines in the Financial Times
have declared: ‘China’s digital economy is a global trailblazer’ (20 March
2017); ‘China gears up for leap into digitization of industry’ (19 December
2017), ‘China mobile payments dwarf those in US as fintech booms, research
shows’ (23 February 2017). In light of the rapid digitization of the Chinese
economy, computer scientist Kai Fu Lee (2018) describes China as the
‘Saudi Arabia of data’, referring to the unparalleled amount of data that
Chinese citizens produce every day. He argues that the depth of digital inte-
gration in China is unmatched in the rest of the world, as mobile payments
replace hard cash for most daily economic transactions, and apps such as
WeChat become central features in everyday social and economic life. As smart-
phones have made everyday life legible through the technological architectures
of surveillance capitalism, the Chinese state now seeks to use these capacities as a
new source of power and control.
Ideological foundation
State surveillance has a substantial history in modern China, and the develop-
ment of the SCS is not without precedent. In the wake of the 1949 revolution, all
Chinese citizens were organized into a danwei, or work unit, which served as an
early form of government surveillance and control over the personal lives of
individuals. These administrative formations were responsible for state over-
sight over a wide range of everyday human activities, including travel, marriage,
housing, education, population control and health care. The danwei served as
the primary tool by which the Communist Party of China (CPC) sought to
organize the economic ambitions of Mao’s Great Leap Forward, and any
dissent to the Party’s vision was recorded in a dang’an, a government file for
maintaining personal records on Chinese citizens. However, a tradition of
social governance in China stretches back further. In his study of social govern-
ance in China, scholar Bray (2005) advocates for a genealogical method, which
highlights a complex process of layering in which seemingly disparate practices
from the past come together to explain the development and emergence of a
modern system. In this regard, elements of the SCS can be interpreted as
modern reflections of a range of surveillance and control policies evident
throughout China’s dynamic past, from practices of Confucian bureaucracy,
to the social policies of the Yan’an Rectification Movement and the influence
of Soviet planning advisors in the 1950s.
Although the danwei still exists, it is only a single piece of a much more pro-
found surveillance society that has more recently emerged in China. Surveil-
lance and censorship policies, commonly referred to as the Great Firewall of
8 Economy and Society
China, have been part and parcel of the Chinese internet since its inception. In
public spaces, China is currently in the process of building the world’s biggest
camera surveillance network equipped with facial recognition technology, with
170 million CCTV cameras installed by 2017, and an estimated 400 million
more to be running by 2020 (BBC News, 2017). If a particular group is
deemed a security risk, coercive surveillance is intensified, as exemplified by
the ongoing securitization of Xinjiang and its marginalized Uighur population
(Mitchell & Diamond, 2018). A perceived deficit of social trust has contributed
to the expansion of Chinese surveillance capacities, whilst a culture of informing
on one’s neighbours remains widespread (Hawkins, 2017; Lubman, 2017). In
this context, public support for the SCS and its surveillance imperatives
remain high, and it is broadly regarded as a way to bring about a more honest
and harmonious society (Kostka, 2018).
Much of the impetus for the SCS’s development stems from a historical lack
of trust between economic actors in China, alongside weak institutions that
have struggled to rein in unsustainable and corrupt business practices (Dai,
2018). In development discourse it has become common to acknowledge effec-
tive institutions as important drivers of economic growth; over the past two
decades, China’s rapid economic development has largely outpaced its
growth in institutional capacity (Ezrow et al., 2016; Nederveen Pieterse,
2015). As a result, many laws and regulations concerning economic activity
in China remain poorly and selectively enforced across a range of industries
(Zhang & Zhang, 2016). The core concept SCS strives to impose is ‘self-regu-
lation of enterprise’ in which businesses comply with laws and regulations of
their own accord, thereby easing the burdens on existing enforcement and
compliance structures (State Council, 2017). Hence, the SCS is being
implemented as a way to complement the ultimate functions of other insti-
tutions by shaping individual and firm behaviour, ensuring compliance with
government laws and regulations and incentivizing corporate social and
environmental responsibility. In one regard, the SCS represents a means to
reconsolidate control in the face of weak institutions, representing a creative
and novel enforcement mechanism that, rather than strengthening existing
institutions, diminishes their importance.
As the Chinese State Council’s Planning Outline notes, the goal of the system is
to ‘broadly shape a thick atmosphere in the entire society that keeping trust is
glorious and breaking trust is disgraceful, and ensure that sincerity and trust-
worthiness become conscious norms of action amongst all the people’ (State
Council, 2014). By achieving extensive surveillance and control over social
and market behaviours, the SCS seeks to ensure good behaviour between econ-
omic subjects, as well as compliance with regulations and participation in gov-
ernment agendas. The behaviour of subjects is controlled not by means of force,
Brett Aho and Roberta Duffield: Beyond surveillance capitalism 9
pollution each year is 10% of same year’s added GDP and is likely to rise higher
in 2020’ (p. 1989). For these reasons, the amelioration of environmental con-
cerns is amongst the early aims that SCS has forwarded.
Much of the difficulty China faces in the environmental arena stems from
poor enforcement of pollution regulations, corruption and feeble institutions
disincentivizing firm compliance (Eaton & Kostka, 2017; Wang et al., 2003).
Again, much of SCS’s power lies in its ability to bypass institutional weakness
and coerce economic actors into self-regulation. The Planning Outline (2014)
proposes new state capabilities for ecological monitoring through establishing
‘credit evaluation structures for enterprises’ environmental behavior’. Former
head analyst at the Mercator Institute for China Studies Mirjam Meissner
(2017) notes that the government’s current goals include the development of
a system able to carry out real-time emissions and energy consumption over-
sight of polluting industries using sensors in chimney stacks and smart
metres. Pilot projects have already been launched, such as Green Horizon
which monitors and responds to Beijing’s pollution spikes with real-time
measures for traffic and industries (Cooper, 2016).
Besides immediate regulatory enforcement, SCS also incentivizes business
participation in initiatives designed to improve performance over longer
periods of time. For example, firms that voluntarily reduce energy consumption
or carbon footprints may see credit scores improved and rewards such as
decreased tax rates. Because different places and industries face their own
unique set of environmental problems, programmes can be tailored to local spe-
cifics. For example, increased water consumption penalties may be levied
against industries in water-scarce areas as compared to regions where supplies
are bountiful. SCS incentives could even theoretically promote positive compe-
tition between firms. Nor does regulation stop at environmental concerns; the
same principles of bespoke monitoring and coordination exist for any other cat-
egory of economic interest, such as corporate social responsibility, consumer
satisfaction, product output and so on.
Given the large volume of data collected, SCS could eventually reach deep
into the lives of citizens and provide a range of policy insights related to the
well-being of citizens and employees. For example, a firm’s occupational
health practices could be measured through the number of transactions that
employees make at medical establishments. High workforce spending on medi-
cine and healthcare may serve as an indicator of poor occupational health prac-
tices and the social credit score of the unhealthy enterprise reduced accordingly.
To raise the score, the firm would be incentivized to adopt policies that improve
relevant working conditions, and the success of those policies could be analysed
in real-time by continuing to monitor employees’ medical transactions. Further-
more, through surveillance of personal communication and social media, digital
algorithms could theoretically measure the attitudes and mental health of
workers, enabling the creation of complex metrics that could potentially revolu-
tionize labour relations and human resources practices. The potentials for firm
12 Economy and Society
regulation are virtually limitless within a system that permits unrestricted sur-
veillance and intervention.
Over the past 100 years, we have come to believe that the market economy is the
best system, but in my opinion, there will be a significant change in the next three
decades, and the planned economy will become increasingly big. Why? Because
with access to all kinds of data, we may be able to find the invisible hand of the
market. (Global Times, 2017)
This return to economic planning can trace its roots to the emergence of
systems engineering as an interdisciplinary field of study. As commentator
Hvistendahl (2018) observes, interest in systems engineering has waned in
Western education and industries but has exploded in China to the point
where today it is a mandatory subject for all students at the CPC’s Central
Party School in Beijing. The centrality of systems engineering in CPC planning
is particularly visible in the policy prescriptions laid out by Xi Jinping, who
noted in 2013 that ‘comprehensively deepening reform is a complex systems
engineering problem’ (Hvistendahl, 2018). Systems engineering approaches
lie at the centre of the SCS and will be used to continually develop the networks
as well as the policies and programmes embedded within it. In the eyes of the
CPC, the economy represents the mother of all systems, which, with enough
data points, can be studied, manipulated and understood. Recent developments
in machine learning represent the primary technological developments that have
made it possible for China to use big data and the SCS as a means of steering
economic growth.
With huge amounts of data, real-time feedback and machine learning algor-
ithms that can process and understand outputs, the SCS presents the CPC with
an instrument that can help the party respond to the fluctuations of the
market almost instantaneously. Using the SCS, economic planners now have
Brett Aho and Roberta Duffield: Beyond surveillance capitalism 13
Background
On 25 May 2018 the GDPR was formally implemented, ending a two-year tran-
sitional period that followed the regulation’s final adoption in April 2016.
GDPR has been heralded as ‘one of the most robust data privacy laws in the
world’ that sets a new global standard for data collection storage and use
(Pardes, 2018). Its twofold aim is to ‘enhance data protection rights of individ-
uals and to improve business opportunities by facilitating the free flow of per-
sonal data in the digital single market’ (Council of European Union, 2015, p. 1).
By unifying digital protection practices, the general goal is to enhance the
degree of control that ordinary citizens have over their personal data, as well
as how it is collected and used in an age of data-driven capitalism. Substantively,
the regulations place meaningful limits on how corporations can collect and use
personal data, effectively hindering the practices of surveillance capitalism.
GDPR replaces the Data Protection Directive 95/46/EC (DPD) introduced
in 1995 to unilaterally standardize the multitude of extant data protection laws
within each EU nation, in turn based upon an older set of principles known as
the Fair Information Practices. DPD’s primary objective was to uphold the pro-
tection of the individual with regards to the processing of personal data and its
free movement. The shifting technological landscape soon outstripped DPD’s
oversight capabilities, creating legislative gaps and fragmentation. In addition,
its directive status allowed individual member states to differentially interpret
and modify the original edict with supplementary national laws. As a result,
data regulation in Europe has previously been uneven, such that businesses
operating across borders in the EU’s single market found themselves navigating
an increasingly complex legal framework. In response, a proposal for new regu-
latory legislation on digital data protection was issued in January 2012, paving
14 Economy and Society
the way for what would eventually become the GDPR (Pardes, 2018; Ryz &
Grest, 2016; Tankard, 2016).
GDPR’s implementation involves a reimagination of geographical borders to
match a new digital imaginary. The regulation applies to all individuals within
the EU and European Economic Area, regardless of nationality or origin. Con-
trollers and processors of personal data located outside the EU are also subject
to GDPR if their business practices involve the data of any EU individual or
entity (Ryz & Grest, 2016). In the case of the United Kingdom, royal assent
for a new Data Protection Act was granted in May 2018 that allows for a con-
tinuation of the GDPR in a post-Brexit nation (Burgess, 2018).
Vying against each other are those societies that believe that individuals have an
absolute right to control their personal data – to exercise the same kind of domin-
ion over data that they do over their bodies or their personal property – and those
that believe that personal data is a good to be traded on the open market and thus
subject to the same market forces at play elsewhere … The EU stands firmly for
the interests of the individual. (Pendergast, 2018, emphasis added)
Indeed, GDPR can be seen to take on a semi-messianic role, with pundits asking
questions such as ‘will the spring of 2018 be remembered as the time when the
right to privacy was enshrined as a fundamental human right?’ (Pendergast,
2018). This rhetoric is perhaps reflective of the sea-change in the practices
and principles of market economics that Zuboff (2015, 2019) maps out with
regards to surveillance capitalism and its contemporary challenges to the
modern liberal order.
The regulatory language of the GDPR does indeed seem to reflect this sense
of urgency, solidifying the safeguarding of individual rights throughout its 99
articles. Under Articles 13 and 14 on the right to notification, all users must
be informed of how their data are to be used and given the opportunity to
opt in or out of the process (European Union, 2016). This consent must be
given freely and unambiguously, without coercion or entrapment, disallowing
such practices as the provision of extra services to those who agree to share per-
sonal information. In addition, no data may be transferred to third parties or
outside of the EU without specific prior agreement of the involved parties
(Ryz & Grest, 2016). Further, under Article 15 on the right to access, individ-
uals now hold the authority to view information held on them and withdraw
from data processing if they change their mind. The right to be forgotten has
16 Economy and Society
also been reinforced, now requiring data controllers to remove information that
is considered to be extraneous, inadequate or no longer relevant. These require-
ments will naturally require that data collecting entities exercise greater over-
sight over what information they hold, where it is held and how it is being
used at all times. (Council of the European Union, 2015; Tankard, 2016).
Implicit throughout these terms is a notion of the individual as an active agent
in determining their own positionality of self, where the ‘right to explanation’ or
‘right to be informed’ moves beyond mere passive ‘data protection’. Data pro-
tection scholars Malgieri and Comandé (2017) argue that this significance arises
from the nexus between the rights to access, notification and not to be subject to
any automated decision-making made on a solely-algorithmic basis. Indeed, the
inclusion of the latter precept is particularly telling when considered through
the lens of Zuboff’s (2015) re-conception of false consciousness as produced
by hidden facts of commoditized behaviour modification. Implicitly, GDPR
recognizes the shift in power she outlines from ownership of the means of pro-
duction to commercially-driven data analytics and the role this technology plays
in subject formation. Article 22(1) of the GDPR which governs the automation
of data analytics states that:
The data subject shall have the right not to be subject to a decision based solely
on automated processing, including profiling, which produces legal effects con-
cerning him or her or similarly significantly affects him or her. (European Union,
2016, emphasis added)
This is further support by Articles 13(2)(f) and 14(2)(g) which both state that:
The existence of automated decision-making, including profiling, referred to in
Article 22(1) and (4) and, at least in those cases, meaningful information about the
logic involved, as well as the significance and the envisaged consequences of such pro-
cessing for the data subject. (European Union, 2016, emphasis added)
as the compulsion for market knowability, then it is only for the eyes of corpor-
ate elite interests. GDPR can therefore be seen as a reclamative response of
transparency for the individual (by rendering visible and providing choice),
but crucially not of the individual. As the epistemic contents of big data’s
black-box is revealed, this simultaneously re-privatises the personal lives of citi-
zens, if they so choose it.
The right not to be subject to any automated decision-making made on a
solely-algorithmic basis also has implications for the notion of consent that
GDPR forwards. Automation serves to bypass human rationalities by trans-
forming data from an abstract reduction to a rendition of human behaviour
‘increasingly understood as approaching reality itself’ (Chandler, 2015,
p. 836). Jurist and technology research analyst Antoinette Rouvroy (2012)
terms this the ‘truth regime’ of algorithmically generated insight that presents
claims to pure factuality in yielding insights that appear to have always
existed, but obscured beneath the chaotic surface of reality and the human fal-
libility of heuristic bias and emotion. GDPR can therefore also be seen as a rec-
lamation of human agency from the post-human infallible authority of big data
analytics, emphasizing the individual’s integrity once more.
its possible impact on the subjects involved, a data impact assessment and deli-
neation of requisite safeguards will be required before analysis can move for-
wards. These measures also reaffirm the responsibility of data processors for
the security of the information they hold (Ryz & Grest, 2016). Sanctions for
non-compliance are stricter than under the DPD, with violators liable for
fines of up to 4 per cent of total revenues or 20 million euros, whichever is
higher, for serious breaches (Tankard, 2016). GDPR’s reach is not exhaustive,
however. Data processing without compliance is still permitted for matters of
state security, justice and military matters; or data processing conducted by
individuals or within a personal household, for example.
Given the transnational nature of the tech industry, these divergent policy
regimes have already begun to come into contention. Concerns about data col-
lection and surveillance practices lie at the centre of political spats, national
security debates and trade disputes between Europe, the United States,
China and beyond. A report from the NATO Cooperative Cyber Defence
Centre of Excellence warns about the security threats posed by Huawei’s 5G
technology, citing Sun Tzu: ‘the supreme art of war is to subdue the enemy
without fighting’ (Kaska et al., 2019). US President Donald Trump explains
that EU competition chief and tech industry regulator Margrethe Vestager
‘hates the United States, perhaps worse than any person I’ve ever met’
(Stevis-Gridneff, 2019). Citing the cases of Huawei, Google and Samsung, pol-
itical scientist Abraham Newman (2019) describes how the United States and
China have begun strategically weaponizing supply chains in what he describes
as a new ‘quiet war’. As digital technologies become tied with divergent models
of social and economic development, it seems likely that data collection and sur-
veillance will only continue to grow as a substantial component of ideological
clashes.
The political will of the State Council to see SCS to completion is undeniable,
drawing on China’s industry advantages in surveillance and data collection. Yet,
the ambitious infrastructures demanded by SCS are without precedent and
must be built almost from scratch. Technological feasibility, bureaucratic bar-
riers and parity of enforcement pose significant challenges to the CPC’s
grand vision of economic omniscience, particularly concerning information
pipelines from rural areas and smaller towns where extant technology use and
state oversight are weak. Data theft, fraud and the emergence of a shadow indus-
try of loopholes is likely, spurred by fear of sanctions for non-compliance, or
powerful private enterprises loath to share their valuable data assets with the
state (Meissner, 2017). Even if correctly implemented on some scale, there is
no guarantee that the system will be successful, or how disruptive the teething
problems associated with the installation of a new regime of this scale will be.
Despite the epistemic armour of the algorithmic truth regime, its insights do
Brett Aho and Roberta Duffield: Beyond surveillance capitalism 19
not guarantee accuracy and remain dependent on the veracity and quality of data
mined. Of concern too is state misuse of data, as raised above in the case of the
marginalized Uighur population in Xinjiang and potentially on wider scales
beyond. Big data surveillance enhances the capture and categorization of differ-
ence, breeding potential for systemic social polarization as human subjects are
identified and sorted according to worth and risk (Lyon, 2003), as already
evident by Mara Hvistendahl’s (2017) account of a nascent digital underclass.
GDPR has fallen foul of similar logistical and operational pitfalls given its
ambitious scope. By the European Union’s own admission
companies seem to be treating the GDPR more as a legal puzzle, in order to pre-
serve their own way of doing things … rather than adapting their way of working
to better protect the interests of those who use their services. (EDPS, 2019, p. 5)
Indeed, by the rules of the GDPR, the ‘lead regulator’ of multinational firms
must be located in the county where firms have their ‘main establishment’,
which for most large firms, including Google, Facebook, Twitter and Microsoft,
is Ireland. Despite thousands of alleged data privacy violations, Ireland’s Data
Protection Commission has been slow to take enforcement actions, causing
some to raise concerns about regulatory capture (Vinocur, 2019). This disparity
between identification and action appears to be common across the EU, with
reporting from the period of GDPR’s inception in May 2018 until January
2019 indicating 59,000 data breach notifications but only 51 fines levied,
mostly of low value (DLA Piper, 2019). However, despite the implementation
challenges of GDPR and SCS, the direction of each policy regime is clear. As
with any new governance paradigm, processes of regulatory learning will
ensue, and each society is likely to continue progressing along prescribed trajec-
tories towards the normative vision embedded within.
Economic implications
The economic ambition of the CPC’s social credit system is clear – it is to be the
powerhouse for growth that will deliver Xi Jinping’s vision of national prosper-
ity and influence. In order to achieve this, China must recentre its export-based
economy to a development model based on consumption and quality rather than
price competition if it wishes to secure sustainable, sensible economic practices
(Nederveen Pieterse, 2015). In theory, SCS is geared to deliver an economy
operating at its maximum potential in all possible contingencies through the
exploitation of mass data collection, machine-learning algorithms and, even-
tually, real-time cybernetic feedback and adjustment. The eventual goal is a
revival of the planned economy equipped for the information age, where big
data and its analytics are tools for economic advancement, first and foremost.
In essence, China is taking the architectures first developed under the guise
20 Economy and Society
Social implications
Political implications
one must dedicate a 50-plus year career to political service in order to advance to
the upper echelons of the Party’s hierarchy (Leonard, 2012; Li, 2013). The SCS
represents the careful life work of experts in systems engineering, social psychol-
ogy, computer science, artificial intelligence, economics, political science, and
countless other fields. It is not the result of democratic will, but of cadres of
highly-trained scientific minds seeking to apply their knowledge for the advance-
ment of society as a collective project.
In contrast, GDPR is only reactive, playing catch-up to legislative fragmen-
tation and data infringement scandals such as Cambridge Analytica. Although
GDPR’s attempt to forward personal rights is commendable (at least within
the context of Western liberal thinking), when considered against its wider pol-
itical and cultural context, the forecast is less certain. GDPR is highly signifi-
cant, precisely because it stands alone as a bold move against plutocratic
currents and corporate government. Its look to the longue durée again is
worthy, but again, represents the exception rather than the rule. Compared
to the foresight of the CPC’s comprehensive developmental approach, the capi-
talist democracies of the EU member states remain preoccupied with short-term
election cycle projects and are caught blindsided by crises such as the 2008
financial crash. GDPR can thus be seen as a warning, not a success story. It
is symbolic of a greater sickness within the Western liberal order; that of the
post-ideological turn in an age of neoliberal normativism, where legislation
must undo past wrongs in order to build for the future. Its precepts guard
only against perceived threat; again, emblematic of the crisis of capitalism
that can no longer deliver sustainable futures for its citizens in a globalized
world. The proof is in China’s steadily rising living standards and falling cor-
ruption indices (Nederveen Pieterse, 2015) – the mirror image of Western
decline. This does not forgive nor justify Chinese social repression and an auto-
cratic political system in light of its ideological fortitude, but rather serves to
deepen the debate around the morals and meaning of digital surveillance and
its commercial applications that the contemporary world must address.
Whilst China proceeds with constructive confidence, Europe lags behind,
searching for a way to function in the global information civilization that is com-
patible with established Western political and social values.
Disclosure statement
References
Woetzel, J., Seong, J., Wei Wang, K., Zerlang, J. (2017). GDPR: A milestone
Manyika, J., Chui, M. & Wong, W. in convergence for cyber-security and
(2017, August). China’s digital economy: A compliance. Network Security, 2017(6),
leading global force. McKinsey Global 8–11.
Institute. Retrieved from https://www. Zhang, K. & Zhang, F. (2016). Report on
mckinsey.com/global-themes/china/ the construction of the social credit system
chinas-digital-economy-a-leading-global- in China’s Special Economic Zones. In Y.
force Tao & Y. Yuan (Eds.), Annual Report on
Wu, T. (2010). The master switch: The rise the Development of China’s Special
and fall of information empires. New York, Economic Zones (pp. 153–171). Singapore:
NY: Vintage. Springer Singapore.
Yim, M., Gomez, R. & Carter, M. Zuboff, S. (2015). Big Other: Surveillance
(2017, January). Facebook’s ‘free basics’ capitalism and the prospects of an infor-
and implications for development: IT mation civilization. Journal of Information
identity and social capital. Proceedings of Technology, 30(1), 75–89.
the 50th Hawaii International Conference on Zuboff, S. (2019). The age of surveillance
System Sciences (pp. 2590–2599). capitalism: The fight for a human future at
Washington, DC: IEEE Computer Society the new frontier of power. London: Profile
Press. Books.
Brett Aho is a PhD student in the Department of Global Studies at the University of
California, Santa Barbara. He has previously earned degrees from the University of
Leipzig, University of Roskilde and the University of Redlands. His current research
focuses on technology and regulatory governance in the United States, EU and China.
Roberta Duffield is a freelance researcher currently living in Cairo, Egypt. She holds
previous degrees from the University of California, Santa Barbara and Oxford Univer-
sity. Her research focuses on urbanism, public space, and the politics of technology
and infrastructure.