6242spaces For The Future A Companion To Philosophy of Technology 1st Edition Joseph C. Pitt (Editor) Instant Download
6242spaces For The Future A Companion To Philosophy of Technology 1st Edition Joseph C. Pitt (Editor) Instant Download
DOWNLOAD EBOOK
Spaces for the Future A Companion to Philosophy of
Technology 1st Edition Joseph C. Pitt (Editor) pdf download
Available Formats
Focused on mapping out contemporary and future domains in philosophy of technology, this
volume serves as an excellent, forward-looking resource in the field and in cognate areas of
study. The 32 chapters, all of them appearing in print here for the first time, were written by
both established scholars and fresh voices. They cover topics ranging from data discrimina-
tion and engineering design, to art and technology, space junk, and beyond. Spaces for the
Future: A Companion to Philosophy of Technology is structured in six parts: (1) Ethical Space
and Experience; (2) Political Space and Agency; (3) Virtual Space and Property; (4) Personal
Space and Design; (5) Inner Space and Environment; and (6) Outer Space and Imagination. The
organization maps out current and emerging spaces of activity in the field and anticipates the
big issues that we soon will face.
Joseph C. Pitt, a Fellow of AAAS, has been teaching at Virginia Tech for forty-five years.
His major area of scholarship is philosophy of technology with an emphasis on the impact of
technological innovation on scientific change. He is the author of four books and the editor/
co-editor of twelve others. He is past president of the Society for Philosophy and Technology
and served as editor-in-chief of Techné: Research in Philosophy and Technology, of which he
is currently on the editorial board.
A Companion to Philosophy
of Technology
List of Contributorsviii
Editor Introductionxii
PART 1
Ethical Space and Experience1
1 Data, Technology, and Gender: Thinking About (and From) Trans Lives 3
ANNA LAUREN HOFFMANN
2 Discrimination 14
D. E. WITTKOWER
5 Technoart 54
DON IHDE
PART 2
Political Space and Agency71
8 Educational Technology 82
ANDREW WELLS GARNAR
vi Contents
9 Religious Transcendence 92
CARL MITCHAM
PART 3
Virtual Space and Property117
14 Cyberwarfare 141
BENJAMIN C. JANTZEN
PART 4
Personal Space and Design183
20 Design 208
ANN JOHNSON
PART 5
Inner Space and Environment249
26 Fracking 281
ADAM BRIGGLE
PART 6
Outer Space and Imagination303
Index353
Contributors
and the Path to Nanotechnology (2011) and The Long Arm of Moore’s Law: Microelectron-
ics and American Science (2016).
Zachary Pirtle is an engineer at NASA Headquarters and a doctoral candidate in Systems
Engineering at George Washington University. Before joining NASA in 2010, he worked
with the Consortium for Science, Policy, and Outcomes at Arizona State University, where
he obtained degrees in in mechanical engineering and philosophy, and a master’s degree in
civil engineering.
Joseph C. Pitt is Professor of Philosophy and of Science and Technology in Society at Vir-
ginia Tech. A Fellow of the AAAS, he is the author of four books and the editor/co-editor of
thirteen others. His area of research is the impact of technological innovation on scientific
change.
Nicholas Rescher is a Distinguished University Professor of Philosophy at the University of
Pittsburgh, where he has also served as Chairman of the Philosophy Department and a Direc-
tor of the Center for Philosophy of Science. In a productive research career extending over
six decades, he has established himself as a systematic philosopher with more than one hun-
dred books to his credit, ranging over all areas of philosophy with sixteen of them translated
from English into eight other languages.
Robert Rosenberger studies the phenomenology of technology, investigating topics such as
homelessness, driving impairment, classroom tech, and laboratory imaging. He is Associate
Professor in the School of Public Policy at the Georgia Institute of Technology.
Evan Selinger is Professor of Philosophy at Rochester Institute of Technology. He specializes
in the philosophy of technology, focusing on ethical issues of emerging technology and pri-
vacy. Currently, he is co-writing Being Human in the 21st Century with Brett Frischmann.
Ashley Shew, Assistant Professor in the Department of Science and Technology in Society at
Virginia Tech, works on issues in philosophy of technology related to technological knowl-
edge, animal studies, biotech, and disability studies. Her first book, Animal Constructions
and Technological Knowledge, is under contract and forthcoming.
Tina Sikka is currently a lecturer at Newcastle University in the Media Culture and Heritage
Department. She has published extensively on the importance of public participation in tech-
nological design, feminist analyses of climate science, and scientific knowledge production.
She is currently writing a book on feminist empiricism, critical theory, and geoengineering
that is under contract and scheduled for publication in late 2017.
Johnny Hartz Søraker, formerly a faculty member at University of Twente, works as a policy
specialist for Google in Dublin, Ireland. His main research specialization lies in the intersec-
tion of information and communication technologies, philosophy, and psychology, with a
particular emphasis on virtual environments and quality of life.
Julie C. Swierczek is Librarian for Primary Resources and Metadata Services at Swathmore
College. She holds a master’s in Philosophy from Miami University of Ohio and a Master
of Science in Library and Information Science from the University of Illinois at Urbana-
Champaign. Her interests include metadata, the organization of information, digital foren-
sics, and information ethics.
David C. Tomblin is Director of the Science, Technology, and Society Scholars program at
the University of Maryland, College Park. He runs a Robotics Service Learning program
and an Infrastructure and Society Service Learning program. He works with a consortium
Contributors xi
of universities, science museums, and nonprofits called Expert and Citizen Assessment of
Technology to develop and do research on public engagement exercises for government
agencies such as NASA, Department of Energy, the EPA, and NOAA.
Shannon Vallor is Associate Professor of Philosophy at Santa Clara University and President
of the International Society for Philosophy and Technology. Her primary research expertise
is the philosophy and ethics of emerging science and technology.
Yoni Van Den Eede is a postdoctoral fellow of the Research Foundation–Flanders (FWO),
affiliated with the research groups Centre for Ethics and Humanism and Culture, Emancipa-
tion, Media, and Society, both at the Free University of Brussels. He conducts research into
the philosophy of technology, media theory, and media ecology. He is the author of Amor
Technologiae: Marshall McLuhan as Philosopher of Technology (2012).
Pieter E. Vermaas is Associate Professor in the Philosophy Department of Delft University
of Technology, the Netherlands. He does research on design methodology, design for moral
values, and validation in design research and is editing the book series Philosophy of Engi-
neering and Technology and Design Research Foundations.
D. E. Wittkower is Associate Professor of Philosophy at Old Dominion University, where he
teaches philosophy of technology, philosophy of social media, information ethics, and infor-
mation literacy and digital culture.
Monique Wonderly is the Harold T. Shapiro Postdoctoral Research Associate in Bioethics at
the Princeton University Center for Human Values. She has published in the areas of applied
ethics, philosophy of emotion, and history of philosophy.
Editor Introduction
This volume provides some important things to think about when it comes to philosophical
problems arising from our technologies, past, present, and future. The philosophy of technol-
ogy can be seen as a relatively young field, but the questions the field poses—about the nature
of humanity, technology, and our relationship to each other and the world—are ancient. We do
not offer a history of the philosophy of technology here because that has already been done, and
several other collections already present the classics. Rather, we offer a series of reflections on
the world we live in, witness, and foresee, primarily characterized in terms of the technologies
we have, emerging technologies, and technologies that may develop in the future. Many of our
authors belong to a new generation of philosophers of technology and bring an excitement to
their work, as adventurers and explorers of the future, laying out projects that need attention.
We have chosen ‘space’ as an organizing theme for mapping out the current and emerging
state of our field. We see ourselves, our work, and our ideals as living in ethical, political, vir-
tual, personal, inner, and outer space. There may be other spaces to explore; these categories
are not intended to be exhaustive, but we think they represent the spaces that philosophy of
technology is now and will be exploring in the near future. By stressing space we also stress our
communal nature and the fact that we are all in this together.
The volume we are proud to bring together here features many fresh voices—and more sea-
soned ones that offer new material. We hope to chart the course of philosophy of technology as
it will soon be written. These collected new works complement the existing canon and serve to
frame the technological world in which we currently exist.
Joseph C. Pitt
Ashley Shew
Blacksburg, Virginia
Part 1
Introduction
For scholars and students interested in topics of gender identity, data, and information tech-
nology, the current historical moment is a curious one. The proliferation of personal comput-
ing devices—from laptops to mobile phones to “smart” watches—combined with widespread
internet access, means that people are producing unprecedented amounts of digital data, leading
some scholars and technology evangelists to declare a “big data” revolution. At the same time,
issues of sexism and gender inequality have taken on new urgency as women face increasing
levels of harassment online, especially on large social networking sites like Twitter. The blame
for this falls, in part, on platform owners and developers that fail to thoroughly consider role of
design in promoting safety for the most vulnerable users. Finally, the emergence of high-profile
transgender activists, performers, and celebrities—from Laverne Cox to Caitlyn Jenner—has
brought attention to a minority population of trans, nonbinary, and gender-nonconforming peo-
ple and communities that have been (until now, at least) largely overlooked, often to the detri-
ment of the health and safety of these populations.
Of course, some would view these three trends as mostly unrelated: at a quick glance, big
data, gender and sexism online, and the health and needs of transgender people seem to have
little to do with one another. Against this easy assumption, however, this chapter suggestions
that—while not wholly reducible to one another—these three issues intersect in important ways
and, in particular, they shine a light the ongoing struggles minority identities and lives face
in our increasingly data-driven world. The ‘big data revolution’ cannot be divorced from the
technologies and systems that support it—technologies and systems that have long struggled to
account for diverse and nonnormative lives.
In the following, these three trends are woven together to further our thinking about gender,
identity, and technology. The first section attends to the biases and assumptions that underwrite
the idea of ‘big data.’ It argues that big data and the quantitative insights into human behavior
they stand to provide are not given but, rather, they are something we make and remake in prac-
tice. The second section traces key strands of thinking about the relationship between gender
and technology, offering deeper insight into the ways in which gendered biases or stereotypes
are built into the practice of scientific and technological development. Finally, the third section
takes these lessons and extends them to thinking about the lives and identities of gender minori-
ties, such as transgender individuals. I should note, however, that the discussions of relevant
literature throughout this chapter are not intended to be comprehensive (indeed, a fully com-
prehensive literature review of any section’s topic would fall outside the scope of this chapter).
Rather, I mean only to hit on the most salient trends and points as they relate to and help to
discuss issues of data, technology, information systems, and gender identity.
4 Anna Lauren Hoffmann
work[s] through correlation of purchases without passing through the vapid categories of
the marketers—you don’t need to know whether someone is male or female, queer or
straight, you just need to know his or her patterns of purchases and find similar clusters.
(Bowker 2014: 1796)
The seductiveness of this idea has led some big data evangelists to proclaim that we have
reached the “end of theory,” a point in time where knowledge production is simply a matter of
“[throwing] numbers into the biggest computing clusters the world has ever seen and [letting]
statistical algorithms find patterns where science cannot” (Anderson 2008: n.p.). As Caroline
Basset (2015) summarizes the idea, “Big Data ushers in new forms of expertise and promises to
render various forms of human expertise increasingly necessary” through “automation of forms
of data capture, information gathering, data analysis and ultimately knowledge production”
(549). In Robert W. Gehl’s (2015) words, “a common refrain . . . is that we are in for a revolu-
tion, but only if we recognize the problem of too much data and accept the impartial findings of
data science for the good of us all” (420). In short, big data appear to make “human expertise
seem increasingly beside the point” (Bassett 2015: 549).
But one can only admit the “end of theory” if one also accepts uncritically the mythology of
big data. But many scholars—including those cited earlier—warn that this myth is dangerous,
Data, Technology, and Gender 5
as it overlooks the ways in which our very ideas about what constitutes ‘data’ are themselves
framed by theoretical perspectives and assumptions. At a fundamental level, the mere act of
calling some things data (and disregarding other things as ‘not data’) represents a kind of theory
itself: even unstructured data rely on categories of chronological time or textual sources that
have already been shaped by assumptions about the world enforced by data collection instru-
ments. Any given data set is, by necessity, limited by its sources or its aims—no single data
set, even the most massive ones, can contain all conceivable data points because not everyone
or everything is conceived of as ‘data.’ Consequently, big data continue to suffer from “blind
spots and problems of representativeness, precisely because [they] cannot account for those
who participate in the social world in way that do not register as digital signals” (Crawford
et al. 2014: 1667).
Assumptions about what constitute ‘data’ are built into the instruments and tools we use to
collect, analyze, and understand the data itself. These tools “have their own inbuilt limitations
and restrictions”—for example, data available through social networking sites like Twitter and
Facebook are constrained by the poor archiving and search functions of those sites, making it
easy for researchers to look at events or conversations in the present and immediate past but
also difficult to track older events or conversations (boyd and Crawford 2012: 666). As a con-
sequence, research conducted on or through these sites often inherits a temporal bias, and given
the constraints of these social platforms, researchers prioritize immediacy over more reflective
or temporally distant analyses. The mythology of big data—its appeal to automated, techno-
logically sophisticated systems and claims to objectivity—works to obscure these biases and
their limits for accounting for certain kinds of people or communities (Crawford et al. 2014:
1667) As Bowker (2014) puts it: “just because we have big (or very big, or massive) data does
not meant that our databases are not theoretically structured in ways that enable certain perspec-
tives and disable others” (1797).
To be critical scholars and students of big data we must be vigilant against a mythology. It
is imperative that we pierce the veil of technological wonder and readily scrutinize big data’s
claims to impartiality or neutrality and recognize that data and knowledge are made legible and
valuable not in a vacuum, but in context. As Tom Boellstorff (2013) rightfully asserts: “There
is a great need for theorization precisely when emerging configurations of data might seem to
make concepts superfluous—to underscore that there is no Archimedean point of pure data
outside conceptual worlds” (n.p.). To be sure, these limits and biases do not automatically mean
that large-scale, data-intensive research is necessarily bad or unimportant. Rather, they simply
underscore the continued relevance of theoretical and other types of inquiry even in the midst
of a big data ‘revolution.’ As Crawford et al. (2014) argue,
the already tired binary of big data—is it good or bad?—neglects a far more complex real-
ity that is developing. There is a multitude of different—sometimes completely opposed—
disciplinary settings, techniques, and practices that still assemble (albeit uncomfortably)
under the banner of big data.
(1665)
focuses on the work of scholars and commentators that show how scientific and technological
practices (and the knowledge they produce) are shaped and constrained by considerations of
gender.
Early work on gender and technology focused almost exclusively on highlighting the over-
looked contributions of women to the history and development of science and technology. Work
in this vein sometimes focuses on women’s contributions to sites conventionally associated
with men—such as industry, engineering, or scientific research—and demonstrates how the
narratives that have emerged around these sites have tended to privilege the work and ideas of
men despite the presence and contributions of women. For example, a focus on the men who
built the first electronic, all-purpose computer—the Electronic Numerical Integrator and Com-
puter (ENIAC)—overlooks the fact that it was a team of women mathematicians that worked to
program the machine and make it operational (Sydell 2014). These sorts of skewed narratives
“have tended to make the very male world of invention and engineering look ‘normal,’ and thus
even more exclusively male than is actually the case” (Lerman, Mohun, and Oldenziel 2003:
431). As Nathan Ensmenger (2010) summarizes, “the idea that many . . . computing professions
were not only historically unusually accepting of women, but were in fact once considered
‘feminized’ occupations, seems . . . unbelievable” against a backdrop that so heavily associates
computing with men and masculinity (120).
Other approaches work in a different direction, looking instead at activities and spaces his-
torically associated with women but overlooked as significant sites of technological activity.
Building on feminist critiques of Marxism that emphasized the role of unpaid and domestic
labor (most often performed by women), work in this area examines the relationship between
gender and technology outside of conventional sites of scientific or technological production.
Cynthia Cockburn and Susan Ormrod (1993)—in their now-classic work Gender and Tech-
nology in the Making—examined the history and rise of the microwave oven not only in its
design and development phase, but through to its dissemination into kitchens and the home.
Cockburn and Ormrod (1993) show how a technology that starts out as a high-tech innovation
ends up—through processes of marketing, retailing, and associations with ‘women’s work’ like
household cooking—viewed as a rote household appliance, ultimately ignoring the ways that
women’s specific technical knowledge (of cooking, for example) also contributed to the design,
distribution, and use of a particular technology.
Despite progress in recognizing the contributions of women in the history of science and
technology, however, biases still persist in our narratives about novel or innovative technolo-
gies. The story of the relatively recent and much-lauded Google Books project, for example,
foregrounds the vision of Google’s founders Sergey Brin and Larry Page as well as the com-
pany’s (male-dominated) engineering teams that developed a novel way for quickly and effec-
tively scanning, digitizing, and bringing entire library collections online (thus greatly expanding
access to recorded knowledge). Lost in this narrative are the contributions of librarians (pri-
marily women) who collected, organized, curated, and maintained the collections upon which
Google Books is built (Hoffmann and Bloom, forthcoming) as well as the women and people
of color who performed the manual labor of turning pages for Google’s scanning machines
(Wen 2014).
Further approaches to gender and technology center not on the narratives that grow up
around particular technologies, but instead on the ways in which gender biases influence the
development and design of technology itself. Work in this vein seeks to uncover how sex-
ist assumptions and stereotypes end up designed—or ‘baked’—into various systems and arti-
facts. For example, video games that offer only male avatars for players (or male and highly
sexualized female avatars) implicitly encode the assumption that only (heterosexual) men play
video games (Friedman 1996). More recently, commentators have pointed out how software
Data, Technology, and Gender 7
applications and personal tracking tools also fail to account for the specific needs of women.
For example, the release of Apple’s HealthKit for its popular mobile phones (iPhones) promised
a set of comprehensive tools for tracking personal health and biometric information. However,
HealthKit’s first iteration failed to include a tool for tracking menstruation (Duhaime-Ross
2014). Studying the relationship between gender and technology in this way allows us to reveal
and destabilize these seemingly ‘natural’ defaults by revealing the ways in which they actively
construct biased or even harmful ideas about women. (For more thorough summaries of the
state of gender and technology studies at different points in its development, see McGaw 1982;
Lerman, Mohan, and Odenziel 2003; Wajcman 2009).
Finally, gender has also played an important role in normative analyses of science, helping
to shed light on the moral and ethical consequences of scientific and technological progress.
Sandra Harding’s (1991) foundational work on feminist studies of science implored scholars
to pay close attention to “the problematics, agendas, ethics, consequences, and status” of
science-as-usual, that is, scientific practice and as we commonly (and uncritically) under-
stand it (1). Doing so means going beyond simply harnessing the tools of science to explore
overlooked questions or areas (like, for example, women’s health needs); instead, it requires
a thorough examination of the tools themselves—the methods, instruments, practices, and
ethics that have come to typify scientific practice. For example, simply pointing the tools and
technologies of science at issues relevant to women’s lives reinforces the assumption that
gender is binary and that men and women are categorically different, a problematic assump-
tion that has historically undergirded research on sex difference (Fausto-Sterling 1985; Rich-
ardson 2013).
Against the ingrained biases and problematic assumptions of conventional scientific
inquiry, many feminist researchers emphasize not one particular ‘female’ way of knowing,
but—rather—advocate for a plurality of methods for gathering, analyzing, and making sense
of the world. Regardless of method, feminist research should share—as Alison Wylie (2007)
argues—at least four basic commitments: (1) research should be relevant to feminist aims of
exposing and addressing questions of oppression, gendered or otherwise; (2) research should
be grounded in the experiences of marginalized populations, especially women; (3) research-
ers should be accountable, in an ethical sense, to the persons and populations they study; and
(4) researchers should be reflexive, that is, they should foreground (rather than obscure) the
context and assumptions that underwrite their work. Combined, these four dimensions articu-
late a normative vision for science that rejects the sort of objectivity and neutrality by positivist
and other understandings of science. (For a more thorough discussion of these commitments,
see Crasnow et al. 2015.)
At its simplest, the term transgender refers to “people who move away from the gender
they were assigned at birth, people who cross over (trans-) the boundaries constructed by their
culture to define and contain that gender” (Stryker 2009: 1). It is sometimes described as the
opposite of cisgender, a term that refers to people who identify with the gender they were
assigned at birth—most likely binary, either male or female (man or woman, boy or girl). Ety-
mologically, the prefix cis- derives from the Latin term meaning “on this side of,” while the
prefix trans- derives from the Latin term meaning “on the other side of.” It is important to note,
however, that not all members of gender minority groups (those who are not either cisgender
men or cisgender women) necessarily identify as transgender. A range of terms have emerged
to describe a range of identities and lived experiences of gender—from genderqueer individuals
who do not prescribe to any discrete gender category to nonbinary individuals who reject the
binary categories of male/female altogether.
Going beyond this relatively straightforward definition, Megan Davidson (2007) explains
that “the term transgender has no singular, fixed meaning but is instead . . . conceptualized by
both scholars and activists as inclusive of the identities and experiences of some (or perhaps
all) gender-variant, gender- or sex-changing, gender-blending, and gender-bending people”
(Davidson 2007: 60). In some sense, then, ambiguity has long been integral to any conception
of the term (Nataf 1996). It may, at times, include (or exclude)
transsexual people (of all operative statuses), cross-dressers, drag kings and queens, gen-
derqueer people, gay men and lesbians who queer gender lines (such as butch lesbians),
the partners of trans people, and any number of other people who transgress binary sex and
gender in all sorts of named and yet unnamed ways.
(Davidson 2007: 61)
For the purposes of readability, however, this remainder of this section (following Haimson
et al. 2015) uses the shorthand “trans” to refer to the transgender and broader gender noncon-
forming population.
Trans people are relevant to thinking about data and information systems in different ways.
Contemporary practices of collecting, mining, analyzing, and otherwise making use of data
represent new avenues for the exercise of social control (Andrejevic 2013). In addition, efforts
to classify and categorize things are caught up in processes of power and control (Boellstorff
2013: n.p.; see also Bowker and Star 1999). As such, they represent new methods for defining
and containing categories of gender—methods that may or may not account for the identi-
ties and needs of trans populations. For example, paper or online forms that offer only binary
options—only male and female check boxes—pose problems for trans people trying to access
health or other social services, online communities, or even dating sites. Trans women, for
example, have had problems using the popular online dating application Tinder, a site that
offers users only the option to identify by binary (and presumably cis) gender (i.e., man or
woman). Men seeking women on the site have repeatedly reported the accounts of trans women
as fraudulent based on the perceived failure of these women to meet the men’s normative stand-
ard of what a ‘real woman’ is or looks like (Vincent 2016). These reports often result in trans
women’s accounts being suspended and trans users being kicked off the site.
Trans people’s struggles with information systems and biased categories also go beyond
mere check boxes for gender. Many trans people, when socially transitioning genders, choose
a new name for themselves—one that better reflects who they are. However, national and local
policies may make it more or less difficult to legally change the name one was assigned at birth
(also known as a “deadname”). As a result, trans people often find themselves being forced to
Data, Technology, and Gender 9
disclose information (like their deadname) through bureaucratic or administrative practices that
do not account for or permit the use of chosen names that are not yet legally recognized. For
example, trans people may wish to sign up to use a website like Facebook—a social network-
ing site with more than one billion registered users—using an identity that is different from the
one that appears on their birth certificate or other legal documents. However, because Facebook
enforces a ‘real name’ policy, doing so is often not be possible.
Beyond social networking sites, the administrative tensions generated by limited and inflex-
ible data categories and information systems can inform all aspects of a trans individual’s life.
Take, for example, the importance of name and gender information in a university context:
Because college officials use gender in assigning campus housing, determining which
bathrooms and locker room students are permitted to use and deciding on which sports
team students can compete, a gender marker that does not correspond to how a student
identifies might mean that their institution will place them in unfair, uncomfortable, and
potentially dangerous situations.
(Beemyn and Brauer 2015: 480)
More than just an administrative headache, being forced to reveal or go by the wrong gender
or the wrong name can trigger feelings of dysphoria and humiliation. In some cases, it can
also lead to harassment, abuse, and even death. As Dean Spade (2015) forcefully demonstrates
in his book Normal Life, these sorts of conflicts—between prescribed categories and lived or
actual identities—have severe consequences, often leading to trans people being denied hous-
ing, employment, medical or mental health care, and access to homeless or domestic violence
shelters.
As discussed in the first section, the mythology of big data cannot be divorced from the sys-
tems and practices upon which the big data revolution relies—systems and practices that strug-
gle to account for trans identities and lives. As Jeffrey Alan Johnson (2014) reiterates “it should
be clear by now that, contrary to the common perception of data as an objective representation
of reality, the content of data systems is an interpretation” (160). Nonetheless, making sense
of data and navigating information systems, he argues, necessarily requires something like the
illusion of objective representation—an illusion that “establish[es] certain state[s] of the world
as within the realm of normalcy to the exclusion of others” (Johnson 2014: 162). Trans lives and
identities challenge the normalized gender assumptions imposed by information systems in at
least two ways: (1) categorically (through the rejection of binary gender) and (2) conceptually
(through resistance to singular, fixed meanings). In doing so, they expose the limits of quantita-
tive and big data–driven understandings of the world that rely on rigid and reductive categories
in the face of fluid or shifting identities.
In addition, contemporary data science and information systems stand to further marginalize
individuals (binary, trans, or otherwise) whose identities are coupled with other identities that
entail other forms of oppression, such as racial or socioeconomic discrimination. Here, discus-
sions of gender, identity, and data featured in the relatively new journal Transgender Studies
Quarterly are instructive as they often emphasize not only gender in their analyses, but other
sources of oppression—racial, ableist, classist, and beyond—as well. They embrace the idea
of intersectional feminism, a concept that refers to a line of critique and activism rooted in
multiracial feminist movements in the second half of the twentieth century and eventually con-
cretized in the work of legal scholar Kimberlé Crenshaw (1989). Following Crenshaw’s (1991)
powerful discussions of violence against women of color, embracing intersectional analyses
means recognizing that the various sources that oppress marginalized groups—be they racism,
10 Anna Lauren Hoffmann
Conclusion
In her feminist account of big data, Elizabeth Losh (2015) reminds us that “individuals do not
float free in a loose matrix of voluntary social relations” (1651). They are, rather, constrained
by power structures or practices that impose their own meanings at different levels. As boyd
and Crawford (2012) put it:
Data are not generic. There is value to analyzing data abstractions, yet retaining context
remains critical, particularly for certain lines of inquiry. Context is hard to interpret at scale
and even harder to maintain when data are reduced to fit into a model.
(671)
The experiences of trans people extend and challenge our understandings of big data and the
relationship between gender and technology in important ways. They lay bare the limits of
rigid or fixed data categories for capturing fluid or multifaceted identities and they urge further
examination—both theoretical and empirical—into the ways data subjects are constrained (and
impacted) by biases and assumptions in scientific and technological development.
While issues of identity, data, and information systems seem to be—on one level, at least—
an interesting conceptual or philosophical problem to ponder, they also expose the urgency of
recognizing the very real and lived challenges these tensions and the rapid rise and adoption
of data-intensive technologies and platforms generate for already vulnerable trans and queer
populations. The continued exclusion from or subjugation of these populations to information
systems that do not represent their lives or needs represents a continuation of the “administra-
tive violence” described by Dean Spade (2015)—a phenomenon that we might rightly call data
violence in order to also capture the harm inflicted on trans and gender nonconforming people
not only by government-run systems, but also the information systems that permeate our eve-
ryday social lives.
References
Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. Wired.
Available at: www.wired.com/2008/06/pb-theory/ [Accessed April 16, 2016].
Andrejevic, M. (2013). Infoglut: How Too Much Information Is Changing the Way We Think and Know.
New York: Routledge.
Bassett, C. (2015). Plenty as a Response to Austerity? Big Data Expertise, Cultures and Communities.
European Journal of Cultural Studies, 18(4–5), 548–563.
Beemyn, G., and Brauer, D. (2015). Trans-Inclusive College Records Meeting the Needs of an Increas-
ingly Diverse U.S. Student Population. TSQ: Transgender Studies Quarterly, 2(3), 478–487.
Bivens, R. (2015). The Gender Binary Will Not Be Deprogrammed: Ten Years of Coding Gender on Face-
book. New Media & Society, 1–19. doi:10.1177/1461444815621527.
Boellstorff, T. (2013). Making Big Data, in Theory. First Monday, 18(10). doi:10.5210/fm.v18i10.4869
Bowker, G. (2014). The Theory/Data Thing. International Journal of Communication, 8, 1795–1799.
Bowker, G., and Star, S. L. (1999). Sorting Things Out: Classification and Its Consequences. Cambridge,
MA: MIT Press.
boyd, d., and Crawford, K. (2012). Critical Questions for Big Data. Information, Communication and
Society, 15(5), 662–679.
Cockburn, C., and Ormrod, S. (1993). Gender and Technology in the Making. Thousand Oaks, CA: Sage.
Crasnow, S., Wylie, A., Bauchspies, W. K., and Potter, E. (2015). In Edward N. Zalta (ed.) Feminist
Perspectives on Science. In The Stanford Encyclopedia of Philosophy. Stanford, CA: Metaphysics
Research Lab, Stanford University.
12 Anna Lauren Hoffmann
Crawford, K., Miltner, K., and Gray, M. L. (2014). Critiquing Big Data: Politics, Ethics, Epistemologies.
International Journal of Communication, 8, 1663–1672.
Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of
Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University of Chicago Legal
Forum, 1989(1), 139–167.
Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence Against
Women of Color. Stanford Law Review, 43(6), 1241–1299.
Davidson, M. (2007). Seeking Refuge Under the Umbrella: Inclusion, Exclusion, and Organizing Within
the Category Transgender. Sexuality Research & Social Policy: Journal of NSRC, 4(4), 60–80.
Dodge, M., and Kitchin, R. (2005). Codes of Life: Identification Codes and the Machine-Readable World.
Environment and Planning D: Society and Space, 23(6), 851–881.
Duhaime-Ross, A. (2014). Apple Promised an Expansive Health App, So Why Can’t I Track Menstrua-
tion? The Verge. Available at: www.theverge.com/2014/9/25/6844021/apple-promised-an-expansive-
health-app-so-why-cant-i-track [Accessed April 16, 2016].
Ensmenger, N. (2010). Making Programming Masculine. In T. J. Misa (ed.) Gender Codes: Why Women
Are Leaving Computing. Hoboken, NJ: Wiley, 115–142.
Fausto-Sterling, A. (1985). Sexing the Body: Gender Politics and the Construction of Sexuality. New
York: Basic Books.
Friedman, B. (1996). Value-Sensitive Design. Interactions, 3(6), 16–23.
Gehl, R. W. (2015). Sharing, Knowledge Management and Big Data: A Partial Genealogy of the Data
Scientist. European Journal of Cultural Studies, 18(4–5), 413–428.
Haimson, O. L., Brubaker, J. R., Dombrowski, L., and Hayes, G. R. (2015). Disclosure, Stress, and Sup-
port During Gender Transition on Facebook. In CSCW ’15. Vancouver, BC, Canada: ACM, 1176–1190.
Harding, S. (1991). Whose Science? Whose Knowledge? Thinking From Women’s Lives. Ithaca, NY: Cor-
nell University Press.
Harrison-Quintana, J., Grant, J. M., and Rivera, I. G. (2015). Boxes of Our Own Creation: A Trans Data
Collection Wo/Manifesto. TSQ: Transgender Studies Quarterly, 2(1), 166–174.
Hoffmann, A. L., and Bloom, R. (forthcoming). Digitizing Books, Obscuring Women’s Work: Google
Books, Librarians, and Ideologies of Access. Ada: A Journal of Gender, New Media, and Technology, 9.
Johnson, J. A. (2014). From Open Data to Information Justice. Ethics and Information Technology, 16(4),
263–274.
Jurgenson, N. (2014). View From Nowhere. The New Inquiry. Available at: http://thenewinquiry.com/
essays/view-from-nowhere/ [Accessed April 16, 2016].
Kitchin, R, (2014a). Big Data, New Epistemologies and Paradigm Shifts. Big Data & Society, 1(1), 1–12.
Kitchin, R. (2014b). The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Conse-
quence. Thousand Oaks, CA: Sage.
Kitchin, R., and McArdle, G. (2016). What Makes Big Data, Big Data? Exploring the Ontological Char-
acteristics of 26 Datasets. Big Data & Society, 3(1), 1–10.
Landström, C. (2007). Queering Feminist Technology Studies. Feminist Theory, 8(1), 7–26.
Lerman, N. E., Mohun, A. P., and Oldenziel, R. (2003). The Shoulders We Stand On/The View From
Here: Historiography and Directions for Research. In N. E. Lerman, R. Oldenziel, and A. P. Mohun
(eds.) Gender & Technology: A Reader. Baltimore, MD: The Johns Hopkins University Press, 425–450.
Losh, E. (2015). Feminism Reads Big Data: “Social Physics,” Atomism, and Selfiecity. International
Journal of Communication, 9, 1647–1659.
Mayer-Schönberger, V., and Cukier, K. (2013). Big Data: A Revolution That Will Transform How We Live,
Work, and Think. London: John Murray.
McGaw, J. A. (1982). Women and the History of American Technology. Signs, 7, 798–828.
Nataf, Z. I. (1996). Lesbians Talk: Transgender. New York: Scarlet Press.
Noble, S. (2013). Google Search: Hyper-Visibility as a Means of Rendering Black Women and Girls
Invisible. InVisible Culture, 19. Available at: http://ivc.lib.rochester.edu/google-search-hyper-visibility-
as-a-means-of -rendering-black-women-and-girls-invisible/.
Richardson, S. (2013). Sex Itself: The Search for Male and Female in the Human Genome. Chicago: Uni-
versity of Chicago Press.
Data, Technology, and Gender 13
Spade, D. (2015). Normal Life: Administrative Violence, Critical Trans Politics, and the Limits of Law.
Durham, NC: Duke University Press.
Stryker, S. (2009). Transgender History. Berkeley, CA: Seal Press.
Sydell, L. (2014). The Forgotten Female Programmers Who Created Modern Tech. NPR: All Tech Con-
sidered. Available at: www.npr.org/sections/alltechconsidered/2014/10/06/345799830/the-forgotten-
female-programmers-who-created-modern-tech [Accessed April 16, 2016].
Vincent, A. R. (2016). Does Tinder Have a Transphobia Problem? The Huffington Post. Available at:
www.huffingtonpost.com/addison-rose-vincent/does-tinder-have-a-transp_b_9528554.html [Accessed
April 16, 2016].
Wajcman, J. (2009). Feminist Theories of Technology. Cambridge Journal of Economics, 1–10.
Wen, S. (2014). The Ladies Vanish. The New Inquiry. Available at: http://thenewinquiry.com/essays/the-
ladies-vanish/ [Accessed April 16, 2016]. doi:10.1093/cje/ben057.
Wylie, A. (2007). The Feminism Question in Science: What Does It Mean to “Do Social Science as a
Feminist”? In S. Hesse-Biber (ed.) Handbook of Feminist Research. Thousand Oaks, CA: Sage.
Zikopoulos, P., and Eaton, C. (2011). Understanding Big Data: Analytics for Enterprise Class Hadoop
and Streaming Data. New York: McGraw Hill.
Chapter 2
Discrimination
D. E. Wittkower
Langdon Winner’s famous article, “Do Artifacts Have Politics?” (1980), must be the first thing
mentioned in any discussion of what philosophy of technology has contributed to our under-
standing of discrimination. The examples addressed, most of all the famous ‘racist bridges’
of Robert Moses—allegedly1 built low in order to specifically exclude New York City buses,
and the kind of person more likely to be using public transportation, from certain beaches—
make clear that artifacts can be said at least to have political effects, including advancing racial
discrimination.
Winner’s work has been highly cited, and rightly so, but it is not a full theory of discrimina-
tory technologies, and it is not grounded in multiple theoretical perspectives in order to maxi-
mize its applicability within the field. In the following, I will develop such a theory; connect
it with Heideggerian, Latourian, and postphenomenological theoretical structures; and dem-
onstrate its applicability to a wide and widening range of forms of normativity, exclusion, and
discrimination. This analysis will be limited to an American cultural context, as cultural con-
structions of discriminatory norms, especially those of race, gender, and religion, are far too
varied across regions and societies to be meaningfully addressed simultaneously, and my goal
in this article is in-depth analysis rather than cross-cultural exploration. I hope that readers in
Germany will be able to see parallel issues presented to Turks; that Israeli readers will see paral-
lel problems in a different religious social normativity; that those from Brazil and India will see
similarities in the way that their societies normativize ‘whiteness’ within more of a spectrum,
but still resulting in significant discrimination; that Canadians and Swedes will see connections
to the cultural erasure of their indigenous peoples; and so on.
First, it must be asked, what it would be to have a full theory of discriminatory technolo-
gies? Next, it must be asked, what we are to make of the idea of a ‘discriminatory technology’?
Following this, we may approach the three theoretical groundings described earlier in order to
provide support to the theory and to provide direction in seeing what kinds of artifacts it can
help to identify and understand as technologies of discrimination.
Band-Aids come in a variety of shapes and sizes, showing their responsivity to a variety of
contexts of use. In philosopher Luciano Floridi’s language (2014), it needs to have the right
protocol to fit its prompter—in this case, the minor cut in the skin. This is why it is made to
minimize infection, with a mesh to discourage adhesion to the healing flesh, and available in
different sizes in order match up right with the naturally occurring diversity of bleeding gashes.
There are limits to the diversity of adhesive bandage sizes and shapes, however. Three sizes
to a pack is good enough to cover most cases satisfactorily well, and we recognize it would be
unreasonable to expect just the right bandage for each particular wound. Having only a single
size is not responsive enough to the relevant cases, while having a dozen different sizes in every
box is far more responsivity than is necessary and would likely result in a bunch of odd shapes
and sizes—which would pile up in half-empty boxes accumulating in the corners of our medi-
cine cabinets, eventually to be discarded.
The invariance of the color of adhesive bandages, until relatively recently, places significant
variance from ‘White’ skin in this same category of irrelevance. Dark skin is apparently a
prompter to which it is not necessary to design a protocol to respond. This may be an effect of
‘color-blindness’: the (White) product designers failed to consider that ‘flesh colored’ might not
be the same thing for everyone. Although exceedingly unlikely in this particular case, this could
be an effect of conscious discrimination akin to Moses’s bridges, where the designers specifi-
cally chose to design a Whites-only product. This could simply reflect the reality of the market,
where a color is chosen that will match best for the largest set of similarly colored consumers.
In any case, the ontology established by the object is the same: the function of the bandage’s
color is to match the skin; when it fails to do so, it implicitly claims that this flesh is not ‘flesh
colored.’ The proper function of the technology contains within it an ontology that may define
some persons as normative and others as lesser or deviant Others.
A full theory of technologies of discrimination should engage with technologies at this
level—not by reading them as texts, not by producing analyses of particular effects or even
kinds of effects of technology, but by theorizing how those technologies embody, transmit, and
produce ontologies of normativity that result in privilege and discrimination.
Irish to be White. The Polish have similarly and recently disappeared as a category distin-
guished from ‘Whites.’ Hungarians and Bulgarians, in the mid-twentieth century, were subject
to racial slurs (bohunk, hunky) largely unrecognizable today to the people that these words were
meant to Other and to denigrate.
Religious and language differences play at least as much of a difference as more distinctively
race-related in determining who counts as White. In European history, Spaniards and Italians
have not always been considered White, especially those of Muslim faith. As Dyer points out
(1997: 42), Jews seem to have been considered Black for some time, and became only White in
the latter part of the twentieth century. Semitic persons, especially when Muslim or immigrants,
are Caucasian but nonetheless are often not White—the same is true of Latina/os, especially
when English is a second language.
What then is it to be White? The approximate answer from critical race theory is that being
‘White’ just means that it is not noted in any consequential way that you are raced at all. If you
are encountered in the context of a racial identification, you are a person of color (PoC); if you
are not, you are White. In this way we see that identification as White is a lack of judgment
rather than a concrete claim: that you are White means nothing besides that you have not been
identified as something else.
This judgment requires training—just as does having a discriminating palate. The way in
which formerly non-White persons become White involves a decrease in the weight placed
upon prejudicial claims against the minority in question, but it also involves a decrease in the
amount of training people receive in identifying those persons as a group in the first place. The
Nazis produced materials specifically designed to help Whites to identify and out Jews, but
much more innocuous racial caricatures, as in political cartoons, play a similar role.
This is why status as White or PoC is a matter more of how we are interpreted rather than
a matter of fact, although matters of fact form the basis of any interpretation, and certainly
can limit the range of interpretation available. Someone of mixed race may pass as White and
identify as White, and have no idea that they have non-White ancestors. A person of European
descent with curly hair may be darker skinned than a person of African descent with straight
hair, but we have been trained to interpret racial cues in such consistent and nuanced ways that
there may be little or no controversy about the whiteness of the former and the blackness of the
latter. Many people, including myself, are raced differently in different contexts or when wear-
ing clothes or hairstyles with or without ethnic markers. Few people today are raised in environ-
ments in which they are trained to recognize my features or my surname as racial markers—but
some are. To most, I am White, but I have been threatened with violence by White racists for
my non-White identity.
This is what we mean when we say that race or gender are socially constructed: they are the
product of human labor, manufactured using some physical basis, genetic and phenotypic, but
reducible to or determined by that basis in only the same kind of limited sense that other manu-
factured goods are reducible to or determined by their raw materials.
An invisible starting point in our encounter with one another, prior to the construction of
difference, was described by Martin Heidegger (1927/1996) as das Man (“the One,” or “the
they”). By “the One” he means only to indicate the approach to others named by the word one
in phrases like “one doesn’t do that.” The “One” who does this or doesn’t do that is no person
in particular, or even a description of a variety or totality of actually existing persons, but is
instead a set of expectations we are trained to have, on the basis of which we judge others and
ourselves. The One is normativity of all kinds: One is kind, One doesn’t tell lies, and One sets
the table with the fork on the left of the plate and the knife on the right. And one is called to
account when one doesn’t do what One does!
the
intuition of is
for
to close three
Home
for speak
life
the light n
in What
so here the
for and him
of feeble
may B
that
in and loyal
of
southwest compute
recalling and
natura compensate
me of
course of
explanations
to as the
Middle and
is
in
from local the
Paul
five says
generations of
and
their The
further
Love persons over
distance Cairo
devolve to
the
the that
power depravity any
middle my the
the
Co the
Europe
the
regret to
valuable i Loess
in remains is
of it
union in of
Calcuttensem
the games
reproof be may
one strangely
not above
upon occupying
The
the or writer
are episcopales
numbered
in that it
some
shattered
3 province
as
somewhat tutandam of
have the
that of
encies
or the people
egregiisque
key
and
here the
and
far
Englishman of
and tried
of
doubts by
literature they
present the
that garrisoning
altar fair as
producing and
write the
opposition him of
to which
to
enabling
as
very character
The into at
of all a
of most
in
and per
and
of by badly
Now or
suas and
between Twist recently
traffic
partial
number several
grows blasphemies It
quam he
right spontaneous
an clues that
view te
Our
and
The
better merely
the order as
p ago all
whatever
it Dr in
had members
devotissimani has the
and peaceable
Church Creek
English a
and no
magna
income in
of
were advice twenty
subsidence inches is
worship
George
was
taking
the the of
to
an
the
sight in to
latter ad
to
in
history
the
spring of
truly James
the
loved
adequate
set Hanno
of the for
the
The
culminated as
of daily 1886
as only Successive
is private
of
too
time unprincipled de
to petroleum the
inimici times
to sea
pleasures original
made
and a might
of
be Kurds it
brusquely is the
an quaint
Ireland
of without by
foundation
Facilities in of
uses in coal
of
period was
would for to
Bay s
one
for 1778
them consociationes k
any
and means
and
to ei
is
list
fetters A
1886 we
be
universal buying in
presence
high violence
of trip in
Deity accept
taken
seeing repays
no moderandisque
collapsed on
agreed blossoms if
out this
authors
in works
and
sorts of for
finishing to though
with
we the have
if
facts IN alluded
aint
he be
try
powder
full rushing
which
first chiefs heaven
The
it of and
United
difference admirably Ex
Farrell said
short w advised
five this
for mixed
of
to this
his
Societati a of
1885 the or
and
true
decidedly to to
by sincerity
the this Lao
Queen his
aspect in
sort
of and into
gM
the
bound Government
affording
of
and character
were episode
was
Melanthus the to
does
Dr injustice in
Tabarn rota
poured
him libens
daily
Spirit
famous
that up and
his
at made four
suppressed
power
while of form
carried construct
It commerce centers
police estimated
interest
gives
a acquitted was
their Communion one
nova grows
of
be be
pleasing of
one to
subject F
old success
China
a set Pigne
fitted getting
he of instances
Broglie
flood
is
to fired
which
of inside will
them
subject
so Christians magically
which
rewards
it One have
production
are even
inch
of Books
roses
interesting
the It final
the to The
a Saghalien
in
or
they
not
of
speech earth politics
our
will is
have
plenty church
of when
under
us years to
draw no
teaching person
of com when
Moses
a the their
balm principles
accidental
than nothing
that exhausted
the England
sequel whose of
no and
Nostris
of entire by
to subjects the
the
Empire
be The
an
Clothes 193
59 the Evin
government view m
Pere s
of certain near
the this
we literature
guardian
reli motus
Book
on strong
said i should
be of interpretation
vestram
in a servant
they
now
and of versa
perhaps a thinkinjr
the of
entire
was do
I they by
heart
the