100% found this document useful (1 vote)
49 views91 pages

6242spaces For The Future A Companion To Philosophy of Technology 1st Edition Joseph C. Pitt (Editor) Instant Download

Spaces for the Future: A Companion to Philosophy of Technology is a comprehensive volume edited by Joseph C. Pitt and Ashley Shew, featuring 32 original chapters that explore contemporary and future issues in the philosophy of technology. The book is organized into six parts, addressing ethical, political, virtual, personal, inner, and outer spaces, and includes contributions from both established scholars and new voices in the field. This resource aims to map out current and emerging domains within the philosophy of technology and anticipate significant challenges ahead.

Uploaded by

yhuayfr6414
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
49 views91 pages

6242spaces For The Future A Companion To Philosophy of Technology 1st Edition Joseph C. Pitt (Editor) Instant Download

Spaces for the Future: A Companion to Philosophy of Technology is a comprehensive volume edited by Joseph C. Pitt and Ashley Shew, featuring 32 original chapters that explore contemporary and future issues in the philosophy of technology. The book is organized into six parts, addressing ethical, political, virtual, personal, inner, and outer spaces, and includes contributions from both established scholars and new voices in the field. This resource aims to map out current and emerging domains within the philosophy of technology and anticipate significant challenges ahead.

Uploaded by

yhuayfr6414
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 91

Spaces for the Future A Companion to Philosophy

of Technology 1st Edition Joseph C. Pitt


(Editor) pdf download
https://textbookfull.com/product/spaces-for-the-future-a-companion-to-philosophy-of-technology-1st-
edition-joseph-c-pitt-editor/

★★★★★ 4.8/5.0 (48 reviews) ✓ 220 downloads ■ TOP RATED


"Excellent quality PDF, exactly what I needed!" - Sarah M.

DOWNLOAD EBOOK
Spaces for the Future A Companion to Philosophy of
Technology 1st Edition Joseph C. Pitt (Editor) pdf download

TEXTBOOK EBOOK TEXTBOOK FULL

Available Formats

■ PDF eBook Study Guide TextBook

EXCLUSIVE 2025 EDUCATIONAL COLLECTION - LIMITED TIME

INSTANT DOWNLOAD VIEW LIBRARY


Collection Highlights

The Routledge Companion to the Philosophy of Race 1st


Edition Paul C. Taylor (Editor)

Guide to Scientific Computing in C Pitt-Francis

Guide to Scientific Computing in C 2nd Edition Joe Pitt-


Francis

The Routledge Companion to Philosophy of Psychology Sarah


Robins
The Routledge Companion to Philosophy of Social Science
1st Edition Lee Mcintyre

The Routledge Companion to Feminist Philosophy 1st Edition


Ann Garry

Postmodern Suburban Spaces: Philosophy, Ethics, and


Community in Post-War American Fiction 1st Edition Joseph
George (Auth.)

Functional Analysis: An Introduction to Metric Spaces,


Hilbert Spaces, and Banach Algebras: Second Edition Joseph
Muscat

The Cambridge Companion to Fichte Cambridge Companions to


Philosophy 1st Edition David James
Spaces for the Future

Focused on mapping out contemporary and future domains in philosophy of technology, this
volume serves as an excellent, forward-looking resource in the field and in cognate areas of
study. The 32 chapters, all of them appearing in print here for the first time, were written by
both established scholars and fresh voices. They cover topics ranging from data discrimina-
tion and engineering design, to art and technology, space junk, and beyond. Spaces for the
Future: A Companion to Philosophy of Technology is structured in six parts: (1) Ethical Space
and Experience; (2) Political Space and Agency; (3) Virtual Space and Property; (4) Personal
Space and Design; (5) Inner Space and Environment; and (6) Outer Space and Imagination. The
organization maps out current and emerging spaces of activity in the field and anticipates the
big issues that we soon will face.

Joseph C. Pitt, a Fellow of AAAS, has been teaching at Virginia Tech for forty-five years.
His major area of scholarship is philosophy of technology with an emphasis on the impact of
technological innovation on scientific change. He is the author of four books and the editor/
co-editor of twelve others. He is past president of the Society for Philosophy and Technology
and served as editor-in-chief of Techné: Research in Philosophy and Technology, of which he
is currently on the editorial board.

Ashley Shew works in philosophy of technology on topics relating to emerging technologies,


animals, disability, and environment. Her first book, Animal Constructions and Technological
Knowledge, will be published next year. Her latest work concerns how disabled bodies are used
in technological imagination and how narratives about technology often fail to match up to the
lived experience of disability. She collects class materials and information about disability and
technology on her website, http://techanddisability.com.
Spaces for the Future

A Companion to Philosophy
of Technology

Edited by Joseph C. Pitt


and Ashley Shew
First published 2018
by Routledge
711 Third Avenue, New York, NY 10017
and by Routledge
2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2018 Taylor & Francis
The right of Joseph C. Pitt and Ashley Shew to be identified as the authors of
the editorial material, and of the authors for their individual chapters, has been
asserted in accordance with sections 77 and 78 of the Copyright, Designs and
Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or
utilised in any form or by any electronic, mechanical, or other means, now
known or hereafter invented, including photocopying and recording, or in any
information storage or retrieval system, without permission in writing from the
publishers.
Trademark notice: Product or corporate names may be trademarks or registered
trademarks, and are used only for identification and explanation without intent
to infringe.
Library of Congress Cataloging-in-Publication Data
A catalog record for this book has been requested
ISBN: 978-0-415-84296-9 (hbk)
ISBN: 978-0-203-73565-7 (ebk)
Typeset in Times New Roman
by Apex CoVantage, LLC
Visit the companion website: http://spacesforthefuture.com
Contents

List of Contributorsviii
Editor Introductionxii

PART 1
Ethical Space and Experience1

1 Data, Technology, and Gender: Thinking About (and From) Trans Lives 3
ANNA LAUREN HOFFMANN

2 Discrimination 14
D. E. WITTKOWER

3 Video Games and Ethics 29


MONIQUE WONDERLY

4 Social Networking: The Dialectics of Sharing 42


YONI VAN DEN EEDE AND KATLEEN GABRIELS

5 Technoart 54
DON IHDE

6 The Philosophy of Art and Technology 62


JESSIE MANN

PART 2
Political Space and Agency71

7 Robots With Guns 73


SHANNON VALLOR

8 Educational Technology 82
ANDREW WELLS GARNAR
vi Contents

9 Religious Transcendence 92
CARL MITCHAM

10 Agency and Citizenship in a Technological Society 98


ANDREW FEENBERG

11 Animals in Philosophy of Technology 108


ASHLEY SHEW

PART 3
Virtual Space and Property117

12 Obscurity and Privacy 119


EVAN SELINGER AND WOODROW HARTZOG

13 Copyright Between Economic and Cultural Models of Creativity 130


GORDON HULL

14 Cyberwarfare 141
BENJAMIN C. JANTZEN

15 The Cloud 154


CRAIG A. CONDELLA AND JULIE C. SWIERCZEK

16 Ethical Issues in Big Data 164


DEBORAH G. JOHNSON

17 Virtual Environments 174


JOHNNY HARTZ SØRAKER

PART 4
Personal Space and Design183

18 The Organization of User Experience 185


ROBERT ROSENBERGER

19 Engineering Design 196


PIETER E. VERMAAS

20 Design 208
ANN JOHNSON

21 Dancing the Device: A Translational Approach to Technology 216


ALBRECHT FRITZSCHE
Contents vii

22 On Harnessing of Birth in the Technical Age 224


DANA S. BELU

23 Moore’s Regula 238


CYRUS C. M. MODY

PART 5
Inner Space and Environment249

24 Is Renewable Energy Technology Revolutionary? 251


ROBERT DARROW

25 Geoengineering and Climate Change 263


TINA SIKKA

26 Fracking 281
ADAM BRIGGLE

27 Climate Change and Philosophy of Technology 292


ROBERT-JAN GEERTS

PART 6
Outer Space and Imagination303

28 Space Tourism and Science Fiction 305


ROSALYN W. BERNE

29 Well-Ordered Engineering: Participatory Technology


Assessment at NASA 314
ZACHARY PIRTLE AND DAVID C. TOMBLIN

30 When Loving Your Planet Isn’t Enough 326


DIANA P. HOYT

31 Transcendence in Space 340


JOSEPH C. PITT

32 The Role of Technology in Natural Science 346


NICHOLAS RESCHER

Index353
Contributors

Dana S. Belu is Associate Professor of Philosophy in the Philosophy Department at California


State University, Dominguez Hills. She works at the intersection of phenomenology, phi-
losophy of technology, and feminist philosophy. Her book Heidegger, Reproductive Technol-
ogy, and the Motherless Age came out in 2017.
Rosalyn W. Berne explores the intersecting realms of emerging technologies, science, fiction,
and myth. She is author of Nanotalk: Conversations with Scientists and Engineers About
Ethics, Meaning and Belief in the Development of Nanotechnology (2005), Creating Life
From Life: Biotechnology and Science Fiction (2014), and the science fiction novel Waiting
in the Silence (2012).
Adam Briggle is Associate Professor in the Philosophy and Religion Department at the Uni-
versity of North Texas. He is the author of A Field Philosopher’s Guide to Fracking (2015)
and co-author with Robert Frodeman of Socrates Tenured: The Institutions of 21st Century
Philosophy (2016).
Craig A. Condella is Associate Professor in the departments of Philosophy and of Cultural,
Environmental, and Global Studies at Salve Regina University. His interests include the
philosophy of technology, environmental ethics, and the history and philosophy of science.
Robert Darrow is a PhD Candidate in Political Science at University of Massachusetts,
Amherst, and holds a master’s degree in Philosophy from Virginia Tech. He is currently
researching the large-scale adoption of renewable energy technology in Denmark. He has
also written about mountaintop-removal coal mining in Appalachia.
Andrew Feenberg is Canada Research Chair in Philosophy of Technology at Simon Fraser
University, and Directeur de Programme in the College International de Philosophie in Paris.
He is the author of Questioning Technology (1999), Between Reason and Experience (2010),
and The Philosophy of Praxis; Marx, Lukács, and the Frankfurt School (2014).
Albrecht Fritzsche works at the Institute of Information Systems of the University of Erlangen-­
Nuremberg. He holds doctoral degrees in philosophy and management and worked for many
years in the manufacturing industry. His research is focused on innovation studies.
Katleen Gabriels is Assistant Professor at Eindhoven University of Technology in the Nether-
lands. Her research focuses on the ethical aspects of the Internet of Things and the virtualiza-
tion of morality. She publishes within philosophy and ethics of technology and media studies.
Andrew Wells Garnar is Senior Lecturer in Clemson University’s Department of Philosophy
and Religion. He teaches classes on science, technology, and postmodernism. His research
deals with these topics as well, from the perspective of American pragmatism.
Contributors ix

Robert-Jan Geerts is Assistant Professor in Ethics and Technology at University of Twente.


He is interested in the interface between philosophy of technology and environmental phi-
losophy, and his work explores avenues to broaden debates on energy transition.
Woodrow Hartzog is the W. Stancil Starnes Professor of Law at Samford University. His
research focuses on privacy law, media law, internet law, intellectual property, torts, and
contracts. Currently he is writing Privacy’s Blueprint: The Battle to Control the Design of
New Technologies.
Anna Lauren Hoffmann is Assistant Professor at the Information School at the University
of Washington. Her research focuses on information, technology, culture, and ethics, with
particular focus on how design and discourses of information technology can promote or
hinder social justice.
Diana P. Hoyt is a senior policy analyst at National Aeronautics and Space Administration
(NASA) Headquarters and currently manages NASA’s Regional Economic Development
and Innovation program. Her primary research interest is the philosophy of technology.
Gordon Hull is Associate Professor of Philosophy and Public Policy at the University of North
Carolina, Charlotte, where he directs the Center for Professional and Applied Ethics. He works
on topics in law, technology, and political philosophy such as intellectual property and privacy.
Don Ihde is Distinguished Professor of Philosophy, Emeritus, at Stony Brook University, and
author of twenty-two books, most recently Acoustic Technics (2015) and Husserl’s Missing
Technologies (2016).
Benjamin C. Jantzen joined the Philosophy Department at Virginia Tech in 2011. He works
broadly in the philosophy of science and the logic of discovery. Currently, he is developing
computer algorithms for the automated discovery of scientifically relevant natural kinds.
Ann Johnson passed away before the completion of this volume and is dearly missed by her
colleagues in history and philosophy of engineering. She did qualitative research on engi-
neers and engineering, focusing particularly on the way engineers produce and circulate new
knowledge and the role of specialized communities in the dissemination of knowledge. Her
book, Hitting the Brakes (2009), is a study of knowledge production and community forma-
tion in the design of antilock braking systems for passenger automobiles.
Deborah G. Johnson is the Anne Shirley Carter Olsson Professor of Applied Ethics, Emeritus,
in the Science, Technology, and Society Program at the University of Virginia. Best known
for her work on computer ethics and engineering ethics, Johnson’s research examines the
ethical, social, and policy implications of technology.
Jessie Mann is an artist whose work can be found in the permanent collections of the J. Paul
Getty Museum, the St. Petersburg Museum, and the Art Institute of Chicago. Mann’s research
is focused on the cognitive neuroscience of self-simulation and the underlying neurological
mechanisms that support neurorehabilitative therapies that rely on self-simulation, such as
mental imagery and avatar-assisted therapies.
Carl Mitcham is Professor of Philosophy of Technology at Renmin University of China and of
Liberal Arts and International Studies at the Colorado School of Mines. His work focuses on
ethics and policy issues related to science, technology, and society.
Cyrus C. M. Mody is Professor and Chair in the History of Science, Technology, and Innovation
at Maastricht University. He is the author of Instrumental Community: Probe Microscopy
x Contributors

and the Path to Nanotechnology (2011) and The Long Arm of Moore’s Law: Microelectron-
ics and American Science (2016).
Zachary Pirtle is an engineer at NASA Headquarters and a doctoral candidate in Systems
Engineering at George Washington University. Before joining NASA in 2010, he worked
with the Consortium for Science, Policy, and Outcomes at Arizona State University, where
he obtained degrees in in mechanical engineering and philosophy, and a master’s degree in
civil engineering.
Joseph C. Pitt is Professor of Philosophy and of Science and Technology in Society at Vir-
ginia Tech. A Fellow of the AAAS, he is the author of four books and the editor/co-editor of
thirteen others. His area of research is the impact of technological innovation on scientific
change.
Nicholas Rescher is a Distinguished University Professor of Philosophy at the University of
Pittsburgh, where he has also served as Chairman of the Philosophy Department and a Direc-
tor of the Center for Philosophy of Science. In a productive research career extending over
six decades, he has established himself as a systematic philosopher with more than one hun-
dred books to his credit, ranging over all areas of philosophy with sixteen of them translated
from English into eight other languages.
Robert Rosenberger studies the phenomenology of technology, investigating topics such as
homelessness, driving impairment, classroom tech, and laboratory imaging. He is Associate
Professor in the School of Public Policy at the Georgia Institute of Technology.
Evan Selinger is Professor of Philosophy at Rochester Institute of Technology. He specializes
in the philosophy of technology, focusing on ethical issues of emerging technology and pri-
vacy. Currently, he is co-writing Being Human in the 21st Century with Brett Frischmann.
Ashley Shew, Assistant Professor in the Department of Science and Technology in Society at
Virginia Tech, works on issues in philosophy of technology related to technological knowl-
edge, animal studies, biotech, and disability studies. Her first book, Animal Constructions
and Technological Knowledge, is under contract and forthcoming.
Tina Sikka is currently a lecturer at Newcastle University in the Media Culture and Heritage
Department. She has published extensively on the importance of public participation in tech-
nological design, feminist analyses of climate science, and scientific knowledge production.
She is currently writing a book on feminist empiricism, critical theory, and geoengineering
that is under contract and scheduled for publication in late 2017.
Johnny Hartz Søraker, formerly a faculty member at University of Twente, works as a policy
specialist for Google in Dublin, Ireland. His main research specialization lies in the intersec-
tion of information and communication technologies, philosophy, and psychology, with a
particular emphasis on virtual environments and quality of life.
Julie C. Swierczek is Librarian for Primary Resources and Metadata Services at Swathmore
College. She holds a master’s in Philosophy from Miami University of Ohio and a Master
of Science in Library and Information Science from the University of Illinois at Urbana-­
Champaign. Her interests include metadata, the organization of information, digital foren-
sics, and information ethics.
David C. Tomblin is Director of the Science, Technology, and Society Scholars program at
the University of Maryland, College Park. He runs a Robotics Service Learning program
and an Infrastructure and Society Service Learning program. He works with a consortium
Contributors xi

of universities, science museums, and nonprofits called Expert and Citizen Assessment of
Technology to develop and do research on public engagement exercises for government
agencies such as NASA, Department of Energy, the EPA, and NOAA.
Shannon Vallor is Associate Professor of Philosophy at Santa Clara University and President
of the International Society for Philosophy and Technology. Her primary research expertise
is the philosophy and ethics of emerging science and technology.
Yoni Van Den Eede is a postdoctoral fellow of the Research Foundation–Flanders (FWO),
affiliated with the research groups Centre for Ethics and Humanism and Culture, Emancipa-
tion, Media, and Society, both at the Free University of Brussels. He conducts research into
the philosophy of technology, media theory, and media ecology. He is the author of Amor
Technologiae: Marshall McLuhan as Philosopher of Technology (2012).
Pieter E. Vermaas is Associate Professor in the Philosophy Department of Delft University
of Technology, the Netherlands. He does research on design methodology, design for moral
values, and validation in design research and is editing the book series Philosophy of Engi-
neering and Technology and Design Research Foundations.
D. E. Wittkower is Associate Professor of Philosophy at Old Dominion University, where he
teaches philosophy of technology, philosophy of social media, information ethics, and infor-
mation literacy and digital culture.
Monique Wonderly is the Harold T. Shapiro Postdoctoral Research Associate in Bioethics at
the Princeton University Center for Human Values. She has published in the areas of applied
ethics, philosophy of emotion, and history of philosophy.
Editor Introduction

This volume provides some important things to think about when it comes to philosophical
problems arising from our technologies, past, present, and future. The philosophy of technol-
ogy can be seen as a relatively young field, but the questions the field poses—about the nature
of humanity, technology, and our relationship to each other and the world—are ancient. We do
not offer a history of the philosophy of technology here because that has already been done, and
several other collections already present the classics. Rather, we offer a series of reflections on
the world we live in, witness, and foresee, primarily characterized in terms of the technologies
we have, emerging technologies, and technologies that may develop in the future. Many of our
authors belong to a new generation of philosophers of technology and bring an excitement to
their work, as adventurers and explorers of the future, laying out projects that need attention.
We have chosen ‘space’ as an organizing theme for mapping out the current and emerging
state of our field. We see ourselves, our work, and our ideals as living in ethical, political, vir-
tual, personal, inner, and outer space. There may be other spaces to explore; these categories
are not intended to be exhaustive, but we think they represent the spaces that philosophy of
technology is now and will be exploring in the near future. By stressing space we also stress our
communal nature and the fact that we are all in this together.
The volume we are proud to bring together here features many fresh voices—and more sea-
soned ones that offer new material. We hope to chart the course of philosophy of technology as
it will soon be written. These collected new works complement the existing canon and serve to
frame the technological world in which we currently exist.
Joseph C. Pitt
Ashley Shew
Blacksburg, Virginia
Part 1

Ethical Space and Experience


Chapter 1

Data, Technology, and Gender


Thinking About (and From) Trans Lives
Anna Lauren Hoffmann

Introduction
For scholars and students interested in topics of gender identity, data, and information tech-
nology, the current historical moment is a curious one. The proliferation of personal comput-
ing devices—from laptops to mobile phones to “smart” watches—combined with widespread
internet access, means that people are producing unprecedented amounts of digital data, leading
some scholars and technology evangelists to declare a “big data” revolution. At the same time,
issues of sexism and gender inequality have taken on new urgency as women face increasing
levels of harassment online, especially on large social networking sites like Twitter. The blame
for this falls, in part, on platform owners and developers that fail to thoroughly consider role of
design in promoting safety for the most vulnerable users. Finally, the emergence of high-profile
transgender activists, performers, and celebrities—from Laverne Cox to Caitlyn Jenner—has
brought attention to a minority population of trans, nonbinary, and gender-nonconforming peo-
ple and communities that have been (until now, at least) largely overlooked, often to the detri-
ment of the health and safety of these populations.
Of course, some would view these three trends as mostly unrelated: at a quick glance, big
data, gender and sexism online, and the health and needs of transgender people seem to have
little to do with one another. Against this easy assumption, however, this chapter suggestions
that—while not wholly reducible to one another—these three issues intersect in important ways
and, in particular, they shine a light the ongoing struggles minority identities and lives face
in our increasingly data-driven world. The ‘big data revolution’ cannot be divorced from the
technologies and systems that support it—technologies and systems that have long struggled to
account for diverse and nonnormative lives.
In the following, these three trends are woven together to further our thinking about gender,
identity, and technology. The first section attends to the biases and assumptions that underwrite
the idea of ‘big data.’ It argues that big data and the quantitative insights into human behavior
they stand to provide are not given but, rather, they are something we make and remake in prac-
tice. The second section traces key strands of thinking about the relationship between gender
and technology, offering deeper insight into the ways in which gendered biases or stereotypes
are built into the practice of scientific and technological development. Finally, the third section
takes these lessons and extends them to thinking about the lives and identities of gender minori-
ties, such as transgender individuals. I should note, however, that the discussions of relevant
literature throughout this chapter are not intended to be comprehensive (indeed, a fully com-
prehensive literature review of any section’s topic would fall outside the scope of this chapter).
Rather, I mean only to hit on the most salient trends and points as they relate to and help to
discuss issues of data, technology, information systems, and gender identity.
4 Anna Lauren Hoffmann

Confronting the Mythology of Big Data


The term big data represents many things. As Rob Kitchin (2014a) describes, the term often
refers to data sets and databases that are ‘big’ along three lines: volume, velocity, and variety
(the 3Vs) (67–68; see also: Zikopoulos and Eaton 2011). Under this definition, big data are
unique because of their massive size (petabytes or even zettabytes), the rapidity of their pro-
duction (sometimes near real time, as with data generated by social networking sites), and their
diversity (they are expansive, contain data and metadata, and they can be both structured and
unstructured) (Kitchin 2014b: 1). Big data are also sometimes marked by other features, like
scalability (Mayer-Schönberger and Cukier 2013), the ease by which they are combined with
other data (Kitchin and McArdle 2016), and their often fine-grained or detailed nature (Dodge
and Kitchin 2005). Beyond technical features, big data also represent a kind of mythology.
As boyd and Crawford (2012) put it, big data simultaneously are produced and thrive on a
“widespread belief that large data sets offer a higher form of intelligence and knowledge that
can generate insights that were previously impossible, with the aura of truth, objectivity, and
accuracy” (663).
Although the technical features of big data may raise their own practical and philosophical
issues, the focus of this section is the mythology of big data. This myth—the idea that more
and bigger data equals more and greater truth—is a seductive one; it suggests that the social
world can be explained from a value-neutral, objective point in much the same way that the
physical universe is understood through measurable and mathematically quantifiable features
(Jurgenson 2014). Instead of filtering our data through the ideas and theories that make up vari-
ous branches of the social sciences (like sociology, linguistics, or psychology) we can simply
harness the power of today’s computers to perform automated statistical analyses on massive
data sets that capture traces of human behavior. Computers can find patterns and identify cor-
relations that humans cannot, patterns that—while not proof of causation—are basically good
enough to do the job of predicting (rather than explaining) future behavior. As Geoffrey Bowker
(2014) describes, such an approach seems—at least superficially—to “[avoid] funneling our
findings through vapid stereotypes” (1796). Amazon, for example, deploys an online recom-
mender system that

work[s] through correlation of purchases without passing through the vapid categories of
the marketers—you don’t need to know whether someone is male or female, queer or
straight, you just need to know his or her patterns of purchases and find similar clusters.
(Bowker 2014: 1796)

The seductiveness of this idea has led some big data evangelists to proclaim that we have
reached the “end of theory,” a point in time where knowledge production is simply a matter of
“[throwing] numbers into the biggest computing clusters the world has ever seen and [letting]
statistical algorithms find patterns where science cannot” (Anderson 2008: n.p.). As Caroline
Basset (2015) summarizes the idea, “Big Data ushers in new forms of expertise and promises to
render various forms of human expertise increasingly necessary” through “automation of forms
of data capture, information gathering, data analysis and ultimately knowledge production”
(549). In Robert W. Gehl’s (2015) words, “a common refrain . . . is that we are in for a revolu-
tion, but only if we recognize the problem of too much data and accept the impartial findings of
data science for the good of us all” (420). In short, big data appear to make “human expertise
seem increasingly beside the point” (Bassett 2015: 549).
But one can only admit the “end of theory” if one also accepts uncritically the mythology of
big data. But many scholars—including those cited earlier—warn that this myth is dangerous,
Data, Technology, and Gender 5

as it overlooks the ways in which our very ideas about what constitutes ‘data’ are themselves
framed by theoretical perspectives and assumptions. At a fundamental level, the mere act of
calling some things data (and disregarding other things as ‘not data’) represents a kind of theory
itself: even unstructured data rely on categories of chronological time or textual sources that
have already been shaped by assumptions about the world enforced by data collection instru-
ments. Any given data set is, by necessity, limited by its sources or its aims—no single data
set, even the most massive ones, can contain all conceivable data points because not everyone
or everything is conceived of as ‘data.’ Consequently, big data continue to suffer from “blind
spots and problems of representativeness, precisely because [they] cannot account for those
who participate in the social world in way that do not register as digital signals” (Crawford
et al. 2014: 1667).
Assumptions about what constitute ‘data’ are built into the instruments and tools we use to
collect, analyze, and understand the data itself. These tools “have their own inbuilt limitations
and restrictions”—for example, data available through social networking sites like Twitter and
Facebook are constrained by the poor archiving and search functions of those sites, making it
easy for researchers to look at events or conversations in the present and immediate past but
also difficult to track older events or conversations (boyd and Crawford 2012: 666). As a con-
sequence, research conducted on or through these sites often inherits a temporal bias, and given
the constraints of these social platforms, researchers prioritize immediacy over more reflective
or temporally distant analyses. The mythology of big data—its appeal to automated, techno-
logically sophisticated systems and claims to objectivity—works to obscure these biases and
their limits for accounting for certain kinds of people or communities (Crawford et al. 2014:
1667) As Bowker (2014) puts it: “just because we have big (or very big, or massive) data does
not meant that our databases are not theoretically structured in ways that enable certain perspec-
tives and disable others” (1797).
To be critical scholars and students of big data we must be vigilant against a mythology. It
is imperative that we pierce the veil of technological wonder and readily scrutinize big data’s
claims to impartiality or neutrality and recognize that data and knowledge are made legible and
valuable not in a vacuum, but in context. As Tom Boellstorff (2013) rightfully asserts: “There
is a great need for theorization precisely when emerging configurations of data might seem to
make concepts superfluous—to underscore that there is no Archimedean point of pure data
outside conceptual worlds” (n.p.). To be sure, these limits and biases do not automatically mean
that large-scale, data-intensive research is necessarily bad or unimportant. Rather, they simply
underscore the continued relevance of theoretical and other types of inquiry even in the midst
of a big data ‘revolution.’ As Crawford et al. (2014) argue,

the already tired binary of big data—is it good or bad?—neglects a far more complex real-
ity that is developing. There is a multitude of different—sometimes completely opposed—
disciplinary settings, techniques, and practices that still assemble (albeit uncomfortably)
under the banner of big data.
(1665)

Surfacing the Role of Gender in the Design


and Production of Science and Technology
The previous section challenged the seeming neutrality and objectivity of big data by reassert-
ing the importance of paying critical attention to the social, political, and technological biases
that underlie processes of collecting, analyzing, and making sense of data. This section builds
on this idea by zeroing in on one particular kind of social and political bias: gender bias. It
6 Anna Lauren Hoffmann

focuses on the work of scholars and commentators that show how scientific and technological
practices (and the knowledge they produce) are shaped and constrained by considerations of
gender.
Early work on gender and technology focused almost exclusively on highlighting the over-
looked contributions of women to the history and development of science and technology. Work
in this vein sometimes focuses on women’s contributions to sites conventionally associated
with men—such as industry, engineering, or scientific research—and demonstrates how the
narratives that have emerged around these sites have tended to privilege the work and ideas of
men despite the presence and contributions of women. For example, a focus on the men who
built the first electronic, all-purpose computer—the Electronic Numerical Integrator and Com-
puter (ENIAC)—overlooks the fact that it was a team of women mathematicians that worked to
program the machine and make it operational (Sydell 2014). These sorts of skewed narratives
“have tended to make the very male world of invention and engineering look ‘normal,’ and thus
even more exclusively male than is actually the case” (Lerman, Mohun, and Oldenziel 2003:
431). As Nathan Ensmenger (2010) summarizes, “the idea that many . . . computing professions
were not only historically unusually accepting of women, but were in fact once considered
‘feminized’ occupations, seems . . . unbelievable” against a backdrop that so heavily associates
computing with men and masculinity (120).
Other approaches work in a different direction, looking instead at activities and spaces his-
torically associated with women but overlooked as significant sites of technological activity.
Building on feminist critiques of Marxism that emphasized the role of unpaid and domestic
labor (most often performed by women), work in this area examines the relationship between
gender and technology outside of conventional sites of scientific or technological production.
Cynthia Cockburn and Susan Ormrod (1993)—in their now-classic work Gender and Tech-
nology in the Making—examined the history and rise of the microwave oven not only in its
design and development phase, but through to its dissemination into kitchens and the home.
Cockburn and Ormrod (1993) show how a technology that starts out as a high-tech innovation
ends up—through processes of marketing, retailing, and associations with ‘women’s work’ like
household cooking—viewed as a rote household appliance, ultimately ignoring the ways that
women’s specific technical knowledge (of cooking, for example) also contributed to the design,
distribution, and use of a particular technology.
Despite progress in recognizing the contributions of women in the history of science and
technology, however, biases still persist in our narratives about novel or innovative technolo-
gies. The story of the relatively recent and much-lauded Google Books project, for example,
foregrounds the vision of Google’s founders Sergey Brin and Larry Page as well as the com-
pany’s (male-dominated) engineering teams that developed a novel way for quickly and effec-
tively scanning, digitizing, and bringing entire library collections online (thus greatly expanding
access to recorded knowledge). Lost in this narrative are the contributions of librarians (pri-
marily women) who collected, organized, curated, and maintained the collections upon which
Google Books is built (Hoffmann and Bloom, forthcoming) as well as the women and people
of color who performed the manual labor of turning pages for Google’s scanning machines
(Wen 2014).
Further approaches to gender and technology center not on the narratives that grow up
around particular technologies, but instead on the ways in which gender biases influence the
development and design of technology itself. Work in this vein seeks to uncover how sex-
ist assumptions and stereotypes end up designed—or ‘baked’—into various systems and arti-
facts. For example, video games that offer only male avatars for players (or male and highly
sexualized female avatars) implicitly encode the assumption that only (heterosexual) men play
video games (Friedman 1996). More recently, commentators have pointed out how software
Data, Technology, and Gender 7

applications and personal tracking tools also fail to account for the specific needs of women.
For example, the release of Apple’s HealthKit for its popular mobile phones (iPhones) promised
a set of comprehensive tools for tracking personal health and biometric information. However,
HealthKit’s first iteration failed to include a tool for tracking menstruation (Duhaime-Ross
2014). Studying the relationship between gender and technology in this way allows us to reveal
and destabilize these seemingly ‘natural’ defaults by revealing the ways in which they actively
construct biased or even harmful ideas about women. (For more thorough summaries of the
state of gender and technology studies at different points in its development, see McGaw 1982;
Lerman, Mohan, and Odenziel 2003; Wajcman 2009).
Finally, gender has also played an important role in normative analyses of science, helping
to shed light on the moral and ethical consequences of scientific and technological progress.
Sandra Harding’s (1991) foundational work on feminist studies of science implored scholars
to pay close attention to “the problematics, agendas, ethics, consequences, and status” of
science-as-usual, that is, scientific practice and as we commonly (and uncritically) under-
stand it (1). Doing so means going beyond simply harnessing the tools of science to explore
overlooked questions or areas (like, for example, women’s health needs); instead, it requires
a thorough examination of the tools themselves—the methods, instruments, practices, and
ethics that have come to typify scientific practice. For example, simply pointing the tools and
technologies of science at issues relevant to women’s lives reinforces the assumption that
gender is binary and that men and women are categorically different, a problematic assump-
tion that has historically undergirded research on sex difference (Fausto-Sterling 1985; Rich-
ardson 2013).
Against the ingrained biases and problematic assumptions of conventional scientific
inquiry, many feminist researchers emphasize not one particular ‘female’ way of knowing,
but—rather—advocate for a plurality of methods for gathering, analyzing, and making sense
of the world. Regardless of method, feminist research should share—as Alison Wylie (2007)
argues—at least four basic commitments: (1) research should be relevant to feminist aims of
exposing and addressing questions of oppression, gendered or otherwise; (2) research should
be grounded in the experiences of marginalized populations, especially women; (3) research-
ers should be accountable, in an ethical sense, to the persons and populations they study; and
(4) researchers should be reflexive, that is, they should foreground (rather than obscure) the
context and assumptions that underwrite their work. Combined, these four dimensions articu-
late a normative vision for science that rejects the sort of objectivity and neutrality by positivist
and other understandings of science. (For a more thorough discussion of these commitments,
see Crasnow et al. 2015.)

Data, Information Technology, and Transgender Lives


The preceding discussions of big data and gender, science, and technology share a careful atten-
tion to the ways in which biases, stereotypes, and problematic assumptions shape our under-
standings of the world. They resist any easy or uncomplicated claims to neutrality or objectivity,
whether in science generally or in analyses of massive data sets specifically. For scholars criti-
cal of big data, this resistance means understanding what is admitted as ‘data’ in the first place
(and what is left out) as well as being cognizant of the histories and politics that shape the cat-
egories we use to make sense of data. For feminist scholars of science and technology, it means
bringing gender and oppression to the fore of our analyses and rejecting the supposed neutrality
of scientific and technological production. Bringing these discussions together helps to open
up a critical discussion of data, information technology, and the continued marginalization of
gender minorities like transgender and gender nonconforming individuals.
8 Anna Lauren Hoffmann

At its simplest, the term transgender refers to “people who move away from the gender
they were assigned at birth, people who cross over (trans-) the boundaries constructed by their
culture to define and contain that gender” (Stryker 2009: 1). It is sometimes described as the
opposite of cisgender, a term that refers to people who identify with the gender they were
assigned at birth—most likely binary, either male or female (man or woman, boy or girl). Ety-
mologically, the prefix cis- derives from the Latin term meaning “on this side of,” while the
prefix trans- derives from the Latin term meaning “on the other side of.” It is important to note,
however, that not all members of gender minority groups (those who are not either cisgender
men or cisgender women) necessarily identify as transgender. A range of terms have emerged
to describe a range of identities and lived experiences of gender—from genderqueer individuals
who do not prescribe to any discrete gender category to nonbinary individuals who reject the
binary categories of male/female altogether.
Going beyond this relatively straightforward definition, Megan Davidson (2007) explains
that “the term transgender has no singular, fixed meaning but is instead . . . conceptualized by
both scholars and activists as inclusive of the identities and experiences of some (or perhaps
all) gender-variant, gender- or sex-changing, gender-blending, and gender-bending people”
(Davidson 2007: 60). In some sense, then, ambiguity has long been integral to any conception
of the term (Nataf 1996). It may, at times, include (or exclude)

transsexual people (of all operative statuses), cross-dressers, drag kings and queens, gen-
derqueer people, gay men and lesbians who queer gender lines (such as butch lesbians),
the partners of trans people, and any number of other people who transgress binary sex and
gender in all sorts of named and yet unnamed ways.
(Davidson 2007: 61)

For the purposes of readability, however, this remainder of this section (following Haimson
et al. 2015) uses the shorthand “trans” to refer to the transgender and broader gender noncon-
forming population.
Trans people are relevant to thinking about data and information systems in different ways.
Contemporary practices of collecting, mining, analyzing, and otherwise making use of data
represent new avenues for the exercise of social control (Andrejevic 2013). In addition, efforts
to classify and categorize things are caught up in processes of power and control (Boellstorff
2013: n.p.; see also Bowker and Star 1999). As such, they represent new methods for defining
and containing categories of gender—methods that may or may not account for the identi-
ties and needs of trans populations. For example, paper or online forms that offer only binary
options—only male and female check boxes—pose problems for trans people trying to access
health or other social services, online communities, or even dating sites. Trans women, for
example, have had problems using the popular online dating application Tinder, a site that
offers users only the option to identify by binary (and presumably cis) gender (i.e., man or
woman). Men seeking women on the site have repeatedly reported the accounts of trans women
as fraudulent based on the perceived failure of these women to meet the men’s normative stand-
ard of what a ‘real woman’ is or looks like (Vincent 2016). These reports often result in trans
women’s accounts being suspended and trans users being kicked off the site.
Trans people’s struggles with information systems and biased categories also go beyond
mere check boxes for gender. Many trans people, when socially transitioning genders, choose
a new name for themselves—one that better reflects who they are. However, national and local
policies may make it more or less difficult to legally change the name one was assigned at birth
(also known as a “deadname”). As a result, trans people often find themselves being forced to
Data, Technology, and Gender 9

disclose information (like their deadname) through bureaucratic or administrative practices that
do not account for or permit the use of chosen names that are not yet legally recognized. For
example, trans people may wish to sign up to use a website like Facebook—a social network-
ing site with more than one billion registered users—using an identity that is different from the
one that appears on their birth certificate or other legal documents. However, because Facebook
enforces a ‘real name’ policy, doing so is often not be possible.
Beyond social networking sites, the administrative tensions generated by limited and inflex-
ible data categories and information systems can inform all aspects of a trans individual’s life.
Take, for example, the importance of name and gender information in a university context:

Because college officials use gender in assigning campus housing, determining which
bathrooms and locker room students are permitted to use and deciding on which sports
team students can compete, a gender marker that does not correspond to how a student
identifies might mean that their institution will place them in unfair, uncomfortable, and
potentially dangerous situations.
(Beemyn and Brauer 2015: 480)

More than just an administrative headache, being forced to reveal or go by the wrong gender
or the wrong name can trigger feelings of dysphoria and humiliation. In some cases, it can
also lead to harassment, abuse, and even death. As Dean Spade (2015) forcefully demonstrates
in his book Normal Life, these sorts of conflicts—between prescribed categories and lived or
actual identities—have severe consequences, often leading to trans people being denied hous-
ing, employment, medical or mental health care, and access to homeless or domestic violence
shelters.
As discussed in the first section, the mythology of big data cannot be divorced from the sys-
tems and practices upon which the big data revolution relies—systems and practices that strug-
gle to account for trans identities and lives. As Jeffrey Alan Johnson (2014) reiterates “it should
be clear by now that, contrary to the common perception of data as an objective representation
of reality, the content of data systems is an interpretation” (160). Nonetheless, making sense
of data and navigating information systems, he argues, necessarily requires something like the
illusion of objective representation—an illusion that “establish[es] certain state[s] of the world
as within the realm of normalcy to the exclusion of others” (Johnson 2014: 162). Trans lives and
identities challenge the normalized gender assumptions imposed by information systems in at
least two ways: (1) categorically (through the rejection of binary gender) and (2) conceptually
(through resistance to singular, fixed meanings). In doing so, they expose the limits of quantita-
tive and big data–driven understandings of the world that rely on rigid and reductive categories
in the face of fluid or shifting identities.
In addition, contemporary data science and information systems stand to further marginalize
individuals (binary, trans, or otherwise) whose identities are coupled with other identities that
entail other forms of oppression, such as racial or socioeconomic discrimination. Here, discus-
sions of gender, identity, and data featured in the relatively new journal Transgender Studies
Quarterly are instructive as they often emphasize not only gender in their analyses, but other
sources of oppression—racial, ableist, classist, and beyond—as well. They embrace the idea
of intersectional feminism, a concept that refers to a line of critique and activism rooted in
multiracial feminist movements in the second half of the twentieth century and eventually con-
cretized in the work of legal scholar Kimberlé Crenshaw (1989). Following Crenshaw’s (1991)
powerful discussions of violence against women of color, embracing intersectional analyses
means recognizing that the various sources that oppress marginalized groups—be they racism,
10 Anna Lauren Hoffmann

sexism, ableism, homophobia, classism, or beyond—cannot be understood independently of


one another. Rather, they intersect in ways that generate unique experiences of oppression and
marginalization. For example, “the intersection of racism and sexism factors into Black wom-
en’s lives in ways that cannot be captured wholly by looking at the race or gender dimensions
of those experiences separately” (Crenshaw 1991: 1244).
Careful attention to the intersections of identity generates even richer and more diverse
understandings of gender as not isolated features of identity, but as shaped and realized in com-
plex webs of identity and social relations best understood intersectionally. Instead of an abstract
category of ‘woman’ that attempts to account for otherwise disparate experiences, more com-
plex categories emerge—suddenly, the distinct experiences of White women and Black women
or cisgender women and transgender women (or Black cisgender women and Black transgen-
der women, and so on) become available for analysis. Oftentimes, however, large-scale, data-­
intensive research and design fails to account for these local, context-­dependent intersections
and the specific experiences of violence and oppression they generate. For example, Safiya
Noble (2013) has shown how the data, algorithms, and processes that produce Google search
results for the term Black girls reduce the identities of Black women to stereotypical or hyper-
sexualized representations only, demonstrating the unique confluence of racist and sexist
oppressions faced by Black women online.
Achieving broad inclusivity in data and design is challenging even for those squarely con-
cerned with problems of gender and technology. As Catharina Landström (2007) notes, even
when “captur[ing] the ways in which technology is shaped by gender and gender is shaped by
technology” (8) feminist scholars of science and technology reinforce normative standards that
are constraining rather than emancipatory for some. For example, work discussed in the second
section tends to “not question the definition of gender as a heterosexual coupling of opposites,
female and male, masculine and feminine” (Landström 2007: 10). The conception of gender as
binary (and further constrained by the heteronormative connotations of a male/female dichot-
omy) presents further challenges for feminist discussions of big data. As Elizabeth Losh (2015)
notes in her reading of big data through the politics of the “selfie,” in certain approaches to study-
ing big data, “gender is presented in strongly binary terms, with ‘female’ and ‘male’ as the main
categories separated by a territory demarcated by a question mark” (1635). Even where online
tools—and commercial models, such as Facebook’s expanded and open-ended gender identity
options—make possible a diversity of representations “essential . . . for studying how under and
sexuality are performed online,” defaults tend to emphasize an either/or logic (Losh 2015: 1653).
The tension between fixed, reductive categories and fluid, transitional identities is further
reflected in the most progressive efforts to account for trans and other queer identities through
data collection and design. In 2014, Facebook revised its gender options for users—going from
two categories to fifty-eight (and eventually adding an open-ended option as well). Though
this is lauded as in important move towards broad gender inclusivity, Rena Bivens (2015) has
demonstrated how—at a deep level and despite the addition of expanded gender options—
Facebook continues to enforce a binary logic encoded both in their business model and at the
level of software. In a different example, Jack Harrison-Quintana, Jaime M. Grant, and Ignacio
G. Rivera (2015) reflect on their experiences developing the National Transgender Discrimina-
tion Survey (NTDS) and note the challenges of developing “liberating versus limiting” boxes
that capture identity in data in ways that do not “collapse and marginalize trans experience” but
“expand and uncover the richness and complexities of trans lives” (167). Though not necessary
congenial to the production of efficient and data-intensive quantitative analyses, such diverse
understandings are integral to a richer and more broad ranging understanding of human behav-
ior and experience.
Data, Technology, and Gender 11

Conclusion
In her feminist account of big data, Elizabeth Losh (2015) reminds us that “individuals do not
float free in a loose matrix of voluntary social relations” (1651). They are, rather, constrained
by power structures or practices that impose their own meanings at different levels. As boyd
and Crawford (2012) put it:

Data are not generic. There is value to analyzing data abstractions, yet retaining context
remains critical, particularly for certain lines of inquiry. Context is hard to interpret at scale
and even harder to maintain when data are reduced to fit into a model.
(671)

The experiences of trans people extend and challenge our understandings of big data and the
relationship between gender and technology in important ways. They lay bare the limits of
rigid or fixed data categories for capturing fluid or multifaceted identities and they urge further
examination—both theoretical and empirical—into the ways data subjects are constrained (and
impacted) by biases and assumptions in scientific and technological development.
While issues of identity, data, and information systems seem to be—on one level, at least—
an interesting conceptual or philosophical problem to ponder, they also expose the urgency of
recognizing the very real and lived challenges these tensions and the rapid rise and adoption
of data-intensive technologies and platforms generate for already vulnerable trans and queer
populations. The continued exclusion from or subjugation of these populations to information
systems that do not represent their lives or needs represents a continuation of the “administra-
tive violence” described by Dean Spade (2015)—a phenomenon that we might rightly call data
violence in order to also capture the harm inflicted on trans and gender nonconforming people
not only by government-run systems, but also the information systems that permeate our eve-
ryday social lives.

References
Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. Wired.
Available at: www.wired.com/2008/06/pb-theory/ [Accessed April 16, 2016].
Andrejevic, M. (2013). Infoglut: How Too Much Information Is Changing the Way We Think and Know.
New York: Routledge.
Bassett, C. (2015). Plenty as a Response to Austerity? Big Data Expertise, Cultures and Communities.
European Journal of Cultural Studies, 18(4–5), 548–563.
Beemyn, G., and Brauer, D. (2015). Trans-Inclusive College Records Meeting the Needs of an Increas-
ingly Diverse U.S. Student Population. TSQ: Transgender Studies Quarterly, 2(3), 478–487.
Bivens, R. (2015). The Gender Binary Will Not Be Deprogrammed: Ten Years of Coding Gender on Face-
book. New Media & Society, 1–19. doi:10.1177/1461444815621527.
Boellstorff, T. (2013). Making Big Data, in Theory. First Monday, 18(10). doi:10.5210/fm.v18i10.4869
Bowker, G. (2014). The Theory/Data Thing. International Journal of Communication, 8, 1795–1799.
Bowker, G., and Star, S. L. (1999). Sorting Things Out: Classification and Its Consequences. Cambridge,
MA: MIT Press.
boyd, d., and Crawford, K. (2012). Critical Questions for Big Data. Information, Communication and
Society, 15(5), 662–679.
Cockburn, C., and Ormrod, S. (1993). Gender and Technology in the Making. Thousand Oaks, CA: Sage.
Crasnow, S., Wylie, A., Bauchspies, W. K., and Potter, E. (2015). In Edward N. Zalta (ed.) Feminist
Perspectives on Science. In The Stanford Encyclopedia of Philosophy. Stanford, CA: Metaphysics
Research Lab, Stanford University.
12 Anna Lauren Hoffmann

Crawford, K., Miltner, K., and Gray, M. L. (2014). Critiquing Big Data: Politics, Ethics, Epistemologies.
International Journal of Communication, 8, 1663–1672.
Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of
Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University of Chicago Legal
Forum, 1989(1), 139–167.
Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence Against
Women of Color. Stanford Law Review, 43(6), 1241–1299.
Davidson, M. (2007). Seeking Refuge Under the Umbrella: Inclusion, Exclusion, and Organizing Within
the Category Transgender. Sexuality Research & Social Policy: Journal of NSRC, 4(4), 60–80.
Dodge, M., and Kitchin, R. (2005). Codes of Life: Identification Codes and the Machine-Readable World.
Environment and Planning D: Society and Space, 23(6), 851–881.
Duhaime-Ross, A. (2014). Apple Promised an Expansive Health App, So Why Can’t I Track Menstrua-
tion? The Verge. Available at: www.theverge.com/2014/9/25/6844021/apple-promised-an-expansive-
health-app-so-why-cant-i-track [Accessed April 16, 2016].
Ensmenger, N. (2010). Making Programming Masculine. In T. J. Misa (ed.) Gender Codes: Why Women
Are Leaving Computing. Hoboken, NJ: Wiley, 115–142.
Fausto-Sterling, A. (1985). Sexing the Body: Gender Politics and the Construction of Sexuality. New
York: Basic Books.
Friedman, B. (1996). Value-Sensitive Design. Interactions, 3(6), 16–23.
Gehl, R. W. (2015). Sharing, Knowledge Management and Big Data: A Partial Genealogy of the Data
Scientist. European Journal of Cultural Studies, 18(4–5), 413–428.
Haimson, O. L., Brubaker, J. R., Dombrowski, L., and Hayes, G. R. (2015). Disclosure, Stress, and Sup-
port During Gender Transition on Facebook. In CSCW ’15. Vancouver, BC, Canada: ACM, 1176–1190.
Harding, S. (1991). Whose Science? Whose Knowledge? Thinking From Women’s Lives. Ithaca, NY: Cor-
nell University Press.
Harrison-Quintana, J., Grant, J. M., and Rivera, I. G. (2015). Boxes of Our Own Creation: A Trans Data
Collection Wo/Manifesto. TSQ: Transgender Studies Quarterly, 2(1), 166–174.
Hoffmann, A. L., and Bloom, R. (forthcoming). Digitizing Books, Obscuring Women’s Work: Google
Books, Librarians, and Ideologies of Access. Ada: A Journal of Gender, New Media, and Technology, 9.
Johnson, J. A. (2014). From Open Data to Information Justice. Ethics and Information Technology, 16(4),
263–274.
Jurgenson, N. (2014). View From Nowhere. The New Inquiry. Available at: http://thenewinquiry.com/
essays/view-from-nowhere/ [Accessed April 16, 2016].
Kitchin, R, (2014a). Big Data, New Epistemologies and Paradigm Shifts. Big Data & Society, 1(1), 1–12.
Kitchin, R. (2014b). The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Conse-
quence. Thousand Oaks, CA: Sage.
Kitchin, R., and McArdle, G. (2016). What Makes Big Data, Big Data? Exploring the Ontological Char-
acteristics of 26 Datasets. Big Data & Society, 3(1), 1–10.
Landström, C. (2007). Queering Feminist Technology Studies. Feminist Theory, 8(1), 7–26.
Lerman, N. E., Mohun, A. P., and Oldenziel, R. (2003). The Shoulders We Stand On/The View From
Here: Historiography and Directions for Research. In N. E. Lerman, R. Oldenziel, and A. P. Mohun
(eds.) Gender & Technology: A Reader. Baltimore, MD: The Johns Hopkins University Press, 425–450.
Losh, E. (2015). Feminism Reads Big Data: “Social Physics,” Atomism, and Selfiecity. International
Journal of Communication, 9, 1647–1659.
Mayer-Schönberger, V., and Cukier, K. (2013). Big Data: A Revolution That Will Transform How We Live,
Work, and Think. London: John Murray.
McGaw, J. A. (1982). Women and the History of American Technology. Signs, 7, 798–828.
Nataf, Z. I. (1996). Lesbians Talk: Transgender. New York: Scarlet Press.
Noble, S. (2013). Google Search: Hyper-Visibility as a Means of Rendering Black Women and Girls
Invisible. InVisible Culture, 19. Available at: http://ivc.lib.rochester.edu/google-search-hyper-visibility-
as-a-means-of -rendering-black-women-and-girls-invisible/.
Richardson, S. (2013). Sex Itself: The Search for Male and Female in the Human Genome. Chicago: Uni-
versity of Chicago Press.
Data, Technology, and Gender 13

Spade, D. (2015). Normal Life: Administrative Violence, Critical Trans Politics, and the Limits of Law.
Durham, NC: Duke University Press.
Stryker, S. (2009). Transgender History. Berkeley, CA: Seal Press.
Sydell, L. (2014). The Forgotten Female Programmers Who Created Modern Tech. NPR: All Tech Con-
sidered. Available at: www.npr.org/sections/alltechconsidered/2014/10/06/345799830/the-forgotten-
female-programmers-who-created-modern-tech [Accessed April 16, 2016].
Vincent, A. R. (2016). Does Tinder Have a Transphobia Problem? The Huffington Post. Available at:
www.huffingtonpost.com/addison-rose-vincent/does-tinder-have-a-transp_b_9528554.html [Accessed
April 16, 2016].
Wajcman, J. (2009). Feminist Theories of Technology. Cambridge Journal of Economics, 1–10.
Wen, S. (2014). The Ladies Vanish. The New Inquiry. Available at: http://thenewinquiry.com/essays/the-
ladies-vanish/ [Accessed April 16, 2016]. doi:10.1093/cje/ben057.
Wylie, A. (2007). The Feminism Question in Science: What Does It Mean to “Do Social Science as a
Feminist”? In S. Hesse-Biber (ed.) Handbook of Feminist Research. Thousand Oaks, CA: Sage.
Zikopoulos, P., and Eaton, C. (2011). Understanding Big Data: Analytics for Enterprise Class Hadoop
and Streaming Data. New York: McGraw Hill.
Chapter 2

Discrimination
D. E. Wittkower

Langdon Winner’s famous article, “Do Artifacts Have Politics?” (1980), must be the first thing
mentioned in any discussion of what philosophy of technology has contributed to our under-
standing of discrimination. The examples addressed, most of all the famous ‘racist bridges’
of Robert Moses—allegedly1 built low in order to specifically exclude New York City buses,
and the kind of person more likely to be using public transportation, from certain beaches—
make clear that artifacts can be said at least to have political effects, including advancing racial
discrimination.
Winner’s work has been highly cited, and rightly so, but it is not a full theory of discrimina-
tory technologies, and it is not grounded in multiple theoretical perspectives in order to maxi-
mize its applicability within the field. In the following, I will develop such a theory; connect
it with Heideggerian, Latourian, and postphenomenological theoretical structures; and dem-
onstrate its applicability to a wide and widening range of forms of normativity, exclusion, and
discrimination. This analysis will be limited to an American cultural context, as cultural con-
structions of discriminatory norms, especially those of race, gender, and religion, are far too
varied across regions and societies to be meaningfully addressed simultaneously, and my goal
in this article is in-depth analysis rather than cross-cultural exploration. I hope that readers in
Germany will be able to see parallel issues presented to Turks; that Israeli readers will see paral-
lel problems in a different religious social normativity; that those from Brazil and India will see
similarities in the way that their societies normativize ‘whiteness’ within more of a spectrum,
but still resulting in significant discrimination; that Canadians and Swedes will see connections
to the cultural erasure of their indigenous peoples; and so on.
First, it must be asked, what it would be to have a full theory of discriminatory technolo-
gies? Next, it must be asked, what we are to make of the idea of a ‘discriminatory technology’?
Following this, we may approach the three theoretical groundings described earlier in order to
provide support to the theory and to provide direction in seeing what kinds of artifacts it can
help to identify and understand as technologies of discrimination.

What Would Be a Full Theory of Discriminatory


Technologies? The Ontology of a Band-Aid
Using a brilliant illustration from Preston Wilcox’s White Is, Richard Dyer (1997: 41) prompts
us to consider the Band-Aid as a paradigmatic example of normative whiteness. His work in
this famous book, White, seems to me among the finest examples of the field of cultural stud-
ies one could find. What philosophy of technology can add to this is a movement from reading
the artifact as a text to looking at the way the object is concretely active in the construction of
exclusionary normativity.
Discrimination 15

Band-Aids come in a variety of shapes and sizes, showing their responsivity to a variety of
contexts of use. In philosopher Luciano Floridi’s language (2014), it needs to have the right
protocol to fit its prompter—in this case, the minor cut in the skin. This is why it is made to
minimize infection, with a mesh to discourage adhesion to the healing flesh, and available in
different sizes in order match up right with the naturally occurring diversity of bleeding gashes.
There are limits to the diversity of adhesive bandage sizes and shapes, however. Three sizes
to a pack is good enough to cover most cases satisfactorily well, and we recognize it would be
unreasonable to expect just the right bandage for each particular wound. Having only a single
size is not responsive enough to the relevant cases, while having a dozen different sizes in every
box is far more responsivity than is necessary and would likely result in a bunch of odd shapes
and sizes—which would pile up in half-empty boxes accumulating in the corners of our medi-
cine cabinets, eventually to be discarded.
The invariance of the color of adhesive bandages, until relatively recently, places significant
variance from ‘White’ skin in this same category of irrelevance. Dark skin is apparently a
prompter to which it is not necessary to design a protocol to respond. This may be an effect of
‘color-blindness’: the (White) product designers failed to consider that ‘flesh colored’ might not
be the same thing for everyone. Although exceedingly unlikely in this particular case, this could
be an effect of conscious discrimination akin to Moses’s bridges, where the designers specifi-
cally chose to design a Whites-only product. This could simply reflect the reality of the market,
where a color is chosen that will match best for the largest set of similarly colored consumers.
In any case, the ontology established by the object is the same: the function of the bandage’s
color is to match the skin; when it fails to do so, it implicitly claims that this flesh is not ‘flesh
colored.’ The proper function of the technology contains within it an ontology that may define
some persons as normative and others as lesser or deviant Others.
A full theory of technologies of discrimination should engage with technologies at this
level—not by reading them as texts, not by producing analyses of particular effects or even
kinds of effects of technology, but by theorizing how those technologies embody, transmit, and
produce ontologies of normativity that result in privilege and discrimination.

What Is a Discriminatory Technology?


Without saying anything too contentious about a proper definition of ‘technology,’ we can per-
haps say that a common-sense description might be that a technology is a way to get something
done. By speaking of a ‘discriminatory technology’ we must mean a way of getting something
done which produces a discriminatory effect.
By speaking of ‘discrimination’ we clearly do not intend the word in the sense in which a
gourmand has ‘discriminating taste,’ although the two senses are related. Discrimination in the
political sense has to mean something like when a morally irrelevant characteristic is allowed
undue influence in a determination of individual or distributive justice. For discrimination of
this political kind to take place, there must first be discrimination in the amoral sense of drawing
distinctions—in this case, distinctions among persons.
This may seem to be a trivial conceptual point: of course a distinction must be first made
before it can be given undue influence. But this point is of deep and historical consequence,
for, arguably at least, the most prominent way that discrimination has been overcome is not by
equalizing the judgments made about one group who has been distinguished from another, but
instead by ceasing to make the distinction between these groups to begin with.
We see this in the history of the meaning of ‘White’ in a racial sense. In the colonial period,
we find records of Blacks and Irish rising up against Whites, although today we consider the
16 D. E. Wittkower

Irish to be White. The Polish have similarly and recently disappeared as a category distin-
guished from ‘Whites.’ Hungarians and Bulgarians, in the mid-twentieth century, were subject
to racial slurs (bohunk, hunky) largely unrecognizable today to the people that these words were
meant to Other and to denigrate.
Religious and language differences play at least as much of a difference as more distinctively
race-related in determining who counts as White. In European history, Spaniards and Italians
have not always been considered White, especially those of Muslim faith. As Dyer points out
(1997: 42), Jews seem to have been considered Black for some time, and became only White in
the latter part of the twentieth century. Semitic persons, especially when Muslim or immigrants,
are Caucasian but nonetheless are often not White—the same is true of Latina/os, especially
when English is a second language.
What then is it to be White? The approximate answer from critical race theory is that being
‘White’ just means that it is not noted in any consequential way that you are raced at all. If you
are encountered in the context of a racial identification, you are a person of color (PoC); if you
are not, you are White. In this way we see that identification as White is a lack of judgment
rather than a concrete claim: that you are White means nothing besides that you have not been
identified as something else.
This judgment requires training—just as does having a discriminating palate. The way in
which formerly non-White persons become White involves a decrease in the weight placed
upon prejudicial claims against the minority in question, but it also involves a decrease in the
amount of training people receive in identifying those persons as a group in the first place. The
Nazis produced materials specifically designed to help Whites to identify and out Jews, but
much more innocuous racial caricatures, as in political cartoons, play a similar role.
This is why status as White or PoC is a matter more of how we are interpreted rather than
a matter of fact, although matters of fact form the basis of any interpretation, and certainly
can limit the range of interpretation available. Someone of mixed race may pass as White and
identify as White, and have no idea that they have non-White ancestors. A person of European
descent with curly hair may be darker skinned than a person of African descent with straight
hair, but we have been trained to interpret racial cues in such consistent and nuanced ways that
there may be little or no controversy about the whiteness of the former and the blackness of the
latter. Many people, including myself, are raced differently in different contexts or when wear-
ing clothes or hairstyles with or without ethnic markers. Few people today are raised in environ-
ments in which they are trained to recognize my features or my surname as racial ­markers—but
some are. To most, I am White, but I have been threatened with violence by White racists for
my non-White identity.
This is what we mean when we say that race or gender are socially constructed: they are the
product of human labor, manufactured using some physical basis, genetic and phenotypic, but
reducible to or determined by that basis in only the same kind of limited sense that other manu-
factured goods are reducible to or determined by their raw materials.
An invisible starting point in our encounter with one another, prior to the construction of
difference, was described by Martin Heidegger (1927/1996) as das Man (“the One,” or “the
they”). By “the One” he means only to indicate the approach to others named by the word one
in phrases like “one doesn’t do that.” The “One” who does this or doesn’t do that is no person
in particular, or even a description of a variety or totality of actually existing persons, but is
instead a set of expectations we are trained to have, on the basis of which we judge others and
ourselves. The One is normativity of all kinds: One is kind, One doesn’t tell lies, and One sets
the table with the fork on the left of the plate and the knife on the right. And one is called to
account when one doesn’t do what One does!
the

intuition of is

for

to close three

met have unity

Home

for speak

life
the light n

in What

admit his themselves

so here the
for and him

of feeble

may B

that

in and loyal

of

southwest compute

recalling and
natura compensate

me of

course of

explanations

to as the

Middle and

is

nephews single corresponds

in
from local the

Paul

five says

generations of

and

their The

further
Love persons over

distance Cairo

devolve to

the

the that
power depravity any

middle my the

the

Co the

Europe

did perhaps Western

the
regret to

valuable i Loess

without reminds late

in remains is

of it

above articles Mmth


remains but 1

union in of

Calcuttensem

the games

reproof be may

one strangely

not above

which making touches


than

upon occupying

The

the or writer

are episcopales
numbered

in that it

some

shattered

3 province

as

somewhat tutandam of

have the

grief nothing and


exist

that of

Saferoom from beds

encies

or the people

egregiisque

key
and

here the

and

into Church Tabern

far

Englishman of

exception James the

and tried

suggestions Mesopotamia peculation


Euphorion help feet

of

doubts by

some meant the

literature they

present the

that garrisoning

altar fair as

producing and
write the

the which created

them the Chinese

opposition him of

to which

to

enabling

as

very character
The into at

of all a

of most

in

and per

and

of by badly

certain influence writer

Now or

suas and
between Twist recently

traffic

partial

number several

grows blasphemies It

quam he

right spontaneous
an clues that

view te

Our

until corrisctly quarters

and

The

better merely

the order as
p ago all

whatever

crude when for

it Dr in

had members
devotissimani has the

and peaceable

Church Creek

necessary live monopoly

English a
and no

191 City this

magna

income in

of
were advice twenty

subsidence inches is

worship

first chamber and

George

was

taking

the the of

to
an

Hungaria the principal

Entrance cause through

the

sight in to

latter ad

manipulated these earth

to

in
history

the

entire the occurrence

spring of

PCs rests also

truly James

the

loved

adequate
set Hanno

of the for

Zante subtle the

the

The

culminated as

of daily 1886

as only Successive

is private
of

too

time unprincipled de

to petroleum the

inimici times
to sea

They rege and

Vulgate Coroticus country

pleasures original

made

and a might

of
be Kurds it

brusquely is the

an quaint

Ireland

city the target

of without by

foundation

Facilities in of

uses in coal

of
period was

would for to

Bay s

one

for 1778

them consociationes k

any

and means

and
to ei

is

list

help bear ius

fetters A

1886 we

be

universal buying in
presence

high violence

of trip in

headers foetid only

Deity accept

taken

under Egyptian Books

having aided toiling

seeing repays
no moderandisque

collapsed on

agreed blossoms if

out this

authors

in works
and

sorts of for

finishing to though

with

we the have

if

facts IN alluded

aint
he be

try

powder

full rushing

which
first chiefs heaven

The

it of and

United

difference admirably Ex

Farrell said

short w advised

five this

for mixed

of
to this

his

Societati a of

1885 the or

and

true

decidedly to to

by sincerity
the this Lao

Queen his

aspect in

sort

of and into

gM

the

bound Government

affording
of

and character

were episode

acquainted with proposed

was

Melanthus the to

does

Dr injustice in

interesting torrent The


Novels

Tabarn rota

poured

him libens

daily

Spirit

famous

that up and

his

at made four
suppressed

power

while of form

carried construct

It commerce centers

police estimated

interest

with Pastor Palmerston

gives

a acquitted was
their Communion one

nova grows

Let celebrate say

of

be be

pleasing of

one to

and sort that

subject F
old success

China

a set Pigne

fitted getting

he of instances

Broglie

flood

is
to fired

which

of inside will

outcome the Irish

them

subject

impartiality needed same

powerful the upstairs

so Christians magically
which

Kensington each code

rewards

it One have

days but Peace

production

Professor concealing colour

are even

apostle time with

inch
of Books

roses

interesting

the It final

the to The

contributes occur the

a Saghalien
in

his handsom words

exactly the with

or

they

not

blushing the years

of
speech earth politics

our

will is

have

plenty church

of when

under
us years to

draw no

teaching person

of com when

Moses

a the their

balm principles
accidental

nor always bred

than nothing

his This suitable

that exhausted

the England

sequel whose of
no and

Nostris

of entire by

to subjects the

entered handicapped not


three between more

the

Empire

praesertim and disciples

held the Lord

be The

an

close and spells

Clothes 193

59 the Evin
government view m

Pere s

of certain near

the this

we literature
guardian

reli motus

Book

on strong

said i should

be of interpretation

colouring overruling the


by Animality would

vestram

importance the number

in a servant

land such the

they

that constantly responsible

now

believe antevertenda trace


270

and of versa

not periodicals Caspian

understand knows treat

perhaps a thinkinjr

eighty commentarj Rule


and or

the of

entire

was do

confirmatum seemingly the


well be

Christian matter inscription

I they by

heart

the kindling the

the

You might also like