- Home
- Kris Shaffer
Data Versus Democracy
Data Versus Democracy Read online
How Big Data Algorithms
Shape Opinions and Alter
the Course of History
—
Kris Shaffer
DATA VERSUS DEMOCRACY
HOW BIG DATA ALGORITHMS
SHAPE OPINIONS AND ALTER THE
COURSE OF HISTORY
Kris Shaf fer
Data versus Democracy: How Big Data Algorithms Shape Opinions and
Alter the Course of History
Kris Shaffer
Colorado, USA
ISBN-13
(pbk):
978-1-4842-4539-2 ISBN-13
(electronic):
978-1-4842-4540-8
https://doi.org/10.1007/978-1-4842-4540-8
Copyright © 2019 by Kris Shaffer
This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or
part of the material is concerned, specifically the rights of translation, reprinting, reuse of
illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way,
and transmission or information storage and retrieval, electronic adaptation, computer
software, or by similar or dissimilar methodology now known or hereafter developed.
Trademarked names, logos, and images may appear in this book. Rather than use a trademark
symbol with every occurrence of a trademarked name, logo, or image we use the names, logos,
and images only in an editorial fashion and to the benefit of the trademark owner, with no
intention of infringement of the trademark.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if
they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights.
While the advice and information in this book are believed to be true and accurate at the date of
publication, neither the authors nor the editors nor the publisher can accept any legal
responsibility for any errors or omissions that may be made. The publisher makes no warranty,
express or implied, with respect to the material contained herein.
Managing Director, Apress Media LLC: Welmoed Spahr
Acquisitions Editor: Shiva Ramachandran
Development Editor: Laura Berendson
Coordinating Editor: Rita Fernando
Cover designed by eStudioCalamar
Distributed to the book trade worldwide by Springer Science+Business Media New York, 233
Spring Street, 6th Floor, New York, NY 10013. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail
[email protected], or visit www.springeronline.com. Apress Media, LLC is a California
LLC and the sole member (owner) is Springer Science + Business Media Finance Inc (SSBM
Finance Inc). SSBM Finance Inc is a Delaware corporation.
For information on translations, please e-mail [email protected], or visit http://www.apress.
com/rights-permissions.
Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook
versions and licenses are also available for most titles. For more information, reference our Print
and eBook Bulk Sales web page at http://www.apress.com/bulk-sales.
Any source code or other supplementary material referenced by the author in this book is
available to readers on GitHub via the book's product page, located at www.apress.
com/9781484245392. For more detailed information, please visit http://www.apress.com/
source-code.
Printed on acid-free paper
Blessed are the peacemakers.
Contents
About the Author vii
Acknowledgments ix
Introduction: From Scarcity to Abundance xi
Part I:
The Propaganda Problem 1
Chapter 1: Pay Attention: How Information Abundance Affects the
Way We Consume Media 3
Chapter 2: Cog in the System: How the Limits of Our Brains
Leave Us Vulnerable to Cognitive Hacking 19
Chapter 3: Swimming Upstream: How Content Recommendation
Engines Impact Information and Manipulate
Our Attention 31
Part II:
Case Studies 45
Chapter 4: Domestic Disturbance: Ferguson, GamerGate, and
the Rise of the American Alt-Right 47
Chapter 5: Democracy Hacked, Part 1: Russian Interference and
the New Cold War 67
Chapter 6: Democracy Hacked, Part 2: Rumors, Bots, and Genocide
in the Global South 91
Chapter 7: Conclusion: Where Do We Go from Here? 109
Index 117
About the Author
Kris Shaffer, PhD, is a data scientist and Senior
Computational Disinformation Analyst for New
Knowledge. He coauthored “The Tactics and
Tropes of the Internet Research Agency,” a report
prepared for the United States Senate Select
Committee on Intelligence about Russian inter-
ference in the 2016 U.S. presidential election on
social media. Kris has consulted for multiple U.S.
government agencies, nonprofits, and universities
on matters related to digital disinformation, data
ethics, and digital pedagogy.
In a former (professional) life, Kris was an academic and digital humanist. He
has taught courses in music theory and cognition, computer science, and digi-
tal studies at Yale University, the University of Colorado Boulder, the
University of Mary Washington, and Charleston Southern University. He
holds a PhD from Yale University.
Acknowledgments
How do you write the acknowledgments section for a book like this?
If researching and writing this book has taught me anything, it’s that some
things are best said in private and, if possible, in person. So in lieu of public
acknowledgments, I have decided to give a personal, hand-written note to
those who educated, inspired, or otherwise helped me on this project. And
hopefully, that note will be delivered in person and accompanied by a drink or
a meal.
However, there is one group of people too large to thank individually, but
whose influence and motivation have been immeasurable. To all my students
over the years, who have inspired me with their hard work, their brilliance,
and their desire to make the world a better place, I thank you. May each of
you make your own dent in the universe and nudge humanity, even a little bit,
in the right direction.
Introduction: From
Scarcity to
Abundance
A Brief History of Information
and the Propaganda Problem
As long as we’ve had information, we’ve had disinformation. As long as we’ve
had advertising, we’ve had attempts at “psychographic profiling.” Ever since
the invention of the printing press, we’ve had concerns about the corrupting
influence of mass media. But there are some things that are new in the past
decade. Information is abundant in a way we couldn’t conceive of just a decade
or two ago, and the new science of recommendation engines—algorithmically
selected content, based on personal data profiles—dominates the modern
media landscape. In this
introduction, we will clear away misconceptions and
focus on the heart of the problem—the intersection of information abun-
dance, human psychology, user data profiling, and media recommendation
algorithms. This is where propaganda finds its way into modern society.
The Lay of the Land
Have you ever used a search engine to find a stock photo? Maybe you needed
a slick header for your blog post, some scenery for your family holiday letter,
a background for an office party flyer. The results can be pretty good, espe-
cially if you’re on a professional stock image site (and know how to choose
your search terms).
But have you ever used a regular search engine to find images of something
generic? Try it sometime. Search for images of doctor, then nurse. Professor, then teacher. What do you see?
Chances are you found some pretty stark stereotypes. White-haired profes-
sors, wearing tweed, lecturing in front of a chalkboard. Teachers also in front
of chalkboards, smiling at their eager pupils. Doctors in white coats, deftly
xi
Introduction: From Scarcity to Abundance
wielding their stethoscopes or making notes on their patients’ charts. Nurses
in blue scrubs, also masters of their charts and scopes. You get the picture.
Walk through a school, a university, a hospital, a general practitioner’s office,
though, and you’ll see little that matches these images. Most schools and uni-
versities abandoned chalk long ago, in favor of dry erase boards and electronic
projectors. And lecturing to seas of students in rows is increasingly rare,
particularly with young students. And, by the way, all of these professionals
tend to dress less formally, and certainly more diversely, than the subjects of
these search result images.
Search engines don’t give us reality. They give us the results we expect to see.
Using a combination of human programming, data from user interaction, and
an ever-repeating feedback loop of the two, the results of these searches
gradually become more like the generalizations in our minds. The stereotypes
we hold in our minds determine what we click on, and those clicks form the
raw data used by the search engine’s algorithms. These algorithms, in turn,
form generalizations of expected user behavior, based on our collective clicks,
and serve up the results we’re most likely to click on, based on that data. We
perceive, we generalize, we search, we click, the machine perceives (the
clicks), the machine generalizes, the machine returns results.
It doesn’t end there. Those search results get used all over the web and in
print (isn’t that why we were searching in the first place?). Those images
become part of the backdrop of our view of the world and further fuel the
generalizations formed by our mind. Thus forms an endless loop: human per-
ception → human generalization → human behavior → machine perception
→ machine generalization → machine behavior → human perception →
human behavior … And in each turn through the feedback loop, the stereo-
type gets more stereotypical. Reality is lost. Though, because the stereotypes
become part of our media landscape, in a very real sense, reality is also formed.
But did you notice something else strange about those image results?
How many of those doctors were men, and how many were women? What
about the nurses? According to The Wall Street Journal, 32% of doctors in
the United States in 2012 were women, and the proportion is rising. 1 Is that
the percentage you saw in your search results? According to the National
Center for Education Statistics, 49% of tenure-track university faculty and
57% of non-tenure-track faculty in the United States are women.2 How did
your professor search results compare?
1 Josh Mitchell, “Women Notch Progress,” The Wall Street Journal, published December 4,
2012, www.wsj.com/articles/SB10001424127887323717004578159433220839020.
2 “Quick Take: Women in Academia,” Catalyst, published October 20, 2017, www.catalyst.
org/knowledge/women-academia.
xi i
Introduction: From Scarcity to Abundance
Chances are your search results were more stereotype than reality. That’s
partly our brains’ fault. Our brains make generalizations about what we per-
ceive in the world, and those generalizations allow us to make predictions
about the world that help us interact with it more efficiently. Cognitive scien-
tists have also found that when we form generalizations—called schemas—we
tend to define those schemas, in part, by contrast with other schemas. In
other words, we emphasize their differences, often making them more distinct
from each other in our minds than they are in reality. While this method
of defining ideas and categories in our mind is usually helpful, it sometimes
works against us by reinforcing the bias of our environment, including the
(stereotype-ridden) media we encounter. And when the feedback loop of
human generalizations, machine generalizations, and media representation
goes online, that bias gets propagated and reinscribed at near light speed.
It’s all connected—our media, our memory, our identity, our society. The way
we interact with the world is directly influenced by the “mental map” we have
of the world—what’s real, what’s not, and where it all belongs. And while that
mental map is primarily formed in light of our experiences (with a little help
from hundreds of thousands of years of human evolution), that map is increas-
ingly influenced by the media that we consume.
I say “increasingly” not because something in our brains has changed, but
because our media landscape has changed so drastically in the last century—
even the last decade. We have moved from information scarcity to informa-
tion abundance, from media desert to media ubiquity. And for most—though,
importantly, not all—humans on this planet, our access to the information
and media that exists has expanded just as rapidly. Our lived experiences are
increasingly mediated through media.
The Limits of Attention
But one important thing has not changed: the limits of the human body,
including the brain. Sure, infant mortality3 and life expectancy4 have improved in most societies over the past century, and quality of life has improved for
many as the result of scientific and humanistic advancement. But the human
cognitive system—the interaction of the brain and the body, memory and the
3 “Infant Mortality,” World Health Organization Global Health Observatory (GHO) Data,
accessed February 5, 2019, www.who.int/gho/child_health/mortality/neonatal_
infant_text/en/.
4 “Life Expectancy,” World Health Organization Global Health Observatory (GHO) Data,
accessed February 5, 2019, www.who.int/gho/mortality_burden_disease/life_
tables/situation_trends_text/en/.
xiv
Introduction: From Scarcity to Abundance
senses—took its more-or-less modern form tens of thousands of years ago. 5
The amount of information our brains can hold has not changed, nor has the
limits of conscious attention.
And with all the media clamoring for our attention, that attent
ion has become
our most precious—if also our most overlooked—resource.
Let’s step back to the world of search engines for a moment. They can be
hacked. I don’t mean a full-on security breach (though that’s certainly possi-
ble). I mean they can be manipulated by highly motivated users. Just as what
people click on determines (in part) the search result rankings, the terms that
people search for determine (in part) the terms that pop up on autocomplete
as you type. (For some fun, go to Google and search for your home country/
state: “Why is [Colorado] so” and see how Google thinks you might want to
complete that search.) If the searches typed into a search engine can deter-
mine the autocomplete terms, then a group of people willing to put the time
in can search for the same thing over and over again until it dominates the
autocomplete results, increasing the number of people who see it, are influ-
enced by it, and click on it.
This very thing happened in 2016. Members of the so-called alt-right move-
ment (right-wing extremists, often affiliated with groups espousing hateful
views like white supremacy, anti-Semitism, and hyper-masculine antifeminism)
successfully manipulated Google’s autocomplete to suggest racist searches
and pro-alt-right messages.6 When this manipulation was brought to Google’s
attention, they responded with changes to the system. But no system is com-
pletely impervious to hacking.
Not even the human mind.
Cognitive Hacking
Any system that draws conclusions based on statistics, in part or in whole,
can be gamed by a manipulation of the statistics. That’s how the human brain
works—over time, the things we perceive are aggregated together into gen-
eralized schemas, which are constantly changing with new information. (The
more established schemas change more slowly, of course.) By altering the
statistical input of the brain, a “hacker” can impact the schemas our brain
5 Though scientists still debate just how many tens of thousands. See Erin Wayman, “When
Did the Human Mind Evolve to What It Is Today?,” Smithsonian Magazine, published June
25, 2012, www.smithsonianmag.com/science-nature/when-did-the-human-mind-
evolve-to-what-it-is-today-140507905/.
6 Olivia Solon and Sam Levin, “How Google’s Search Algorithm Spreads False Information