“By pulling back the curtain and drawing attention to forms of coded inequity, not only do we become more aware of the social dimensions of technology but we can work together against the emergence of a digital caste system that relied on our naivety when it comes to the neutrality of technology… It includes an elaborate social and technical apparatus that governs all areas of life.“
- Ruha Benhamin, Race After Technology
We are Cyber Collective, a group of technology professionals with backgrounds in cyber security, privacy, public policy, and ethics. Through our collective industry, academic, and lived experience, we have become intimately aware of the ways in which technologies deployed in today’s landscape have produced, and continue to produce, unintended harmful consequences at scale for the vulnerable and marginalized¹ in society.
Our approach to this problem is to bring Black, Indigenous, people of color (BIPOC), women, and marginalized peoples to the center of tech product and policy development. We do this by educating our diverse community of technology users—@cybercollective social media followers and newsletter subscribers which include librarians, teachers, women, BIPOC in tech, and concerned citizens—by teaching them how to challenge the digital world we live in, then using their insights to advocate for change.
We also recognize the work that tech companies and policymakers are doing to increase accountability to people impacted by technologies in our present-day landscape. Our goal is to provide some of the missing pieces of the puzzle in this work: by centering the experiences and voices of the historically marginalized in tech, we aim to identify and improve outcomes produced by technology for the marginalized and society at large.
This research brief, coupled with community events we held in October 2020, is our first step in this work. In this document, we discuss the problem we’re tackling as an organization, detail the approach we have taken to understand our community’s knowledge and knowledge gaps, and share our findings.
Our Problem Space
While marginalized peoples are frequently left out of the product development process in the tech industry, technical fixes to problems in society often end up producing harms experienced by the most vulnerable². Decision-making algorithms, for example, like those that use credit scores to evaluate employment candidates, frequently penalize the poor, while search engine results often reinforce sexist and racist conceptions of women and BIPOC³.
The technologies many people encounter in wide-ranging aspects of life—social media platforms, search engines, and surveillance technologies, for example—have implications beyond simply connecting users with information and people. Ruha Benjamin, associate professor of African American Studies at Princeton University and author of Race after Technology, suggests that today’s technology “includes an elaborate social and technical apparatus that governs all areas of life” and reaches far beyond our screens, ultimately presenting a challenge to our autonomy and free will as human beings⁴.
The far-reaching effects of these technologies have also led to public policy efforts to regulate the collection, handling, and use of the mass amounts of consumer data that make the deployment of such technologies possible⁵. While some ballot measures have provided tangible consumer protections, others have represented the interests of industry—cementing the potential that future technologies continue to produce the harms with which the members of our team have become so familiar.
To their credit, the tech industry has begun the work to understand and mitigate this impact, in large part as a response to outside pressures to make these considerations. Emanuel Moss and Jacob Metcalf of the Data & Society Research Institute have found that a number of players in the tech industry are intentionally working to resolve ethical dilemmas faced at the corporate level, assigning leaders to operationalize ethics and scale these practices⁶.
However, these discussions often lack considerations of race and racial equity, and many technologies continue to “operate within parameters that assume a universal, raceless default subject position synonymous with whiteness”⁷. When the development process lacks insight into the experiences of people who are disproportionately impacted, these technologies will continue to produce harms in similar ways.
By looking at these gaps, researchers have identified one key component for addressing these ethical dilemmas: centering the vulnerable and marginalized in discussions around tech and data ethics⁸. Our approach as an organization is to do just this: to bring into the spotlight the voices and experiences of the marginalized in the tech ethics and policy dialogue.
We combine both traditional and creative research approaches to collect empirical data through grounded theory, using traditional methods to gather qualitative and quantitative data during virtual research events. Event types include workshops, seminars, and community conversation. Throughout each event, our research team recorded participant feedback through dialogue, typed comments, and survey responses.
During the U.S. General Election, we worked with our community members to identify their gaps in knowledge around tech policy and to teach them how to think critically about our digital world more broadly. Our event lineup from October 2020 is below.
Our event lineup for October included:
CyCo 101 - The more accessible that internet, data, and technology become, the larger the threat we face as individuals. Concepts in cyber security need to be digestible, relatable, and accessible to a broad audience in order to impact our dialogue and awareness at the individual, community, and global level. Our first event was an introduction to our organization, National Cybersecurity Awareness Month, and a few best practices for securing our data.
Election Security: Digging Into Our Digital Democracy - During the 2020 U.S. Presidential election, we wanted to hold space for conversation around data exploitation, discuss what happened in the 2016 presidential election, and share actionable steps to ensure secure and fair elections.
Election Security 101 - Misinformation and disinformation spread during the 2020 U.S. General Election sparked concern about election hacking and the security of the democratic process in the United States. Maggie MacAlpine, election security specialist, joined us to debunk common myths about election security.
Big Tech Little Ethics: Why 11 States Are Suing Google - Earlier this year, the U.S. Antitrust Committee launched an investigation into the Big Four: Facebook, Google, Microsoft, and Amazon. Now, Google is being sued by 11 states over alleged antitrust violations. This event took the headline down to the consumer level and discussed how this impacts us.
What We Found
Our research team developed a series of questions to gauge our participants’ knowledge level before and after an event, what they learned, and what inferences they made after learning about the event topics. These questions were included in a survey we released during each event. The following section details and analyzes our participants’ responses.
During our events, we shared techniques for protecting ourselves online against common manipulation tactics and challenging mis- and disinformation, finding that many of the participants were not aware of the topics and their implications.
“Coming to the realization that as consumers we really need to take policy change and data protection into our own hands. And I came away with some resources and suggested reading.”
“How can we connect human rights and privacy law? As someone who is going into this area of law in about a year and a half, I have serious concern over this. I am wondering how we can make the changes even at a local level. I know the UN has been preaching this for years to no avail.”
“In a world where technology advances faster than the federal laws and regulations of data privacy and data capitalism, how do we help get those laws up to speed with the technology ?”
Do you know how your personal information is being handled on the internet?
Are you concerned with how your personal information is being handled on the internet?
- When asked, “Should consumers have the right to know what happens to their data?” 100% answered ‘yes’.
- When asked, “Do you think data protection laws exist in the U.S.?” 23% answered ‘yes’, 55% answered ‘no’, and 22% answered ‘unsure’.
- When asked, “Do you think data protection laws should exist?” 100% answered ‘yes’.
The ultimate goal of our research is to center diverse voices and experiences in the technology ethics dialogue and to improve outcomes produced by tech for all. After a month of connecting with, educating, and learning from our community, we have gauged both the level of awareness and concerns around data privacy and security and have found that, while many community members were not initially aware of how their personal information is handled online, they are inclined to take action as their awareness increases.
We have learned that creating a space for critical thinking around the intersection of technology, global policy, and ethics led by marginalized voices enables unheard-of representation and original research.
Our next steps in conducting this research include continuing to educate our community members about privacy and security topics as well as continuing to explore, investigate, and document the knowledge gap between our community members and industry experts.