Temas

We keep falling into the same ditches, you know? I mean, we learn more and more about the physical universe, more about our own bodies, more technology, but somehow, down through history, we go on building empires of one kind or another, then destroying them in one way or another. We go on having stupid wars that we justify and get passionate about, but in the end, all they do is kill huge numbers of people, maim others, impoverish still more, spread disease and hunger, and set the stage for the next war. And when we look at all of that in history, we just shrug our shoulders and say, well, that’s the way things are. That’s the way things always have been.”

Octavia Butler, Parable of the Talents

 

How does the internet deepen global inequalities?

Whose knowledge is, and is not, represented on the internet?

Who produces this knowledge and how?

Whose internet, and whose freedoms, are we really defending?

Knowledge

Only a tiny fraction of the world’s knowledge systems are captured in books or other forms of visual and oral material, and the internet – for all its democratic, emancipatory potential – further skews what we use as knowledge every day.

Google estimated in 2010 that there are about 130 million books in at least about 480 languages. Of these, only about 20% are freely accessible in the public domain and 10-15% are in print. In a world of 7 billion people speaking nearly 7000 languages and dialects, we estimate that only about 7% of those languages are captured in published material; a smaller fraction of the world’s knowledge is converted into digital knowledge; and a still smaller fraction of that is available on the internet.

We’re piloting resources and methods for centering the knowledge and expertise of marginalized communities on the internet, starting with Wikipedia’s online knowledge repository. As one of the world’s most visited websites, Wikipedia is a good proxy for knowledge on the internet more broadly, and we know that Wikipedia is not representative of the knowledge of the world.

Using Wikipedia as a proxy indicator of freely available online knowledge, we know that only 20% of the world (primarily white male editors from North America and Europe) edits 80% of Wikipedia currently, and estimate that 1 in 10 of the editors is self-identified female. Studies by Mark Graham and colleagues at the Oxford Internet Institute have found that 84% of Wikipedia articles focus on Europe and North America, and most articles written about the global South are still written by those in the global North, so that even where content is present, skewed representations remain.

Communities, like the Dalits from India and the US, queer feminists from Bosnia and Herzegovina, and Kumeyaay Native Americans, have led the way in mapping their own knowledge to find critical gaps in Wikipedia, and then creating and improving related content to fill those gaps. We support and amplify these efforts and look for new opportunities to build alliances and create space for other communities online as well.

Questions We Care About:

  • Who participates in the production of knowledge?
  • How can we diversify our sources?
  • Is ‘neutrality’ a virtue?

 

Surveillance, Privacy and Security

Digital technologies have ushered in an era of unprecedented mass and individualized surveillance. The inherent tensions between unrestricted data retention and personal privacy have prompted some governments to take action. In the EU, where the ‘right to be forgotten’ has been enshrined in law, and a new General Data Protection Regulation places limits on how organizations can use individuals’ data. In recent years, revelations like the Edward Snowden disclosures trained a spotlight on the ways in which governments (including in the EU) surveil their citizens and human rights defenders without warrant or cause.

Inequality is a central feature of surveillance and privacy in the digital age. As privacy becomes a privilege, protected in the global north, data extraction from the global south is likely to accelerate.

There are political and socio-economic dimensions of digital surveillance. Human rights defenders and political activists are frequently targeted for surveillance.  Research by the Tactical Technology Collective revealed that states often compromise the online safety and privacy of human rights defenders both in daily forms of harassment, and in extraordinary forms of intervention. Thus, the space for alternative political visions and projects (including the democratization of the internet) is confined.

In addition, women, people of non-heteronormative gender identities, and people of color are often at significant risk to online harassment and bullying. The United Nations Report on Cyber Violence against Women and Girls found that 73% of women have been exposed to, or experienced, forms of online violence. These marginalized communities face discrimination offline, but they also often find the internet a hostile environment for sharing their views. We need to be aware of the unique privacy concerns of different communities in order to build an internet where all are welcome and where privacy entails not only the right to non-interference but also the right to safety and security for those at risk. The privacy policies of internet corporations often have negative effects on diversity and participation online, such as Facebook’s ‘Real Names’ policy, which adversely affected LGBTQI and Native American communities.

Questions We Care About

  • How is privacy connected to privilege in terms of geography, class, race, and gender?
  • Whose privacy does a right to privacy defend?
  • Whose security do we value?
  • How do we realize privacy as more than simply non-intervention – but as the positive duty to protect?
  • How do we bridge the knowledge gap between internet users and data policy makers?
  • How do we build solidarity networks between internet users whose privacy is at stake, and tech projects committed to the defense of our privacy?

 

Digital Infrastructure

Buildings are made up of bricks and steel, wires and fuses. The internet is also constructed. It is made up of material and immaterial infrastructure — from servers to fiber optic cables to code. The underlying architecture of the internet has an impact on how internet users can engage with the internet as a space or a resource. Importantly, much of this architecture is effectively invisible. We don’t always know where the server that hosts a website is located, and we rarely know what code is running behind the scenes to generate our search engine results. Like the buildings we construct, the internet is the product of human decisions about how it should be. So, it matters who’s making those decisions!

Since we usually can’t see the infrastructure of the internet, any biases built into the system are often similarly invisible. But biases exist. Most servers (that host websites and store data from internet users) are physically located in the United States or Europe, even though three quarters of internet users come from the global south. Some regions or countries are more ‘connected’ than others due to long, complicated histories of either privilege or exclusion, with geographical consequences. Most computer scientists and engineers are (still) men, and many algorithms written to perform some of the most basic functions of the internet demonstrate tendencies toward racism and sexism. As we start to build algorithms that can think for themselves (‘artificial intelligence’) — we have to ask how we’re teaching them to think and who is doing the teaching.

Questions We Care About:

  • What does the geography of the internet really look like?
  • Who builds the internet we see and use everyday?
  • What biases are built into the invisible architecture of the internet?
  • We can’t make code neutral (because humans write it!), but how can we humanize it?
  • How can we imagine a more decentralized and autonomous internet infrastructure?