Technology Salon

New York

Sponsored by

a discussion at the intersection of technology and development

ZunZuneo and Ethics in Technology for Democracy

Last week’s Technology Salon New York City touched on ethics in technology for democracy initiatives. We heard from lead discussants Malavika JayaramBerkman Center for Internet and SocietyIvan Sigal, Global Voices; and Amilcar PriestleyAfrolatin@ Project. Though the topic was catalyzed by the Associated Press’ article on ‘Zunzuneo’ (a.k.a. ‘Cuban Twitter’) and subsequent discussions in the press and elsewhere, we aimed to cover some of the wider ethical issues encountered by people and organizations who implement technology for democracy programs.

cuban_twitter

Salons are off the record spaces, so no attribution is made in this post, but I’ve summarized the discussion points here:

First up: Zunzuneo

The media misinterpreted much of the Zunzuneo story. Zunzuneo was not a secret mission, according to one Salon participant, as it’s not in the remit of USAID to carry out covert operations. The AP article conflated a number of ideas regarding how USAID works and the contracting mechanisms that were involved in this case, he said. USAID and the Office of Transition Initiatives (OTI) frequently disguise members, organizations, and contractors that work for it on the ground for security reasons. (See USAID’s side of the story here). This may still be an ethical question, but it is not technically “spying.” The project was known within the OTI and development community, but on a ‘need to know’ basis. It was not a ‘fly by night’ operation; it was more a ‘quietly and not very effectively run project.’

There were likely ethics breaches in Zunzuneo, from a legal standpoint. It’s not clear whether the data and phone numbers collected from the Cuban public for the project were obtained in a legal or ethical way. Some reports say they were obtained through a mid-level employee (a “Cuban engineer who had gotten the phone list” according to the AP article). (Note: I spoke separately to someone close to the project who told me that user opt-in/opt-out and other standard privacy protocols were in place). It’s also not entirely clear whether, as the AP states, the user information collected was being categorized into segments who were loyal or disloyal to the Cuban government, information which could put users at risk if found out.

Zunzuneo took place in a broader historical and geo-political context. As one person put it, the project followed Secretary Clinton’s speeches on Internet Freedom. There was a rush to bring technology into the geopolitical space, and ‘the articulation of why technology was important collided with a bureaucratic process in USAID and the State Department (the ‘F process’) that absorbed USAID into the State Department and made development part of the State Department’s broader political agenda.’ This agenda had been in the works for quite some time, and was part of a wider strategy of quietly moving into development spaces and combining development, diplomacy, intelligence and military (defense), the so-called 3 D’s.

Implementers failed to think through good design, ethics and community aspects of the work. In a number of projects of this type, the idea was that if you give people technology, they will somehow create bottom up pressure for political social change. As one person noted, ‘in the Middle East, as a counter example, the tech was there to enable and assist people who had spent 8-10 years building networks. The idea that we can drop tech into a space and an uprising will just happen and it will coincidentally push the US geopolitical agenda is a fantasy.’ Often these kinds of programs start with a strategic communications goal that serves a political end of the US Government. They are designed with the idea that a particular input equals some kind of a specific result down the chain. The problem comes when the people doing the seeding of the ideas and inputs are not familiar with the context they will be operating in. They are injecting inputs into a space that they don’t understand. The bigger ethical question is: Why does this thought process prevail in development? Much of that answer is found in US domestic politics and the ways that initiatives get funded.

Zunzuneo was not a big surprise for Afrolatino organizations. According to one discussant, Afrolatino organizations were not surprised when the Zunzuneo article came out, given the geopolitical history and the ongoing presence of the US in Latin America. Zunzuneo was seen as a 21st Century version of what has been happening for decades. Though it was criticized, it was not seen as particularly detrimental. Furthermore, the Afrolatino community (within the wider Latino community) has had a variety of relationships with the US over time – for example, some Afrolatino groups supported the Contras. Many Afrolatino groups have felt that they were not benefiting overall from the mestizo governments who have held power. In addition, much of Latin America’s younger generation is less tainted by the Cold War mentality, and does not see US involvement in the region as necessarily bad. Programs like Zunzuneo come with a lot of money attached, so often wider concerns about their implications are not in the forefront because organizations need to access funding. Central American and Caribbean countries are only just entering into a phase of deeper analysis of digital citizenship, and views and perceptions on privacy are still being developed.

Perceptions of privacy

There are differences in perception when it comes to privacy and these perceptions are contextual. They vary within and across countries and communities based on age, race, gender, economic levels, comfort with digital devices, political perspective and past history. Some older people, for example, are worried about the privacy violation of having their voice or image recorded, because the voice, image and gaze hold spiritual value and power. These angles of privacy need to be considered as we think through what privacy means in different contexts and adapt our discourse accordingly.

Privacy is hard to explain, as one discussant said: ‘There are not enough dead bodies yet, so it’s hard to get people interested. People get mad when the media gets mad, and until an issue hits the media, it may go unnoticed. It’s very hard to conceptualize the potential harm from lack of privacy. There may be a chilling effect but it’s hard to measure. The digital divide comes in as well, and those with less exposure may have trouble understanding devices and technology. They will then have even greater trouble understanding beyond the device to data doubles, disembodied information and de-anonymization, which are about 7 levels removed from what people can immediately see. Caring a lot about privacy can get you labeled as paranoid or a crazy person in many places.’

Fatalism about privacy can also hamper efforts. In the developing world, many feel that everything is corrupt and inept, and that there is no point in worrying about privacy and security. ‘Nothing ever works anyway, so even if the government wanted to spy on us, they’d screw it up,’ is the feeling. This is often the attitude of human rights workers and others who could be at greatest risk from privacy breaches or data collection, such as that which was reportedly happening within Zunzuneo. Especially among populations and practitioners who have less experience with new technologies and data, this can create large-scale risk.

Intent, action, context and consequences

Good intentions with little attention to privacy vs data collection with a hidden political agenda. Where are the lines when data that are collected for a ‘good cause’ (for example, to improve humanitarian response) might be used for a different purpose that puts vulnerable people at risk? What about data that are collected with less altruistic intentions? What about when the two scenarios overlap? Data might be freely given or collected in an emergency that would be considered a privacy violation in a ‘development’ setting, or the data collection may lead to a privacy violation post-emergency. Often, slapping the ‘obviously good and unarguably positive’ label of ‘Internet freedom’ on something implies that it’s unquestionably positive when it may in fact be part of a political agenda with a misleading label. There is a long history of those with power collecting data that helps them understand and/or control those with less power, as one Salon participant noted, and we need to be cognizant of that when we think about data and privacy.

US Government approaches to political development often take an input/output approach, when, in fact, political development is not the same as health development. ‘In political work, there is no clear and clean epidemiological goal we are trying to reach,’ noted a Salon participant. Political development is often contentious and the targets and approaches are very different than those of health. When a health model and rhetoric is used to work on other development issues, it is misleading. The wholesale adoption of these kinds of disease model approaches leaves people and communities out of the decision making process about their own development. Similarly, the rhetoric of strategic communications and its inclusion into the development agenda came about after the War on Terror, and it is also a poor fit for political development. The rhetoric of ‘opening’ and ‘liberating’ data is similar. These arguments may work well for one kind of issue, but they are not transferable to a political agenda. One Salon participant pointed out the rhetoric of the privatization model also, and explained that a profound yet not often considered implication of the privatization of services is that once a service passes over to the private sector, the Freedom of Information Act (FOIA) does not apply, and citizens and human rights organizations lose FOIA as a tool. Examples included the US prison system and the Blackwater case of several years ago.

It can be confusing for implementers to know what to do, what tools to use, what funding to accept and when it is OK to bring in an outside agenda. Salon participants provided a number of examples where they had to make choices and felt ethics could have been compromised. Is it OK to sign people up on Facebook or Gmail during an ICT and education project, given these companies’ marketing and privacy policies? What about working on aid transparency initiatives in places where human rights work or crime reporting can get people killed or individual philanthropists/donors might be kidnapped or extorted? What about a hackathon where the data and solutions are later given to a government’s civilian-military affairs office? What about telling LGBT youth about a social media site that encourages LGBT youth to connect openly with one another (in light of recent harsh legal penalties against homosexuality)? What about employing a user-centered design approach for a project that will eventually be overlaid on top of a larger platform, system or service that does not pass the privacy litmus test? Is it better to contribute to improving healthcare while knowing that your software system might compromise privacy and autonomy because it sits on top of a biometric system, for example? Participants at the Salon face these ethical dilemmas every day, and as one person noted, ‘I wonder if I am just window dressing something that will look and feel holistic and human-centered, but that will be used to justify decisions down the road that are politically negative or go against my values.’ Participants said they normally rely on their own moral compass, but clearly many Salon participants are wrestling with the potential ethical implications of their actions.

What we can do? Recommendations from Salon participants

Work closely with and listen to local partners, who should be driving the process and decisions. There may be a role for an outside perspective, but the outside perspective should not trump the local one. Inculcate and support local communities to build their own tools, narratives, and projects. Let people set their own agendas. Find ways to facilitate long-term development processes around communities rather than being subject to agendas from the outside.

Consider this to be ICT for Discrimination and think in every instance and every design decision about how to dial down discrimination. Data lead to sorting, and data get lumped into clusters. Find ways during the design process to reduce the discrimination that will come from that sorting and clustering process. The ‘Do no harm’ approach is key. Practitioners and designers should also be wary of the automation of development and the potential for automated decisions to be discriminatory.

Call out hypocrisy. Those of us who sit at Salons or attend global meetings hold tremendous privilege and power as compared to most of the rest of the world. ‘It’s not landless farmers or disenfranchised young black youth in Brazil who get to attend global meetings,’ said one Salon attendee. ‘It’s people like us. We need to be cognizant of the advantage we have as holders of power.’ Here in the US, the participant added, we need to be more aware of what private sector US technology companies are doing to take advantage of and maintain their stronghold in the global market and how the US government is working to allow US corporations to benefit disproportionately from the current Internet governance structure.

Use a rights-based approach to data and privacy to help to frame these issues and situations. Disclosure and consent are sometimes considered extraneous, especially in emergency situations. People think ‘this might be the only time I can get into this disaster or conflict zone, so I’m going to Hoover up as much data as possible without worrying about privacy.’ On the other hand, sometimes organizations are paternalistic and make choices for people about their own privacy. Consent and disclosure are not new issues; they are merely manifested in new ways as new technology changes the game and we cannot guarantee anonymity or privacy any more for research subjects. There is also a difference between information a person actively volunteers and information that is passively collected and used without a person’s knowledge. Framing privacy in a human rights context can help place importance on both processes and outcomes that support people’s rights to control their own data and that increase empowerment.

Create a minimum standard for privacy. Though we may not be able to determine a ceiling for privacy, one Salon participant said we should at least consider a floor or a minimum standard. Actors on the ground will always feel that privacy standards are a luxury because they have little know-how and little funding, so creating and working within an ethical standard should be a mandate from donors. The standard could be established as an M&E criterion.

Establish an ethics checklist to decide on funding sources and create policies and processes that help organizations to better understand how a donor or sub-donor would access and/or use data collected as part of a project or program they are funding. This is not always an easy solution, however, especially for cash-strapped local organizations. In India, for example, organizations are legally restricted from receiving certain types of funding based on government concerns that external agencies are trying to bring in Western democracy and Western values. Local organizations have a hard time getting funding for anti-censorship or free speech efforts. As one person at the Salon said, ‘agencies working on the ground are in a bind because they can’t take money from Google because it’s tainted, they can’t take money from the State Department because it’s imperialism and they can’t take money from local donors because there are none.’

Use encryption and other technology solutions. Given the low levels of understanding and awareness of these tools, more needs to be done so that more organizations learn how to use them, and they need to be made simpler, more accessible and user-friendly. ‘Crypto Parties’ can help get organizations familiar with encryption and privacy, but better outreach is needed so that organizations understand the relevance of encryption and feel welcome in tech-heavy environments.

Thanks to participants and lead discussants for the great discussions and to ThoughtWorks for hosting us at their offices!

 If you’d like to attend future Salons, sign up here!

Comments are closed.