Technology Salon

COVID-19

Sponsored by

a discussion at the intersection of technology and development

Data Ethics, Power, and Privacy in COVID-19 Digital Response

Data is top of mind during the COVID-19 response. Quality data can help to direct the response and to make decisions, but it’s not the silver bullet that everyone is hoping for. We need to remain aware of the power dynamics represented in how data is collected, treated, protected and interpreted. There needs to be transparency, scrutiny, and accountability with regard to how governments, the private sector, and non-profit agencies are collecting, using, and sunsetting data.

In the rush to apply technology to a problem, marginalized and under-represented communities are often left out. The very methodologies deployed in data collection are determined by the biases of the collectors and that can mean certain groups are excluded.

COVID-19 is an opportunity to make societal changes. We all need to lift our voices to work towards a more ethical reality. We also need to remember that each data point is an individual person, and ensure that we treat data with the care that those persons deserve.

Learn more from experts in the video above.

Report Back on the Salon

by Allana Nelson, Digital Impact Alliance, and Linda Raftree, Independent Consultant and New York City Tech Salon Convener

Our April 30th Salon focused on upholding data ethics and privacy during COVID-19. This was the third virtual Salon in the Tech in the Time of Coronavirus series.

We heard from five experts in the field:

  • Zara Rahman, Deputy Director, The Engine Room
  • Tracey Gyateng, Independent, Responsible Data & Ethics
  • Amanda Makulec, Data Visualization Lead at Excella & Operations Director, Data Visualization Society
  • Reema Patel, Head of Public Engagement, Ada Lovelace Institute
  • Sean McDonald, Co-Founder, Digital Public + CEO of FrontlineSMS

 

Key Points From Speakers

 

Zara Rahman

The Engine Room

Zara opened the Tech Salon by discussing the role of big data in the COVID-19 response and why it may not be the silver bullet everyone is hoping for. In particular, she noted that many development and policy professionals in key leadership positions are pursuing big data sources but then they don’t know how best to use such information when they have access to it. Despite having a wealth of data at their disposal, some leaders still do not know what questions they should be seeking answers to or how to design the collection of citizen data in a way that addresses current needs.

This has led to a widespread lack of transparency in the collection and use of personal information. There continues to be a lack of transparency and accountability in data use across governments, despite the astounding number of public-private partnerships being crafted at this time. Scrutiny for such relationships are “at an all time low,” according to Zara. “There are so many interesting writings [out there] about [how] transparency needs to be at an all time high, but this requires that people in those positions open themselves up to that scrutiny and transparency.”

Zara pointed out that we all need to consider what the long-term impact of “fast-moving or quickly-made decisions” will be: What will happen to all of the data currently being traded once the COVID-19 pandemic is over? Are we going to hold government leaders accountable now, or will we wait until this pandemic is over? While many advocates of big data use may want us to focus more on the results and benefits rather than the methods for the collection of data, Zara reminded us that, “how data is being gathered matters just as much as the insights.”

 

Sean McDonald

FrontlineSMS and Digital Public

Sean began his presentation with a staggering statistic: “At present, at least 84 countries have declared public emergencies, encouraged national lockdowns, or have suspended citizen rights in some way because of COVID-19”. As a result, lockdowns that started out as protective measures for public health have quickly become law enforcement measures. In particularly vulnerable countries, this gives the government a window of opportunity to take advantage of the current health crisis for their own gain. “What starts as medical advice becomes a law issue and we have already seen deaths in some places.”

Sean emphasized that the way those of us in the development and humanitarian communities utilize and leverage technology and data in this context is extremely important. “Each field has its own approach to ensure the things it produces are moral, fit for good, and are valid,” Sean said. “Those approaches are often disconnected and [because of COVID-19], authorities that would otherwise be focused on scrutiny are instead refocused on response, so there are institutional gaps where quality control isn’t taking place.”

Within this challenge of scrutiny and oversight, new mobile phone applications are being created or launched weekly that claim to solve an important transmission problem: understanding personal proximity of a sick person to a non-sick one and the likelihood that the virus will spread as a result. There is a lot of experimentation in the creation of these apps, and “in that chaos, one looks to every kind of tool that you can use to arrange and secure relationships,” whether they are safe and appropriate, or not. Sean asked the TechSalon participants, “What does accountability look like in the process of creating such apps?”

Sean encouraged accountability guardrails should be put in place through legally binding contracts. Contracting language serves as an opportunity to secure methods for accountability in new technology and those creating it. Sean called this “architecting accountability,” and noted that it could be a path to normalizing data rights and creating an expectation for what is included in future proposal responses. “When we are talking about data, we are talking about both the information we know is contained in it, but also the social license to act on that data. As a responsible data community this [means we] can have an impact on the permission we give for people to act on and access [our data].”

Sean closed by recommending that we utilize contracts to create operational practices that have “teeth and actionable frameworks” for securing accountability and oversight of personal data rights in the future. “Power relationships aren’t all that complicated; a lot of the way we can intervene is by strategically negotiating those relationships.”

 

Reema Patel

Ada Lovelace Institute

Reema joined this TechSalon as a speaker following the publication last month of a new report from the Ada Lovelace Institute on the COVID-19 response: Exit Through the App Store, which sets forth guidance to the UK government on the “technical considerations and societal implications of using technology to transition from the COVID-19 crisis.”

Having assessed the state of play for digital technologies in the COVID response, she brought interesting insights to the issue of data use and privacy. She reasoned that the controversial topic isn’t quite so clear-cut: “questions about privacy are more about the society we want to build; digital contact tracing is actually about what we understand in the evidence base and how are we making our decisions.” But, she noted, reasonable use of data has given way to scope creep. “The rate of change [of data use] means that you can start with one intent and it could morph into something that lacks clear remit and functionality; it’s escalated out of control.”

So, how do we combat such scope creep in data use? According to Reema, we must make sure that society holds technology providers accountable for the technology they create. As a society, we must make sure that the value of an app is understood and that it has not just been created as a “knee jerk reaction” to the current state of events (see the report from our first Salon in the Tech in the Time of Coronavirus Series for takeaways on this topic). Many nations have been requiring mandatory contact tracing via apps for behavioral control. As Reema stated, “it speaks to broader societal pressure. How does this [crisis event] impact how governments should legitimately behave? Should governments use technology to shape the way people behave?”

Reema closed by pointing out that there is almost no evidence that digital contact tracing has worked with COVID-19 like it has with past viruses. The World Health Organization (WHO) has even put out an official policy position that immunity certification doesn’t actually work because a past infection of COVID-19 does not guarantee immunity later on. This important fact shows that many “technology solutions” proposed thus far may not actually solve the challenges we face. “We need to understand the scale of the problem and the nature of the disease,” before we start applying technology to it. Otherwise, efforts will be futile.

Tracey Gyateng

Independent Practitioner

Tracey’s talk focused heavily on the issue of inclusion during the COVID-19 crisis, and how currently available data in many communities is not actually representative of all groups that may be impacted by the virus. “I have not yet heard a clear definition of the problem that a service is trying to fix,” she opened. “ If we don’t clearly understand the problem and the stakeholders it affects, then it is bound to fail. This shouldn’t be controversial, but the allure of data science and [Artificial Intelligence] can be intoxicating, especially by organizations that want to be seen as innovative.”

Tracey noted that oftentimes, in the rush to apply technology to a problem, marginalized and under-represented communities are left out due to the way we conduct requirements gathering. The very methodologies deployed in data collection are determined by the biases of the collectors, and that can mean certain groups are left out. “Nine out of ten black, ethnic, and minority charities are at risk of closing within a few months. In focusing on the problem [of COVID-19], we need to discuss what the potential consequences will be for everyone, and use this information when deciding what action we should take.”

Importantly, Tracey pointed out that “while this virus certainly can affect anyone, it will affect some people more than others: poorer communities deal with worse effects than affluent ones; minorities are largely in low-paid jobs and face structural inequalities; women make up a disproportionate amount of healthcare workers and [traditionally] carry more domestic duties.” Tracey emphasized that any technological solutions created to respond to COVID-19 will only increase inequalities if the technology community does not address these inherent problems. “Inclusion must stretch across the whole tech development pipeline, and we must collect and analyze data on minority groups so that we know if interventions are ethically just.”

Tracey closed with a note of optimism and a call to action on the future that can be created as an outcome of this pandemic. “If COVID-19 is an opportunity to make societal changes and we are seeing some policies that were said to be impossible being made possible, then we all need to lift our voices to work towards a more ethical reality.”

Amanda Makulec

Data Visualization Society and Excella

Amanda shared her experience and background in data visualization, and how it can be leveraged to better understand the COVID-19 pandemic and the impact it was having around the world. She noted that many news outlets use confusing or conflicting data visuals to explain to the public what is happening with the virus, and this has led to many people feeling overwhelmed by information or just confused by the graphs and tables being published. Amanda identified three key constraints and challenges that make understanding all of the Coronavirus-related data so hard to understand.

  • Saturation: Visualizations of data currently have a wide reach because of the draw that COVID-19 has, but Amanda noted that we have to ask ourselves how relevant this information is to each of us and how it informs our day-to-day decision making. If we determine that it isn’t immediately relevant to us personally, then “it’s okay to step away.”
  • Incomplete information: “There are a lot of gaps and challenges in the data. We don’t have good disaggregated data.” Amanda pointed to the same concerning issues that Tracey raised: large swaths of society are missing from data collection, leaving gaps in the information that is then visualized. This results in a misrepresentation of the current situation and makes it harder for individuals to make decisions about their own well-being.
  • A lot of unknown unknowns: Oftentimes, data visualizations do not disclose the information they have failed to capture or are unsure of. Relationships between different data sets are not always evident (or do not exist outright). Public data reporting is not always complete and infection calculations are not always accurate (or sample sizes may be too small). “Think about uncertainty,” Amanda said. “It’s an important piece and something we see in the models. Other models try to plot firm thick lines, but look for people who are uncertain and the limitations that they’ve created.” Amanda shared that those suppliers of data who are upfront about these uncertainties and limitations are likely better sources to turn to. Additionally, she said that we “must look to epidemiologists to lead. Power BI and Tableau can do a lot, but they aren’t always correct. When we start to plot numbers, they start to seem like numbers that we know, running on the best data that we have… [but] epidemiologists are the leaders here, we need to listen to them.”

Amanda closed by reminding us that, at the end of the day, each statistic and data entry is actually a person. “Humanity behind the data is so incredibly important. Whether we look at a chart of cases and deaths, or a chart on unemployment, each individual is a person. We can easily forget this, but we must remember there is a person behind it.”

Breakout Group Discussions

Following the speakers, we broke into six smaller groups to discuss key issues that Salon participants wanted to bring up. Below are the main points that were raised in these rich discussions across those groups.

Get beyond privacy and think about ethics

We need to focus more broadly than just “privacy.” As Sean pointed out in his talk, crises such as this pandemic allow governments to exercise “emergency powers,” and normal laws don’t really apply because due process checks get placed on hold. So as technology and development practitioners, we need to go above and beyond in terms of evaluating pre-existing biases. “COVID highlights deep inequities, and often a technocratic solution deepens and perpetuates that.”

There are several issues here, including how data is used now and getting the objectives right for that, but also thinking about what happens afterwards. “We need to think in terms of scenarios: what might happen if we take certain actions or allow certain actions to happen, what might the future look like if we allow access to this data without mechanisms to stop access afterwards? How do we police this? I would find it useful to have sets of scenarios that we can explain to people and put into context how our futures might diverge based on decisions made now.”

COVID-19 health data being collected in mobile apps and being connected to financial records is highly concerning. Other problematic situations include non-consensual collection of COVID-19 related health data. “In the UK, we passed laws for data. If you call a healthcare provider now, that data is captured and allowed to be linked with all your other health data. These laws encapsulate capabilities to arrest or detain people, even children. That’s why the ethics of this are so important. We are capturing that data and no consent is required. This is really shocking. Can we at least put up posters informing people that their data is being collected?”

Understand whether there’s a use case

As Tracey and Amanda both noted, we need to understand the use cases for data and avoid the “solution looking for a problem” syndrome. “There’s too much stuff being built because data sets are too easy to get to, but we need to think about what we are publishing. We need to rethink this because it creates a lot of noise. The opportunity to miss the mark and be misinformed is too high.”

Additionally, data visualization challenges are problematic. “I’m bugged by data visualization challenges. It’s a lot of ‘here’s the data, build things!’ People are jumping on this. My organization has leaned out and decided not to sponsor support to these types of initiatives because they do more harm than good. I hate the idea of people being able to play around with data and publish it without really understanding it.” As a sector, we need to ask uncomfortable questions like “Is this a priority? Does it meet the goal? Does it mislead the users? It’s highly political, regardless of whether you’re in the government, public, or private sector, and working with these tools.”

Mistrust of authorities is a foundational challenge

In some communities, people are less aware of privacy risks and not concerned about it but in others, there is “well-grounded distrust of NGOs, big tech firms and governments.” One person said they had worked in a refugee camp in South Sudan and “people weren’t worried about how data could get to the government because they didn’t understand how the government could use the data.” Elsewhere, however, distrust in the authorities has meant that groups and individuals have refused to provide data. In the US, for example, a COVID-19 testing site run by the national guard was not as well visited as other testing sites because the community didn’t trust what the national guard might do with the data. In the UK, Black citizens have less trust in the government and what it might do with data.

One way to address this is bringing constituents into the conversation. At one organization, “beneficiaries were initially opposed to this idea of sharing [data] across departments, but once we explained how we’d be using it, they were open to it. We also made sure it was written down and we asked for feedback on policies. We set up a panel, where we met regularly to keep everyone abreast of whether we were answering the question that we set out to.”

Hold both companies and governments accountable (but how?)

Salon participants had mixed opinions about whether governments or companies were the bigger problem in terms of data protection and ethics. “There is a tendency to focus on government. I’m more concerned about private companies. We are creating a lot of technologies and seeing what sticks, and then later trying to reign it back in to make sure that it can’t be misused,” said one participant. Another, however, said “I’m a lot more worried about governments than private companies, particularly in some of the sub-Saharan African contexts…. We will likely see companies and governments taking advantage of this data.” Unfortunately, in times of coronavirus, we seem to be racing straight ahead without any strong policies in place that put boundaries around this.

“Who should we be worried about? Should companies be responsible for protecting data?” “We are in such a fast-moving environment, is it not then up to companies that are working with governments to protect the data? Can we install a safe switch? Can we not put pressure on them? A time limit, like one year then all the data is gone?” Others raised the privacy by design framework, saying that technology companies need to ensure it as an integral part of design. “You should have a privacy professional and bring them into the design.”

Another issue is how to encourage country governments to only collect the data that is necessary and absolutely required. How do you protect people’s privacy over time? How do you balance these needs? One person commented that “the question is not whether we should use the data for good, (because the answer is yes, with caveats) but it’s why are we allowing it to be collected under baseline conditions. Imagine a world where a national emergency allows the data to be collected for good reason. But right now, it’s being collected by TelCos, and governments don’t really know what TelCos are doing with it. How do we get policy makers to understand the attributes and qualities and behaviors of data? Because they don’t currently understand its implications and how to use it properly.”

Another question was about where the data goes. ”If we’re just switching on services for big tech, supported by governments, collecting data into for-profit company databases, what happens after this? Who will have access to it? What are their other uses for it? Filling out missing pieces of pictures of digital consumers perhaps so they can better influence elections, buying habits, etc.? This data shouldn’t be going into private sources but a government source, which has its own issues, but it’s separated from private companies and their abuses.”

“What technology can do is outpacing us. New capabilities are there before we realize it. How can we get policy in place in the next 3 months if these things are rolling out today? What tools can we use now to make sure that PII is not being picked up on?” There are some best practices around anonymization, but “you anonymize it one week and then the next week you can de-anonymize it.” “Tech is fast, but science is slow because of the need for verification. How do we make sure epidemiologists are central to the data we’re visualizing? How do we make sure they’re part of every step?”

COVID-19 is being used as an excuse for a data grab

A big concern is that “we’re on the verge of big tech doing the most massive personal data grab that’s ever occurred. How can that be managed? What does the oversight look like? Is big tech doing a runaround of GDPR because they are working in conjunction with governments?” Contact tracing is an example of a crisis being used for data grabs. “How many solutions are coming from health workers on the front line and how many are from data groups or organizations that don’t have the right context of the challenges? You collect data and that becomes interesting to a group that is finding patterns. It’s not a solution to the real problems that people are facing, but it is used to justify collecting and using that data.”

Fear is leading to a “Shock Doctrine” situation

As one participant said, “people are driven by fear. This is also a pandemic of power. How will we come back from that? Will we be able to come back from that?” Another asked “how can we avoid something like Namoi’s Klein shock doctrine? How do we make sure the new normal is not created from the abnormal? How do we determine what is acceptable going forward and how do we make sure some things stop at a certain point?” Surveillance that arrived with 9/11 has remained in place for the past 20 years.

“The long-term goal is having systems that we are the master of. We should be able to choose what data to unlock and relock. That puts us back in control, rather than having our data out there and bought and sold.” Many privacy lawyers are talking about how to hold the government to account and ensure there is no mission creep. “There has to be a balance. There are some things we should be able to do for COVID. We need to be open and transparent about the choices that have to be made.”

Not only is data being grabbed, power is being grabbed. Governments are taking advantage of the crisis to increase their powers. For example, Israel activated its domestic police to move people to COVID-19 hotels against their wishes.The question is whether we end up extending the power of authoritarians in a trade off for curtailing the virus as quickly as possible.

What happens after the crisis?

The COVID-19 crisis will have plenty of downstream effects and the timelines are long. In times of crisis, things that would never be accepted otherwise get deployed. How do we think about design with the user when there is this power imbalance? What is the role of government and regulation in providing boundaries? “What we haven’t seen is someone trying to put limitations legislatively on where we are getting this data or having these laws in effect. No one has put down a statement of how long it will last. That doesn’t give me confidence,” said one person. It is hard to find good examples of governments who are enforcing sufficient accountability. “I wonder what we need to do to hold people accountable in power for what they are doing now… mapping and cataloging legislation that wouldn’t necessarily go through stringent processes or things that are out of the ordinary. I wonder if we assume that those in power are not making great decisions right now, I wonder what we need to do to hold them accountable?”

The Czech Republic was cited as an example of strong controls due to its activist judiciary. “The legislation was challenged in the court and the court intervened and forced the government to take a different route to make sure that the government was within the law.” There have also been announcements that measures are temporary but the endpoint is very unclear. The endpoint is often “once we contain the crises” and not “when we hit certain numbers or a date. That is an issue obviously. It is a slippery slope without a hard deadline because it could be prolonged forever.”

One suggestion was to assume the worst will happen and use that mindset to design projects better. “Can we design this stuff with the idea that a malicious actor or a crazy man will run your country? In that case, what different project or product design decisions would you make? Make the assumption that something terrible will happen and work backwards from there.”

Participants with experience in national security said that “once the door is open, it is near impossible to close it. Once the government has [certain data], it will be taken advantage of. There needs to be an incentive and punitive or regulatory structure that ensures that companies and governments don’t take advantage of access to new data and that there are sufficient penalties in place to discourage bad actors.”

Are there any bright spots?

Though there are plenty of concerns, people did identify some bright spots. For example, a few organizations are seeing data issues being pushed up the management chain. “The urgency that COVID brings and the polarizing debates have escaped the small niche community into the mainstream about what should or should not be done and what can be done with mobile data and elsewhere. I think that’s a good thing.”

“I’m seeing more learning and conversations about ‘hang on, what’s going on with privacy?’ than before,” said one person, “Because COVID is impacting the US, the UK, and other countries in the Global North, there is now less tolerance for privacy issues. For example, things that happened in Ebola, would never fly in the UK. “There’s more awareness, more civil society, judicial prudence. It’s beneficial,” said another person, “but I don’t see it automatically trickling down in developing countries because the UK may have policies about contact tracing but those might not be applied in Nigeria.”

On the regulatory side, it was added, “there are lots of interesting movements across Africa. There’s been a push from countries that historically have weak privacy policies. Although it’s not exclusively COVID related, there have been two things in Kenya: the government pushed NIIMs [national integrated identity management system] and a Nubian rights civil society group took them to court. They wanted to collect GPS and DNA data for all people living in Kenya, which was unnecessary. The court had to balance function with registration but also protect people’s privacy and control over personal data. GPS and DNA data were determined to be excessive, but other biometrics as necessary.”

As one person noted, however, the biggest challenge for many governments is operationalizing good policy, and few are able to do it. This point applies to organizations as well. “The biggest challenge in executing policy is getting everyone to believe in the policy itself. It’s a huge leadership challenge.”

Importantly, there are key things to learn from this crisis so that we are better prepared for the next wave of COVID-19 and future pandemics. It likely won’t be another 100 years before we experience the next one.

 

Resource List

Check out our Resource List for additional materials on this topic!

Tech in the Time of Coronavirus

The Tech in the Time of Coronavirus Series is co-organized and supported by Technology Salon in conjunction with ThoughtWorks, Pivotal Act, the UN Foundation’s Digital Impact Alliance (DIAL), and GitHub. This particular Salon was also supported by The Engine Room. The series aims to bring together the wider technology sector with humanitarian and crisis response sector experts, specifically those who have worked on past crises situations, to highlight good practices and to avoid repeating well-documented mistakes and re-inventing wheels.

We also hope that through connections made at these Salons we can find effective and impactful ways to work together on the COVID-19 response. The first part of each Salon is recorded so that it can be publicly shared. For the second hour, participants are divided into moderated, off-the-record break-out groups for frank and open discussions aimed at identifying and working through challenges and moving towards collaboration.

We will cover several topics over the next few months, including the issue of responsible and ethical use of data during COVID-19; effective ways to volunteer; the role of the corporate, foundation, and other donors; the impact of COVID-19 on online education, economy and jobs, domestic abuse and gender violence, mental health and substance abuse, and other emerging secondary effects of the pandemic.

We will also cover topics that aim to help agencies working on the crisis to move towards effective digital response necessitated by the need to avoid face-to-face contact with communities and one another and government mandates to quarantine to avoid spreading the virus.

Read about past and future Salons in the series:

Comments are closed.