Technology Salon

New York

Sponsored by

a discussion at the intersection of technology and development

9 Considerations for Designing Safe Digital Solutions for Sensitive Services

sensitive services

As the world became more digital in the wake of COVID-19, the number of mobile applications and online services and support increased exponentially. Many of these apps offer important support to people who live and move in contexts where they are at risk.

Digital apps for sensitive services (such as mental health, reproductive health, shelter and support for gender-based violence, and safe spaces for LGBTQI+ communities) can expose people to harm at the family, peer, and wider societal level if not designed carefully. This harm can be severe – for example, detention or death. Though people who habitually face risk have their own coping mechanisms, those designing digital apps and services also have a responsibility to mitigate harm.

At our March 8 Technology Salon NYC (hosted at Thoughtworks), we discussed how to create safe, private digital solutions for sensitive services with key thought leaders, including:

Key Takeaways from the conversation

1. Do constant threat modeling.

Threat modeling needs to include a wide range of potential challenges including mis- and disinformation, hostile family and community members, shifting legal landscapes, and law enforcement tactics. The latter are especially important if you are working in environments where people are being persecuted by government.

Roughly 70 countries criminalize consensual same-sex activities and some forms of gender expression, most in Sub-Saharan Africa, for example. The US is placing ever greater legal restrictions on gender expression and identity and on reproductive rights, and laws differ from state-to-state, making the legal landscape highly complex.

Hate groups are organizing online to perpetrate violence against women, girls and LGBTQI+ people in many other parts of the world as well. In Egypt, police have used the dating app Grindr to entrap, arrest and prosecute gay men. Similar tactics were used in the US to identify and ‘out’ gay priests. Since political and social contexts and the tactics of those who want to do harm change rapidly, ongoing threat modeling is critical. Your threat models will look different in each context and for each digital app.

2. Involve communities and other stakeholders and experts.

Co-creation processes are vital for identifying what to design as well as how to design for safety and privacy. By working together with communities, you will have a much better idea of what they need and want, the various challenges they face to access and use a digital tool, and the kinds of risks and harms that need to be reduced through design and during implementation.

For example, a lot of apps have emergency buttons designed to protect women, one Salon participant explained. These often alert the police, however that might absolutely be the wrong choice. “Women will tell you about their experiences with police as perpetrators of gender-based violence” (GBV). It’s important to hire tech designers who identify with the groups you are designing for/with.

Subject matter experts are key stakeholders, too. There are decades of experience working with groups who are at risk, so don’t re-invent the wheel. Standards exist for how to work on themes like GBV, data protection, and other aspects of safe design of apps and digital services – use them!

3. Collect as little data as possible.

Despite the value of data in measuring impact and use and helping to adapt interventions to meet the needs of the target population, collection of personal and sensitive data is extremely dangerous for people using these apps and for organizations providing the services.

Data collected from individuals who explicitly or implicitly admit to same-sex activities or gender non-conforming behavior could, in theory, be used by their family and community as evidence in their persecution. Similarly, sexual activity and fertility data tracked in a period tracker could be used to ‘prove’ that a girl or woman is/was fertile or infertile, had sex, miscarried, or aborted — all of which can be a risk depending on the family, social, or legal context.

Communication on sensitive topics increases the risk of prosecution because email, web searches, social media posts, text messages, voice messages, call logs, and anything that can be found on a phone or computer can be used as evidence. If a digital app or service can function without collecting data, then it should! For example, it’s not necessary to collect a person’s data to provide them with legal advice or to allow them to track their period.

4. Be thoughtful about where data is stored.

When using third party apps to help manage a digital solution, it’s important to know exactly what data is stored, whether the data can be deleted, and whether it can be subpoenaed. Also consider that if an app or third-party data processor is sold to another company, the data they store will likely be sold along with the app, and the policies related to data might change.

While sometimes it is safer to store data on an individual’s device, in other cases it might be safer for data to live in the cloud and/or in different country. This will depend on the threat landscape and actors. You’ll want to also review data privacy regulations for the countries where you are based, where the data is stored, and where your target end users live. All of these regulations may need to be complied with depending on where data is collected, processed, and stored.

Some countries have “data sovereignty laws” that dictate that data must reside in the country where it was collected. Some governments have even drafted laws that require government to have access to this data. Others have so-called “hostage” laws that require that digital platforms maintain at least one employee in the country. These employees have been harassed by governments who push them to comply with certain types of censorship or surrender data from their digital platforms.

If government is your main threat actor, you might need to decide whether non-compliance with data laws is a risk that you are willing to take.

5. Improve consent processes and transparency.

Consent cannot be conceived as a one-time, one-off process, because circumstances change and so does consent. Generally digital platforms do a terrible job at telling people about what happens to their data and informing them of the possible risks to their privacy and safety. It’s complicated to explain where data goes and what happens to it, but we all need to do better with consent and transparency. Engaging people who will use your app in designing a good process is one way to help develop easy to understand language and explanations.

6. Help people protect themselves.

Add content to your website, app, or bot that helps people learn how to adjust their privacy settings and understand the risks of using your service and how to protect themselves while doing so. Some features that were mentioned by Salon participants include those that allow people to disguise the apps they are using, quickly delete their data and/or the app itself, mask or ‘forget’ phone numbers so that the number won’t appear in the contact list and so that text message content won’t repopulate if the number is used again to send a text, and using different phone numbers for the organization’s website and for outreach so that the numbers are harder to trace back to the organization or a service.

7. Plan for the end of your project and/or funding.

It’s important to plan for how you will safely delete all your data and any data held by third parties at the end of your funding cycle if the app or service is discontinued. In addition, you’ll need to think about what happens to the people who relied on your service. Will you leave them high and dry? Some organizations think of this as an “off ramp” and recommend that you plan for the end of the effort from the very beginning.

8. Take care of your staff.

Ensure that you have enough staff capacity to respond to any incoming requests or needs of the people your service targets. Additionally, keep staff safe from harm. Countries, like Hungary, Russia and Indonesia have laws that make the provision of educational material related to LGBTQI+ identities challenging, especially to minors. Similarly, some countries and some US states prohibit any type of counseling related to abortion or gender affirmative care.

This poses a risk to organizations who establish legal entities and employ people in these countries and states and to their staff. It’s critical to ensure that you have enough resources to keep staff safe. You will also want to be sure to provide support for them to avoid high levels of burn out and to deal with any vicarious trauma. Keeping staff safe and healthy is not only good for them, but also for your service because better morale will mean higher quality support services.

9. Accept that there will be trade-offs.

Password protected apps are more secure, but they can pose higher barriers to use because they introduce friction. If your app doesn’t collect personal data it will be safer, but it will be more difficult to offer a password reset or recovery options, which is a usability challenge, especially in places where people have lower literacy and less experience using apps and passwords. When data is stored locally, it’s less susceptible to large scale data mining, however it might be more at risk of a family member or law enforcement forcing the data to be shared, and if a device is lost or broken, the data will be lost.

Large platforms may be more prone to commercial privacy risks, yet in some ways they provide greater data security. As one person said, “We decided to just go with WhatsApp because we could never develop a platform as secure as theirs – we simply don’t have the engineering power that they do.” Another person mentioned that they offer a Signal option (which is encrypted) for private messaging but that many people do not use Signal and prefer to communicate through platforms they already use.

These more popular platforms are less secure, so the organization had to find other ways to set protective parameters for people who use them. Some organizations have decided that, despite the legal challenges it might bring, they simply will not hand over data to law enforcement. To prevent this situation from happening, they have only set up legal entities in countries where human rights protections for the populations they serve are strong.

You’ll want to carefully discuss all these different privacy and usability choices, including with potential end users, to come to the best decision for each app or service.

Additional resources on this topic include:

Comments are closed.