Freedom of expression is a basic requirement for maintaining democracy and open societies where citizens are able to stay informed, express opinions and participate actively in public life. Over the summer the Solidarity Action Network (SANE) explored different aspects of freedom of expression through a series of curated conversations looking at the role that international civil society organisations (ICSOs) can play in protecting and increasing free expression and civic space, particularly in relation to digital space and freedoms.
The first example came from Takura Zhangazha and Arthur Steiner from Hivos, who shared how Hivos has played an ‘incubator’ role in supporting young artists and makers to champion free expression, and to increase and even celebrate civic space. Through its R.O.O.M. Program Hivos has designed interventions that support young creatives, strengthening their resilience to remain critical and independent so that they can continue to challenge damaging narratives and shrinking civic space. Beyond the solidarity shown through this form of ‘incubation’ and direct support, Hivos has also made use of facilitation as a method of solidarity. The programme has facilitated the convening and connecting of young makers and creative hubs, enabling cross-fertilisation between these makers and other actors in support of Pan-African solidarity that can counter closing civic space.
These forms of solidarity are brought to life through stories of R.O.O.M Program partners – in particular, the Magamba Network based in Zimbabwe. The network focuses on the arts, digital media, activism and innovation. It has opened up space for free expression online through supporting and incubating young bloggers and new media start-ups speaking truth to power, and has inspired the creation of other, similar hubs in Southern Africa. As one of its activities, the network has convened discussions around the topic of internet access and ownership, inviting makers and creative hubs across the region to discuss the rise in internet regulations, cyber-crime laws, internet shutdowns, and – in some countries – the increasingly prohibitive costs of internet access.
(Still from video: Who owns civic space? by Hivos featuring Magamba Network)
How international actors can show solidarity through support to local civil society actors to maintain and defend freedom of expression online, and protect online civic space, led us on to the next curated conversation with Felicia Anthonio from Access Now. Felicia shared insights on the role that Access Now has played in coordinating and convening #KeepItOn, a global campaign and coalition that aims to end internet shutdowns.
Members of the #KeepItOn coalition work together to prevent shutdowns through awareness-raising, advocacy, capacity-building and litigation. Access Now further builds resilience among affected communities through technical support and grassroots grants, and applies multiple forms of solidarity at local, national and international levels.
(Graphic from: #KeepItOn update: who is shutting down the internet in 2021? by Access Now)
The campaign uses public solidarity to tackle shutdowns, for example through advocacy at the global or national level calling for specific internet shutdowns to be ended. Tensions can arise between public solidarity on the one hand, and access on the other, and risks to access, staff or partners’ safety can act as barriers to ICSOs signing on to open letters or speaking out publicly on the issue of internet shutdowns. However, as the #KeepItOn coalition’s work demonstrates there is a spectrum of different modes of solidarity available to ICSOs.
There are different examples of more ‘quiet’ acts of solidarity that ICSOs can take, such as helping to document restrictions in a particular context, or supporting local communities or groups affected by a shutdown. The coalition itself employs multiple modes and levels of solidarity; for example combining awareness-raising at the multilateral level with litigation or advocacy at the national level, alongside strengthening of local capacities to deal with the impacts of shutdowns. So, whatever their appetite or capacity for risk might be, ICSOs can contribute to the protection of digital space and freedoms, and joining the #KeeptItOn Coalition can be an effective first step.
Further details about these two examples can be found in the Solidarity Playbook, in the case studies on Hivos and Access Now which cover strategies for the protection of online free expression, as well as its potential to bring about social change, revealing different strategies for solidarity in the face of closing civic space. We encourage you to delve deeper into these topics by reading the cases!
NGOs and UN agencies have collected sensitive personal data of millions of people in the global South. With due prudence?
Aid and charitable organisations (hitherto referred to as NGOs) have gone digital in recent years. One aspect of this involves collecting and processing large amounts of data about the very people they assist and interact with. Such data is often very personal and may include health and educational data, family relations, religious and political believes. These data sets identify a person by name, address, picture, and increasingly also biometric features.
«Trust is the fuel of our societies» says the contemporary historian and philosopher Yuval Harari. This remark raises an important question, namely how digital data dimensions of the practices of NGOs affect this vital issue of trust. Can their policies and practices stand public scrutiny and are worth our trust? Can we be sure that personal and often sensitive data has been obtained voluntarily from each of the data subjects that NGOs gather such data from? Is it right to assume that the people have a clear understanding of purpose, benefits, and risks related to the sensitive data they allow NGOs to collect, store and process?
In general, it can be said that NGOs fall under the jurisdiction of the country where they operate. That might reassure the public in the countries of the North where data protection laws have been passed and are being enforced. However, on a global scale, as more and more governments enact data protection laws, we are in the paradoxical situation that NGOs often face new obligations to disclose personal data in support of national sovereignty and security issues. Countries like Turkey and Rwanda have modelled their data protection laws following the example of the European GDPR and demand that personal data of their citizens and residents is kept exclusively on servers on their territory. But NGO data is sometimes also shared with donors to obtain funding or as part of specific data-sharing agreements. Each contractual arrangement with commercial service providers, be it outsourced IT services, such as the storage and data analysis, or simple bank and mobile pay transfers, disclose identity and information about aid recipients.
Those who manage the personal data, the senior NGOs managers and data protection officers, are aware of the issues at stake. They will mention the noble mandate and the non-profit character of their organisation. Some will point out that they have internal data protection policies and regulations. Especially the international NGOs, these are the western institutions led and funded from the global North, will claim that they follow the GDPR, as the current international gold standard in data protection, even in countries outside of Europe.
It is certainly true that data of their institutional and individual donors, including those who have been recruited in the High Street to donate every month 10 Dollars/Euros or so, is kept and processed in a GDPR compliant manner. The NGOs know that they have to gain and maintain the trust of that important segment of the public since their financial and reputational survival is at stake.
But what about the people at the margins, those data subjects in the global South, that have surrendered their personal data to NGOs because they need protection and assistance in the face of war, natural disaster, or poverty? In such cases, specific questions need to be asked and addressed, such as:
Unfortunately, answers to these and other questions are in most cases likely to fall short of even minimum standards of data protection. Organisational culture and practices in many NGOs often fail to put the necessary focus on and resources into conducting impact and risk assessments in a particular context before decisions are made to collect, store and process personal data. Financial audits are common while data audits remain an exception. Operations security processes and strategies for protecting critical and sensitive data are rare within organisations and absent at an inter-organisational level. Self-policing is the norm, rather than submitting to independent oversight.
Special mentioning is necessary of the large specialised UN agencies, such as IOM, WFP and UNHCR. These organisations have, like no other non-governmental bureaucracy, amassed personal data files of tens of millions of people around the globe. Their data subjects for example, surrender their biometric imprints (commonly a fingerprint or an iris scan) for a bar of soap, a sack of rice or a cash transfer, but also for a residence permit, or the opportunity to be resettled in another country. Unfortunately, their policies for safeguarding such sensitive biometric data and importantly the implementation of data protection remain largely opaque. Let’s take the example of the data-sharing agreement that UNHCR has concluded with the government of Bangladesh. Human Rights Watch alleges that personal data, including biometrics, collected jointly by UNHCR and the Bangladesh authorities from Rohingya refugees, who have fled genocide in Myanmar, has been transmitted to the government of Myanmar. Or WFP which has announced that it signed a contract with Palantir, a company specialised in data analytics and which is part of the data-military-industrial complex in the United States. Palantir was criticised recently by Amnesty International for its failure to conduct human rights due diligence around its contracts. These and other data related treaties, contracts, and practices of UN agencies escape public scrutiny and cannot be challenged in any national court because of their diplomatic immunity. Yet, various UN agencies routinely roll out large-scale data collection programmes in many countries whilst failing to ensure appropriate oversight bodies and recourse procedures for their data subjects.
When two months ago western military and civil organisations evacuated their personnel from Afghanistan, large amounts of sensitive personal data, including biometric data, was left behind. Only time will tell whether that data has been adequately protected and cannot be abused. Closer at home, the German Red Cross received the 2018 edition of the Big Brother Award from a civil society organisation for its digital system of asylum shelter management. The Red Cross software instituted humanitarian surveillance and total control of the asylum seekers and refugees by movement tracking to and within the shelter, detailed recording of medical checks, food consumption, relationships, religious and ethnic affiliations and much more. The question is not whether there are similar systems in different setting and places elsewhere in the world but rather how many of them exist.
Looking ahead, we see the contours of different trends: Some civil society actors recognise the emergence of unforeseen risks relating to the use of personal data that have been collected in many different contexts. They now increasingly advocate for intensified discussions of approaches to responsible uses of personal, in particular biometric data. Indeed, the ICRC and Oxfam have defined red lines for the use of biometrics and discuss their data policies and practices with the people they protect and assist.
On the other hand, however, the trend towards surveillance and biometric overkill continues. The most worrying example is the use of DNA by the US Immigration Service ICE. Here a precedent is set where biological features are not only used to uniquely identify an individual but the biological relationship of several persons. Meanwhile, UNICEF and the Gates Foundation are looking into ways of using biometrics for infant and child healthcare in Africa. Newly set-up start-ups suggest introducing biometrics for school kids in Africa so as to control their school attendance.
But there is resistance too. Political organisations working among Rohingya refugees living in camps in Bangladesh are campaigning among their fellow refugees not to deliver their personal data, notably biometrics, to the United Nations and the Bangladesh authorities out of fear that such sensitive personal data could end up in the hands of the military dictatorship in Myanmar. During the Covid-19 pandemic, the UN Agency for Palestinian Refugees in the Middle East (UNRWA) failed in introducing biometric ration and cash cards: The Palestinian refugees wouldn’t have any of it and simply refused to collaborate.
And so it is not surprising that personal data can also become a weapon. A case in point is Yemen where two competing governments fight a civil war for control of the country. Since 2019, WFP rejects the lists of beneficiaries put together by the government based in the capital Sana’a. The agency demands that itself should be put in charge to biometrically register those eligible to receive food aid. The competing government, supported by the US and Saudi Arabia, allowed WFP to biometrical register food aid recipients in their areas of control. As a result, the bulk of food aid goes to the areas where WFP was able to register the population.
Accordingly, widening current debates and including a broad range of stakeholders will be crucial as we move forward. If it comes to sensitive personal data and trust, charitable intentions, progressive mandates, and a non-profit posture are noble starting points but unfortunately account for little. What happens to the personal data of the so-called data subjects is what matters. For years NGOs have been discussing concepts like do no harm and accountability. Many claim that they are accountable to the people they assist, protect, and seek to empower, rather than to their donors. Now the opportunity has come to implement and showcase meaningful accountability: Give those who are marginalised and in need of protection and assistance the digital rights they are entitled to.
This article was published first in German by Netzpolitik.org on 8 October 2021
Members of the Scanning the Horizon community recently met online to continue our exploration of ‘tools for inclusive futures’, engaging methods to democratise futures conversations in organisations, using digital tools which do not require previous experience from either facilitators or participants. These tools have been highlighted in our recent Sector Guide on Strategic Decision-Making in a Whirly World.
This time, we wanted to find out more about Futures Frequency, from the Finnish innovation and futures fund Sitra. The idea behind Futures Frequency is that it inspires thinking and action towards positive, preferred futures and can be ‘used and applied by anyone’. You can check out an intro video here.
We decided to use it to explore futures of human diversity, and felt that a group of 9-12 is a good size to allow the discussion parts to take place in threes. No advance preparation was requested from participants, just encouragement to join with an open mind, and be ready to ‘enjoy the ride’, go with the process and put their heads in a different, more creative and playful space.
Setting the stage
We started with some relaxed individual reflection about the big ‘what if’ question – in relation to futures of human diversity in 2050 – which occurred to us. Then we introduced ourselves and our big question in plenary and it was already really interesting to see the different angles which people had already come up with – from gender fluidity, to intergenerational working with people living longer, to racism being history, to humans being seen as just another part of nature. Just this initial sharing already encourages you to open up and expand your own thinking more.
First stage, challenge your assumptions about the future
Then we had to activate our imagination muscles more by moving into the first main stage of the Futures Frequency method, challenging assumptions. We were given an audio drama snippet to listen to individually and then as a small group, we discussed what assumptions we heard in the piece and how it connected to our own assumptions or what felt familiar. This was a really interesting process to go through, surfacing both small assumptions or questions but also bigger ones about when in the future the conversation was set or whether we were just defaulting to assumptions about things in this future were still working in a similar way to the present. From a facilitation angle, you could either use one of the many supporting resources which Sitra provides for this, or you could create your own snippet – audio or written – linked to the theme you’re exploring.
This process does highlight biases you weren’t aware of in your own thinking and how your brain tries to ‘fill in the gaps’ around incomplete information you have on a situation. It also helps you better understand and appreciate how those you are working with are also thinking. This would be particularly important in a very diverse group, or especially if exploring potentially sensitive topics together. This stage increases your awareness of why you think certain things, before you then move onto imagining preferred futures.
Second stage, imagine your preferred futures
In this stage, you again start with individual reflection to imagine what the theme – for us, human diversity – might look like, without boundaries, with new possibilities, and envision a mental snapshot of the future you personally prefer for this, trying to engage different senses to bring this image to life. Then moving into Miro or another digital whiteboarding space, each person in the group writes up their personal vision in one sentence on a post-it and shares it with the others in the group. Then you all work together to combine your (three) different visions into a new statement which integrates the main ‘spirit’ of each. We didn’t really have enough time for this as we were primarily exploring the method – rather than the topic – fully, but in a full session this stage clearly needs a good amount of time to complete. Again, all this has templates from Sitra.
Take action towards your preferred futures
The final stage involves thinking through actions which you can take towards bringing this vision about. First, we were guided through an individual brainstorm to come with ideas that would lead us to our vision. Time was the creative constrain here. In our small groups we were then tasked with coming up with a news headline from the future which captured what would have happened in the intervening period. We imagined we were living in 2030 and working as reporters for ‘Future News’, sharing our headline and a short explanation of the actions that had taken place and answering any questions from the other groups. And we could add visual images to represent the story as well.
Final reflections on the method
It’s recommended to add further methods to this final phase if you want to build out the process into more of a detailed action planning process. For instance, you could use backcasting or future literacy labs. But from a first experience, it really is a very useful way of getting the participants into a different space to share ideas and inspire others, appreciate the diversity of perspectives in the group and be encouraged to use your imaginations, within a simple but effective framework. It really does feel like a universal method which anyone can just pick up and use!