Prolonged humanitarian crises, the rise of generative artificial intelligence, the use of disinformation to polarise societies and manipulate elections, the suppression of civil society from state and non-state actors and decreasing funding… are just a few of the ever-growing challenges that social justice, humanitarian, and development organisations must contend with. As progressive and rights-based civil society organisations – from local to international level – are grappling with these crises of the past few years, the International Civil Society Centre once again had the honour of curating our yearly conference, Global Perspectives on these topics. “Global Perspectives – Moments of Truth” happened on 9 November and brought together hundreds of online participants across five different sessions to not just ponder these challenges but provide concrete examples and explore ideas on how we can collaboratively tackle them.
Though each session was independently curated, three overarching themes emerged.
The Future of Civic Space is Now
Anticipating the factors that will constrict our civic space a decade from now demands our attention today. Civic Space has been in decline and is likely to continue on this trajectory. Are we adequately addressing the issues that will likely shape our societies in the next decade, impacting our civic engagement? In the session titled “Learn From and Engage on Futures Scenarios for Civic Space” participants learned about the outcomes of the Centre’s “A History of Civic Space 2024-2034”, exercise, where representatives from 15 civil society organisations collaborated to develop possible future scenarios for civic space. Session participants engaged in the scenarios and identified actionable steps to either advance or prevent undesirable outcomes. For example, a likely scenario of artificial intelligence (AI) first enabling a lot of good work at scale and then backfiring on civil society as “obstacles to progress”, highlighted the urgency to get into the AI game now. Later in the day at the “Digital Dialogue – AI: Solution or Threat to Mis-/Disinformation?” drove the point home: two scholars Liz Orembo from Research ICT Africa and Admire Mare from the University of Johannesburg, called on civil society organisations to address AI now, as governance advocates, watchdogs, as well as helping to increase media literacy. This is especially needed as there are a number of key elections coming up next year where we will likely see sophisticated disinformation campaigns. Henry Parker from Logically, informed us that there is a lot of potential to use AI to identify disinformation campaigns and reprimand the actors responsible. During “A Sector Conversation”, Stéphane Duguin from CyberPeace Institute warned us that we need to create a comparable countermodel if we wish to increase our capacity to oppose disinformation campaigns. Read their approach to responsible use of artificial intelligence here and watch this space as we are launching our Sector Guidance on Mis-, Dis-, and Mal-information: Insights and Foresights in early 2024.
Two sessions, “The Truth is in the Telling” and “Exploring Personal Realities (of Marginalisation)”, delved into the importance of representation. Insights from individuals working with and identifying as members of marginalised communities underscored the need for more direct dialogue with those in power. Nana Afadzinu from WACSI emphasised in “A Sector Conversation” the need for introspection and acknowledgement of systemic inequity. Festus Odingo from the SDG Kenya Forum emphasised the significance of partnerships as a key force for change, emphasising how they may broaden the scope and effect of community-based initiatives. Representation of course also happens through communication pieces – donor reports, flyers, fundraising advertisements and much more. Undeniably, communication about Global North-financed Global South projects has been a big part of manifesting white saviourism and entrenching power imbalances. By now, several organisations have begun to examine this reality and make changes. The Ethical Story Telling Guideline, a toolkit that PATH and Metro Group DRC contributed to, was presented by the speakers. It can assist companies in determining how to, for instance, become more ethical by making concrete adjustments to the planning process. Communication audits, such as the ones conducted every two years by CARE International, can be useful in holding teams accountable and providing incentives for improvement. As part of its bottom-up strategy to alter various communication channels, CARE has made significant efforts to maximise informed consent and minimise unconscious bias. Yet, there are still incredibly difficult dilemmas when organisations must weigh communication subjects’ agency against their safety, for example when portraying female CSO workers in Afghanistan. The emphasis is on respect. A key takeaway from our sessions is to aim to do no harm but failing forward is inevitable as we push one another to improve and recognise ethical storytelling as a fundamental component of power shifts within the industry.
Weathering a Perfect Storm
Our speakers highlighted this year a shrinking civic space, humanitarian crises piling up and worsening, colonial structures still fostering inequity, and growing cybercrime and disinformation adding to the complexity. All of this is happening in the face of a challenging global economy with a sharp decline for our causes. Are we experiencing a perfect storm? Yet, for most in the sector, there is a firm resolve to plough on despite difficult circumstances. There is no alternative. It has been inspiring to see among others, leaders from ICVA, CIVICUS, WACSI and the CyberPeace Institute sharing resources, knowledge and honest invitations to collaborate more, helping each other to overcome our deficiencies and capitalise on our respective expertise and strengths.
To continue surviving the storm, the International Civil Society Centre will keep bringing attention to the incredible innovations that are being developed in the field. As Mirela Shuteriqi from ICVA said in her closing statement, we must also transform ourselves. We must encourage a culture and bring about changes at the UN level, using this as a chance to collaborate and tackle social justice issues. We remain dedicated to facilitating dialogue, sharing innovations, and fostering collaboration within the sector. It is through collective determination, thoughtful introspection, and ethical storytelling that we can face the challenges that lay ahead, transform ourselves, and forge a path towards a more just and equitable future. The journey is ongoing, but together, as a united force, we embark on it with unwavering resolve.
Special thanks to all our speakers – Jennifer Abomnger, Nana Afadzinu, Stéphane Duguin, Patrick Gathara, Arnold Gekonge, Eva Gondor, Heather Hutchings, Wolfgang Jamann, Lysa John, Hussam Joudah, Admire Mare, Shalini Moodley, Patricia Mugenzi, Levis Nderitu, Nana Nwachukwu, Festus Odingo, Elizabeth Orembo, Henry Parker, Neha Rayamajhi, Mirela Shuteriqi, Clare Spurrell, David Verga, and Rachel Wilkinson.
For the first time in four years, the Centre convened its Leading Together conference in person again. Leading Together is our annual space for the global directors of the ICSO divisions of Human Resources, Policy/Advocacy, and Programmes. These groups have parallel peer group discussions as well as joint sessions over topics that concern them all. This year, the Scanning the Horizon community of futures-focused senior sector professionals also joined the group. We were thrilled to welcome participants on our home turf in Berlin and spend 48h learning, debating, and reflecting. The Centre team is busily following up on those 47+ items/ideas/insights generated during the event, and we would also like to share three insights that were key to us:
At Global Perspectives 2022, we heard and stressed how anticipatory capacity in (I)CSOs is a collective muscle we need to exercise constantly. We took this advice and focused peer and joint sessions on this topic: Discussions with Russell Reynolds on the role of leadership and using AI for the good of ICSOs as well as shaping the future through participatory strategy making were sessions where participants engaged with trends and how to “organise futures”. The Policy/Advocacy Directors discussed with David Griffiths, Associate Fellow at Chatham House the future of the Human Rights diplomacy.
We also really challenged participants with some freshly generated scenarios created in a collective exercise (ParEvo) the Centre has just concluded. Participants had to discuss and reflect on how civil society (organisations) might deal with and shape civil society space after a series of mega-tsunamis hit the world and severed all IT infrastructure. While some scenarios stretched the goodwill of participants to further consider, the exercise was highlighted by many as important to encourage imagining futures differently. A series of mega-tsunamis will throw the world into disarray (not unlike a global pandemic) and might need primarily our crisis-response capacity. However, spending time on creating long-term visions for different futures can put us as civil society sector professionals in a different kind of driving seat versus racing to manage with futures narratives others – usually more powerful actors – are creating.
What was particularly enriching at this year’s conference was the participants’ willingness to engage in the substance of discussions and openness to critical challenges, and generally a learning mentality. We tried not to gloss over differences with buzzword definitions like “power shift” or “decolonising” but acknowledged the complexity of the matters we deal with and that we may get some things right and others wrong along the journey. Similarly, a joint discussion between the Programme and the Policy/Advocacy directors in exchange with AWID over anti-rights groups and the threat they pose to civic space was exemplary for constructive engagement: Participants brought so much nuance to the discussion and – it might sound like a cliché – embraced the diversity of viewpoints and created patience for understanding our individual or organisational contexts. These high-quality discussions were incredibly enriching and displayed a high degree of collective responsibility for advancing as a sector.
A few years back someone said “’powershift’ is the water we all swim in”. This was certainly true for Leading Together. In so many sessions participants explored topics that come from our journeys to become organisations that are at least more power-aware or even mirror a decolonised, equitable and just society that we want to see. It was hugely encouraging to see the spread of organisational initiatives and the degree to which ambitions for change are permeating the organisations: to learn from the experience of WaterAid’s participatory strategy making journey, engage with Superrr Lab in what it takes to break western-centred views of futures making. In similar vein, Mission Talent and the cohort of Human Resources directors discussed the challenges and possibilities our changing sector holds to build more diverse organisations; the Programme Directors explored with Comic Relief what ways there are to work differently with bilateral donors to enable more equitable partnerships; and the Policy/Advocacy Directors are already experienced how shifting mandates of ICSOs hold increased expectations for their departments. Senior leaders from the ICSOs are demonstrating resolve and yet acknowledge that these are unchartered waters where peer exchange, inspiration and support is just what you need.
If you are also an ICSO senior leader and you want to learn more about our offer, do reach out. We already look forward to the next round of Leading Together in 2024 – online – and in-person in 2025!
Karl Steinacker and Michael Kubach are digital experts based in Berlin, Germany. This article is a reflection on their recent teaching assignment, “Digital Transformation and Social Justice”, at the Alice-Salomon University in Berlin, Germany.
During the 1990s, the world of public administrations and civil society organisations (CSOs) was awash with Change Management Teams. In particular, humanitarian and development organisations were, tasked to professionalise their work, to overcome charity and to create impact. The backbone of the process was the introduction of the Logical Framework, or LogFrame. This tool was originally developed for the US military and then adapted by NASA for its space programmes. After it was adopted by USAID and GTZ, an entire nomenclature for the humanitarian and development sectors was built around it: Results Based Management.
Even one of the authors proudly joined a Change Management Team and worked, so the buzzwords at the time, on Modernisation and Innovation. On top of the list of taboo concepts stood the term Welfare. In fact, the task was to implement neo-liberal policies at the level of the people living in extreme poverty. And in this context, the idea of unconditional support to ensure that members of a society can meet basic human needs, such as food and shelter, was simply rejected. The measure of support was not what a society could possibly afford but a revised paradigm of social justice aiming to reward effort and measure effectiveness. Management manuals were updated with an array of re-defined terminology stretching from authority, to responsibility, and most importantly accountability, to ensure compliance by everybody with these new policies.
Our journey into this realm of non-profit business administration would last for many years. The first task was to develop indicators: performance and impact indicators, at times SMART and then KPIs, but most importantly: measurable. Thus, datafication started on Excel sheets. For organisations mandated to protect and assist individuals, group statistics were deemed not to be sufficiently precise anymore. Registration systems had to be changed too: individual registration was to replace all systems that had been developed and used to provide collective support, to families or other groups.
Think tanks and govtech consulting firms were always eager to help for a fee: The digitalisation and the datafication of the social sphere would replace blanket statistical categorisations and allow for a more precise documentation of the individual. This idea of fine tuning corresponds to the sense of justice of many people working in the aid sector: A detailed analysis is much fairer than stereotypical evaluations.
While the early days of social statistics were characterized by the attempt to define meaningful categories, the growth of personalized data means that individuals are no longer counted as part of one big class, but increasingly as social-statistical singularities. The use of metrics and algorithms allows previously overlooked, hidden or willfully ignored differences to be identified, and hence utilized.
Twenty-five years on, we find ourselves in a university classroom. Students, soon to enter the job market in the social sector, are confronted with our professional experience as we discuss digital transformation and social justice. But the learning curve is steep for both, students and lecturers. For the latter, having stepped back from limited areas of responsibility in a single set of organisations, the bigger picture emerges: Yes, individual registration is indeed an empowerment tool providing an identity to somebody who is otherwise invisible, excluded, unbanked, and unfit to participate in the society and (digital) economy. However, it also allows for humanitarian and social surveillance – not as an abstract possibility but as an everyday reality.
Today, authorities all over the world are experimenting with predictive algorithms. That sounds technical and innocent but as we dive deeper into the issue, we realise that the real meaning is rather specific: fraud detection systems in social welfare payment systems. In the meantime, the hitherto banned terminology had it’s come back: welfare or social safety nets are, since a couple of years, en vogue again. But in the centuries-old Western tradition, welfare recipients must be monitored and, if necessary, sanctioned, while those who work and contribute must be assured that there is no waste. So it comes at no surprise that even today’s algorithms focus on the prime suspect, the individual fraudster, the undeserving poor.
Fraud detection systems promise that the taxpayer will no longer fall victim to fraud and efficiency gains can be re-directed to serve more people. The true extent of welfare fraud is regularly exaggerated while the costs of such systems is routinely underestimated. A comparison of the estimated losses and investments doesn’t take place. It is the principle to detect and punish the fraudsters that prevail. Other issues don’t rank high either, for example on how to distinguish between honest mistakes and deliberate fraud. And as case workers spent more time entering and analysing data and in front of a computer screen, the less they have time and inclination to talk to real people and to understand the context of their life at the margins of society.
Thus, it can be said that routinely hundreds of thousands of people are being scored. Example Denmark: Here, a system called Udbetaling Danmark was created in 2012 to streamline the payment of welfare benefits. Its fraud control algorithms can access the personal data of millions of citizens, not all of whom receive welfare payments. In contrast to the hundreds of thousands affected by this data mining, the number of cases referred to the Police for further investigation are minute.
In the city of Rotterdam in the Netherlands every year, data of 30,000 welfare recipients is investigated in order to flag suspected welfare cheats. However, an analysis of its scoring system based on machine learning and algorithms showed systemic discrimination with regard to ethnicity, age, gender, and parenthood. It revealed evidence of other fundamental flaws making the system both inaccurate and unfair. What might appear to a caseworker as a vulnerability is treated by the machine as grounds for suspicion. Despite the scale of data used to calculate risk scores, the output of the system is not better than random guesses. However, the consequences of being flagged by the “suspicion machine” can be drastic, with fraud controllers empowered to turn the lives of suspects inside out.
As reported by the World Bank, the recent Covid-19 pandemic provided a great push to implement digital social welfare systems in the global South. In fact, for the World Bank the so-called Digital Public Infrastructure (DPI), enabling “Digitizing Government to Person Payments (G2Px)”, are as fundamental for social and economic development today as physical infrastructure was for previous generations. Hence, the World Bank finances globally systems modelled after the Indian Aadhaar system, where more than a billion persons have been registered biometrically. Aadhaar has become, for all intents and purposes, a pre-condition to receive subsidised food and other assistance for 800 million Indian citizens.
Important international aid organisations are not behaving differently from states. The World Food Programme alone holds data of more than 40 million people on its Scope data base. Unfortunately, WFP like other UN organisations, is not subject to data protection laws and the jurisdiction of courts. This makes the communities they have worked with particularly vulnerable.
In most places, the social will become the metric, where logarithms determine the operational conduit for delivering, controlling and withholding assistance, especially welfare payments. In other places, the power of logarithms may go even further, as part of trust systems, creditworthiness, and social credit. These social credit systems for individuals are highly controversial as they require mass surveillance since they aim to track behaviour beyond financial solvency. The social credit score of a citizen might not only suffer from incomplete, or inaccurate data, but also from assessing political loyalties and conformist social behaviour.
Hence, the question becomes urgent: what is social justice in a metric society and which role will CSOs play in this emerging environment, what will be their raison d’être?
Indian CSOs chose a pragmatic approach and engaged to ensure that these systems work in favour of those in need. These CSOs have analysed and strategised, on the basis of their principles and values, they endeavour to improve the newly emerging systems. Thus, some are engaged in advocacy, political and even legal battles to ensure that data protection measures are enacted and implemented. Others assist vulnerable individuals to register and to claim their entitlements and turn low-resolution into high–resolution citizens. Yet others engage in educational programmes to teach the users on the new digital environment and their rights therein.
Reflection and discourse might need to go even further and look at transparency and other societal issues, in particular with regard to logarithms. The city of Rotterdam was the only of several cities which agreed to a third party review of the algorithms deployed for fraud prevention purposes. Hence, there is a vast area to be chartered out: from the preservation of confidentiality, copyright and intellectual property rights to the demand for transparency where appropriate.
Lately it has been suggested that anti-racism or decolonisation dimensions will fundamentally change the long-term strategic thinking of important civil society organisations and that this would require developing concrete performance metrics and progress indicators. This shows that the issue will not go away: Should those advocating and working on counter power be using the same methodologies and tools than those currently holding power?
Our own personal story from LogFrame to Logarithms provides a number of lessons. The most important one is to understand the basic concepts underpinning the digital transformation of our societies: Power, social welfare in a market economy, individuals and their identity, and networks and platforms. Accepting that there is no escape room from the megatrend of digital transformation, neither for the individual, nor for the societies we are living in, means that engaging with these new technologies remains imperative. For the organisations that constitute civil society even more is at stake: It’s the race for relevance: Wherever human rights and social justice issues are at stake, whether an organisation focuses on advocacy or service delivery, what is needed are concepts and strategies to actively shape the ongoing transformation based on a clear idea of values and rights. Failing to do so will leave it to the govtech business industry to come up and implement their solutions uninhibitedly.
 Steffen Mau, The Metric Society, 2019, p. 167.
 Alice Salomon Hochschule, Berlin, Digital Transformation and Social Justice as part of the ICM programme.
 Suspicion Machines – Unprecedented experiment on welfare surveillance algorithm reveals discrimination, in: Lighthouse Reports, 6 March 2023.
 Consulting firms talk about welfare fraud up to nearly 5 per cent of benefits spending, while some national auditors’ offices estimate it at between 0.2 and 0.4 per cent. Ibid.
 Suspicion Machines, see footnote 5 above
 Ranjit Singh, Steven Jackson, Seeing Like an Infrastructure: Low-resolution Citizens and the Aadhaar Identification Project, in: Proceedings of the ACM on Human-Computer Interaction Archive Vol. 5, No. CSCW2, Article 315, October 2021,
 International Civil Society Centre, Sector Guide #2:Strategic Decision-Making in a Whirly World, July 2021, p. 26,