Author(s)
Martin Noblecourt
The concept of localisation has been officially the lead item on the humanitarian agenda since the 2016 World Humanitarian Summit. The concept is, however, increasingly called into question by the idea of the ‘decolonisation of aid’, which takes a different approach as its starting point: ‘two different conversations [are taking place] simultaneously: a technical exchange on ways to improve humanitarian aid, and a moral debate on how to deal with the broader dynamics of geopolitical power which are the reason a country may need humanitarian aid in the first place’1.
Despite the growing importance in recent years of data management programmes2, the challenges they present seem at first sight somewhat removed from questions of governance or power relationships within the international solidarity movement. In a 2023 study of the topic (Changing the outlook: for a local approach to data3), CartONG demonstrated the crucial role of data in putting localisation into practice and more generally in changing power relationships within the humanitarian aid sector.
Bias and accountability run counter to attempts at localisation
Our basic vision of the sector is biased by our thinking about why and how we collect data. By way of example, a case study4 showed that while ‘3W’ (Who/What/Where) operational reports produced by OCHA (the UN’s humanitarian affairs coordination body) already minimised the importance of local actors in humanitarian interventions, these local actors disappeared entirely from key reports produced by the IATI (International Aid Transparency Initiative), because in the absence of financial indicators their presence was undetectable.
In the same way, the fact that local languages are not taken into account – especially when devising and carrying out surveys and evaluations – introduces a fundamental bias into data used as the basis for taking decisions on humanitarian interventions. The organisation Translators without Borders/Clear Global has supplied several examples of humanitarian projects where accountability, relevance or basic feasibility have been called into question because of a lack of accurate translation in languages that beneficiaries understand5. There is an all-too-prevalent practice today of relying on third parties who happen to be on hand to translate questionnaires that may often be quite complex. There is no way of checking the quality of the translation. People may even be simply left out of surveys because there is no way of providing a translation. This results in bias in the data produced and leads to systematic under-representation of certain population groups (women, people with a lower level of education, minorities …). A similar observation might be made about taking gender into account in surveys, although progress is being made on this.
These problems of bias are part of a humanitarian system where upwards accountability is considered increasingly important. In the view of civil society organisations (CSOs) (73% of the CSOs surveyed by CartONG6), and national statistical institutes (from a survey of 140 national statistical institutes in low- and middle-income countries7), data collected is intended primarily for donors and international humanitarian actors, far more than for local government actors or civil society, or those actively using data within these organisations.
International agencies (donors, the UN, major international CSOs) continue to take the lead in determining research projects and their objectives, associated indicators and often methods for collecting and managing data. Local actors (CSOs, research units, individual researchers, and communities who are ultimately the most concerned) are limited to an implementation role, considered no more than sources of data8. This situation has an impact on all the CSOs in the international solidarity sector. They are obliged to adopt measures that run counter to humanitarian principles, on the pretext of the need for compliance. The best example is the process of screening, which some donors try to insist on: verification that humanitarian workers and suppliers, as well as the intended beneficiaries of CSOs’ interventions, are not on international sanctions lists. This type of screening does not conform to the humanitarian values of impartiality and non-discrimination. It creates a new set of challenges related to reliability, security and (mis)trust of people9. The most significant altercation between French CSOs and their government, including the Agence Française de Développement (France’s Overseas Development Agency), went as far as a flat refusal, in some cases, to talk to each other. It ended up with the Conseil d’État (France’s final court of appeal) eventually asked to pronounce – which it did in favour of the CSOs – precisely on this issue of screening, in other words on a challenge over the responsible handling of data10.…
This type of bias can have serious consequences, by failing to reach oppressed people in the management of funds intended to help them11, but tending more generally towards the creation of a vicious circle: problems of access/fewer data collected/failure to properly identify needs/under-allocation of funds.
Neo-colonialism of data?
The international solidarity sector cannot ignore today’s global digital systems, dominated by a few major companies (American, for the most part). The transformation of our economic system by digitalisation and data (cf. debates on ‘surveillance capitalism12’ or the claim that ‘data are the new oil’13) might be considered on the same sort of scale as the emergence of colonialism in terms of the capture of resources. Extracting personal data for private interests (and the way this process has become invisible, and normalised) follows the same logic as colonial extractivism14.
Experts therefore speak of ‘algorithmic colonisation’, which reproduces the methods of colonialism. The issue is essentially ‘the way technologies imported [sc. into African countries] by the West not only encode western values, objectives and interests, but are a positive hindrance to Africa’s own technologies, which better suit its needs’15. This is reflected in failure to adapt solutions to local contexts, contempt for the people behind the data and blind trust in technology. Among the most striking evidence of these new inequalities: exploitation of ‘clickworkers’ who keep artificial intelligence (AI) platforms going, risking their own mental health in doing so;16 ‘plundering’ of biometric data from vulnerable people in crisis-torn countries like Argentina by the Californian start-up WorldCoin17; concentration of the environmental externalities of digitalisation (pollution from mineral extraction, toxic waste, etc.) in countries of the South.
Behind the grand international declarations about achieving global consensus on technology in the service of development (such as the recent Global Digital Compact) lie several opposing interests on the part of the huge American companies on the one hand and states and civil society on the other (the latter not only in the Global South): the former openly assert their opposition to the ambitions of the latter for digital sovereignty18. It is worth noting that the imposition of western norms is also open to challenge over its most apparently uncontroversial features, such as the adoption of norms compatible with the European GDPR, whose vision of rights as essentially individual is not matched in every culture19.
Artificial intelligence (AI), currently considered by many people as revolutionary for the humanitarian sector, really does no more than shine a spotlight on the challenges data present. In fact, AI will automatically reproduce bias in data and may even amplify it (an unintended consequence), reducing transparency and thus the scope for correcting errors – and it will, obviously, make it even more difficult for communities to be involved in the use of their own data20. If we are not vigilant, AI will simply reinforce existing power dynamics, and encourage techno-solutionism (e.g., the temptation to extrapolate data for a given country, where in-country collection is complicated by difficulties of access, by means of data from neighbouring countries: a lack of methodological rigour is inevitable).
Equality between women and men is already proven to exhibit bias in AI: most tools and methods have been devised by professions where women are structurally under-represented; and the data which sustain major AI learning systems like ChatGPT are based on a corpus of knowledge that reflects the inequalities of western society, including gender inequality21.
A genuinely responsible approach to data
Dealing with bias and inequality calls for several structural changes in the organisation of international solidarity: first, we need to support the growing capacity of local actors to handle the challenges related to data, with their needs – rather than the demand for more and more accountability – taken as the starting point. This requires funding and training. Several organisations recommend investing as a priority in local data infrastructure (local administrations, civil status and census returns, schools, hospitals …), thereby building a ‘spinal column’ to support a wider system of data, including on humanitarian aid22. They recommend also recognising the legitimacy of a sovereign statistical system developed by national actors, rather than trying to set up a parallel system, as the humanitarian aid sector too often does. We should also reverse the process of ‘more and more accountability’, as already discussed. The need for change also affects project management, which requires a less narrow concept of indicators, with communities and local actors heard and paid attention to in project development (which entails a change of approach to project financing, too).
There are frameworks available today to guide the work that needs to be done on data inclusivity: they include Principles for Digital Development (principles which include the issue of data, while additionally going further), the Inclusive Data Charter23 and the Data Values Project24. They all assume more involvement by communities in devising data requirements and making use of data, as well as greater transparency, open data, and the provision of more opportunities to acquire data literacy. There are examples of decentralised international networks, structured to promote working within egalitarian ecosystems – e.g., the OpenStreetMap platform, or Flying Labs – whose objective is to reduce the ‘power footprint’ (taking the concept of ‘carbon footprint’ as a model) of actors from the global North among their users25. Alongside initiatives like these, we also need to be willing to find ways of drawing on the potential of tomorrow’s humanitarian leaders and experts, from the global South.
CartONG’s new overview study on the challenges that data poses for the international solidarity movement (‘Beyond the Numbers: Balancing Innovation, Ethics, and Impact’26), shows that the structural transformations described above would benefit everyone. By listening and paying more attention to local communities when developing projects; by being more inclusive and thus enhancing the quality of data used for needs assessments; by making more use of qualitative, introspective approaches; by transforming our ways of collaborating and learning – by taking these steps, our sector would have far greater impact. The point is equally valid for mid-sized NGOs, particularly French ones, who are at risk of being left behind in the present context of rapidly evolving technology and increasingly complex demands for accountability.
A sustainable approach to data must therefore find ways to address the following challenges: responsible data use (protection of personal data); cybersecurity; sustainability in the sense of promoting affordable, resilient systems that are compatible with ecologically sound policies; inclusivity (accessibility, gender-awareness, sensitivity to local language requirements and to data literacy; and the wider issue of digital sovereignty. This approach should resonate with humanitarian actors wishing to apply the principle of ‘do no harm’ to data management, as well as with national and governmental actors in the Global South and a range of humanitarian actors in the public sector in the Global North (especially Europe), who are all confronted today with similar digitalised systems. A ‘decolonisation of data’, or of digitalisation more widely, is needed if responsible, sustainable data are to materialise.
It is essential that discussions on the future of data, and the use of digitalisation within the sector of international solidarity, should not be limited to specialists, and that our sector joins in the wider process of reflection on how to achieve responsible digitalisation. If not, CSOs could find themselves at odds with humanitarian principles, perhaps without even realising it.
Martin Noblecourt, Senior Fundraising & Research Officer, CartONG
- Heba Aly. ‘Policymakers and racial justice activists came together to discuss decolonising aid.’ The New Humanitarian. August 2022.
- CartONG. ‘Program Data: The silver bullet of the humanitarian and development sectors? Panorama of the practices and needs of francophone CSOs.’ September 2020.
- CartONG, ‘Changing the outlook: for a local approach to data’. January 2024.
- Development initiatives. ‘Improving the visibility of local and national actors in humanitarian aid data.’ 2021.
- Translators Without Borders. ‘Listen and learn: The link between language and accountability for the future of the Grand Bargain.’ June 2021.
- CartONG. ‘Beyond the Numbers: Balancing Innovation, Ethics, and Impact.’ October 2024.
- Mihir Prakash, Tanya Sethi. ‘Measuring and responding to demand for official statistics.’ AidData. December 2018.
- Mahad Wasuge, Ahmed M. Musa, Tobias Hagmann, ‘Who owns data in Somalia? Ending the country’s privatised knowledge economy’, Somali Public Agenda, juillet 2021.
- CartONG. ‘Screening and accountability. Responsible data management toolbox.’ June 2023.
- Coordination SUD. ‘Annulation des lignes directrices en matière de criblage par le Conseil d’État.’ (The annulment by the Council of State of guidelines relating to screening.) February 2023
- Mariam Ibrahim, Fionna Smyth, Claudia Wells, Euan Ritchie. ‘When the data doesn’t tell the full story: improving gender-responsive climate finance.’ Development Initiatives Blog. November 2023.
- Shoshana Zuboff. ‘L’âge du capitalisme de surveillance.’ (The age of surveillance capitalism.) Editions Zulma: October 2020.
- The Economist. ‘The world’s most valuable resource is no longer oil, but data.’ May 2017.
- Nick Couldry, Ulises A. Mejias. ‘Making data colonialism liveable: how might data’s social order be regulated?’ Internet Policy Review, 8(2). May 2018.
- Abeba Birhane. ‘Algorithmic Colonization of Africa.’ Imagining AI: How the World Sees Intelligent Machines. Oxford Academic: 2023.
- Marion Douet. ‘Au Kenya, des « entraîneurs » de ChatGPT s’élèvent contre leurs conditions de travail.’ (In Kenya, ChatGPT “trainers” protest their working conditions.) Le Monde. October 2023.
- Louise André-Williams. ‘De l’argent contre des données biométriques : la start-up américaine qui profite de la misère.’ (Money in exchange for biometric data: the American start-up that profits from extreme poverty.) Médiapart. March 2024.
- Stephen Chacha & Bill Anderson. Digital Compacts: Global ideals, regional realities. Development Initiatives. September 2024.
- Siddharth Peter de Souza, Hellen Mukiri Smith & Linnet Taylor. ‘Decolonial Data Law andGovernance.’ Technology and Regulation. 2024 pp.1-11.
- Op. cit. See note 6 above.
- Linda Raftree. ‘How can we apply feminist frameworks to AI governance?’ MERL Tech. September 2023.
- Bernard Sabiti, Bill Anderson & Sam Wozniak. ‘The data side of leaving no one behind.’ Development Initiatives. September 2021.
- Global Partnership for Sustainable Development Data. ‘Inclusive Data Charter.’ 2018.
- Data Values project. ‘The #DataValues Manifesto: Demanding a fair data future.’ 2021.
- WeRobotics. ‘Here’s How We Expanded Locally Led Action to Shift the Power.’ March 2024.
- Op.cit. See notes 6 and 20 above.
Pages
P. 52-59