Who asks the questions and who responds? A broader and deeper assessment of The State of the Humanitarian System

ALNAP
6 min readAug 25, 2022

By Alice Obrecht

Alice Obrecht, Head of Research at ALNAP and a lead author on this year’s upcoming The State of the Humanitarian System report, explains the approach the team used to include more voices and views when measuring the performance of the system.

Aid workers in Dhaka, Bangladesh, surveying affected communities about the aid they receive and their awareness regarding COVID-19. Photo credit: Flickr/UN Women Asia and the Pacific

When I tell people I’ve just met that I work on research in the humanitarian aid system, the most common question I hear is, ‘So, does any of that aid accomplish anything?’ (immediately followed by ‘Which organisation is the best?’, which, as an employee of an impartial multi-stakeholder network, I typically decline to answer).

Interestingly, when ALNAP asked people who had received humanitarian support to share the big questions they had about aid, they had similar concerns. They wanted to know if aid was going to the right people and how much of it was diverted due to corruption. They wanted to know why they weren’t being offered more opportunity to influence decisions in aid organisations and wondered how these decisions were made.

It’s these questions and more that motivate ALNAP’s State of the Humanitarian System research series, the only regular assessment of performance in the humanitarian system.

For this year’s edition of the State of the Humanitarian System report, its fifth edition since 2012, three things stood out to us in the research process that I think have wider relevance for how knowledge is being produced and used in the humanitarian system.

The State of the Humanitarian System report will launch on 7 Sept.

A more inclusive approach to research and evidence production

Each edition of the SOHS has always relied on interviews and/or surveys with people supported by humanitarian assistance to understand their perspectives on the system and how well it is doing. But as any researcher knows, the person who sets the research questions has significant control over the findings and how they are framed.

Therefore, in an effort to broaden our research framework, from measuring the system’s own expectations of itself to measuring its performance against the priorities of people supported by humanitarian assistance, we consulted people supported by humanitarian assistance during the early design phase of the research, asking people in three different response contexts what should be included and integrating their priorities into our research. This led to a greater emphasis on issues that matter most to them: targeting, corruption, do-no-harm, and accountability to affected populations. While some of these were expected, the strength of people’s desire to see us tackle targeting and corruption led to a much stronger emphasis on these issues in the report.

Read more in The State of the Humanitarian System: Inception report

We also partnered with the NEAR network and The Research People, working with a team of researchers who are nationals in crisis-affected countries, to undertake two country studies on localisation in Somalia and Turkey, and country-level interviews and focus group discussions in six locations: Bangladesh, DRC, Ethiopia, Lebanon, Venezuela, and Yemen. The dedication that these researchers showed to uncovering key challenges with humanitarian aid in their contexts was inspiring, all the more so because of the personal risks involved — risks which led most of them to request anonymity as authors of their case studies in the main report.

For the first time we restructured the report to detach it from the OECD-DAC criteria and make it feel less like a report assessing the system by its own preferred expectations.

Even with these measures, ALNAP was still — rightfully — challenged by our research partners on how much of our pre-set research framework we were really willing to let go of. To accommodate this, for the first time we restructured the report to detach it from the OECD-DAC criteria and make it feel less like a report assessing the system by its own preferred expectations. But there is still a lot more we can do in this regard. Balancing the need to expand and reorient our perspective on performance against the desire to ensure comparability to previous SOHS reports will be an important challenge for us to address in the next edition.

Looking outside ‘the system’

For the first time in this edition, we also took a more systematic look at other vital networks and sources of support for people in crisis.

Our working definition of the humanitarian system is:

‘The network of interconnected institutional and operational entities through which humanitarian action is undertaken when local and national resources are, on their own, insufficient to meet the needs of a population in crisis.’

These boundaries are fluid and there have always been examples that challenge where we draw the lines. While our assessment of performance remains focused on those who give and receive international funds for humanitarian crisis response, for the first time in this edition, we also took a more systematic look at other vital networks and sources of support for people in crisis.

Multiple key informant interviews and responses to aid recipient surveys illustrated the importance of actors who do not fit the traditional ‘humanitarian’ definition: survivors and community-led responders (sclr), diaspora groups, philanthropists, host communities and private-sector entities, among others.

What was surprising — but perhaps should not have been — was how many examples of this kind of support came up in focus group discussions and in our aid recipient survey, reaffirming that formal humanitarian assistance is but one of many important sources in a crisis. Humanitarian researchers need to walk the talk on seeing and understanding these wider forms of support, in order to help actors in the system better connect with them.

Data gaps and the future of the humanitarian sector

What this report cannot say is almost as important as what it does say. Persistent data gaps limit our ability to provide clear, definitive answers to key questions about humanitarian assistance:

  • How many people does it reach?
  • Does it save lives?
  • How cost-effective is it?

For this edition we went to new lengths to locate or generate this data. But addressing these gaps requires more resources and effort than a single research project can achieve, even one as long running and large in scope as The State of the Humanitarian System.

There have been repeated calls over many years for the humanitarian system to improve its evidence base. Resource pressures in the sector mean that knowledge production, monitoring and evaluation, and data quality and accessibility tend to get de-prioritised.

Our collective failure, as a sector, to properly resource this work risks undermining the humanitarian project as a whole. Better evidence could not only guide improvements; it could help to demonstrate the system’s value in the context of vast economic pressures.

As we come to the end of this edition of the State of the Humanitarian System, we hope that the next edition will be able to draw on ever greater and more accurate inputs.

The State of the Humanitarian System 2022 looks at the period from January 2018 to December 2021, as well as drawing comparisons with our previous editions to take a 15-year long view. It assesses the size, shape and performance of the humanitarian system against key criteria over time. It is independent and based on evidence from frontline practitioners, crisis-affected populations, academics, policy-makers and donors. It draws on a mixture of qualitative and quantitative data from primary and secondary sources, including evaluation syntheses, quantitative reviews, surveys, interviews and focus group discussions, and longitudinal analysis of our unique 12-year dataset. Feedback and research outputs from affected populations form a significant part of the report.

--

--

ALNAP

We are the global network for advancing humanitarian learning.