Not just no data, but also the wrong type of data
Readers who have had anything to do with the Global Fund to fight AIDS, Tuberculosis and Malaria will know that there is about to be a huge flurry of activity, as the Fund prepares to launch its new funding model after a hiatus of a couple of years.
Among many, many other things the new model is emphasizing the need to make sure that programmes are based on sound epidemiological data. In principle this is a good thing: it is a way of making sure that programmes are designed to reach the right people. The people who, particularly in the context of HIV and AIDS, have been overwhelmingly neglected and marginalised.
But, as Stef Baral wrote on this site a few posts back, the same systems and prejudices that lead to the neglect of groups such as sex workers, men who have sex with men, transgender people, and people who use drugs, also operate to ensure that we don’t really have good data. Perversely, we end up in a situation where no data = no programmes.
That’s not the only problem, however. The types of data that are generally considered “robust” and valid for strategic and programming purposes tend to be fairly limited in terms of the insights they provide: we’re talking about estimations of the sizes of different population (or “at risk”) groups, and estimations of HIV prevalence rates among these groups. While this information, if available, can help to indicate where the majority of new infections are occurring in a given country or area, it tells us little or nothing about the reasons why, or the challenges that highly affected groups are facing. It limits them to epidemiological risk-factors.
Community-led research an help bridge some of this gap. It may not provide the hard numbers that decision makers want as a basis for funding allocations, but it is an excellent way of reframing priorities that can help improve how services and programmes are designed.
Sex worker led research in Namibia
A few years ago, because there has been very little research on sex work in Namibia, and because most of the programmes designed to support sex workers are framed around a very narrow HIV focus (information, condoms, cajoling or even coercing people to get tested and have STI check ups; and no attention to issues like violence, discrimination and insecurity), UNFPA and UNAIDS wanted to do a bit of qualitative research to look in more detail at what was going on.
Although I’m a big user of epidemiological research (quantitative and qualitative), in this context it wasn’t particularly feasible (given the resources available) or appropriate to see this as a classic research project, with publication in a peer-reviewed journal or changing national policies as the ultimate goal. What seemed more important, given that a major new HIV programme aimed at sex workers was about to be launched, was to document some of the specific situations in the towns that the programme was going to target, to help influence the sorts of things that get addressed, and to identify and point out any gaps in the programme. Moreover, there were quite a few sex workers in Namibia who are very involved in community work, whether in relation to HIV or more broadly, and we wanted to help them get even more involved.
So we decided to provide some introductory training on one qualitative method – focus group discussions – and got them to think through what sorts of issues their colleagues might want to discuss. We used those suggestions to develop a guide, and sent them out to conduct their own research. Unsurprisingly when they developed the guide, they did not start with a list of questions about “condom negotiation” or “access to HIV testing”. They started with questions about violence, abuse by the police, and discrimination in health services. This is important is because the whole idea behind the work we did in Namibia was to move away from the standard survey approaches which ask sex workers the same standard questions about condom use and access to services, and to give sex workers space to talk, among other things, about the issues that HIV programmes aren’t helping them with and maybe even won’t help them with.
The report describes the results in detail. It also describes the limitations, of which there are many. Although I remain adamant that the purpose of this activity was never to extract data that will tell the whole story and represent the realities of sex workers throughout Namibia, some common themes come out of each of the five towns. But there are also differences. It’s the differences that interest me. I wanted to give people an opportunity to discuss and think about what was going on in their own towns, and what, practically, immediately, might be done to fix some of the problems in each town. And to an extent, I think that’s what we got. It’s not generalisable; in fact the results from each town are probably very biased. We know, for instance, that in most of the towns, we failed to talk to any male or transgender sex workers. But if the biases and their relevance to each town are recognised, and used to get positive change in each town, then that’s OK.
We also got a team with a new set of skills, who could do the same thing again, or can replicate it in other towns, or – why not – help other marginalised groups like men who have sex with men, migrants, or people living in slums do the same thing. The team has, on a number of occasions, used the findings to frame their input into national discussions about HIV programming – so this research, in a very real way, helped to plug the data gap.
Maybe “research” is the wrong term to describe using research techniques in creative ways. This participatory approach isn’t new to community development work: far from it. It isn’t new to public health researchers either. Practitioners have been advocating it for decades. But it remains a marginal rather than a mainstream practice. When there’s no data, however, this sort of research is an excellent starting point for filling the gap.