Understanding how evidence informs policy is essential for informing the design of and evaluation of interventions that aim to deliver better-informed policy.
By and large, the frames we use to understand evidence-informed policy rely on linear or logical models. The use of terms like “bridge”, “broker”, “translate”, “to” or “from” to describe the relationship between research and policy illustrates our conceptual limits.
One useful alternative frame through which to examine this process is the concept of information density, originally proposed by Peruvian intellectual Mirko Lauer. This concept refers to the concentration of knowledge, actors, and institutions that produce, disseminate, and apply evidence within a particular geography, sector, or policy domain.
This concept is consistent with the use of the marketplace of ideas metaphor – a high-density context would be akin to a highly competitive and well-functioning market.
The density of information strongly influences the degree to which evidence can shape policy.
In high-density contexts — such as economic policy in most countries — a well-developed ecosystem of universities, think tanks, research centres, international agencies, media organisations, popular literature and the private sector contributes to a dynamic environment in which knowledge flows readily as it is produced, used and disseminated throughout the system. In such settings, debate and peer review are common, and policy decisions are more likely to be based on diverse and rigorous sources of evidence. Even when policy mistakes occur, they are often corrected swiftly (or at least contested) through scrutiny and feedback.
As in a well-functioning market, high-density contexts enjoy little information asymmetry, many producers, intermediaries and users, low barriers of entry and exit, an efficient allocation of resources, trust in the rules, minimal externalities, etc.
By contrast, low-density contexts — which may include neglected policy areas such as disability inclusion, gender-responsive budgeting, or subnational health services or certain geographies — lack a critical mass of relevant expertise and institutions. These contexts may suffer from fragmented funding, weak research and decision-making infrastructure, and limited political or media interest. As a result, policy decisions in these fields may be driven by anecdote, political pressure, or isolated studies rather than robust, systematically generated evidence. In this context, poor policy can persist unchecked.
High-density contexts are not guaranteed to deliver better results, but then again, nobody can guarantee this.
What to do?
As On Think Tanks’ work has highlighted, improving the use of evidence in policy requires a context-sensitive approach to capacity development. There is no universal solution; instead, interventions must be tailored to the characteristics and needs of the local evidence system.
But frameworks to guide context-sensitive interventions are still useful. And these frameworks need to relate to observable actions and indicators.
In a recent conversation, Octavio Gómez Dantés offered a way to go from talking about information density (or field building, as he described it) to action. Gómez Dantés suggests that:
- Low-density contexts are akin to spaces with an underdeveloped, weak or infant field, and the focus must be on building capacity – from the ground up. This might involve supporting the emergence of new research chairs, research programmes or think tanks; maybe even simply developing a cadre of early-career researchers, investing in data systems, and fostering new communities of practice. In low-density contexts, there isn’t a research community or field to talk about- yet. It has to be built.
- In medium-density contexts, where relevant actors and evidence exist but have limited influence or coordination, the goal should be to strengthen their capacity. This includes improving research quality, fostering collaborations, and enhancing the ability of evidence producers to engage effectively with policymakers, civil society organisations, and the media. In medium-density contexts, the research community or field exists but is still emerging.
- In high-density contexts, the priority becomes enhancing performance. This involves encouraging innovation, ensuring transparency, incorporating citizen perspectives, and promoting diversity within the evidence base. Even in mature systems, challenges around credibility, independence, and elite capture persist. There is always room for improvement.
Both Lauer’s and Gómez Dantés’ conceptualisations imply a need to prioritise expectations of capacity and impact.
For example, in low-density contexts where capacity still needs to be built, does it make sense to push for and demand short-term policy impact? In the absence of sufficient research-based evidence, intermediaries, informed debate and capacity to use evidence in government, it would be unreasonable to expect policies to be better informed (I would argue it may even be dangerous to demand it, as the system is not yet ready to qualify and use evidence responsibly).
Evidence use can be a desirable secondary outcome, but should not be the primary expected impact of any intervention.
The same prioritisation of expectations is found in IDRC’s Research Quality Plus (RQ+) framework to assess the quality of research, instead of comparing research produced in different contexts (e.g. high-density and low-density) like for like, IDRC advocates for a careful consideration of the conditions in which research is produced. After all, having access to robust data, financial resources, an extensive peer network, and institutional support can greatly affect the results of research. Still, none of these define the quality of the research itself.
But strengthening evidence-informed policymaking requires a systems perspective that looks beyond individual organisations and considers their relationships. This involves engaging a wide range of actors: funders, universities, civil society, the media, and policymakers themselves. It involves shaping incentives, embedding new norms, and building long-term relationships between all of them – ideally based on trust.
Hence, these 3 levels of investment can be applied to all actors in the system – not just researchers and their work. Gómez Dantés’ focus was on researchers and research institutions, but the same approach can be used to invest in policymaking bodies, research funders and scientific councils, political parties, etc. In fact, Mirko Lauer’s reflection on density was in response to a question we posed to him on the role of the media in promoting evidence-informed policy; hence, support for the media’s intermediary capabilities could follow the same logic.
Furthermore, the approach allows for cross-cutting considerations such as equity: density and capacity may increase in overall terms but at the expense of certain voices (e.g. women or the most vulnerable communities).
Moreover, this kind of intervention demands patience and sustained investment. It is not a linear process — political dynamics, institutional arrangements, and knowledge demands evolve and suffer setbacks over time. Support for evidence systems must therefore be adaptive, strategic, and rooted in deep contextual understanding – but long-term.
By recognising and responding to different levels of information density or field development, and by fostering collaboration across diverse actors and disciplines, we can maximise long-term sustainable change. In doing so, we pave the way for a more inclusive, effective, and democratic policymaking process — one that better reflects the complexity of the challenges we face and the plurality of perspectives needed to address them.
In practice
I took advantage of my conversation with Octavio Gómez Dantés to ask about the types of indicators that could help monitor and assess the impact of interventions to promote evidence-informed policy in different contexts. Using the example of health systems in Latin America, he suggested the following first draft:
- Low density: For countries in the initial stages of health systems research development, such as those in Central America, capacity building indicators could include the number of researchers, supporters, and departments of research institutions devoted to teaching health systems.
- Medium density: For countries like Costa Rica and Panama, potentially in this “middle stage” of field development, relevant indicators of capacity strengthening could include the number of health systems teaching programmes, research centres with programmes or even devoted to health systems and maybe, academic publications that routinely publish on the issue.
- High density: For countries with more robust health systems and a mature health systems research community, such as Chile, Brazil and Mexico, more sophisticated indicators of performance enhancement are appropriate. These could include the financial investment in health research systems and the impact factor of health systems journals published. He also suggests that the presence of strong health economics research within a country is a good measure of sophistication and an indicator of strong field health systems research.
Progress in any field is not always linear. Gómez Dantés was quick to point to the case of Mexico, where the use of evidence in health policy has recently declined despite the existence of a strong research community.
But his point, as well as Mirko Lauer’s, is that while you cannot guarantee that a mature and high-density field will lead to better informed decisions and policy outcomes, there is a much better chance that bad policy decisions will be contested and improved.
What about the context?
Julie LaFrance commented on an early version of this article and asked: “What other contextual factors may influence density of use even when research ecosystem infrastructure and policy conditions are high density? While I like all the indicators (see the draft score card below), to what extent are the political, economic and social conditions considered as influencing factors? The more nuanced things like political and social polarisation, public trust in evidence, disinformation/fake news, etc.”
Context matters! Results from the 2025 State of the Sector Report show that higher levels of trust in evidence in a country “protect” think tanks against the negative effects of political polarisation. We know, too, that state capture can cancel out the positive impact of higher performance and density; the case of economic policy in Peru is a good example of this.
In answer to Julie’s question, I expect that the political economy of policymaking would have to be part of the sense-making that this type of analysis demands. I am not advocating for a purely quantitative assessment.
At the same time, I also think that this approach captures some of the contextual factors – albeit with, maybe, a certain time lag.
The case of Peru, again: Peru is likely still a high-density country when it comes to economic policy. But it has lost ground on evidence use in politics and is beginning to see some loss in evidence use in broader civil society and private sector – especially among certain increasingly radical groups and new actors emerging from the informal and illegal economy. Conducting the assessment in 2019 and then again in 2021, 2023, and 2025 would have shown a clear trend.
Thus, a combination of easily measurable indicators and sense-making informed by contextual evidence would offer, in my view, a useful and accurate-enough assessment of the state of evidence-informed policy in any given polity.
Monitoring progress – A Quick Score matrix and facilitation guide
I have been “playing” with these ideas in search of a practical tool to support funders and evidence-informed policy practitioners. Below is a list of observable and measurable indicators of information density, drawing from both evidence production, dissemination, and use across research, media, government, civil society, and the private sector.
To assess the level of density of information, we could consider the following categories:
- Evidence production
- Evidence intermediation and dissemination
- Evidence use in politics
- Evidence use in the broader civil society and the private sector
Indicators in each category could be used to determine the level of information density on a specific geography, sector or polity.
We could calculate both the overall density and the level of density for each category above. This could highlight cases where density in evidence is sufficient but greater efforts are needed in intermediation, for example.
After the list, you’ll find a practical Quick Score matrix to assess levels of density in a given geography or sector.
Finally, some guidance on how to facilitate sensemaking discussions informed by the results.
Please consider them as a starter-for-ten.
Sample indicators of information density
Each category can be informed with multiple indicators – as many as it makes sense or is useful. Below I suggest an initial set of indicators and what or where to look for data. Across all indicators questions on equity could be asked. For instance, the number of active researchers in the field – by gender or location.
1. Evidence production
| Indicator | What or where to look for |
| Number of research centres, think tanks, or policy institutes. | Look for global, regional or national or sectoral directories, donor/project listings, or academic networks. e.g. Open Think Tank Directory |
| Presence of bachelor’s degree and postgraduate programmes in the relevant fields (e.g. economics, public policy, health, education). | Scan university offerings — especially MA and PhD degrees. |
| Number of active researchers or analysts in the field. | Count faculty, affiliated experts, or contributors to public research. |
| Number of recent research publications (academic and grey literature). | Count academic publications as well as working papers, reports, and articles from the last 2 years. |
| Presence of research training or capacity-building programmes. | Fellowships, summer schools, PhD support, etc. |
| Locally led, externally or nationally funded research projects. | Evidence of research projects at local think tanks, universities, public policy centres, etc. |
| Funding for relevant research (this can also include a per capita (number of researchers) breakdown of the funding). | Review the national budget for research, science granting councils, and foreign funders’ portfolios. |
2. Evidence intermediation and dissemination
| Indicator | What or where to look for |
| Media outlets with sector-specific editors or journalists (e.g. health, economics). | Editors, permanent sections in newspapers, radio or TV, regular bylines or columns by specialists in newspapers, radio, or TV. |
| Popular publications for general audiences (e.g. popular science/policy books, opinion columns). | Number of books, explainers, or special issues published in the last 12–24 months. |
| Media programmes focusing on evidence or policy issues (TV, radio, podcasts). | Number of evidence-based talk shows, science policy programmes, and interviews with researchers. |
| Public events discussing research or data (forums, conferences, launches). | Number of events open to the public, online or in-person, where evidence is discussed. |
| Presence of knowledge brokers or communication staff in research institutions. | The number and seniority of dedicated individuals or teams translating research for non-expert audiences. |
| Existence of digital evidence platforms or repositories. | Government, donor, or academic portals with accessible research/data. |
| Number of “social media” experts or through leaders. | Social media platforms, influencers with accounts that regularly discuss relevant issues. |
3. Evidence use in politics
| Indicator | What or where to look for |
| Existence of evidence-use guidelines or policies. | Regulations requiring data or research to inform policy (e.g. in budgeting, health). |
| Government units or teams dedicated to policy planning, research, or evaluation. | Policy analysis units, “EdLabs”, data and M&E departments within ministries, parliament or sub-national governments. |
| Government research centres or bodies. | Publicly funded national research bodies are linked to line ministries on a range of issues. |
| Policy documents that reference research or data. | White papers, strategies, or legislation citing studies, statistics, or expert input. |
| Formal partnerships between government and research organisations. | MOUs, collaborative projects, or joint advisory bodies. |
| Training for civil servants in policy analysis or evidence use. | Courses, workshops, or certifications in public administration. |
| Political parties with research teams or formal/informal associations with research organisations. | Political party organograms, MoUs, and overlap between politicians and think tanks. |
| Public funding allocated to research. | Size and/or number of public funds or programmatic allocations via granting councils or public research bodies. |
4. Evidence use in civil society and the private sector
| Indicator | What or where to look for |
| CSOs referencing or producing research in advocacy. | Campaigns citing evidence, publishing briefs, or hosting evidence-based events. |
| Private sector entities commissioning or applying research. | Companies funding studies or using evidence in product, service, or CSR planning. |
| Presence of innovation hubs or incubators tackling policy challenges. | Labs or accelerators linked to education, health, environment, etc. |
| Participation of non-state actors in policy consultations or debates. | Public records or media showing active engagement by NGOs, unions, and business groups. |
Quick Score matrix
A scoring table like the one below could be used to assess each context or sector. The values in the example are arbitrary and would need to be developed on the basis of real cases:
| Density indicator | High (2) | Medium (1) | Low (0) |
| Evidence generation | |||
| Research centres / think tanks | >5 active | 2–5 | <2 |
| Postgrad programmes in policy-related fields | >3 | 1–3 | None |
| Active researchers producing public outputs | >10 identifiable experts | 3–10 | <3 |
| Recent research publications (past 2 years) | >10 outputs | 3–10 | <3 |
| Intermediation and dissemination | |||
| Sector-specific media journalists | Regular presence | Occasional mentions | Absent |
| Popular books/articles on the issue | >10 in last 2 years | 3–10 | <3 |
| Media programmes on policy/evidence | Multiple regular programmes | Occasional | None |
| Public events on research/policy | Monthly or more | A few annually | Rare/none |
| Evidence use in politics | |||
| Policy documents citing evidence | Frequently | Occasionally | Rarely/never |
| Government or Parliament evidence-use units | In multiple ministries | In 1–2 | None |
| Civil servant training on evidence | Systematic | Occasional | None |
| Government or parliament partnerships with research orgs | Multiple active partnerships | One or two | None |
| Public funding (entirely dependent on context) | ~ > USD 250,000 | ~ < USD 250,000 | None |
| Evidence use in civil society | |||
| CSOs using research in advocacy | Common practice | Sporadic | Rare |
| Private sector engaging in research | Several examples | A few | None |
| Innovation hubs tackling policy | Multiple visible hubs | One or two | None |
| Evidence portals/repositories | Widely used | Existing but limited | None |
Total Score (out of 34):
- High Density: ~ 27–34
- Medium Density: ~ 14–25
- Low Density: ~ 0–14
A similar analysis can be made for each category.
Sense-making
The data could be used to inform discussions at the level of each space – sector, geography or polity. Questions to facilitate the discussion could include:
- Is there balance in the system? Is production matched by spaces or vehicles to disseminate, communicate or use evidence?
- Is the system biased in favour or against certain constituencies – women, vulnerable communities, certain regions or disciplines?
- Are there any political, economic or social trends or developments that could explain the relative density or strength found?
- How may these development affect the relative density or strength of the system in the future?
- Are there sectors within the same geography or polity with high density or strength and that could offer relevant and useful advice or practical interventions to address low density or system weakness?