The Spiral: Systems MEL in Practice

Systems MEL in Practice

Systems MEL is inherently non-linear—iterative, adaptive, and deeply shaped by context. To help navigate this complexity we introduce three tools:


  • The Spiral as a way of practicing Systems MEL. The Spiral exemplifies the dynamic nature of Systems MEL and of systems change. 


  • Four core contradictions that surface when adopting Systems MEL and suggestions on how to work through these.


  • A Systems MEL Readiness Check to help gauge where in your organisational context there is appetite and resistance to Systems MEL.

01

The Spiral: Systems MEL in Action

The Spiral exemplifies the dynamic nature of Systems MEL and of systems change. By cycling through the Spiral, practitioners get to work with their system insights dynamically throughout implementation. The iterations also allow teams to practice Systems MEL in settings that don’t yet support it.


The spiral is not prescriptive but a dynamic learning orientation. Rather than prescribing a fixed sequence, the Spiral helps you:

Identify where to begin, based on your context, entry point, or capacity

Loop back, reflect, and adapt as system dynamics evolve

Revisit and realign as new actors join, insights emerge, or power shifts

The Spiral is anchored in your Orientation Lens—the foundational elements that keep your MEL practice grounded:

Strategic Intent

clarifies long-term purpose

System Boundaries

defines what’s in or out of focus

Spheres of Control, Influence, and Interest

situates your role and agency within the system

Together, these lenses help you stay focused on purpose, scope, and role as you navigate change.


The Spiral isn’t a set of fixed phases—it’s a set of focal areas that teams revisit again and again, adjusting their approach as the system shifts.

02

The Spiral Model

The Spiral model is made up of five areas: Review Direction, Revisit System Boundaries, Surface Patterns, Shift Course, Cultivate Coherence. By clicking on each area in the diagram you can read about the purpose of each and some useful techniques to engage with each. It is important to note that you can start anywhere on the Spiral and you do not need to work through all five areas every time you cycle through the Spiral. Instead we encourage you to find the best area for you to start with and over time to add in new Spiral areas.

Cultivate

Coherence

Shift

Course

Surface

Patterns

Revisit System

Boundaries

Review

Direction

Review Direction

Purpose

Revisit your shared long-term purpose. Re-anchoring MEL in systemic transformation— rather than outputs — helps guide meaningful inquiry and adaptive strategy.

Example Questions

  • Has our shared purpose evolved?

  • Are our activities aligned with deeper goals?

  • What types of systemic shifts do we hope to see?

  • Who is involved in shaping that purpose?

Systems MEL Techniques

  • Use your Strategic Intent to update contribution pathways (ToCs, impact hypotheses)

  • Refine learning questions tied to system outcomes

  • Identify relevant Domains of Change

  • Review your indicators for alignment with transformation goals

Revisit System Boundaries

Purpose

Reassess your system landscape. This helps ensure your MEL scope still reflects current dynamics, actors, and feedback.

Example Questions

  • What’s changed in roles, relationships, or feedback loops?

  • Are new risks or opportunities appearing?

  • Do current MEL priorities match the system context?

Systems MEL Techniques

  • Update and/or deepen system maps and actor influence diagrams

  • Revise stakeholder analysis (e.g., power-interest grids)

  • Reassess leverage points and spheres of influence

  • Defining signals of change

Surface Patterns

Purpose

Observe and interpret signals of system change. Go beyond binary results to explore patterns, tensions, and unexpected developments.

Example Questions

  • What early signals are emerging?

  • Are we seeing new behaviors, norms, or alliances?

  • What might these shifts indicate about system evolution?

Systems MEL Techniques

  • Use outcome journals or signal-tracking worksheets

  • Collect stories, observations, and informal feedback

  • Document intermediate outcomes and ripple effects

  • Tracking and interpreting signals of change

Shift Course

Purpose

Apply what you’ve learned to refine your approach. Use MEL data and sensemaking to guide real-time decisions.

Example Questions

  • What needs to change in strategy, design, or engagement?

  • Are we still aligned with our Strategic Intent?

  • Where are new leverage points or risks emerging?

Systems MEL Techniques

  • Hold pause-and-reflect or learning review sessions

  • Revise activities, partnerships, or investments

  • Adjust indicators or learning questions accordingly

Cultivate Coherence

Purpose

Ensure that actors, strategies, and structures remain aligned with the shared vision. This step is about strengthening coherence—not just coordination.

Example Questions

  • Are stakeholders still aligned in purpose?

  • Is our MEL reinforcing shared understanding?

  • Are we integrating multiple perspectives into learning?

Systems MEL Techniques

  • Synthesize system-wide insights and feedback

  • Share learning products across levels (local to policy)

  • Facilitate joint reflection and re-prioritization

More:

Why a Spiral

03

Practicing Systems MEL in Organizations

Not every organization is immediately ready or equipped to embed systems practices—and that’s perfectly normal. Embracing Systems MEL involves a learning curve, where teams gradually build comfort, capacity, and alignment over time.


A few key considerations:

The journey is rarely linear. Progress may be followed by setbacks. For instance, a leadership change or new funder priorities might shift resources away from a Systems MEL approach.

Different individuals and departments within the same organization often demonstrate varying levels of readiness. You might have strong leadership support and technical skills but lack sufficient resources or funder interest to sustain Systems MEL.

Adopting Systems MEL takes time. It challenges core operational assumptions and requires navigating between two coexisting worldviews—the more rigid Results-Based Management and more adaptive, systems-oriented practices. Integrating both approaches often involves deliberate reflection, negotiation, and phased implementation.

More:

Navigating the Practical Challenges of Adopting Systems MEL

deeper learning:

Advanced Systems Practices

Linked Content

Systems MEL Readiness Check

04

Four Core Contradictions

Navigating Contradictions in Systems MEL

Working in complex systems often means operating within contradictions—situations where two seemingly opposing truths are both valid and must coexist. In Systems MEL, these contradictions arise because the demands of accountability, evidence, and performance often clash with the realities of emergence, adaptation, and long-term change.


Rather than trying to eliminate or resolve these tensions, effective practitioners learn to hold and work within them—using them as sources of insight, creativity, and balance. Contradictions reveal where Systems MEL must stretch: between donor expectations and adaptive practice, between proof and learning, between short-term results and enduring transformation.


The following section outlines four core contradictions that commonly show up in Systems MEL practice. For each contradiction, we describe:

What it looks like in practice

How the tension manifests in real work.

How to navigate it

Practical ways to hold both sides, integrate perspectives, and use the contradiction as a learning opportunity.

Together, these contradictions offer a way to deepen practice—helping teams stay accountable and credible while still embracing the complexity, emergence, and adaptive nature of systems change.

Contradiction

What It Looks Like in Practice

How to Navigate It

Accountability vs. Learning

Donors require predefined outcomes, rigid logframes, and proof of impact, but your system change work needs flexibility to learn and adapt.

Use both/and framing. For example, include monitoring of traditional accountability metrics and integrate reflective learning sessions in your management. In your reporting include both results against your metrics and sections demonstrating your growing understanding of the system, how this is informing your activities and contributing to change.

Evidence Expectations vs. System Complexity

Funders and leaders expect “rigorous” evidence (e.g., validated data, independent evaluations) focused on demonstrating your project, program, portfolio performance, but your systems work is full of nuance, multiple perspectives, and indirect outcomes.

Build legitimacy for an expanded definition of rigor. You might do this by:

  • Co-creating what counts as valid evidence with funders and stakeholders, including stakeholders closest to the change. You could do this during the design phase, prior to commissioning an evaluation, at a formal review point during implementation.

  • Demonstrating rigour through transparency including the rationale for a different approach, the methodology, and assumptions in your plans and reports.

  • Showing how different types of evidence are being used to support change in context.

Short-Term Results vs. Long-Term System Change

Project timelines are 3 to 5 years long, but system change unfolds over decades. Teams feel pressure to show results too soon.

Surface and show the relationship between your intervention and your Strategic Intent. Here are some ideas:

  • Use nested theories of change to connect short-term actions to your long-term direction.

  • Use interim indicators or signals of change to track system shifts.

Standardized Evaluation Criteria vs. Adaptive Usefulness

The real value of your MEL lies not only in proving impact, but also in surfacing pathways to change and supporting collective insight and adaptation.

Link MEL and management activities where you can. This might include:

  • Use evaluative tools that support both accountability and adaptive sense-making—for example, complement rubrics or metrics with facilitated learning sessions.

  • Leading up to a planned evaluation, co-design the evaluation brief with partners and stakeholders. Include a facilitated judgement process to ensure that multiple perspectives influence the evaluation findings and use.

Working with the Contradictions

Rather than trying to “resolve” these contradictions, we recommend learning how to hold them. Here are some ways to do that:

1. Start with Assumptions

Facilitate reflective discussions about the underlying assumptions of your initiative, your operational context, the system, and your MEL approach to surface the gaps between your initiative and its potential for contributing to systems change.


Useful tools for surfacing assumptions: Systems MEL Readiness Check, Core Foundation Tools: Strategic Intent, System Boundaries, Domains of Change.

2. Experiment and Learn

Don’t try to overhaul everything at once. Try small experiments: a learning-focused reflection session alongside a donor report, or a facilitated sense-making moment after data collection. Learn what works.


Select a tool from the How-To Sheets to test out, and follow it up with a short After Action Review to capture what worked, what didn’t, and how you’d adapt it for next time.

3. Build Internal Champions

Equip MEL staff, project managers, and funders with language and tools to talk about systems work and Systems MEL. Share examples and stories of how this has been done elsewhere.


Use our case studies in the Food Systems Toolkit as a starting point.

4. Engage Donors Constructively

Most funders are not opposed to Systems MEL—they need help to see and experience the value of it.


You might include in your reporting learning logs or adaptive management stories. Invite funders to participate in Collective Sensemaking, or in the formulation or revisiting of your Strategic Intent, System Boundaries, Spheres of Influence.

5. Blend Approaches

You can honor the programmatic commitments while also experimenting with systems practices.


For example, keep a Results Framework and introduce a Strategic Intent and/or Learning Questions that explore the gap between your initiatives TOC and your Strategic Intent and/or introduce systems-informed indicators to consider alongside your Results framework.

Implemented by:

United Nations
Development Programme

FUNDED BY:

MEL 360 is part of  the Systems, Monitoring, Learning and Evaluation initiative (SMLE) of UNDP funded by the Gates Foundation.

WEBSITE DESIGNED IN 2025 BY RAFA POLONI FOR UNDP