Back to Course

Evaluating systems change and place-based approaches – Online Course – November, 2019

0% Complete
0/102 Steps
Lesson 8, Topic 47
In Progress

How does MSC compare to traditional monitoring? Copy

Clear Horizon September 23, 2019

MSC is different from the traditional methods of monitoring and evaluation that you might be using. It complements traditional approaches to monitoring and evaluation because it can fill in some of the gaps. In general, MSC is a qualitative method, and traditional monitoring methods are more quantitative in nature. To illustrate just how different MSC is and why it works so well as a supplementary method, select the tabs below to compare:

MSC Traditional MSC
  • Inductive – about unexpected outcomes
  • Diversity of views (from field staff and beneficiaries)
  • Open questioning
  • Participatory analysis
  • Puts events in context – ‘thick description’
  • Enables a changing focus on what’s important
  • Outer edges of experience
Traditional
  • Deductive – about expected outcomes
  • Indicators often determined by senior staff
  • Closed or specific questioning
  • Analysis by management
  • Based on numbers – no context
  • About ‘proving’
  • Central tendencies

Select the tabs below to find out more about how MSC and Traditional monitoring compare.

Inductive vs. deductive views

Traditional methods often use deductive reasoning. Usually, there is a theory about what is supposed to happen, and then we analyse quantitative data to find out if the outcome that we expected occurred. But what about the outcomes we don’t expect? Deductive reasoning isn’t ideal for that. MSC uses inductive reasoning instead. By asking participants to make sense of events after they have happened, MSC can tell us about the outcomes we expect as well as the outcomes that we did not expect. This is useful because it can tell us things that we don’t realise we need to know.

By using the MSC technique to get information and encourage reflection from participants regularly about the intangible and indirect consequences of their work, teams can change direction to achieve more of the outcomes that are important to them.

Diverse vs. limited views

In many monitoring and evaluation systems, the indicators we measure are defined by people who are distant from where events happen. When senior executives and specialist research units define indicators for monitoring and evaluation, these are defined by looking outwards from the project (program out). In MSC, this is done differently. The people closest to the event, like field staff, beneficiaries , front line staff and clients are the ones who identify stories that they think are relevant (context in). Other participants then choose the most significant stories, so diversity of views is a core part of the way the organisation decides which direction to go in.  

Open vs. closed questions

Here are some example of closed questions:

  • Did you like the program?
  • On a scale of 1-10, how would you rate the program?

Questions like these result in numerical data that can be analysed quantitatively. MSC analyses qualitative data, so it uses open questions. For example:

  • Over the past five years, how would you describe your experience of the program
  • From your point of view, what was the most significant change that took place concerning the quality of people’s lives?

Using MSC, participants use their  judgement to identify and select stories. To do this, they use open questions, like these, which gives beneficiaries , field staff, clients and front line staff a voice in the process. 

Participatory vs. centralised analysis

Often in traditional methods of monitoring and evaluation, data is analysed at a senior level. Typically, fieldworkers do not analyse the data they collect; they pass the information on for others to analyse.

In the MSC process, information is not managed centrally but is distributed through the organisation and processed locally. Staff collect information about events and evaluate that information according to their local perspective.

Context vs. lack of context

Quantitative data is often analysed without context. Tables of statistics are usually sent from field offices to central office staff for analysis, but the people analysing the data are a long way from the field site. With limited text comments from fieldworkers, the analysis happens without much context or the perspective of beneficiaries and staff.

MSC uses ‘thick description’: detailed accounts of events in the local context, with detail about people and their views. These descriptions are usually given through anecdotes or stories that also capture the writer’s interpretation of what’s significant. This makes drivers for change visible. With this additional information, teams can see what happened and why, and can focus on what has changed and why this is important.

Static vs. dynamic indicators

In most monitoring and evaluation systems, indicators remain the same for each reporting period. The same questions are asked repeatedly, and the focus doesn’t change. With MSC, the type of data collected is dynamic and changes over time. Participants choose what to report, and these choices reflect real change in the world, and changing views in the organisation about what matters. That information can then inform project activities, ensuring that the project reflects what’s important to the people involved. This makes MSC particularly good for working in emergent and complex contexts.

Outer edges vs. central tendencies

MSC focuses on the outer-edges of experience. In most types of social science research, and in evaluation, we are mostly concerned with finding out what most people’s experience is of a program or intervention. This is related to the scientific research approach, where the main focus is on proving or disproving hypotheses. MSC is interested in the outer-edges of experience, rather than in finding out or generalising about the most common experience. This makes MSC useful for investigating the unintended outcomes of programs. This means that MSC is not intended to produce generalisable results.

First cousins

  • Appreciative inquiry
  • Success case method – Brinkerhoff
  • Critical incident technique

Downloads

Diverse vs. Limited Views

Leave a Reply

avatar
  Subscribe  
Notify of

CLEAR HORIZON ACADEMY

  • (+61) 3 9425 7777
  • Email us
  • 129 Chestnut Street, Cremorne, 3121, VIC

Sign up to our Newsletter

The Clear Horizon Academy is a learning initiative of Clear Horizon.

We acknowledge the Traditional Custodians of the lands on which Clear Horizon is located and where we conduct our business. We pay our respects to Elders, past, present and emerging. Clear Horizon is deeply committed to honouring Indigenous peoples’ unique cultural and spiritual relationships to the land, waters and seas; and to transforming evaluation practice to uphold and reflect Indigenous values.
Register your interest We will inform you when the product arrives in stock. Just leave your valid email address below.
Email We won't share your address with anybody else.
Jess Dart sq

Dr Jess Dart

CEO & Founder

Inventor of practical methodologies and highly demanded facilitator, Jess navigates complexity with comfort and helps her clients to become clear about their desired outcomes and how to get there.

Recipient of the 2018 AES Award for outstanding contribution to evaluation, Jess has over 25 years experience in evaluating and designing social change initiatives and strategies in Australia and overseas.

In October 2005, Jess founded Clear Horizon Consulting and is now CEO. She is also a board member of the Australasian Evaluation society.

Jess is passionate about developing and designing real world evaluation and strategy for social justice and sustainability. She particularly works with systems change interventions, large scale strategy and social innovation. After completing her PhD she co-authored the Most Significant Change (MSC) guide alongside Dr Rick Davies, which is now translated into 12 different languages.

MSC is well suited to evaluation in the complex emergent context. The latest innovation by Jess, Collaborative Outcomes Reporting (COR), is a collaborative form of impact evaluation.

Jess is also an active mum and has two teenage boys. In a quiet moment, she loves reading far-future science fiction and enjoys long distance running.

Please provide your information
before downloading pdf file