Bringing sense to complexity

Actively contributing members (in alphabetical order): Alison Ng, Ana Prodanovic, Sally Dover, Sophie Lehouelleur

Our challenge: untangle unexpected and overriding influences over an intervention in order to clearly analyze the intervention's impact.

Our stakeholders: demanding funders and worried evaluation managers

Our solution: a methodology that is fit-for-purpose

Make sure to check out our FULL Project Summary!

07.07.2020 14:19

Two new members on our team! Yeey!

--- `08.07.2020 14:21`

Reading up.. from evaluation to statistics to Wong Baker FACES.. we are consumed

--- `08.07.2020 14:22`

First interviews are in motion :D

--- `08.07.2020 14:22`

Miro is becoming our best friend - it takes anything from us

09.07.2020 14:24

Under the hub.. swiftly moving towards rocket-science

--- `10.07.2020 14:25`

Task for tomorrow: start thinking on our pitch

--- `10.07.2020 14:43`

Today we're on the lookout for more prototype critics...

--- `11.07.2020 10:01`

Video in production today

12.07.2020 10:01

Launched at Evaluation Hackathon by

aprodanovic sophie_lehouelleur binta_moustapha alison_ng

Maintainer hackathons-ftw

Updated 13.07.2020 06:56

  • aprodanovic / update / 13.07.2020 06:56
  • aprodanovic / update / 13.07.2020 06:55
  • aprodanovic / update / 13.07.2020 06:54
  • aprodanovic / update / 13.07.2020 06:53
  • aprodanovic / update / 13.07.2020 06:53

Covid-19 factor

Accounting for Covid-19 as a confounding factor during impact evaluations

We know from numerous studies that exogenous factors, such as economic crises, natural disasters or rising conflicts can superimpose societal developments that are subject to development interventions – which again are subject to evaluations. As evaluators we are familiar with such challenges and know how to deal with them adequately. However, we also know that things are getting trickier when the magnitude of influence of a confounding factor exceeds the expected size of an intervention’s impact. This is even more true, if there is more than one of such nasty factors. So, what to do if technically speaking the ‘signal-to-noise ratio’ between our subject of investigation and the unwanted interference is becoming too small for such a corrective maneuver? How can we attribute an observable change to an intervention if this change is influenced significantly more by something else than the intervention under investigation itself? How would you evaluate an intervention aiming at improving the resilience of vulnerable groups at the very moment their resilience is severely tested by a collapsing regional economy?

All attendees, sponsors, partners, volunteers and staff at our hackathon are required to agree with our Code of Conduct. Organisers will enforce this code throughout the event. We expect cooperation from all participants to ensure a safe environment for everybody.

Creative Commons LicenceThe contents of this website, unless otherwise stated, are licensed under a Creative Commons Attribution 4.0 International License.