Why's and what's of evaluation: SAGE reveals the secrets of measuring impact
The Australian Academy of Technology and Engineering and Australian Academy of Science established SAGE in 2015 to pilot the UK’s Athena SWAN charter in Australia. This framework seeks to improve the retention of women in STEM. Now an independent organisation, SAGE Ltd supports the hiring, promotion, participation and retention of women in STEM and the broader higher education and research sector through an accreditation and awards system. Subscribing organisations are awarded a Bronze, Silver or Gold award.
We spoke with Wafa El-Adhami, CEO of SAGE Ltd, and Larissa Fedunik, Communications Officer, who shared with us some of their best evaluation tips. They assured us high quality evaluation can be done cheaply and with time to spare — when it is well planned. This quote from Wafa sums it up nicely: “Spend quality time planning the process, and you will be rewarded with how you manage evaluation.”
“Make it core at the outset”
Much like Tech Girls, also interviewed for this series, SAGE is emphatic about integrating evaluation from the outset. Here’s why they suggest starting early can help you.
Manage costs, time and quality
SAGE indicates that the program design stage is the ideal phase to design your evaluation. An early start means you can break down evaluation into smaller, manageable pieces. Plan to evaluate at different points in time in a way that it moves hand in hand with the program you are doing. From a time-management perspective, early planning will allow you to allocate time for evaluation, and responsibilities across your team. This means team members know when they are expected to complete evaluation-related tasks.
Setting expectations is also important for stakeholders. Wafa brought up something she has observed: stakeholders often perceive that evaluation tasks, when received with little to no notice, can imply there is something wrong with the program. With evaluation designed at the outset, you can communicate relevant milestones to stakeholders participating in the evaluation process. They are more likely to trust you and be confident in the process.
All the previous steps have already helped you reduce costs. “[Evaluations] often tend to become expensive exercises if we have not really started doing it from the outset,” says Wafa. Planning evaluations into the program will decrease a potential need to resort to external parties to evaluate your program once it is coming to an end.
High quality evaluation is specific
A broad statement like “evaluating the impact” is not enough. Defining the scope and objectives of evaluation is as important as defining the scope and objectives of the program. This helps program owners avoid a common challenge. If evaluation is the last thing you do, you might be left trying to figure out what worked well and what didn’t, instead of having specific evaluation goals. Your findings can, and should, help continuously improve your program. However, it should not be the sole focus and objective of your evaluation. What else could you do with your findings?
Make the best use of the information you collect
Evaluation provides you with a lot of information. Which is why a well-defined evaluation scope can guide your decision on how to you use your findings. SAGE reminds us there is a difference between:
- Evidence you collect
Ask yourself the following questions to help you use the information for your defined purposes. Will your findings help you draw out solutions to a particular problem you are trying to resolve? Will they inform recommendations to certain stakeholders?
One best piece of evaluation advice: “Start evaluation from the outset and apply a system wide approach”
SAGE employs a systems-thinking approach to evaluation, and the team suggests you do the same. Systematic thinking maximises your ability to:
- Save time and money
- Evaluate throughout the program
- Find tools and instruments that are a good fit to evaluate the program, and adapt them if necessary
- Foresee the intended and unintended outcomes of the program
What is SAGE’s ideal ‘evaluation world’? “When it will no longer need to be said that the program will be evaluated. When it is already implied that this will happen, and everything we do is going to be evaluated at different points in time,” says Wafa.
About the series
‘Why’s and what’s of evaluation: Program leaders reveal the secrets of effective evaluation’ is a series by the Office of the Women in STEM Ambassador which provides insights for program owners looking to evaluate their programs. We delve into why it’s important to evaluate, the barriers organisations face with evaluation and how they overcome them.
Still have a question about evaluation?
Download “Evaluating STEM gender equity programs: A guide to effective program evaluation” and find other tools to drive gender equity in STEM.
 Find out how to define the specifics of your evaluation with the guide we have developed for STEM gender equity programs.
Clara Gomes is a communicator and writer. She is passionate about creating a more equitable STEM sector in Australia. As the Digital Content Officer for the Office of the Women in STEM Ambassador, Clara creates, edits and shares content on the Office's social channels and website.
Isabelle Kingsley is a researcher, science communicator and educator. Her research focuses on measuring the impacts of science education and outreach. As Research Associate for the Office of the Women in STEM Ambassador, Isabelle is investigating gender issues to inform how to drive needed cultural and social change for gender equity in STEM across Australia.