Designing the right thing
Feedback on the plans for the STEM Equity Evaluation Portal
We know that people who run STEM gender equity programs (or any programs) do not feel confident or equipped to evaluate them. This is part of the reason why evaluation often falls to the wayside . The lack of evaluation of STEM equity programs means that we don’t know what is working or how to improve what isn’t .
Why evaluation is a priority
Evaluation of STEM gender equity programs is a subject of national interest. These programs seek to dismantle barriers to attract, retain and progress girls and women in STEM. The end goal: a diverse and gender-balanced STEM workforce to confront important world issues. With such significant outcomes at stake, we need to know if what we are doing is working.
The National Evaluation Guide
Last year, the Office of the Women in STEM Ambassador delivered on this key national priority. We produced an evaluation guide (the Guide) to make evaluation simple and easy for program leaders. The Guide is a user-friendly, how-to resource that breaks evaluation down into a simple four-step process.
But wait, there’s more! The STEM Equity Evaluation Portal
In July 2021, the Office received funding from the Department of Industry, Science, Energy and Resources to develop an online evaluation tool and repository for STEM equity programs. The STEM Equity Evaluation Portal (the Portal) will function as an extension and complement to the Guide. As a planning and reporting tool, it will contain standardised and interdependent elements that users can ‘click & select’ to build an evaluation plan and report on findings. It will also include a bank of recommended tools (surveys, tests and other instruments) to choose from for their evaluations. As a repository, it will function as a database of existing STEM equity programs and their efficacy.
The aim of the Portal (and the associated Guide) is to:
- Enable project-level evaluation and demonstrate what works to attract, retain and progress girls and women in STEM
- Support a culture of evidence-based practice by enabling activities to be improved based on evaluation data
- Create consistency and comparability of evaluation data
- Publish and collate evaluation data in a national repository to (1) improve awareness of existing programs and their efficacy, (2) identify and/or address any gaps, and (3) inform decision making about what works and what should be scaled up and/or funded across sectors
- Support and incentivise collaboration between providers of programs within and across sectors to create stronger cohesion and consolidate efforts and resource
Before jumping into development, we asked for feedback on the Portal plan from the people who are likely to use it. We reached out to over 250 stakeholders across the country, as well as the broader public via newsletters and social media. We wanted to know what potential users want to help us design the right thing for the right people.
Feedback on the Portal plans
The feedback on the Portal plan was primarily positive. Respondents said that the Portal features would be helpful to plan and report evaluations. The top feature was the bank of recommended tools (surveys, tests and other instruments). Respondents also liked that the Portal will contain both standardised and customisable elements.
Respondents commented that the Portal would streamline evaluation, contribute to collective insights and allow comparisons between findings. They also noted that the Portal would make it easier to know what others are doing and identify gaps and opportunities.
A general concern was that using the Portal would increase reporting workload. Respondents also expressed some uncertainty around whether the Portal would apply to their needs or align with the existing measures and systems that they currently use.
One respondent noted the importance of broad uptake:
“Uptake amongst program owners is critical to allow seekers to have confidence that they are evaluating all available programs.”
Another respondent suggesting mandating the use of the Portal:
“Perhaps a condition of being awarded a grant could be to use the portal for evaluation, as all grants include some form of evaluation and success reporting anyway.”
A short summary report provides an overview of the feedback.
We are taking this valuable feedback on board as we begin to build the Portal. We will promote broad uptake of the Portal. We will work with funders of STEM equity programs to recommend using the Guide and Portal to evaluate programs. We will work with them to align reporting requirements with the Portal to limit duplication and reduce the reporting burden. The long-term goal is to create a national evidence base of consistent and comparable program evaluations to see if we are doing what works.
* Formal feedback via the online survey closed on 31 October. However, we still welcome informal feedback. You can contact us here.
Salmon, R. A., & Roop, H. A. (2019). Bridging the gap between science communication practice and theory: Reflecting on a decade of practitioner experience using polar outreach case studies to develop a new framework for public engagement design. Polar Record, 1–14. https://doi.org/10.1017/S0032247418000608
McKinnon, M. (2020). The absence of evidence of the effectiveness of Australian gender equity in STEM initiatives. Australian Journal of Social Issues. https://doi.org/10.1002/ajs4.142