Skip to content

STEM Equity Evaluation Portal: Stakeholder Consultation

CLOSED 31 October 2021

"Early and broad consultation helps us know if we're designing the right thing for the right people."
IsabelleKingsley
Isabelle Kingsley
Research Associate | Office of the Women in STEM Ambassador

Consultation purpose

The purpose of this public consultation is to seek feedback and input from the people who are likely to use the STEM Equity Evaluation Portal (the users). We want to know what you want to do (user needs) in order to build a service that works for you. We want to know: Are we designing the right thing?

Executive summary

The STEM Equity Evaluation Portal (the Portal) will be an online evaluation tool for STEM equity programs, which will also act as a national repository of evaluations, developed by the Office of the Women in STEM Ambassador (the Office).

The Portal is being developed in response to recommendations from the Women in STEM Decadal Plan for a standardised national evaluation framework that guides evaluation efforts across all existing and future STEM gender equity initiatives in Australia. In July 2021, the Office was awarded funding from the Department of Industry, Science, Energy and Resources to develop the Portal as an integral part of the Australian Government’s Advancing Women in STEM 2020 Action Plan.

The Portal will function as an extension and complement to the National Evaluation Guide (the Guide), published by the Office in 2020. The Guide is a simple 5-step evaluation framework that outlines how to create an evaluation plan, put it into action and share the findings. As an extension to the Guide, the Portal will contain standardised and interdependent elements that users can easily ‘click & select’ to build an evaluation plan and report on findings. It will also include a bank of recommended tools (surveys, tests and other instruments) to choose from for their evaluations. As a repository, it will function as a database of existing STEM equity programs and their efficacy.

The aim of the Portal (and the associated Guide) is to:

  • Enable project-level evaluation and demonstrate what works to attract, retain and progress girls and women in STEM
  • Support a culture of evidence-based practice, by enabling activities to be refined and improved based on evaluation data
  • Create consistency and comparability of evaluation data
  • Publish and collate evaluation data in a national repository to: (1) improve awareness of existing programs and their efficacy, (2) identify and/or address any gaps, and (3) inform decision making about what works and what should be scaled up and/or funded across sectors
  • Support and incentivise collaboration between providers of programs within and across sectors to create stronger cohesion and consolidate efforts and resourc

How you can help

The information below the line describes who the Portal users are and how it will be designed to serve their needs. We would like your feedback and input. Please carefully review the content below, then click on the ‘Give your feedback’ button to tell us what you think.

CLOSED 31 October, 2021

Who are the users?

Who are the Portal users, what ‘job’ are they trying to get done and why?

Program owners

People and/or organisations who are running and/or evaluating a STEM equity program. They may include (across business, research, education and government): 

  • Key personnel (professional staff, managers, senior leaders, CEOs)
  • Organisations
  • Collaborators (individuals or organisations)

Program owners may use the Portal to:

  • Use a simple resource that helps them easily plan and report program evaluations
  • Share evaluations publicly to contribute to the evidence base
  • Find/attract funding from organisations seeking to invest in programs
  • Find/attract program partners/collaborators to create stronger cohesion and consolidate efforts/resources
  • Promote program and evaluation findings 
  • Improve program accountability and transparency by publicly sharing the evaluation plan and report
  • Improve program credibility by using a reputable framework and portal
  • Contribute to improving coherence, relevance and reporting of equity program evaluations to a common standard
  • Contribute to creating an evidence-based repository of cohesive, consistent and comparable evaluations

Seekers

People and/or organisations seeking evaluation data, equity programs and/or providers of programs. They may include: 

  • Funding organisations such as government, academia, industry, community groups and philanthropic organisations
  • Leaders in business, research, education and government
  • Program owners across business, research, education and government

Seekers may use the Portal to:

  • Funding organisations: seek evidence-based approaches to drive investment into measures that work
  • Leaders: seek to inform decision making about what works and what should be scaled up and/or funded across sectors
  • Leaders and program owners: seek coherent, standardised reporting to identify and/or address any gaps
  • Program owners: seek to partner/collaborate on equity programs to create stronger cohesion and consolidate efforts and resources
  • Program owners: seek an evidence base to inform, design, refine and/or improve programs

Note. We acknowledge that users will have varying needs (such as varied digital confidence and access to a digital environment, such as remote areas and different devices). We will seek to consult with varied potential users of the Portal.

User journeys

What is the series of actions that the users will perform on the Portal to achieve what they need to do?

General Portal attributes

The portal will: 

  • Have appropriate legal, privacy and security measures in place for any data that the Portal, and its users, will use and/or create.
  • Employ responsive design methods for various devices
  • Be accessible to all users regardless of their ability and environment (e.g., users with varied digital confidence and access to a digital environment, such as remote areas and different devices)
  • Make all new source code open and reusable by default 

** Evaluation Planning and Reporting Tool

The Evaluation Planning and Reporting Tool is a key feature of the Portal and is described in more detail in this section. The feature aligns with the National Evaluation Guide (the Guide). Each section within this feature will be colour-coded and follow the 5 steps outlined in the Guide (see diagram). It will be organised into two main sections. The PLAN section guides planning and designing the program and evaluation and REPORT section guides reporting of evaluation finding

The Evaluation Planning and Reporting Tool will be organised as follows: 

  • Program overview
  • PLAN
    • Define
    • Plan
    • Design
    • Execute
    • Review & publish
  • REPORT
    • Share
    • Review & publish
The 5-step process from the National Evaluation Guide

Program overview 

The program overview section will collect general, high-level information about the program.

Program owners will be able to: 

  • Enter the name of the program: text entry field
  • Enter short program description: text entry field
  • Enter sponsors/funders/partners: text entry field
  • Select one or more available options for the following categories: ‘Broad areas of interest’; ‘Industrial classifications (ANZSIC)’; ‘Discipline of interest’; ‘Program reach’

PLAN

The PLAN section will guide program owners through the process of planning and designing their program and evaluation. It will contain 5 tabs, as described below.

Program owners will be able to: 

  • Problem (what issues does the program address?) Select one or more options from a list of available ‘Problems’ or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Problems’

  • Program participants (who is the program for?) Select one or more options from a list of available ‘Program participants’ or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Program participants’
  • Evaluation audience (who is interested in the evaluation?) Select one or more options from a list of available ‘Evaluation audiences’ or ‘write your own’ in text entry field; with option to write additional detail about the selected ‘Evaluation audiences’
  • Goals (what are the short-, medium- and long-term outcomes/impacts?) Select one or more options from a list of available ‘Goals’ or ‘write your own’ in text entry field.

Note. The ‘Goals’ field is dependent on the ‘Problems’ field. If the user selected one or more options from the list of ‘Problems’ in the step above, then the ‘Goals’ mapped to the selected ‘Problems’ will appear in a different format to the rest of the goals in the list (e.g., highlighted green). If the user wrote their own, then all the ‘Goals’ in the list will display in the usual format. See the example in the image below.

'Goal' options related to the selected 'Problems' are highlighted to signal to the user that they are most relevant

‘Goal’ options related to the selected ‘Problems’ are highlighted to signal to the user that they are most relevant

(Plan tab is viewable but not editable until Define tab is complete)

  • Activities (what will you do to achieve the goals?) Select one or more options from a list of available ‘Activities’ or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Activities’
  • Inputs (what is needed?) Select one or more options from a list of available ‘Inputs’ or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Inputs’
  • Outputs (what will be delivered?) Select one or more options from a list of available ‘Outputs’ or or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Outputs’
  • Evaluation Priorities (1-3 priorities for the evaluation) Text entry field
  • Key evaluation questions (what questions will the evaluation answer?) Text entry field
  • Indicators (what demonstrates the outcomes?) Select one or more options from a list of available ‘Indicators’ or ‘write your own’ in the text entry field.

Note. The ‘Indicators’ field is dependent on the ‘Goals’ fields, as previously described. See the two examples in the images below.

'Goal' options related to the selected 'Problems' are highlighted to signal to the user that they are most relevant

‘Indicator’ options related to the selected ‘Goals’ are highlighted to signal to the user that they are most relevant

(Design tab is viewable but not editable until Plan tab is complete)

  • Data collection tool (how will you measure the indicators?) Select one or more options from a list of available ‘Data collection tools’ or ‘write your own’ in the text entry field.

Note 1. The ‘Data collection tools’ field is dependent on the ‘Indicators’ fields, as previously described.

Note 2. The tools are from the Portal’s bank of recommended existing surveys, tests and other instruments that were developed by experts, have been tested and shown to be valid and reliable. Valid and reliable means that the instrument accurately and consistently measures what it is meant to measure. Validity and reliability bring rigour to your evaluation and credibility to your findings.

Users will be able to view the details of each recommended tool/instrument to determine which they want to use (see above the ‘Search for and view data collection tools/instruments’ function).

‘Data collection tool’ options related to the selected ‘Indicators’ are highlighted to signal to the user that they are most relevant

 

  • Design approach (which evaluation approach will use choose?) Select one or more options from a list of available ‘Approaches’ or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Approaches’
  • Data collection method (which data collection method will use choose?) Select one option from a list of available ‘Data collection methods’; with the option to write additional detail about the selected ‘Data collection method’
  • Data management plan (where and how will you manage your data?) Complete the data management plan table by (1) select one of the options for ‘Data type’ and (2) write details in each cell; with the option to add additional rows as needed.

Example data management plan; users can add rows for different types of data

(Execute tab is viewable but not editable until Design tab is complete)

  • Consent: Do you plan to recruit evaluation participants? Select ‘Yes/No’ option; if ‘Yes’, then respond to 3 follow-up questions: (1) ‘How will you invite people to take part in your evaluation?’ (2) ‘How will you get informed consent from participants?’ and (3) ‘Are you following the consent requirements outlined in the National Statement on Ethical Conduct in Human Research?’ – ‘Yes/No’
  • Ethics: Do you plan to publish in an academic journal or use the data for academic research? Select ‘Yes/No’ option; if ‘Yes’, then respond to 3 follow-up questions: (1) ‘Are you following the human ethics guidelines outlined in the National Statement on Ethical Conduct in Human Research?’  – ‘Yes/No’; (2) ‘Have you obtained human ethics approval through a registered Human Research Ethics Committee (HREC)?’ – ‘Yes/No’ and (2) if ‘Yes’, please enter the HREC approval number; if ‘No’, then error message “You must obtain ethics approval from a registered HREC if you plan to publish in an academic journal or use the data for academic research”.
  • Analysis approach (how will you analyse the data?) Select one or more options from a list of available ‘Analysis approaches’ or ‘write your own’ in the text entry field; with the option to write additional detail about the selected ‘Analysis approaches’

(Review & publish tab is viewable but not editable until Execute tab is complete) 

  • Review Define, Plan and Design sections at once; with the option to edit specific sections
  • Save PLAN section as ‘Draft’ or ‘Publish’
  • If ‘Publish’ is selected, choose the visibility of the PLAN section from available options: public (anyone can view), private (only the user can view), unlisted (anyone with a link can view)

REPORT

The REPORT section will guide program owners through reporting the findings of their program evaluation. It will contain 2 tabs, as described below. The REPORT section is viewable but not editable until the PLAN section is published.

Program owners will be able to: 

  • Share options Choose from two types of ‘Share’ options: ‘Progress report’ or ‘Final report’. Multiple progress reports can be created over time. Only one final report can be created at the end of the program/evaluation. The following fields reflect the ‘Share’ option selected.
  • Executive summary (final report only) Summarise the program background, goals, evaluation design and findings): text entry field
  • Findings summary table (progress and final reports) Summarise the findings for each ‘Indicator’: text entry field

Note. The table will be auto-populated with the ‘Evaluation priority’, ‘Key evaluation question’ and ‘Indicators’ from ‘Plan’ tab. 

The summary table will be auto-populated with the relevant content from the Plan tab above
to make it easy to report on each specific question and indicator
  • Evaluation detail (progress and final reports) Complete the text entry fields for the following items:
    • Outline the program outputs (deliverables) achieved by the program [so far/end date]. Do the achieved outcomes align with those you intended and specified in the Plan stage? If yes, please describe how. If no, explain why.
    • Outline the program outcomes (changes) achieved by the program [so far/end date]. Do the achieved outcomes align with those you intended and specified in the Plan stage? If yes, please describe how. If no, explain why.
    • Has your program addressed the problem(s) you intended to address? If so, describe how. If no, explain why.
    • What ongoing impact will the program have? Describe how your program will continue to achieve the intended outcomes/impact?
    • Did the program result in any unexpected benefits? If yes, explain what and how.
    • Did the program result in any unintended consequences? If yes, explain what and how.
  • Lessons learnt (progress report only) What lessons can you draw with the insights from the findings? What can be improved, and how?
  • Conclusions (final report only) What conclusions can you draw from the evaluation findings? What can we learn from them?
  • Recommendations (final report only) Make recommendations based on your evaluation. What should we do with the insights from the findings? Make your recommendations action-oriented and feasible. Arrange them in order of importance.
  • Supplementary materials (progress and final reports) Upload supplementary materials, if applicable (e.g., reports, presentations, deliverables)

(Review & publish tab is viewable but not editable until Share tab is complete) 

  • Review Share section; with the option to edit specific sections
  • Save the REPORT section as ‘Draft’ or ‘Publish’
  • If ‘Publish’ is selected, choose the visibility of the REPORT section from available options: public (anyone can view), private (only the user can view), unlisted (anyone with a link can view)

At any point throughout the process, the program owner will be able to: 

  • Save program with two options: ‘Save and continue’ or ‘Save and exit’
  • Hover the cursor over the ‘question mark’ icon next to each field to get tooltip/help content taken from the Guide.

Tell us what you think

Now that you reviewed who the Portal users are and how it will be designed to serve their needs, we want your feedback. Click on the ‘Give your feedback’ button to tell us what you think.

Closes 31 October, 2021