Collective Evaluation Pathway
Sometimes a different approach is effective, which is to first recruit organizations with the same goals and outcomes and that may be using or are interested in using the same evaluation tool(s) or approach(es).
Example: Improving EE distance learning
In the midst of the pandemic in 2020, faculty at Virginia Tech and Clemson recruited 43 EE youth-serving organizations, who were all unexpectedly pivoting from in-person to online programming, to create a networked community with a culture of continuous improvement and learning. The organizations had various levels of prior collaboration with each other from none to a significant amount. All shared similar desired outcomes resulting from their programs and agreed to use a consistent outcome measure (a survey that assesses a broad range of EE outcomes: Environmental Education for the 21st Century, EE21) to evaluate programs. This allowed these organizations to draw conclusions about their programs, identify potential areas for improvement, and share ideas about what works best both within and across organizations.
Learning cycles: The program included two learning cycles. In the first cycle, educators were coached and guided through evaluation and data collection processes. After collecting data on their programs they received an evaluation report that summarized their results. The educators were then asked to reflect on their programs and identify areas for improvement based on the evaluation results. The second cycle was an opportunity to evaluate programs after these adjustments were implemented and to reflect as a community on what was learned about these adjustments to foster collective learning. This model supports educator autonomy and decision-making, and takes advantage of the broader community of peers to develop new insights and innovations.
Other activities: The facilitators used a number of techniques to foster evidence-based learning across the network.
- In network meetings, participants were provided with the ability to choose breakout rooms in order to network with specific organizations/people with commonalities.
- Examples and facilitated discussions were provided about programs that attained high outcomes.
- Specific topics for spring meetings were chosen based on member requests and interests (e.g. attracting participation in new online programs) and facilitated presentations and small group discussions on those topics.
- One-on-one consultations with each organization were added to interpret and reflect on data from spring reports.
Extending to collective evaluation:
The EE21 team has a number of initiatives that extend their evaluative learning to the field of environmental education.
- Results from these networks were widely shared to inform the field about promising practices.
- Results about the network and what worked and what could be improved will be shared widely so that other organizations and evaluation specialists can replicate or develop similar models.
- Survey instrument (EE21) is provided here in its entirety.
- Examples of evidence-based practices for online environmental education programs, designed by participants in our network, can be found on this website.