Outcome
Method
Citation
Powell, R. B., Stern, M. J., Frensley, B. T., & Moore, D. (2019). Identifying and developing crosscutting environmental education outcomes for adolescents in the twenty-first century (EE21). Environmental Education Research, 25(9), 1281–1299. https://doi.org/10.1080/13504622.2019.1607259
Background
This tool was developed by Dr. Marc J. Stern, Dr. Robert B. Powell, and their team in collaboration with practitioners, organizations, and academics in an effort to identify and develop crosscutting outcomes for environmental education in the twenty-first century. The EE21 questionnaire measures outcomes that many environmental education programs seek to address and influence in youth participants. They initially tested the tool at six STEM-related environmental education programs in the United States and have since used the tool with over 400 EE organizations.
Format
The EE21 surveys include two single item questions and 10 scales, collectively measuring 12 outcomes. The questions ask youth to respond to a series of statements on a scale of 0 to 10.
Audience
The EE21 surveys are designed for program participants ages 11 and up.
When and how to use the tool
There are two ways to use these surveys.
- Post-experience only retrospective survey. This survey should typically be administered immediately following a program and can be used to understand how the program impacted each outcome area. The post-experience only method is best used in a comparative sense. That is, to compare outcomes across multiple programs. Pre-experience and post-experience surveys are more appropriate for evaluation purposes, but only for longer-duration programs.
- Pre-experience and post-experience surveys. These surveys were designed to be given to students prior to the start of the environmental education program and at the end. These surveys are not appropriate for programs that only last a few hours, rather, they are ideal for longer-term programs or overnight experiences.
Further guidance on how to use these surveys can be found here.
How to analyze
We recommend entering survey responses into a spreadsheet using a program such as Microsoft Excel. Create a spreadsheet with a column for each of the statements and a row for each individual. Assign each survey a record number, and enter each individual’s responses (ranging from 0 to 10) across the corresponding row. Enter a dot if the response was skipped.
Create an average score for each of the outcome areas across all individuals. The authors have provided a table (EE21 outcomes), which displays the survey items that can be combined to collectively measure each key outcome area (such as self-efficacy or connection to place). To do this, add up all of their responses for the set of questions in an outcome area and divide by the number of questions to get an average score. Do not include skipped questions for which you entered a dot. If a student (a case) skipped a question that is part of a set of questions to measure a key outcome (for example, three questions are used to measure “connection to place”), then leave the average for this case blank. Averages will be between 0 and 10.
When administering the pre-experience and post-experience surveys, you can conduct higher-level statistics on your data to understand if participants had significant changes in the outcome areas after their participation in the program.
What to do next
Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next:
- If initial results show that your audience has lower scores for certain outcomes vs. others, you might want to determine how or if your organization wants to improve programming for those outcomes.
- You could compare the results across different groups of participants (urban youth vs. rural youth, different grades, or gender identities) to determine if members have different scores than other groups and to see if your program is serving participants equitably. This could also provide justification for program revision or development, marketing, or funding proposals.
- If you used the EE21 to measure changes in outcomes by administering it to participants before and after programming, do you see a change in scores between the pre-test and post-test? Keep in mind that you may not see a change, particularly if your program is short in duration or is not designed to influence a particular outcome.
- Invite program staff or other partners to look over the data. Together you might consider:
- What do these results tell us about our programming? Why do we think we got these results?
- What results did we think we would get? And did these data support our goals?
- If our results did not support our goals, can we brainstorm on areas within the programming or delivery to influence the specific outcome area? What changes should be made to programming, or how should new programs be designed?
- What stakeholders should we reach out to for collaboratively discussing program design?
- Who or what organizations can we share our learning with?
How to see if this tool would work with your program
The EE21 can be used to measure the impact of an environmental education program across 12 outcomes (enjoyment, place connection, learning, interest/motivation, 21st Century Skills, meaning/self-identity, self-efficacy, environmental attitudes, environmental behaviors, school behaviors, cooperation, and behavior change.) If you seek to measure these outcomes in your program participants, we recommend reviewing the EE21 Survey Guidance and EE21 Outcomes. Are the outcome relevant and appropriate for your program? If so, then review each statement to further determine if it is relevant. Discuss with staff to decide if the results will be useful to you, and pilot test the survey with youth ages 11 years and older who represent your audience.
Tool Tips
Here is some additional information about the EE21:
- You may want to customize the survey. For example, you may wish to insert the word “field trip” in place of “experience” or “program” or include a somewhat different introduction. You might also consider adding an open-ended question or two. If you decide to add additional items, these should come at the end of the survey, not before.
- If you do not measure all of the outcomes in the EE21, you can pull out only those outcomes which you wish to measure. Be sure to utilize all of the appropriate questions for each subscale.
- The authors of the survey are also working on developing a shorter EE21 survey and will be publishing this in the future.