Climate Change Attitude Survey

Outcome

Citation

Christensen, R., & Knezek, G. (2015). The survey: Measuring middle school student beliefs and intentions to enact positive environmental change. International Journal of Environmental and Science Education, 10(5), 773–788.

Background 

This tool was created to measure middle school students’ beliefs about and their intentions to do something about it. Some of the are drawn from the Wisconsin Center for Environmental Education’s survey, and others are adapted from published studies about climate change. The items were reviewed by middle school science teachers and tested with over 1500 middle school students in 29 schools in 8 states in the U.S. in the beginning of the 2014 school year. The analysis of the instrument suggests it will measure attitudes and beliefs about climate change and student for working toward resolutions. 

Format 

Survey. This survey consists of 18 statements to which people respond to on a five-point , with 1 being “strongly disagree”, 3 being “undecided” and 5 being “strongly agree”

Audience 

Middle school youth, grades 5-8.

When and how to use the tool 

To measure attitude change due to programming, this survey should be implemented on the first day and on the last day of the program. It was originally tested with a year-long program, but has also been used with shorter programs. Survey results can also be used to compare subpopulations of youth, such as among genders, or between urban and rural dwellers. 

To use this survey, the program you are implementing must include activities that intend to impact climate change attitudes, such as investigating local social and environmental impacts of climate or service learning opportunities. 

How to analyze 

We recommend entering survey responses into a spreadsheet using a program such as Microsoft Excel. Create a spreadsheet with 18 columns for the 18 statements and a row for each individual. Using a 1–5 point scale, enter the equivalent value (1 for “strongly disagree” to 5 for “strongly agree.”) Assign each survey a , and enter each individual’s responses (ranging from 1 to 5) across the corresponding row. Enter a dot if the response was skipped.

Before you calculate an average score for each individual across the scale, there are five questions that need to be reverse coded first. In the original tool, in Part 1 question #9 and Part 2 questions #2, 3, 4, and 5, have all been negatively worded and need to be reverse coded for accurate data analysis. For example, the tool includes the following statement:  “Things I do have no effect on the quality of the environment.” In this case, a score of 1 (strongly disagree) would imply the respondent thinks their actions can affect the environment. As such, in order to align the data from this statement with the other questions in the tool, the responses need to be reverse coded (a response of 1 becomes a 5, 2 becomes a 4, and so on.) 

Once all of the appropriate questions have been reverse coded, create an average score for each by adding all of their responses and dividing by the number of questions answered. Do not include skipped questions for which you entered a dot. You can also calculate the averages for different subpopulations. The average will be between 1 and 5. A score of 1-2 indicates a low attitude of efficacy to change the environment, a score of 3 is undecided, and a score of 4-5 indicates a high attitude of efficacy to change the environment. 

What to do next 

Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next: 

  • What do the data tell you? Is there a significant difference between the pre and post scores? If the scores improve, you can say your program is enhancing students’ beliefs and intentions to act on climate change issues. It would be helpful to interview some students to find out how they think they could act, and whether they believe they have the skills to act. 
  • You could compare populations to determine if members have a different attitude or intention than others. This could also provide justification for program development, marketing, or funding proposals.
  • Invite program staff or other partners to look over the data. Together you might also consider:
    • What do these results tell us about our programming? Why do we think we got these results?
    • What did we think we would see with respect to attitudes? And did these data support our goals?
    • If our results did not support our goals, can we brainstorm on areas within the programming or delivery to influence attitudes? What changes should be made to programming, or how should new programs be designed?
    • Who in our should we reach out to for collaboratively discussing program design?
    • Who or what organizations can we share our learning with?

How to see if this tool would work with your program 

Consider whether the terms “global climate change” will mean current, anthropogenic climate change to your students. If they understand this to mean changes to climate over eons, you won’t be able to interpret the results! Also consider whether students consider the “action of individuals” to mean collective action or individual action.

Tool Tips 

  • This tool is recommended to be used in its entirety. 
  • Allow for ample programming time for children to complete this survey.