DEVISE: Self Efficacy for Environmental Action

Outcome

Citation

Porticella, N., Phillips, T., Bonney, R. (2017). Self- for Environmental Action (SEEA, Generic). Technical Brief Series. Cornell Lab of Ornithology, Ithaca NY

 

Background

The tool was developed originally for use with citizen scientists in the Audubon's Great Backyard Bird Count and other citizen science projects aimed at water quality.

Format

Survey. This survey consists of 8 statements to which people respond to on a five-point , with 1 being “strongly disagree”, 3 being “neutral” and 5 being “strongly agree”.

Audience

Adult

When and how to use the tool

This tool can be used for a one-time measure of self- , or can be used to measure change in self-efficacy through implementing the survey before and after a program. The tool was tested with informal science learning environments.

How to analyze

The tool download (from the link provided above) will include detailed information on how to analyze this data. 

What to do next

Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next: 

  • If a baseline measurement suggests that your audience has low efficacy scores, you might want to design a program that can help increase efficacy. 
  • You could compare populations to determine if members have different efficacies scores than the general , or if one geographic area of your is different from another. This could also provide justification for program development, marketing, or funding proposals.
  • If you used this survey to evaluate your program, do you see a change in scores between the and ? Keep in mind that you may not see a change, particularly if your program is short in duration or is not designed to influence efficacy. 
  • Invite program staff or other partners to look over the data. Consider questions together, like:
  • What do these results tell us about our programming? Why do we think we got these results?
  • What did we think we would see with respect to efficacy? And did these data support our goals?
  • If our results did not support our goals, can we brainstorm on areas within the programming or delivery to influence efficacy? What changes should be made to programming, or how should new programs be designed?
  • Who in our community should we reach out to for collaboratively discussing program design?
  • Who or what organizations can we share our learning with?

How to see if this tool would work with your program

To assess whether the tool is appropriate for your educators, please review the carefully and the tool with a small group that represents your population. To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your , you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools. 

Tool tips

  • If using this survey to measure change in self-efficacy, it will probably be unlikely you will see major changes in posttest self-efficacy scores for individuals who score high in the pretest. 
  • Another tip if using this survey to measure change in self-efficacy, you might consider grouping learners who score low on the pretest together with those who score high on the pretest in program activities.
  • There is a custom version that can be requested from DEVISE, so if you want to make changes, contact them and talk with them.
  • This survey will take about 10 minutes to complete. Allow ample programming time to complete.
  • This survey can be administered online, over the phone, or on paper.