DEVISE: Interest in Science and Nature (Adults)

Outcome

Citation

Phillips, T., Porticella, N., Bonney, R., and Grack-Nelson, A. (2015). Interest in Science and Nature Scale (Adult Version). Technical Brief Series. Cornell Lab of Ornithology, Ithaca NY.

Background

This tool was developed by the Cornell Lab of Ornithology Evaluation Research. Initial survey development was conducted with citizen science participants, and then the survey was pilot tested with participants from the Great Backyard Bird Count.

Format

This survey consists of 12 statements to which people respond to on a five-point scale, with 1 being “strongly disagree” and 5 being “strongly agree”

Audience

Adults, no specified age

When and how to use this tool

This tool can be used for a one-time measure of interest in science and nature, or can be used to measure change in interest in science and nature through implementing the survey before and after a program.

How to analyze

The tool download (from the link provided above) will include detailed information on how to analyze this data. 

What to do next

Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next: 

  • If you used this scale once as a baseline measurement, you may want to consider those scores in designing programming. If participants scored low, you may want to find new ways to engage participants in activities. 
  • Is there a particular item, out of the 12, that participants scored higher or lower in? This may be interesting to investigate when considering possible programmatic changes based on what participants are interested in. 
  • If you used this scale to evaluate your program, do you see a change in scores between the pretest and posttest? You might want to follow-up with participants with interviews and find out what activities were most interesting and engaging to them.
  • Invite program staff or other partners to look over the data. Consider questions together, like: 
  • What do these results tell us about our programming, are participants engaged? Why do we think we got these results?
  • What level of engagement scores did we think we would get? And did these data support our goals?
  • If our results did not support our goals, can we brainstorm on areas within the programming or delivery to increase behavior scores?
  • Who in our community should we reach out to for collaboratively discussing program design?
  • Who or what organizations can we share our learning with?

How to see if this tool would work with your program

To assess whether the tool is appropriate for your audience, please review the items carefully and pilot test the tool with a small group that represents your population. To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your evaluation, you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools. If the problems are minor, and limited to changing a few words to make them simpler or more relevant, you could revise the language in the tool.

Tool tips

  • Survey is suggested to be used in its entirety.
  • Survey will take about 10 minutes to complete. Be sure to allow ample program time for participants to complete the survey.
  • This survey can be used as a paper copy or can be formatted to be completed online by using Google Forms or Survey Monkey. 
  • Creators of this survey recommend this tool may be best used with a non-traditional audience that is less regularly exposed to science. Audiences like those who participate in citizen science most likely already have a high interest in science and nature.