Outcome
Audience
Method
Citation
Flagg, B. N., Porticella, N., Bonney, R., and Phillips, T. (2016). Interest in Science and Nature Scale (Youth Version). Copyright Twin Cities Public Television. Technical Brief Series. Cornell Lab of Ornithology, Ithaca NY.
Background
This tool was developed by the Cornell Lab of Ornithology Evaluation Research. The scale was developed with input from in-person focus groups and teacher feedback. Small groups of students ages 8-24 years also reviewed the scale. Young citizen scientists pilot tested the scale.
Format
This survey consists of 10 statements to which people respond to on a five-point scale, with 1 being “disagree strongly” , 2 being “disagree a little”, 3 being “unsure”, 4 being “agree a little” and 5 being “agree strongly”.
Audience
This scale is intended to be used with youth. It has mostly been used with elementary school-aged girls that participated in citizen science projects.
When and how to use the tool
This tool can be used for a one-time measure of interest in science and nature, or can be used to measure change in interest in science and nature through implementing the survey before and after a program.
How to analyze
The tool download (from the link provided above) will include detailed information on how to analyze this data.
What to do next
Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next:
- If you used this scale once as a baseline measurement, you may want to consider those scores in designing programming. If participants scored low, you may want to find new ways to engage participants in activities.
- Is there a particular item, out of the 10, that participants scored higher or lower in? This may be interesting to investigate when considering possible programmatic changes based on what participants are interested in.
- If you used this scale to evaluate your program, do you see a change in scores between the pretest and posttest? You might want to follow-up with participants with interviews and find out what activities were most interesting and engaging to them.
- Invite program staff or other partners to look over the data. Consider questions together, like:
- What do these results tell us about our programming, are participants engaged? Why do we think we got these results?
- What level of engagement scores did we think we would get? And did these data support our goals?
- If our results did not support our goals, can we brainstorm on areas within the programming or delivery to increase behavior scores?
- Who in our community should we reach out to for collaboratively discussing program design?
- Who or what organizations can we share our learning with?
How to see if this tool would work with your program
To assess whether the tool is appropriate for your audience, please review the items carefully and pilot test the tool with a small group that represents your population. To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your evaluation, you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools. If the problems are minor, and limited to changing a few words to make them simpler or more relevant, you could revise the language in the tool. For example, you could change “beach” to “pond” and “heron” to “squirrel” but you are likely to change the intent of the item if you change beach to landfill or heron to rat.
Tool tips
- Survey is suggested to be used in its entirety.
- Survey will take about 10 minutes to complete. Be sure to allow ample program time for participants to complete the survey.
- This survey can be used as a paper copy or can be formatted to be completed online by using Google Forms or Survey Monkey.