DEVISE: Motivation for Doing and Learning Science

Outcome

Citation

Porticella, N., Phillips, T., and Bonney, R. (2017). Motivation for Doing and Learning Science Scale (Generic). Technical Brief Series. Cornell Lab of Ornithology, Ithaca NY.

Background

This tool was developed by the Cornell Lab of Ornithology Evaluation Research. It was developed through focus groups and expert reviews with adult citizen science participants.

Format

This survey consists of 16 statements to which people respond to on a five-point scale, with 1 being “strongly disagree” and 5 being “strongly agree”. Alternatively, you can choose one of two 8-item scales, one measures motivation to learn and understand science, and the other measures motivation to engage in science activities. There are two subscales within each of the two 8-item scales to calculate intrinsic and extrinsic motivation.

Audience

Adults

When and how to use the tool

This tool can be used for a one-time measure of motivation for doing and learning science, or can be used to measure change in motivation through implementing the survey before and after a program.

How to analyze

The tool download (from the link provided above) will include detailed information on how to analyze this data.

What to do next

Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next: 

  • If you used this scale once as a baseline measurement, you may want to consider those scores in designing programming. If participants scored low, you may want to design a program to increase motivation. 
  • If you used this scale to measure change in motivation, do you see a change in scores between the pretest and posttest? Keep in mind you may not see a change in motivation, particularly if your program is short in duration or is not designed to influence collective action behaviors. 
  • Invite program staff or other partners to look over the data. Consider questions together, like: 
  • What do these results tell us about our programming, are we building motivation? Why do we think we got these results?
  • What scores did we think we would see with respect to motivation? And did these data support our goals?
  • If our results did not support our goals, can we brainstorm on areas within the programming or delivery to increase behavior scores?
  • Who in our community should we reach out to for collaboratively discussing program design?
  • Who or what organizations can we share our learning with?

How to see if this tool would work with your program

To assess whether the tool is appropriate for your audience, please review the items carefully and pilot test the tool with a small group that represents your population. To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your evaluation, you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools. If the problems are minor, and limited to changing a few words to make them simpler or more relevant, you could revise the language in the tool. For example, you could change “beach” to “pond” and “heron” to “squirrel” but you are likely to change the intent of the item if you change beach to landfill or heron to rat.

Tool tips

  • Survey is suggested to be used in its entirety.
  • Survey will take about 10 minutes to complete. Be sure to allow ample program time for participants to complete the survey.
  • This survey can be used as a paper copy or can be formatted to be completed online by using Google Forms or Survey Monkey.