Empathy Toward Animals Scale

Outcome

Citation

This tool was created as part of the Measuring Empathy: Collaborative Assessment Project, including partners Mary Jackson, Kathryn Owen, Kathayoon Khalil, and Jim Wharton. Learn more here: https://www.informalscience.org/measuring-empathy-collaborative-assessment-project 

Background 

This tool was developed with a team of conservation psychologists and evaluation practitioners who work in the zoo and aquarium field. Their goal in creating this tool was to help zoos and aquariums assess whether they were meeting their organization’s goals. 

Format

A survey with 11 items using a 5-point scale, with 1 being “strongly disagree”, 3 being “neither agree nor disagree” and 5 being “strongly agree”. There is one additional multiple choice item with three choices, and two open-ended questions. 

Audience

Teen/tween (10-14 years old)

When and how to use the tool

To measure change due to the program, this survey should be implemented on the first day and on the last day of the program. It is recommended for long-term programs lasting at least a week, such as a camp. The program must include activities that intend to increase empathy, such as taking the perspective of an animal, closely observing an animal, understanding the needs and challenges that animals face, etc. 

How to analyze

We recommend entering survey responses into spreadsheet using a program such as Microsoft Excel. Create a spreadsheet with 11 columns for the 11 statements and a row for each individual. Assign each survey a record number (can be the month and day the participant was born, as recorded in the first question), and enter each individual’s responses (ranging from 1 for “strongly disagree” and 5 for “strongly agree”) across the corresponding row. Enter a dot if the response was skipped. Create three additional columns for the multiple choice question and two open-ended questions, record those responses as well, though they will not be included in the quantitative data analysis. 

Create an average score for each individual by adding all of their responses and dividing by the number of questions answered. Do not include skipped questions for which you entered a dot. The average will be between 1 and 5. Scores of 1–2 indicate a low empathy for animals, a score of 3 indicates neither a low nor a high empathy for animals, and scores of 4-5 indicates a higher level of empathy for animals. Calculate an average for both pretest and posttest scores. 

When administering the pre-experience survey and post-experience surveys, you can conduct higher-level statistics on your data to understand if participants had significant changes in the outcome areas after their participation in the program. Ideally, analysis of these data would use a statistical paired t-test, so that outliers would be recognized and significance of program impact can be confirmed before making program changes. Other, more descriptive techniques, such as a simple bar chart, could be useful for discussion, but should not be used for major program changes if significance of the change has not been calculated.

For more information on how to do data analysis, look at this part of the Processes page on this website. 

What to do next

Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next: 

  • Look at the results--do you see a change in empathy scores between the pre-test and post-test? Keep in mind that you may not see a change, particularly if your program is short in duration or is not designed to influence empathy towards animals. 
  • You could compare populations to determine if members have a different empathy score than the general population, or if one geographic area of your community is different from another. This could also provide justification for program development, marketing, or funding proposals.
  • Invite program staff or other partners to look over the data. Together you might also consider 
    • What do these results tell us about our programming? Why do we think we got these results?
    • What did we think we would see with respect to empathy for animals? And did these data support our empathy goals?
    • If our results did not support our goals, can we brainstorm on areas within the programming or delivery to increase empathy toward animals? What changes should be made to programming?
    • What stakeholders should we reach out to for collaboratively discussing program changes?
    • Who or what organizations can we share our learning with?

How to see if this tool would work with your program

To assess whether the tool is appropriate for your audience, please review the items carefully and pilot test the tool with a small group that represents your population. To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your evaluation, you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools. If the problems are minor and limited to changing a few words to make them simpler or more relevant, you could revise the language in the tool. For example, you could change “beach” to “pond” and “heron” to “squirrel,” but you are likely to change the intent of the item if you change beach to landfill or heron to rat.

In communities where hunting is the norm, for example, this scale is likely to be problematic. It may be most appropriate for urban youth in Western cultures. It did not work well for youth in rural India (Salazar et. al, 2021).

Tool Tips

  • Survey is suggested to be used in its entirety.
  • Survey will take about 5 to 7 minutes to complete. Be sure to allow ample program time for participants to complete the survey.
  • This survey can be used as a paper copy or can be formatted to be completed online by using Google Forms or Survey Monkey. 
  • This survey works best with paired pre-post data. So having teens add in an identifier (e.g. birthdate) to link pre and post surveys is helpful. The odds of two people having the same birthdate are reasonably high in groups larger than 30, however. To avoid that problem, ask participants to use the last four digits of their phone number or some more unique number, or code groups differently, so that the June 15 camp can be separated from the July 15 camp.