Evaluation Methods

Below you will find a chart explaining the different types of tools you might encounter in the Tool Library. Advantages and challenges are associated with each, and other considerations exist to work towards evaluation that centers a CREE process. 

Different Types of Evaluation Methods

Adapted from Ernst, Monroe, & Simmons (2012, pp. 32-33)

Method plus Definition Overall Purpose Advantages Challenges CREE Considerations Tools
Concept maps

To gather information about someone’s understanding of and attitudes toward a complex subject or topic.

Can offer a more comprehensive and complex view of someone’s thinking than a test does.

Could be a better tool for visual learners or test-phobic people.

Can gather and data.

Can be used as a form of embedded assessment.

Takes training to complete properly,

Takes training for the program leader to learn how to administer.

Can be challenging and time consuming to score. Can be difficult to analyze and interpret.

Drawings/Journal Prompts

To learn how participants make sense of a concept or idea, or can be used as a way to check for understanding.

Offers a different way for participants to express themselves.

Drawings can be hard to interpret, asking participants to label their drawings or explain their drawings verbally may help.

Focus groups

To explore in-depth through group discussion, e.g. reactions to an experience or suggestion, understanding common complaints. Useful in and marketing.


Can quickly and reliably produce collective impressions.

Can be an efficient way to gather range and depth of information.

Can be useful in groups of children or youth who may be more comfortable sharing in a group setting.

Quality controls in that participants tend to check each other and weed out false or extreme views.

Highly enjoyable to participants.

Great way to get youth to share stories. are a good way to get comfy with participants.

Incentives can be easily provided, such as a snack or meal.


Can be hard to analyze responses.

Need good facilitator for safety and closure. Managing attendees is essential so that the focus group is not dominated by one or two people.

Effective facilitators:

  • Demonstrate active listening techniques
  • Demonstrate and positive regard for participants
  • Restrain from expressing personal views
  • Show curiosity and rigorously pursues answers to the questions

Difficult to schedule 6 to 8 people together.

Can’t go as deep in conversation with each person.

Constrained by limited time, so can ask fewer questions.

If using this approach, carefully consider dynamics in the group setting- are all participants comfortable sharing their experiences with the group and the focus group leaders? 

Make sure the focus group is representative of your audience, and/or have multiple

Focus groups can help assure that the commonalities amongst participants are brought to the table, and non-commonalities are not marginalized. 

Consider times to use focus groups outside or at the end of a program.


To understand someone’s impression or experience or learn more about their answers to questionnaires.

Provides better range and depth of information.

Promotes relationship with respondent.

Allows for follow-up questions.

Can be time consuming.

Can be hard to analyze and compare.

Can be costly.

Interviewers can bias responses.

Generalizations may be limited.

Pay attention to language used in your interview questions and prompts. Try not to use jargon; some of the words used regularly in EE, like and , have different meanings or no meanings for people. Using prompts can help let the respondent's voice shine. 

Consider times to use outside of at the end of a program. Such as, can interview user groups and members in the development phase of a program to be sure their perspectives and goals are reflected in planning. 

Need a representative group.

Value and respect the time interviewees are giving. 

An interview is a way to build rapport with participants. When interviewing, the interviewer can also share their own stories to make the interviewee comfortable and build a relationship with them. 

Use interview responses in program reports, and then follow-up with the “numbers.” 

Consider what compensation for respondents might be appropriate to acknowledge contribution. 

Consider “cultural guides” as interviewers, for example having youth interview other youth.

If relevant, consider having your tool be in the primary language of respondents, and/or be culturally relevant.


To gather more information about how a project actually operates, particularly about processes.

Allows for viewing of project operations as they are actually occurring. 

Allows for adaptation of events as they occur.

Observations can be used on staff as well; observe how they interact with participants and deliver programs.


Can be difficult to interpret behaviors.

Observations can be difficult to categorize.

Can influence participants' behaviors.

Can be expensive.

Evaluator needs to be trained on observation protocols.

Observation protocol, or guide, should be carefully constructed to be culturally relevant.


To use photographs taken by participants or researchers to guide and learn more about their experience.

This method can be woven into project-based curriculum.

Offers a different way for participants to express themselves.

Need equipment to take photos, such as phones or cameras.

Time intensive to conduct and analyze data.

Still and moving images  provide identifiable data, even if people are shown from the back, so human subjects research protocols must be considered.

Questionnaires and surveys

To quickly and/or easily get a lot of information from people in a non-threatening way.

This is also a common way of tracking program metrics.

Can be completed anonymously.

Inexpensive to administer.

Easy to compare and analyze.

Can be administered to many people.

Can get lots of data.

Easy to create: many sample questionnaires already exist.

Might not get careful feedback.

Wording can bias audience responses.


may require sampling and statistical expertise.

Doesn’t get full story

Who responds can impact results and diverse voices can be lost if diverse respondents aren’t included in the original sample. 

If crafting your own survey, carefully consider the questions you are asking. Work with staff and participants to design questions. 

If using existing or questionnaires, look closely at the language used. Will it work with your participants? 

Be careful to not center dominant cultures and perspectives.

Sustainability Attitudes Scale, Empathy Toward Animals Scale, Connection to Nature Index
Tests of knowledge

To determine the audience’s current state of knowledge or skill regarding an issue. 

Helps identify a problem or a deficiency in knowledge or skills.

Results are easily quantified.

Individual performances can be easily compared.

Helps determine if the problem is a training issue.

Limited availability of validated for specific situations.

Results can be influenced by attitudes.

Language or vocabulary can be an issue.

People may be concern about how results will be used.

Adults may resent taking tests.

Tests of knowledge are just one way of evaluating what a participant knows or how much they have learned. Some people may not be good test-takers. Consider using other methods to see how much participants have learned in conjunction with tests- such as through the other methods listed in this table, or project-based learning approaches that can allow participants to connect what they learned with their own life.