Process D. Collect & Analyze Information

Environmental education can benefit from learning from participants, community members, partners, and funders who bring a wide diversity of experiences, perspectives, understandings, and traditions. Methods of inquiry and analysis that reflect this diversity lead to relevant and meaningful evidence. Meaningful evaluation evidence includes a wide variety of information types. In eeVAL, we are intentionally choosing to name evaluation evidence as information and not data. eeVAL does this to open up the conversation to include numerical, narrative, visual, and many other forms of communication. Relevant and meaningful evidence increases the accuracy of the conclusions made about environmental education programs and the people who engage with such programs. In this way, eeVAL contributes to liberatory narratives and programs.

eeVAL Tips to collect and analyze information 

D1. Engage Program Participants in the Design of Information Collection
Engage program participants as partners in designing the methods of inquiry and modes of collection. Participants can support you in understanding what methods of inquiry could work best to capture their feedback and experiences. Participants can also provide insights on what they find most relevant to investigate. You might consider inviting a few participants who bring differing perspectives to serve on an evaluation advisory group that meets to advise the design and pilot approaches for ease and relevance.

D2. Consider (Re)purposing Information That Is Already Generated
Do not forget the easy information, such as registrations, applications, attendance records, and participant projects. As a part of your evaluation work plan, remember some of the "simple" indicators, such as counting the participants at events and the bags of litter collected on river clean-up day. It can be easy to overlook this information. Simple indicators may provide important evidence of success and may be impossible to collect later on.

D3. Design or Adapt Information Collection Methods That Honor Program, Context, and Partners
Before you begin to collect information, remember to consider the site and location where the information collection will take place and consider what types of engagement might work best for your participants’ cultures, histories, and traditions within the local context. For example, a program that takes place outdoors versus indoors, or in-person versus virtually might summon very different approaches. Inquiry methods designed for adolescents may not translate well for elementary-aged children or adult participants. Different funders value different types of evidence. Pay attention to the local cultures and traditions that are present in the program context; multiple forms of inquiry including printed, verbal, and visual modes should be considered. 
 
D4. Collect Information That Values Multiple Truths
Create or adapt methods that collect information in multiple ways and reflect the many ways that people know, experience, and make sense of the world around them. For example, Indigenous peoples hold traditional ecological and social knowledge of the environment that may not align with Western frameworks. Consider using storytelling, interviews, focus groups, chronicles, memoirs, songs, drawings, photos, fabric art, and programming artifacts, in place of or in addition to surveys, questionnaires, and polls. Because there are advantages and disadvantages to both quantitative and qualitative information, many evaluations rely on a mix of the two (which can be practiced as a mixed methods evaluation approach).

D5. Select Evaluation Participants
Initially, you may be tempted to collect feedback from everyone who participates in your program. This may be an appropriate strategy, if the program serves a very small audience. With a large group of participants, however, this approach strains limited resources, such as time and money. Instead of collecting information from everyone in your population (i.e., all of your participants), you can learn just as much about your program by collecting information from a “sample” of participants.

There are no hard and fast rules for selecting an appropriate sample size. Instead, try to set a minimum sample size based on what you think is necessary to adequately capture your outcomes of interest—and anticipate that this number may change once you start to collect data. Examine critically whose voices are being captured or excluded in your sample. You may recognize that adjustments are needed to address sampling biases.

D6. Be Thoughtful About Who Collects Information. 
It matters who does the asking. You can take steps to minimize response bias, which is the tendency for respondents to give false or inaccurate answers because they believe certain responses are socially accepted or desired by the evaluation team. For example, in a youth program, does it make sense for youth to play an active role in collecting information (e.g., help decide on relevant methods or co-facilitate discussions) instead of only providing the information? If you’re conducting a BIPOC focus group, does it make sense for a BIPOC person to lead that focus group? Does it make sense to have someone other than the person who delivers the program to collect feedback on the experience depending on the type and purpose of the evaluation (e.g., summative program and process feedback)?

D7. Collect Demographics with Intention and Sensitivity
If you’re interested in understanding differences across subgroups, respectfully ask participants how they identify across dimensions of difference. Working with community members can support the design of culturally relevant and respectful demographic questions. For example, what language do local community members use to describe their racial and ethnic heritage, gender identity, ability status, residence status. When collecting demographic information, consider not just who is engaged, but also who is not and why. Minimize the collection of demographic information that is not essential to your evaluation inquiry. Lastly, include a write-in option (we suggest labeling it as “Additional” instead of “Other”).

D8. Analyze Information with a Critical Lens 
Because analysis is based on interpretation, it is susceptible to implicit biases that all people hold. Analyzing information in partnership with program partners and participants helps to critically reflect on and question biases that might show up during the process. Bring a spirit of curiosity for how each individual is interpreting the information.  

D9. Create an Analysis Plan That Includes Cultural and Contextual Factors
Create a plan for analysis that includes cultural and contextual factors, which can increase the accuracy and relevance of interpretations. For example, examine patterns across dimensions of difference that are meaningful to the program, such as race, ethnicity, education, family composition, ability, and more. This is referred to as “disaggregating” the information.

Consider how you can dig deeper into the information to add context to the analysis or explore alternative interpretations. Are there disparities between subgroups related to access to program services because of structural factors or other reasons? How does your analysis address cultural biases or stereotypes about participants, groups of people, and communities?  

Key Resources

Consult these resources that support collection and analysis of information: 

Image

Children observe bugs

Explore the Values

We encourage you to investigate how each value is incorporated into the evaluation process.

Image

Bamboo forest

Explore the Evaluation Process

Explore other elements of evaluation to drive excellence in your environmental education program design, build stronger and more equitable relationships, contribute to meaningful experiences, and yield equitable outcomes.