Element E. Collect Data

One of the most profound ways that evaluations influence the prevailing thinking in our field is through the data we measure. By design, evaluations elevate the importance of some data over others. As you explore what data to collect for your program, reflect on any biases you may hold for particular types of information or evidence with your partners, participants, community members, and funders. The Equitable Evaluation Framework™ looks at strategies and solutions to achieve equity through thoughtful choices in the data we collect and measure. 

Existing and Easy-to-Collect Data

As you identify data to collect, look at what information  you and your partners already collect to avoid gathering duplicative information. This data may come from program registrations, attendance records, participant projects, and more. You should also review any assessments already used in your programming that can address your evaluation questions and outcomes. As a rule of thumb, avoid asking participants to share the same basic information about themselves more than once.

Think about data that are easy to collect and demonstrate evidence of your program’s reach, such as number of events in a program or bags of litter collected on a clean-up day. These simple metrics can be easy to forget when you focus on developing and administering other evaluation instruments.

Sampling

Initially, you may be tempted to collect feedback from everyone who participates in your program. This may be an appropriate strategy, if the program serves a very small audience. With a large group of participants, however, this approach strains limited resources, such as time and money. Instead of collecting information from everyone in your population (i.e., all of your participants), you can learn just as much about your program by collecting information from a sample of participants. 

Sampling is the process of collecting and analyzing data from a subset of a population for the purpose of generalizing the results across a population. It can be applied not only to a population of participants, but also to program artifacts (such as participant journals and student assessments), environmental quality indicators, and more. Sampling can also be performed for different units of analysis such as school districts, schools, individual classrooms, or program sites.

Choosing a sample size depends upon the type of data you are collecting and how you will analyze it. 

  • Qualitative: Because qualitative efforts are typically not about generalizing results to a larger population, there are no hard and fast rules for selecting an appropriate sample size. Instead, try to set a minimum sample size based on what you think is necessary to adequately capture your outcomes of interest -- and anticipate that this number may change once you start to collect data.
  • Quantitative: If you plan to use descriptive statistics to analyze and report your data (such as mean, median, and percentages), you should choose a sample that is roughly large enough to accurately represent your population. If you will be using inferential statistics to test significance in differences, then a larger sample size is required. Rules of thumb exist for choosing an appropriate sample size, including this one at My Environmental Education Evaluation Resource Assistant

Sampling Bias

Sampling biases are important to consider. A sample that consists only of individuals who volunteer to participate in the evaluation (referred to as a "convenience" sample) can be problematic. The self-selected group may be different from the rest of the program participants, making it difficult to know whether results are truly representative of the larger group. This problem can even arise when random sampling strategies are used, if a large percent of those selected refuse to participate in the evaluation. Keep in mind that the quality of your sample can have a great deal of influence over whether evaluation results truly represent the experiences of all your participants.

Prepare Your Data for Analysis

As data is being recorded and before analysis begins, review it for any errors or inconsistencies. If more than one person is collecting data, ensure each is recording it the same way. Review these Quick Tips To Make Sure Your Electronic Data Are Accurate. A coding sheet is useful to to keep track of questions asked on the evaluation instrument(s) and how data were collected and recorded. You will want your data to be “clean” and well organized before you begin analysis.

 

Tips to Embed CREE

 

E1. Focus on Relationships

Use the data collection process to demonstrate respect for and build trusting relationships with your participants and their community. Share the purpose of the evaluation and how it will benefit them. Emphasize the voluntary nature of their participation. Allow unhurried time for introductions, inquiry, and concluding your data collection efforts. Show appreciation for participants’ contributions to your evaluation efforts.

E2. Collect Data That Values Multiple Truths

Create or adapt tools that collect data in multiple ways and reflect the many ways that people experience and make sense of the world around them. Avoid simplistic approaches to measuring environmental knowledge, skills, attitudes, behaviors, and other outcomes. Consider using storytelling, chronicles, memoirs, drawings, or fabric art in place of or in addition to surveys, questionnaires, and polls.

E3. Examine Existing Data 

Collaborate with your program partners, participants, and communities to identify what information already exists to help you answer your evaluation questions. For publicly available demographic data, explore how accurate or appropriate this information is from the perspective of your participants or their community. 

E4. Identify Baseline Data

Determine what baseline data may be needed to measure change. Consult with program partners and participants to establish how data such as prior knowledge and experience will be collected and then measured, particularly if that knowledge might not reflect Western norms, such as traditional ecological and social knowledge of the environment held by many Indigenous people. 

E5. Collect Demographics with Intention and Sensitivity 

Avoid collecting demographic data that is not essential to your evaluation. If you are interested in understanding differences across subgroups, respectfully ask participants how they identify along dimensions of difference. For example, organizations that want to broaden the cultures and communities with which they work, might want to collect demographic information in culturally sensitive ways to know more about the audiences participating in their programs. 

When collecting demographic information, consider not just who is engaged, but also who is not and why. out more about who’s not engaged and why. Consult Track Program Activities for guidance on how to ask culturally  relevant demographic questions in an ethical way.

E6. Watch for Sampling Bias

Examine critically whose experiences are being measured through your data collection samples. Review data as it is being collected with program partners and participants to question if information from some participants is being over- or under-represented. Adjust your samples or data collection methods and tools to address biases. 

Resources

Consult these resources that support Data Collection:

Image

Children observe bugs

Explore the Values

We encourage you to investigate how each value is incorporated into the evaluation process.

Image

Bamboo forest

Explore the Evaluation Process

Explore other elements of evaluation to drive excellence in your environmental education program design, build stronger and more equitable relationships, contribute to meaningful experiences, and yield equitable outcomes.