AWE Reflection Question
Two Roads Consulting. (2020). Outcomes Toolkit for AWE Centers.
This method was developed by Two Roads Consulting while working with the Alliance for Watershed Education of the Delaware River. Their goal was to create a non-survey tool that would gather information about participant attitudes.
The format can either be a journal prompt or exit slip. Record responses by collecting the response or taking a picture, and then enter the responses into a spreadsheet to be able to analyze by coding for themes.
When and how to use the tool
This tool can be used after a program to provide participants with a chance to reflect on the impact of the program. Make sure that you are doing something in your program to encourage participants to feel more attached to the place. This tool could be used for a one-time brief program or for an ongoing program.
How to analyze
A first step in preparing for analysis of non-numerical or data is coding, or organizing your data into themes. Start with a list of likely themes and you can determine whether each response fits that theme. Brainstorm likely themes with program staff. Create a spreadsheet in a program such as Microsoft Excel or Google Sheets. Make a separate column for each theme- you can start with the likely themes, and add themes as necessary. If you add themes, be sure to reanalyze responses already analyzed as needed to account for the new themes. Assign each participant’s response (recorded verbal response or written response) a and create a row for each response in the spreadsheet. Mark in each row (with a 1) what themes are present in that response. Then you can look for which themes came up most and least frequently. We recommend at least two people do this process, and then compare interpretations of responses and discuss differences as needed, and come to agreed upon conclusions together.
What to do next
Once analyzed the reflection question responses, consider the following suggestions about what to do next:
- You could compare populations to determine if members have different responses than the general , or if one geographic area of your is different from another. You might consider why different populations had different connections to the place- is it past experience? Was it how programming went that today? Was it something else? This could also provide justification for program development, marketing, or funding proposals. You might also follow-up with participants with or to learn more about their responses.
- Invite program staff or other partners to look over the data. Together you might also consider:
- What do these results tell us about our programming? Why do we think we got these results?
- What results did we think we would get? And did these data support our goals?
- If our results did not support our goals, can we brainstorm on areas within the programming or delivery to influence ? What changes should be made to programming, or how should new programs be designed?
- What stakeholders should we reach out to for collaboratively discussing program design?
- Who or what organizations can we share our learning with?
How to see if this tool would work with your program
Do your participants moan and groan when you hand them a survey? Can you tell they are enjoying themselves, but the survey responses fall short? Qualitative tools like this offer a way to creatively and authentically capture youth experience.
Short on time? You can easily add this evaluation method as a short activity in your program. This tool can serve as a way for participants to reflect on their experience. You can discuss responses as a group afterwards and have this activity aid in the experiential learning process and let participants share and connect with each other.
- Allow ample time for participants to complete this activity
- You can alter the question prompt for this method to better fit your program. For example, is your program in the mountains or the desert? Change the wording to better fit the participants spent time in.