Mad Evaluation: Strategies and lessons learned in the daunting world of evaluation for youth-based programs
Abstract
Because of the diverse nature of the field, there is no “correct” or “standard” approach to citizen science evaluation. However, the recent development of evaluation toolkits are helping practitioners get a handle on... [ view full abstract ]
Because of the diverse nature of the field, there is no “correct” or “standard” approach to citizen science evaluation. However, the recent development of evaluation toolkits are helping practitioners get a handle on how best to design, implement, collect and assess data. These toolkits may help to advance the field considerably, making possible the collection of large volumes of quantitative data and ultimately increasing our understanding of long-term impacts to participants.
However, a challenge for many citizen science practitioners, many of whom are often working in the non-profit realm, still remains. Organizations with very few resources often find evaluation operationally challenging and daunting. Our goal is to share both the absurd and rational perspectives of one youth-based citizen science program, LiMPETS, in overcoming the challenges of conducting evaluation in a resource-limited environment. We will provide information on what challenges we’ve experienced in developing goals and an evaluation framework for our citizen science work, what operational approaches work best for a small budget and few staff, and what we think can be done to help make evaluation more effective for the LiMPETS program and the broader field.
Authors
-
Amy Dean
(Farallones Marine Sanctuary Association)
Topic Area
Research/Evaluation of CitSci Experience
Session
3A » Speed Talks - Across Conference Themes (14:40 - Wednesday, 11th February, LL20A)
Presentation Files
The presenter has not uploaded any presentation files.