All of our projects include extensive evaluation planning done in collaboration with our partners.
This planning typically includes:
- Developing a program logic model or theory of change
- Brainstorming, prioritizing and selecting evaluation questions
- Identifying and selecting appropriate indicators and metrics
- Proposing and defining evaluation methods
- Developing a final evaluation plan that includes questions, indicators and metrics, methods and an analytic plan
- Developing a learning, communication and dissemination plan
For many of our projects, we serve as the cross-site evaluator. This means a funder will contract with us to conduct an evaluation across the funder’s grantees. Grantees may be funded to implement similar programs or they may be funded to implement one of several programs. In all cases, we work with the funder to do comprehensive evaluation planning.
A unique feature of our cross-site evaluations is the development of a core dataset. Through a collaborative process, the PiER Center works with the funder to identify a common, core dataset that grantees are requested to collect. The core dataset is a critical part of the cross-site evaluation. It will allow us to assess progress across grantees and within subgroups of grantees and, if applicable, compare effectiveness across strategies.
An additional unique feature of our cross-site evaluations is the development of regular progress reporting or monitoring by grantees. The progress reporting or monitoring is designed to answer the evaluation questions prioritized by the funder by asking grantees to report regularly (e.g,. quarterly, semi-annually) on the core dataset.
Process & outcome evaluation
The PiER Center has expertise conducting formative, developmental, process, outcome and impact evaluations.
The formative, developmental and process evaluations are ongoing throughout the life of a project and include regular opportunities for our partners to learn from the evaluation.
The outcome evaluation is implemented at strategic time-points throughout the project (for example, baseline, midpoint and at completion of a program). The approach almost always includes a mixed-methods approach, incorporating and triangulating qualitative and quantitative data to answer prioritized evaluation questions.
Evaluation Technical Assistance
Having a solid evaluation plan from the beginning is essential for programs to be successful and sustainable. A goal of our evaluation projects is to enhance partners’ evaluation capacity and skills by providing technical assistance and training to partners. To achieve this goal, the PiER Center reviews evaluation objectives, strategies and evaluation plans of all partners and provides suggestions for improvement on data collection, evaluation methods and reporting.
Additionally, we can provide training opportunities periodically that correspond to where partners are in their own evaluation process. We often start our technical assistance with an evaluation needs assessment to identify current evaluation skills among partners. Based on the results of the needs assessment, we develop a training and learning plan that incorporates individual and group learning delivered virtually or in person.
Web-based Progress Reporting
The PiER Center typically uses a web-based progress reporting system or monitoring system. We have experience developing custom systems or building reporting into existing grants management software. All of our web-based systems are a secure tool where users can only see their data or data assigned to them. We typically collect the core dataset through this web-based system. When feasible, the system includes report options so partners can access their own data and funders of a cross-site evaluation can access data across grantees. A web-based system allows us to put parameters in the fields and interface elements that make it more likely data are entered accurately. Our systems have the ability to require fields be completed before moving on to the next section. Additionally, it can validate data entered to meet certain parameters. For example, it has the ability to require an age field to be between 18 and 110. Any number outside of this range will cause an error. We will also build in branching logic to allow certain sections to be present or be skipped depending on the needs of the user. These features provide assurance that the data are valid and usable.
Quantitative and Qualitative Data Collection
Most of our evaluation projects involve a mixed-methods approach, incorporating and triangulating qualitative and quantitative data to answer prioritized evaluation questions. The evaluation design, approach and methods are finalized in collaboration with our partners.
- Survey design and implementation (paper, electronic or in person)
- Web-based progress reporting
- Social network analysis
- Direct observation techniques (e.g., SoPARC: system for observing play and recreation in communities)
- Sustainability assessments
- Geographic mapping using Geographic Information Systems (GIS)
- Population dose
- Built environment audits (virtual or in person)
- Key informant interviews
- Focus groups
- Concept Mapping
- Policy assessments
Dissemination and Learning
The PiER Center has a core philosophy that dissemination and learning are a critical aspect of all evaluation projects. Programs will be more effective and sustainable if they are part of an efficient feedback loop where evidence-based evaluation results are shared widely, discussed frequently, and used by programs. To accomplish this, we take a multi-modal approach to dissemination and learning.
Our products provide clear visuals of data to show progress and results throughout a project.
- Brief reports
- Oral presentations
- Publications in academic journals
- Emergent learning
- White papers
- Case studies