Conducting an Evaluation Audit as a Quality Improvement Approach for Non-profits – A Canadian Case Study

This research examines the implementation and effectiveness of an evaluation audit as a quality improvement approach for non-profit organizations, using Access Alliance as a Canadian case study. The study assessed how well current evaluation practices at Access Alliance comply with evaluation policy standards and practice.
The research demonstrates that evaluation audits are both feasible and valuable for non-profits, but require investments of time and staff buy-in.
What do you need to know?
The non-profit sector plays a vital role in Canada’s social fabric and economy. According to 2005 data, over 85,000 charitable organizations employ more than 2 million workers and contribute approximately 76 billion dollars to GDP.
Non-profits operate in an environment of increased accountability requirements. Most funders require program evaluation to ensure project goals are met. Community members want to know that service providers in their community are delivering the programs/services they were meant to deliver. Evidence-informed planning and evaluation in community health centres (CHCs) remains a developing area. Organizations often face limitations in capacity, resources, and budget for comprehensive evaluations.
Key questions asked is this research included:
- How does the organization ensure quality of the evaluation practices?
- What are the facilitators that influence the quality of evaluation?
- How are the traditional evaluator roles being engaged in developing new approaches (e.g. appreciative enquiry, art- based methods, etc.) to conduct evaluation in non-profits?
What did the researchers find?
Having a program-specific logic model, ensuring access to quality data, as well as training of the staff members on evaluation, data quality, and data policy is foundational to improve the process and utility of evaluation. Researchers recommend that organizations implement and conduct a systematic evaluation audit approach to improve the quality of all programs and services, to help leverage resources, and to support evidence-informed practice and service delivery.
Programs using a logic model (a planning tool that maps out how a program works) were much more likely to have been properly evaluated. If a program had a logic model, there was an 80% chance it had gone through a formal evaluation process using proper organizational tools and methods. When supportive leadership, organizational culture, and data quality combine, staff are receptive, curious, and engaged partners and drivers of successful evaluation practices.
Data Quality
- Access to high quality, credible, valid data is essential to develop useful evidence from the evaluation activity and to inform service provider practices, be strategic about program design, and to foster an organizational culture of capacity building and quality improvement.
- Data collection and management present operational challenges. They are resource intensive. Organizations with multiple locations that do not have centralized data storage can make access and retrieval difficult.
- Staff training, clear data protocols, and technology decisions focused on making data accessible in mind are critical to address these challenges.
- Clearly communicating the potential applications of the data can help staff not only understand the importance of data quality but also promote good practices in data collection.
Building an Evaluation Culture
- Evaluation needs to be embedded in the organizational fabric through formal policies and frameworks. This includes having an evaluation framework and policies, a program logic model template, evaluation tools, and internal evaluation capacity.
- Planning and evaluation should be actively promoted and discussed consistently and among all teams/programs as key strategic activities to strengthen program and service planning and delivery processes. An evaluation- to-practice (E2P) conversation should be a standing agenda item for all team meetings to identify, discuss, and operationalize planning implications from evaluation activities.
- To promote and enhance uptake of systematic evaluation practices, organizations can create effective staff-focused communication strategies on evaluation pathways, process, consequences, and a description of the end-users. They should also offer capacity-building training sessions, based on identified training needs.
Leadership Impact
- Organizational leadership is key to building effective data practices and an organizational evaluation culture. Senior management should nurture a culture of evaluation in the organization.
- Integration and cohesion in vision and activities between an organization’s internal evaluation team and senior management are crucial.
- It is essential for leadership to respond and build capacity when staff show interest and engagement in implementing effective evaluation practice evaluation.
What did the researchers do?
Researchers used a mixed-method explanatory sequential approach. Quantitative data was collected via a survey from multiple departments and programs (26 in total), and a mix of staff, managers, and leadership stakeholders. The survey asked key questions like whether the program(s) had a program-specific logic model, and whether the program had been evaluated during the period 2012-2015. Reports from 21 out of 26 programs were considered for analysis. 16 key stakeholder interviews were conducted.
How can you use this research?
This research has practical applications at multiple levels:
For Non-profit Organizations
- Implement evaluation audits as quality improvement tools.
- Develop comprehensive evaluation frameworks.
- Strengthen accountability processes.
For Program Managers
- Use logic models to improve evaluation outcomes.
- Integrate evaluation into regular program planning.
- Focus on data quality management.
For Policy Makers
- Support capacity building for evaluation in non-profits.
- Develop sector-wide evaluation standards.
- Promote evidence-informed practice.
Future research recommendations include:
- Scaling up evaluation audit tools for broader sector use.
- Developing standardized evaluation protocols.
- Studying long-term impacts of evaluation audits on program quality
Study authors and journal/book name
Authors: AKM Alamgir, Miranda Saroli, Axelle Janczur, Sonja Nerad
Journal for Worldwide Holistic Sustainable Development, 2019, Volume 5, Issue 1
Related Access Alliance Activities
Building Capacity for Equity-Informed Planning and Evaluation
This project focused on building organizational level knowledge, commitment and capacity to routinely use a health equity framework and evidence geared at overcoming systemic inequities in healthcare access, healthcare quality and health outcomes.
Annual Client Activity Report 2023-2024
A snapshot of Access Alliance clients’ demographic attributes, service needs, and program interactions over the past fiscal year, to ensure accountability, to support evidence-informed organizational planning, and to improve quality.
Annual Planning and Evaluation Report 2023-2024
The goal of this report is to facilitate a learning framework for building and sustaining a healthy evidence-informed and evaluation-focused organization.
Understanding the Experiences of Patients Accessing our Primary Care Services
Our Client Experience Survey provides a glimpse into patient experiences with Access Alliance’s primary care services. Each year we collect patient feedback to ensure service accountability, quality improvement, and evidence-informed practices. Patients are asked to rate their service experience.
Community Based Research Training for Peer Researchers
Our training introduces Peer researchers, researchers in the community with lived experience or working with the population of interest, to ethical CBR practice. In this training we provide an overview of the research process from conceptualization to design a research protocol a research protocol, collection of sensitive data or collection of data from sensitive populations, basics of analyzing data, basics of analyzing data, interpreting the results in accessible format, and knowledge mobilization. Trainees gain an understanding and knowledge base of the research process.