State of Skills Report
Evaluation and Learning in the Skills and Training Ecosystem

Key Insights
The Future Skills Centre’s approach to evaluation and learning has evolved over time, reflecting the challenges of measuring social impact. It has shifted from a focus on common outcomes to a broader approach that embraces multiple methods for understanding impacts on individuals, institutions and systems.
Canada’s skills development ecosystem must encourage experimentation and learning from failures to identify effective, agile, and responsive approaches to skills development – expanding benefits across more people, sectors and regions.
To fully harness Canada’s workforce potential, the skills and training ecosystem requires greater practical support in implementing inclusive practices, not only in program design and delivery but also in evaluation and learning.
The Issue
Canada’s economy is evolving rapidly because of technological, demographic, environmental and geopolitical change. These shifts are reshaping the nature of work — creating new opportunities while introducing new challenges. The Future Skills Centre (FSC) was created in 2019 to accelerate innovation in skills development and support Canada’s workers, employers, and economy by generating insights, solutions, and systems changes that will allow our labour market to adapt and thrive into the future.
We centre our work on five focus areas that represent important and pressing skills and labour market development challenges now and into the future.
- Pathways to jobs: Technology and net-zero transitions mean increased labour market volatility and more job displacement. Now more than ever, workers need to navigate career pathways and transition between jobs and sectors as smoothly as possible.
- Technology and automation: Emerging technologies, including AI, are creating uncertainty in jobs, their skills composition and the impact on sectors and regions. Ensuring workers have the right skills to integrate and adapt to new technologies quickly and flexibly is becoming more important to keep pace with change.
- SME adaptability: Small and medium-sized enterprises (SMEs) make up a large part of the Canadian economy but often struggle with lower productivity, slow technology adoption, and underinvestment in training. Solutions that meet SMEs’ short-term skills needs and lower time and cost barriers have the best chance of driving greater investment in reskilling, upskilling and employee development.
- Inclusive economy: Some groups are disproportionately impacted by rapidly changing economic conditions and are excluded from employment in high-growth sectors. Skills-based strategies that promote labour market inclusion are essential for ensuring equal opportunities for all workers in Canada.
- Sustainable jobs: Impacts of the net-zero transition will vary region to region, driving economic growth and job opportunities in some and unemployment, loss of income, and out-migration in others. Place- and sector-based approaches are needed that coordinate workforce development strategies across a range of actors.
These priorities shape FSC’s partnerships, projects, research, evaluation and knowledge mobilization efforts.
Our Approach
FSC was set up to drive new ideas and to rigorously evaluate and learn from pilots, generating critical insights into what works and what doesn’t, for whom, and under what conditions. These insights will drive uptake and action – scaling impact through both policy and programs – and future-proofing the Canadian labour market.
In pursuit of this, we use a variety of comprehensive evaluation approaches that foster insights at multiple levels. This includes:
- Clearly defining outcomes and corresponding measurement strategies for individuals, institutions and broader systems.
- Collaborating and supporting partners – including training organizations, non-profits, postsecondary institutions, sector councils, workforce development agencies, municipal agencies and others – to invest in data collection, analysis, and intentional learning and improvements for enhanced program implementation and better understanding of both outcomes and impact.
- As a pan-Canadian initiative, we support ideas and innovations from across Canada to generate a robust evidence-base for decision makers to support the skills ecosystem.
What We Investigated
The Future Skills Centre has evolved its approach to evaluation and learning.
Our learning journey: One-size-fits-all
In the beginning, our evaluation strategy was based on a common outcomes framework that established standardized metrics to assess and track outcomes across funded projects. Informed by a review of domestic and international employment-related best practices, this approach measured key indicators related to socio-demographic characteristics such as employment outcomes, earnings, skills acquisition, program completion, enrollment in further education and job satisfaction among others. Underpinning the framework was an assumption that innovation pilot programs would primarily focus on service delivery for individuals (namely, training). The common outcomes framework worked well for projects delivering direct training and those with strong data collection capabilities. Additionally, FSC provided intensive support to many of these projects through our consortium partner, Blueprint.
However, as FSC’s portfolio expanded to fund projects that weren’t necessarily delivering services to individuals – some were early-stage investments (pre-pilot) and others targeted interventions towards employer practices, or policy interventions – so did the need for more flexible evaluation methods. Many interventions included a range of activities including curriculum development, developing skills frameworks, fostering employer relationships, and other complex mechanisms of influencing skills systems, areas where individual-level impact could take years to materialize. A diverse project portfolio was the right approach in an increasingly complex system, but it did require us to diversify our evaluation toolbox to include a wider variety of approaches.
From the outset, we have worked closely with project partners to generate meaningful evidence. Consortium members, including Blueprint and the Diversity Institute, held key responsibilities for supporting specific projects with gathering the requisite data, supporting program model development and iteration, needs assessments, and formalizing processes. Blueprint, for example, has applied an integrated evidence approach to projects to help training solutions advance along the innovation cycle, such as with the scaling portfolio, which includes Facilitating Access to Skilled Talent, and randomized control trials with the ADaPT, In Motion & Momentum+ and NPower Canada projects. The Diversity Institute has similarly supported partner-led evaluations and learnings, including the Black African and Caribbean Entrepreneurship Leadership Program and the Capital Skills program.
Our learning journey: A desire for reflective evaluation
Many organizations in the skills ecosystem are eager to measure and learn. Some FSC-supported projects, particularly those led by postsecondary institutions, incorporated evaluation into their project plans, often hiring in-house staff or third-party evaluators. These projects designed and implemented their own evaluation plans alongside project implementation activities. A variety of evaluative approaches and methods were used, including process evaluation, outcome evaluation, document review, interviews, focus groups, surveys and analysis of administrative data. Examples of projects that initiated their own evaluation plans included the Trucking Human Resources Sector Council Atlantic and Food Processing Skills Canada. We reviewed the final evaluation reports from these projects, providing feedback to partners and using the findings to inform knowledge mobilization activities.
While many of these efforts produced valuable insights, challenges arose, including positivity bias and a reluctance to share critical findings. In a competitive funding environment, many organizations were averse to sharing any unfavourable reflections on the project design, implementation, or uptake with FSC let alone more publicly. The quality of some partner-managed evaluations also varied considerably, ranging from excellent, thorough evaluative assessments to superficial short documents that did not illuminate much about what actually occurred or was learned in the project
Our learning journey: Expanding third-party evaluation
To enhance evaluation consistency, we have taken proactive steps to bring in external evaluators who work one-on-one with project partners to drive success. One example is our work with the British Columbia Institute of Technology (BCIT) in a project aimed to improve outcomes for newcomers and international students by boosting confidence with language skills. Evaluation partner Goss Gilroy, hired by FSC, collaborated with the BCIT team to assess both the process and outcome questions, ultimately helping to refine implementation strategies.
In recent years, we have worked with a range of evaluation organizations – including Social Research and Demonstration Corporation (SRDC), Malatest, Gross Gilroy and Johnston Research, as well as our founding partners, Blueprint and the Diversity Institute – to conduct independent project evaluations. Moving forward, we plan to expand this network of evaluation partners in the coming year.
Indigenous-led evaluation
Recognizing the need for culturally relevant assessment, a group of Indigenous-led projects were paired with Indigenous evaluator, Johnston Research, commissioned by FSC. Johnston Research developed an Indigenous evaluation framework that focused on spirit, relationships and process. This approach ensured that Indigenous-led projects were assessed in ways that aligned with community values and priorities. Johnston Research produced final reports on the projects, which were then published alongside other project deliverables on the FSC website.
Our learning journey: Strengthening learning processes
As our experience in innovation and scaling projects grows, we have adapted our approach to include key data collection and engagement processes that more effectively support overall learning and evaluation objectives. These processes include:
Learning and reflection forms. Since 2022, FSC has asked all project partners to complete a learning and reflection form at the end of the project to provide answers and insights to key learning questions, in addition to the reports authored by the evaluators. The form asks project partners to provide written responses to the following questions:
- What were the issues and needs that your project was trying to address?
- What was the project’s approach to address those issues and needs?
- What was the project testing?
- Reflecting on the original goals, what was learned?
- What worked well?
- What did not work well?
The learning and reflection forms allow us to triangulate any different understanding or takeaways from the project that may differ between the implementation partner and the evaluator, which may inform knowledge mobilization activities. We have also begun to collect feedback on the evaluation experience and support provided through surveys of project partners.
Evaluator guidance. In 2022, we began sharing a guidance document with evaluators working directly with projects – both those initiated by the projects themselves and third-party evaluators commissioned by FSC. The guidance included questions to be discussed in evaluation reports related to:
- Stakeholder and evaluation goals
- Learning-focused background and description of the project, including a theory of change
- Evaluation questions, data sources and indicators, associated with implementation, effectiveness, efficiency and causal attribution
- Any benchmarking against which to gauge results
- Evaluation findings linked clearly to evaluation questions, as well as factors that impacted results, both positive and negative
- Discussion and implications for expansion, adoption, investment and/or further partnership, lessons for service delivery
Thematic evaluations. Guided by strategic questions relating to our focus areas, we are increasingly identifying cross-cutting themes to assess broader trends across groups of projects. One example of this work is current efforts to do a formative review of projects whose primary objective was building a digital platform as the solution to a challenge. The thematic evaluation will identify key challenges, factors for success and lessons learned across a group of FSC-funded technology-focused projects and make forward-looking recommendations to guide future investment.
The reports that result from these evaluation activities are reviewed and published on the FSC website along with our analysis and interpretation of the results.
What We’re Learning
To date, the Future Skills Centre has funded more than 379 projects and produced 690 research and evaluation reports.
The wide variety and diversity of projects funded has presented a range of unique challenges and opportunities which in turn have shaped the evolution of our evaluation and learning strategy.
Varying levels of evaluation capacity. Project partners ranged from very advanced capacities, such as institutions with professional evaluators on staff, and prior experience working with evaluators, to small community organizations with few staff and no prior knowledge or experience with formal evaluations. Even among large research organizations (postsecondary institutions), evaluation capacity varied considerably.
Many project partners were entirely focused on project implementation, with a dedicated focus on developing curricula, developing training infrastructure, delivering training etc. They often had limited experience documenting their project model or approach and limited capacity to generate, track and manage data. With acute timeline pressures made more acute by the pandemic, projects with prior experience working with evaluators were more readily able to leverage those relationships for the benefit of the project learning overall. Partners with less capacity and experience needed more support to establish trust between project staff and evaluators, and productive ways of working together.
Confusion between evaluation and research. Through engagement with project partners, we found that evaluation was often conflated with research. In certain cases, narrow research questions were pursued over broader evaluative questions that would lead to a critical assessment of the results and surface questions about the overall approach. This was particularly evident with postsecondary institutions, and we encountered many postsecondary partners working with a research team on a specific research question, without evaluating the overall project implementation and results.
The research conducted was certainly of value for the research teams involved and added to academic knowledge, but it was often at the expense of practical evaluative learning and evidence that would guide decision-making about expansion, adoption, investment and/or further partnership.
Hesitation to acknowledge failure. Building a culture of innovation requires openly recognizing what does not work, for whom and in what context. Most projects were reluctant to reflect on the elements of their interventions that were not successful and emphasized a positive interpretation of results. While understandable amidst a competitive funding environment, Canada’s skills development ecosystem needs to reward and acknowledge experimentation, which means learning from innovations that didn’t work as expected or planned.
Short funding timelines limit long-term outcome measurement. FSC required evaluation reports to be submitted within 60 days of project completion, which made it difficult to assess long-term impacts on individuals, institutions, or systems. In some instances, where the project received additional funding from FSC’s extension years, analysis of longer-term outcomes was possible, and made a valuable contribution to understanding the potential for scaling of these interventions. In some instances, Blueprint has been able to track long-term outcomes through an agreement with Statistics Canada to link participant data to tax filer data. The three projects involved in the randomized control trials (ADaPT, In Motion & Momentum+ and NPower Canada projects) also include linked administrative tax data, with results expected in early 2025. But for most projects, timelines were extremely compressed and didn’t allow for much outcome data at all – leading to an over-reliance on traditional output measures such as training completions.
The need for more robust and inclusive evaluation practices. Projects across the portfolio need more practical support to implement inclusion into program design and implementation, as well as in evaluation and learning approaches. Evaluation and learning plans and methods were not often inclusive and user focused. For example, a key issue for many skills interventions is attrition during an intervention. An inclusive evaluation approach might ask who is dropping out, and why? Many projects focused intently on the participants who remained enrolled, without examining the patterns of attrition and any additional supports that these groups might require to remain engaged.
User voice was missing for many, but where they were present, methods encountered engagement challenges because they were not drawing on evidence of what works for inclusively engaging a diverse range of people in evaluations and research. Many times, project partners were starting from scratch rather than building on well-established knowledge about effective approaches.
Why It Matters
Transformation and change in the global economy are not slowing down. With changing patterns of migration, trade, climate, and technology, strategic leadership is required to ensure Canada’s workers, employers, and economy will not fall behind. A formative evaluation of the Future Skills Centre, published in May 2023 by Employment and Social Development Canada, concluded that Canada’s skills and development ecosystem continues to face challenges, and adjustments are needed to ensure Canadians and the Canadian economy can thrive. More information and evidence are needed to support workers to adapt to these changes.
Demand for FSC’s investments and partnerships over the last five years clearly demonstrates the need for leadership and investment in innovation in Canada’s skills development systems. We continue to be an essential partner to ensure Canada has the skills we need in the labour force, to bolster our competitive edge, and to deliver prosperity for all people in Canada.
Ultimately, we want to see our investments and partnerships lead to impacts for people, employers, and skills development systems:
- People receive effective career and skills development advice, tools, and other supports throughout their working lives to help them navigate and succeed in a constantly changing world of work with a focus on people who are vulnerable or underrepresented in the labour market.
- Employers, especially SMEs, embed skills development into workplaces and invest in their employees’ skills. There needs to be more effective integration between organizations that deliver skills development (supply) and the firms that employ those skilled people (demand).
- Skills development systems are supported by dynamic labour market and skills information. Programs that are proved effective are scaled, and the network of organizations and institutions working on skills has the right incentives to work together more effectively.
Investing in projects that test promising solutions to pressing problems, evaluating the impact of pilots and new partnerships, generating critical insights, mobilizing knowledge around what works, for whom, and under what conditions is essential to scaling and expanding the impact of the most promising approaches.
Going forward, we will continue to leverage the insights we are generating to inform both policy practice outcomes. We know that all levels of policy making – federal, provincial, and municipal – have critical decisions ahead regarding the allocation of resources, improvements for program delivery, and ensuring Canada remains competitive into the future.
What’s Next
The evaluation strategy for our next phase of investment builds on the insights gained from earlier phases, strengthening the structure, inclusivity, and strategic focus of our approach. This will enhance capacity among project partners, provide clearer evaluation timelines, and ensure alignment with the Future Skills Centre’s strategic priorities across our focus areas.
The pillars of our approach include:
- Strategic questions are integrated into funding proposal. In our recent call for proposals Skills Horizon, we required proponents to articulate how their proposal would generate learning and evidence against our strategic questions. We want project partners to be thinking about learning and evaluation from the very start of our engagement together.
- Required third-party evaluation. We are continuing to require project partners to work with a designated evaluation partner. With very few exceptions, we generally have all funded innovation projects work with an assigned third-party evaluator.
- Expand our evaluator roster. We have valued the unique capabilities and assets of the different evaluation organizations, and we will continue to work with a range of these service providers. We are particularly seeking to expand our collaborations with evaluators who work with underrepresented communities and use innovative or non-traditional approaches to evaluation.
- Continued support for project partners and evaluators. We are updating our measurement frameworks, guidance, and are planning onboarding sessions to ensure clarity of roles and responsibilities throughout the evaluation process.
Have questions about our work? Do you need access to a report in English or French? Please contact communications@fsc-ccf.ca.
More from FSC
Digital Infrastructure for the Post-Pandemic World
Is the Future Micro? Unbundling Learning for Flexibility & Access
HireNext: Improving hiring and HR practices with AI
This report was produced based on projects funded by the Future Skills Centre (FSC), with financial support from the Government of Canada. The opinions and interpretations in this publication are those of the author(s) and do not necessarily reflect those of the Government of Canada.