

Grants and Contributions Applicants Client Experience Research (Year 3)
Employment and Social Development Canada [ESDC]
October 25, 2023
POR# 099-22
CONTRACT AWARD DATE: 2022-12-19
CONTRACT #: CW2266044 (G9292-24-2550)
Contract value: $149,885.85 (tax included)
Ce rapport est aussi disponible en français.
For more information on this report, please contact nc-por-rop-gd@hrsdc-rhdcc.gc.ca
Grants and Contributions Applicants Client Experience Research (Year 3)
It is available upon request in multiple formats (large print, MP3,
braille, e-text, DAISY), by contacting 1-800 O-Canada
(1-800-622-6232).
By teletypewriter (TTY), call 1-800-926-9105.
© His Majesty the King in Right of Canada, as represented by the
Minister of Families, Children and Social Development, 2023
https://publications.gc.ca/site/eng/services/copyPageTemplate.html
For information regarding reproduction rights: droitdauteur.copyright@HRSDC-RHDCC.gc.ca.
PDF
Cat. No.: Em20-148/2024E-PDF
ISBN: 978-0-660-67643-2
Recherche sur l’expérience client des subventions et contributions (Année 3)
Ce document offert sur demande en médias substituts (gros
caractères, MP3, braille, fichiers de texte, DAISY) auprès du 1-800
O-Canada (1-800-622-6232).
Si vous utilisez un téléscripteur (ATS), composez le 1-800-926-9105.
© Sa Majesté le Roi du Chef du Canada, représenté par le ministre de
la Famille, des Enfants et du Développement social, 2022
https://publications.gc.ca/site/fra/services/modeleDroitsAuteur.html
Pour des renseignements sur les droits de reproduction: droitdauteur.copyright@HRSDC-RHDCC.gc.ca.
PDF
Nº de cat. : Em20-148/2024F-PDF
ISBN : 978-0-660-67644-9
List of Acronyms
Acronyms | ||||
PROGRAM RELATED | ||||
AS | Apprenticeship Service | |||
CSJ | Canada Summer Jobs | |||
EAF | Enabling Accessibility Fund | |||
FELIP | Financial Empowerment of Low-Income People | |||
NAAW | National AccessAbility Week | |||
NHSP | New Horizons for Seniors Program | |||
SIP | Sectoral Initiatives Program | |||
STAR | Skilled Trades Awareness and Readiness Program | |||
SDPP-C&F | Social Development Partnerships Program – Children and Families | |||
SDPP-D | Social Development Partnerships Program – Disability Inclusion | |||
SSLP | Supports for Student Learning Program | |||
WER | Women’s Employment Readiness | |||
WORBE | Workplace Opportunities: Removing Barriers to Equity | |||
MISCELLANEOUS | ||||
CX | Client Experience | |||
ESDC | Employment and Social Development Canada | |||
FY | Fiscal year | |||
GBA+ | Gender Based Analysis Plus | |||
GoC | Government of Canada | |||
Gs&Cs | Grants and Contributions | |||
GCOS | Grants and Contributions Online Services | |||
MP | Member of Parliament | |||
N/A | Non applicable | |||
PO | Program Officer | |||
POB | Program Operations Branch | |||
SC | Service Canada |
Political Neutrality Statement
I hereby certify as Senior Officer of Ipsos that the deliverables fully comply with the Government of Canada political neutrality requirements outlined in the Policy on Communications and Federal Identity and the Directive on the Management of Communications. Specifically, the deliverables do not include information on electoral voting intentions, political party preferences, standings with the electorate or ratings of the performance of a political party or its leaders.
Mike Colledge
President
Ipsos Public Affairs
Additional information
Supplier Name: Ipsos Limited Partnership
PSPC Contract Number: CW2266044 (G9292-24-2550)
Contract Award Date: 2022-12-19
Executive Summary
Grants & Contributions CX Survey – Results At a Glance (Year 3)
- 3,041 SURVEYS CONDUCTED
- METHODOLOGY: ONLINE SURVEY
- FIELDWORK: April 19 to June 9, 2023
Overall Service Experience

Click for larger view
Figure 1: Overall Service Experience
This horizontal bar chart shows responses to three questions about the overall service experience and presents results for Year 1, Year 2 and Year 3. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5. All 3041 respondents in Year 3 answered as follows:
- Overall Satisfaction: Year 3 68%. Year 2 77%. Year 1 70%.
- Ease: Year 3 78%. Year 2 79%. Year 1 74%.
- Effectiveness: Year 3 75%. Year 2 78%. Year 1 70%.
Satisfaction with Service Channels

Click for larger view
Figure 2: Satisfaction with Service Channels
This horizontal bar chart shows responses to a question about satisfaction with the overall quality of service provided by the service channels used during the applicant process and presents results for Year 1, Year 2 and Year 3. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5. Sample sizes vary by service channel, only those who used each channel during their experience were asked about it.
- Email support from Program Officer (842 respondents answered this question in Year 3): Year 3 77%. Year 2 79%. Year 1 80%.
- GCOS Web Portal (881 respondents answered this question in Year 3): Year 3 72%. Year 2 76%. Year 1 67%.
- Email support from SC (2442 respondents answered this question in Year 3): Year 3 68%. Year 2 70%. Year 1 65%.
- Online (1871 respondents answered this question in Year 3): Year 3 66%. Year 2 71%. Year 1 66%.
- Mail (154 respondents answered this question in Year 3): Year 3 65%. Year 2 58%. Year 1 63%.
- Phone support from SC (573 respondents answered this question in Year 3): Year 3 62%. Year 2 61%. Year 1 61%.
- In-Person (44 respondents answered this question in Year 3): Year 3 52%. Year 2 62%. Year 1 66%.
- 1 800 O Canada (94 respondents answered this question in Year 3): Year 3 42%. Year 2 48%. Year 1 49%.
Satisfaction with Client Experience by Program

Click for larger view
Figure 3: Satisfaction with Client Experience by Program
This vertical bar chart shows responses to a question about satisfaction with overall service experience by program and presents results for Year 1, Year 2 and Year 3. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5. Sample sizes vary by program.
- EAF (300 respondents answered this question in Year 3): Year 3 72%. Year 2 78%. Year 1 77%.
- NHSP (1296 respondents answered this question in Year 3): Year 3 82%. Year 2 83%. Year 1 73%.
- CSJ (1004 respondents answered this question in Year 3): Year 3 68%. Year 2 79%. Year 1 69%.
- SDPP (214 respondents answered this question in Year 3): Year 3 42%. Year 2 72%. Year 1 53%.
- AS (11 respondents answered this question in Year 3): Year 3 36%.
- WORBE (22 respondents answered this question in Year 3): Year 3 59%.
- SSLP (24 respondents answered this question in Year 3): Year 3 50%.
- WER (51 respondents answered this question in Year 3): Year 3 35%.
- STAR (3 respondents answered this question in Year 3): Year 3 33%.
- SIP (116 respondents answered this question in Year 3): Year 3 28%.
Funding approval

Click for larger view
Figure description: Funding approval
This horizontal bar chart shows whether the applicants received funding approval or not and presents results for Year 1, Year 2 and Year 3 as follows:
- Year 3: Approved 79%. Denied 20%.
- Year 2: Approved 93%. Denied 7%.
- Year 1: Approved 90%. Denied 10%.
Satisfaction by Approval Status |
|||||
---|---|---|---|---|---|
73% | 82% | 74% | 49% | 47% | 41% |
Year 3 | Year 2 | Year 1 | Year 3 | Year 2 | Year 1 |
Approved | Denied |
Strengths

Click for larger view
Figure description: Strengths
- Service in choice of official language 91%
- Completing steps online made the process easier 84%
- Ease of finding general information on the program 82%
- Confident personal information protected 81%
- Ease of determining steps to apply 81%
Areas for Improvement

Click for larger view
Figure description: Areas for Improvement
- Ease of follow-up before receiving a decision 52%
- Ease of determining the amount of time each phase is anticipated to take 56%
- Client journey took reasonable time 58%
- It was easy to get help when needed 63%
- It was clear what would happen next and when 64%
* referred to as [program] web portal in Year 1
Note: Program types, intakes, and streams in grants and contributions vary widely, meaning that some year-to-year or program comparisons should be done with caution.
Year 3 Year 2 Year 1
Top 5 driver of satisfaction
Significantly higher / lower than total
Significantly higher / lower than Year 2
Key Findings
Overall Satisfaction and Applicants Experiencing an Issue
Overall satisfaction with the service experience among applicants to Grants and Contributions programs declined compared to Year 2, returning to levels observed in Year 1.
- Nearly seven in ten (68%) applicants were satisfied with their overall experience, a decrease of nine points from Year 2 (77%) and consistent with results in Year 1 (70%). The proportion of applicants who were very satisfied has declined (33%, -9 pts), while those who were very dissatisfied has increased (7%, +4 pts).
- Satisfaction was highest among applicants to New Horizons for Seniors Program (NHSP) (82%), followed by Enabling Accessibility Fund (EAF) (72%), Canada Summer Jobs (CSJ) (68%) and Workplace Opportunities: Removing Barriers to Equity (WORBE) (59%). Half of Supports for Student Learning Program (SSLP) applicants were satisfied (50%), while ratings were considerably lower for applicants of Social Development Partnerships Program (SDPP) (42%), Apprenticeship Service (AS) (36%), Women’s Employment Readiness (WER) (35%), Skilled Trades Awareness and Readiness Program (STAR) (33%) and Sectoral Initiatives Program (SIP) (28%). Among programs included in Year 2, overall satisfaction has decreased among CSJ and SDPP applicants.
- The overall ease (79%) and effectiveness (78%) of the application process continued to be rated high by the vast majority of applicants. However, those who applied to higher complexity programs and most notably SDPP, SIP and WER experienced more difficultly with nearly all aspects of the ease and effectiveness.
- Trust in Service Canada to deliver services effectively to Canadians has also declined compared to Year 2 (76%, -8 pt). This measure continued to be highly correlated to overall satisfaction and ratings have declined among CSJ and SDPP applicants.
Applicants to CSJ and SDPP experienced more issues related to the timeliness of service and had more difficulty following up or getting assistance during the application process than in Year 2 which negatively impacted their satisfaction.
- The overall decrease in satisfaction was due primarily to lower ratings among applicants to CSJ, who represent the vast majority of Grants and Contributions applicants, and to a lesser extent SDPP applicants.
- CSJ applicants provided lower ratings for the timeliness of service, clarity of the application process and what to do if they had a problem or question. They experienced more challenges getting assistance during the application process and were more likely to feel it took too long to receive an update on their application or to receive a decision.
- SDPP applicants provided lower ratings across all aspects of the ease and effectiveness of the application process and in particular for the timeliness of service, clarity of process and ease of getting assistance. They were more likely to have experienced a problem compared to all clients and to have encountered problems. The most prominent problems were that it took too long to receive an update or decision on their application and that the application form was too long or confusing.
Satisfaction Drivers and Awareness of Service Standards
The timeliness of service had the largest impact on satisfaction with service experience, followed by the ease of follow-up before receiving a decision, and confidence in the issue resolution process. In Year 3, all these aspects of service have increased in importance in driving overall satisfaction. Ratings in each of these aspects of service have declined compared to Year 2.
- The greatest opportunity to increase overall satisfaction is in reducing the amount of time the application process takes to complete and improving the ability of applicants to follow-up more easily before receiving a decision.
- Reducing the amount of time from the start of the application process to receiving a decision also aligned with the change applicants felt would have improved their experience the most. Timeliness is notably the most common suggested change across all programs.
Awareness of service standards remained relatively low and fewer knew of the time to acknowledge the submission and issue a funding decision than in Year 2. Applicants who were aware of each service standard continued to have a more positive experience. Notably, impressions have weakened year over year across several aspects of service among those who were not aware.
- More than four in ten applicants (44%, -4 pts) were aware of the stated service standards regarding issuing payment once a payment claim has been submitted, followed by fewer than four in ten (37%, -6 pts) for acknowledging the submission of a funding application and one third (33%, -6 pts) for issuing a funding decision notification. Compared to Year 2, awareness of the service standard for time to acknowledge submission and issue a funding decision declined.
- NHSP applicants were more likely to be aware of all service standards and EAF applicants of the time to issue a decision, while applicants to SDPP program were less likely to be aware of all service and SIP applicants of the time to issue payment.
Selected Applicant Profiles and the online experience
Those not approved for funding continued to be much less satisfied and fewer applicants were approved for funding compared to Year 2, which has contributed to the decline in overall results. Applicants who were not approved had much more difficulty getting help with their application, were less likely to feel the process was clear and timeliness of service reasonable and few reported having received a debrief on the outcome or being satisfied with the explanation provided.
- Eight in ten applicants received approval for funding (79%, -14 pts), statistically lower than the figure reported in Year 2. Applicants to EAF, SDPP, WER and SIP were less likely to have received funding approval compared to all clients, while NHSP and CSJ applicants were more likely to have received funding.
- Among those who did not receive approval, only one in ten (13%, -29 pts) were provided with an explanation why, statistically lower than in Year 2 (42%), and among those who were provided with an explanation, three in ten (29%, +6 pts) were satisfied with the outcome. EAF, NHSP and SDPP applicants were more likely to have received an explanation, while CSJ applicants were less likely.
Virtually all applicants reported submitting their application online and ratings for the ease and timeliness of the process remained strong and consistent with Year 2. Applicants to higher complexity programs continued to find all steps of the process more difficult.
- Six in ten applicants (59%) submitted their application using the online fillable form, followed by one third (34%) who used the GCOS web portal. Fewer downloaded the application documents and submitted by email (5%) or mail (3%). Applicants to EAF, NHSP, SDPP, WER and SIP were more likely to have downloaded the application documents and submitted them by email, however the vast majority submitted online. Compared to Year 2, NHSP and CSJ applicants were more likely to have submitted using the online fillable form, while SDPP applicants were more likely to have used the GCOS web portal.
- Impression of the ease of completing the application process remained consistent compared to Year 2. Applicants to CSJ continued to provide higher ratings for nearly all aspects of the application process, while applicants to SDPP, SIP, WER and to a lesser extent EAF and NHSP experienced more difficulty with all elements of the application process.
Satisfaction with Service Channels
Satisfaction with the service provided through most service channels was largely consistent and remained highest for support provided by email from a program officer, followed by the online channels. Fewer were satisfied with the Government of Canada website compared to Year 2 due to lower ratings among SDPP applicants who also had weaker impressions of the service provided by email.
- Eight in ten (77%) were satisfied with email support from a program officer, followed by the GCOS web portal (72%), email support from a Service Canada office (68%), the Government of Canada website (66%) and mail service (65%). Fewer were satisfied with telephone support from a Service Canada office (62%), service at a Service Canada Office (52%), and 1 800 O-Canada phone line (42%).
- The highest rated aspects of service were the provision of service in choice of official language, completing steps online made the process easier, confidence in security of their personal information and ease of finding general information about the program or determining the steps to apply.
- Aspects of service with lower ratings included the ease of follow-up, ease of determining how long each phase of the process was anticipated to take, the timeliness of service, ease of getting help when needed and that it was clear what would happen next and when. Ratings across nearly all these service attributes declined compared to Year 2.
Learning about the program
Email outreach from Service Canada or the program, the Government of Canada website and program applicant guides were the primary ways applicants learnt about the program they applied for. The vast majority who relied on the Government of Canada website continued to find it easy to navigate, however applicants to higher complexity programs had more difficulty. Further, more could be done to improve the ease of determining how long each phase of the process is anticipated to take.
- When learning about the program, applicants were most likely to have received an email from the GoC, ESDC, or the program they applied to directly (52%), followed by the Government of Canada website for the program (48%), the applicant guide (45%), talked to peers/community networks (27%) and the general Government of Canada website (20%). More applicants talked to peers/community networks than in Year 2, while fewer received an email directly from the program they applied, went to the general GoC website, participated in a GoC information session or talked to their local MP.
- NHSP applicants relied more heavily on the applicant guide, peers/ community networks, GoC info sessions and their local MP, while applicants to EAF and other higher complexity programs like SDPP, WER and SIP, were more likely to have used the GoC website and to have participated in a GoC information session. Those who applied to higher complexity programs and in particular SDPP, while more reliant on the GoC website, also had more difficulty finding the information they needed.
- The vast majority who used the GoC website continued to feel it was easy to find most types of information. Ratings were highest for the ease of finding general information about the program (82%), determining the steps to apply (81%) and if their organization was eligible for funding (80%). Applicants felt it was more difficult to determine the amount of time each phase of the process was anticipated to take and improving the ease of finding this information was the change identified by applicants that would have improved their experienced the most.
Populations served by funding and project close-out
Funding sought by applicant organizations continued to be targeted largely at supporting diverse communities, however slightly less so than in Year 2.
- Nine in ten applicants (90%) reported that the funding they applied for would support diverse (GBA+) communities, clients or people. Seven in ten (71%) said the funding would support those who identify as youth, followed by women (56%), those belonging to a minority racial or ethnic background (49%), low socio-economic status (45%), Black Canadians (40%) and those who identify as Indigenous (39%).
- Satisfaction remained consistent among applicants who assist GBA+ communities and those who do not but has declined among those who assist GBA+ communities compared to Year 2 mirroring the trend observed overall.
The vast majority of funding recipients found it easy to complete the tasks associated with funding agreement close-out. Recipients of EAF and higher complexity programs had more difficulty and fewer recipients of CSJ felt the tasks were easy compared to Year 2.
- Among applicants approved for funding, seven in ten felt that it was easy to submit the final budget (70%, unchanged), complete the final project report (70%, -1 pt), submit the final project report (70%, -1 pts), and complete the final budget/final claim (68%, -1 pt). Fewer felt it was easy to resolve any outstanding issues with funding (49%, -2 pts).
- NHSP applicants were more likely to find it easy to complete most aspects of the funding agreement close-out. EAF applicants and those who applied to higher complexity programs were less likely and ratings have declined for recipients of CSJ for the ease of submitting and completing the final report and completing the final budget/claim.
Qualitative Research
Organizational Capacity to Complete the Application Process
- Many applicants felt well-equipped to complete the Gs&Cs application process and much of this was underpinned by past experiences of completing funding applications.
- Applicants from larger organizations tended to agree that the application process favours organizations such as their own, which have more resources and the necessary staff expertise at their disposal. Some applicants from smaller organizations shared this perspective based on their experiences, but this view was not universal. In contrast, other applicants from smaller organizations felt that the process of applying was “straightforward” and stressed that it was their responsibility to “educate” themselves about the process.
- When asked about inclusivity and catering to the diverse backgrounds of applicants, applicants reported few inclusivity or accessibility barriers.
Top-of-mind Associations with the Application Process
- When asked to reflect on the entire application process, applicants highlighted a number of aspects of the process which impressed them. This included:
- Reminder notifications and outreach from programs about upcoming funding opportunities.
- Smooth and straightforward application process.
- The existence of grants and contributions for important projects.
- Shift to digital application channels.
- Shorter forms.
- Helpfulness of Service Canada agents.
- Applicants tended to be more vocal around the negative than the positive experiences. This was especially true among applicants who recorded lower overall satisfaction scores in the quantitative survey. The common frustrations were:
- Extensive delays in notification of funding decisions, particularly in the case of Sectoral Initiative Program and the Skilled Trades Awareness and Readiness Program.
- Lack of success in receiving funding, underscored by a few Social Development Partnerships Program and Sectoral Initiative Program applicants specifically.
- Too much detail required of applicants, notably in the case of Social Development Partnerships Program and Women's Employment Readiness Pilot Program
- Often-tight project turnaround times after receiving a funding decision.
- Detailed Findings on the Application Journey
- Overall, regardless of whether an organization received funding or not, ESDC was seen by many as doing an excellent job of raising awareness about the various funding programs that are available. Past applicants heavily relied on alerts from each program about upcoming rounds of funding.
- Program websites were the main source of support used by applicants and their feedback on the websites and supporting documents was predominantly positive.
- Program information sessions were appreciated but opinions were mixed on their perceived value. The main criticism was that the sessions mainly shared information already listed on the website. Still, many applicants appreciated the opportunity to learn more about programs and take part in the Q&A (if offered).
- Past experiences of funding application writing resulted in high levels of comfort with the application forms. The application forms were described as “standard” or “typical” to what applicants were used to filling out. Further, the clarity of instructions was appreciated. Perceived repetitiveness of questions and complexity of filling out budget forms were the main issues experienced by some applicants.
- The majority of applicants opted to fill out the applications online – via GCOS or program websites. The main pain points found were the perceived complicated multi-step process of creating a GCOS account and logging into GCOS, and the linking of personal and organizational accounts within GCOS. The latter did not make intuitive sense to some applicants. Experiences of using GCOS once within the site were largely positive.
- The research suggests that greater use of GCOS in future could be achieved through: raising awareness of GCOS and potential benefits and simplifying the steps for creating an account and logging into the system.
- Many were satisfied with the level of guidance offered in the applicant guide and did not have to turn to any additional resources.
- Delays in receiving a funding decision were a common and major pain point among applicants. Some reported waiting for 6-12 months, in some cases even longer, to receive a funding decision. The lack of communication on when decisions will be made and the provision of very vague timelines exacerbated negative perceptions. Having to wait for an extended period forced many organizations into limbo and created feelings of uncertainty.
- Using GCOS to submit their reports tended to be a positive experience for funded applicants, because the portal provided a centralized place to submit feedback and upload documents.
Future Improvements and the Ideal Experience
Applicants offered numerous improvement suggestions. Highlights include:
- Raising Awareness. Offer an option to subscribe to funding opportunities, across federal departments and agencies, based on topic or population served.
- Application Supports. Add more examples in the applicant guide of what an optimal response looks like. For lower capacity organizations, offer coaching, mentors and provide one-on-one support especially in relation to the budget forms.
- Application Form. Consider rationalizing the number of questions, including more closed-ended questions and simplifying budget forms.
- Channels for Completing and Submitting an Application. More awareness and education on the benefits of a GCOS account while maintaining other channels for submitting an application. Opportunities also exist in simplifying the process for creating a GCOS account, offering the option to download application in a Word document for editing, and the option to upload completed applications on GCOS.
- Receiving a Decision. 2 to 3 months was considered a reasonable timeframe for receiving a decision. When delays occur, proactive outreach with revised timelines should be sent.
- Reporting and Managing Funded Organizations. More education and support on collecting GBA+ data and more flexibility in adjusting projects based on changing context.
The Impact of Funding
- The most positive and most significant impact of receiving funding was on communities served by the organizations, as members of those communities were able to access services and supports they needed. In contrast, unfunded applications meant that programs or activities did not move forward or did so but in a reduced capacity.
Interest in future ESDC funding opportunities
- Most organizations, whether they have been successful or unsuccessful in their application, shared that they would certainly be applying for ESDC grants and contributions again in the future.
- The main consideration for determining whether to apply for funding in future were identifying if there is a need for a program, and whether organizations have the capacity to execute it.
Objectives and Methodology
Background: Gs&Cs Client Experience Research
The Program Operations Branch (POB) within Employment and Social Development Canada (ESDC) handles the operation and coordination of most Grants and Contributions (Gs&Cs) programs across the Department. The Branch actively works to improve the design, administration and delivery of Grants and Contributions programs. This notably includes making the process of applying for funding accessible, efficient and effective through quick and easy online services and standardized forms and agreements.
To comply with the Treasury Board Policy on Service and Digital and the Employment and Social Development Canada (ESDC) Service Strategy, POB requires the gathering of data on the client experience to assist in effectively managing service delivery. To meet these requirements, POB uses the Client Experience (CX) Performance Measurement Framework to guide the research on the Gs&Cs business line of client service delivery experience. The data collected with the framework, which includes qualitative and quantitative dimensions, will provide key insights and diagnostics on client experience to help:
- Better understand the needs and expectations of organizations, including organizations of different types and serving different client groups;
- Identify obstacles and challenges from the perspective of the organization;
- Identify strengths and opportunities to improve CX, including opportunities to implement changes and test new approaches related to program design and delivery;
- Assess the extent to which clients’ expectations are being met;
- Identify and prioritize resources and opportunities tied to CX improvements;
- Assess the impact of improvements made to the CX over time; and
- Explore how ESDC’s leadership at all levels can play an important role in creating a positive CX.
This is the third year of POB’s Client Experience Research Program (FY 2022/23 into 2023/24). Year 3 will build on previous years of research to support the systematic and integrated approach to measure and improve CX in Gs&Cs service delivery which also allows the department to track process on consistent and comparable CX indicators over time.
The detailed methodology and research instruments for all aspects of the research are available under a separate cover.
Note: Program intakes in grants and contributions vary widely, meaning that some year-to-year or program comparisons should be done with caution.
Research Objectives
The Client Experience Research Project is carried out in two phases, a quantitative phase and a qualitative phase.
The overarching objectives of the Year 3 quantitative research are to:
- Focus on monitoring selected POB Gs&Cs programs that were previously studied in Year 1 and Year 2;
- Capture CX insights from additional programs that have not previously been studied; and
- Increase awareness of organizational characteristics and diversity considerations in the CX space.
The research objectives for the quantitative research were to:
- Measure service satisfaction, ease, and effectiveness of the end-to-end client experience, taking into account the CX with various service channels and the CX of different client groups;
- Assess year-over-year changes for programs included in previous years;
- Provide diagnostic insights regarding the strengths and opportunities for improvement; and
- Assess how potential changes in service delivery might affect the CX.
The qualitative research explored the lived experiences of Gs&Cs applicants through focus group discussions and individual interviews. Building on the quantitative research, the qualitative phase of this project was structured around the following:
- Organizational capacity to effectively and efficiently complete the application process and the extent to which the process is inclusive;
- What impressed and what frustrated applicants when it came to their overall experience with Gs&Cs;
- A deep dive into all aspects of the application journey, including GCOS;
- The ideal application journey and opportunities for future improvements;
- The impact of funding and non-funding decisions; and
- The future interest in funding opportunities.
Methodology – Quantitative Research
An online survey was conducted with 3,041 Service Canada applicants across 11 programs. The survey was fielded from April 19 to June 9, 2023, and took on average approximately 16 minutes to complete. The survey sample size has a margin of error of +/-1.75%.
Applicants were defined as organizations that applied for grants and contributions funding (including both funded and unfunded) within the last two intake years (FY 2020/21 and 2021/22). A random sampling of organizations that applied to CSJ or NHSP were included, while all organizations that applied for the remaining programs were invited to complete the survey. ESDC distributed the survey links to participating organizations.
The exact intake periods referred to in this study are as follows:
Fiscal Year 2021-22:
- Enabling Accessibility Fund (EAF)
- Enabling Accessibility Fund: Mid-sized (Contribution) – June 4, 2021 to August 3, 2021
- Enabling Accessibility Fund: Early Learning and Child Care (Grant) – December 3, 2021 to January 28, 2022
- Enabling Accessibility Fund: Youth Innovation (Grant) – June 4, 2021 to October 29 2021
- New Horizons for Seniors Program (NHSP)
- Community Based stream – November 23, 2021 to December 21, 2021
- Canada Summer Jobs (CSJ) – December 16, 2021 to January 25, 2022
- Apprenticeship Service (AS) – August 3, 2021 to September 14, 2021
- Workplace Opportunities: Removing Barriers to Equity (WORBE) – January 18, 2022 to February 15, 2022
- Social Development Partnerships Program – Children and Families (SDPP-C&F) – May 25, 2021 to July 6, 2021
- Financial Empowerment of Low-Income People
- Social Inclusion of Vulnerable Children and Youth
- Social Development Partnerships Program – Disability (SDPP-D) – Disability Inclusion – August 4, 2022 to September 17, 2022
- National AccessAbility Week (Grant)
- Phase 1 – Partnerships (Contribution)
- Supports for Student Learning Program (SSLP) – August 6, 2021 to September 2, 2021
- Women’s Employment Readiness (WER) Pilot Program – June 1, 2021 to June 25, 2021
- Skilled Trades Awareness and Readiness Program (STAR) – May 13, 2021 to June 28, 2021
Fiscal Year 2020-21:
- Sectoral Initiatives Program (SIP*) – January 22, 2021 to March 4, 2021
*SIP has been replaced by The Sectoral Workforce Solutions Program (SWSP). The SWSP builds on and replaces the SIP.
Three (3) of the programs included in the survey have different streams that applicants can apply for.
The relevant streams referred to in this study are as follows:
- Enabling Accessibility Fund (EAF):
- Enabling Accessibility Fund: Mid-Sized (Contribution)
- Enabling Accessibility Fund: Early Learning and Child Care (Grant)
- Enabling Accessibility Fund: Youth Innovation (Grant)*
- Social Development Partnerships Program – Children and Families (SDPP-C&F):
- Financial Empowerment of Low-Income People
- Social Inclusion of Vulnerable Children and Youth
- Social Development Partnerships Program – Disability (SDPP-D) – Disability Inclusion:
- Accessible Canada - Phase 1 Partnerships (Contribution)
- National AccessAbility Week (NAAW) (Grant)
Of the 9,862 organizations that were invited to participate, a total of 3,041 organizations completed the survey. The response rate for the survey was 31% which is considered strong compared to industry standards for a survey of this nature.
Total | |
Invited to participate | 9862 |
Click-through | 3924 |
Partial Completes | 883 |
Terminates | 0 |
Over Quota | 0 |
Completed Surveys | 3041 |
Response Rate | 31% |
Abbreviation | Invited | Completed | Response rate | |
CSJ | Canada Summer Jobs | 3250 | 1004 | 31% |
EAF | Enabling Accessibility Fund | 1063 | 300 | 28% |
NHSP | New Horizons for Seniors Program | 3250 | 1296 | 40% |
SDPP- C&F | Social Development Partnerships Program – Children and Families | 904 | 168 | 19% |
SDPP- D | Social Development Partnerships Program – Disability Inclusion | 200 | 46 | 23% |
AS | Apprenticeship Service | 36 | 11 | 31% |
WORBE | Workplace Opportunities: Removing Barriers to Equity | 79 |
22 | 28% |
SSLP | Supports for Student Learning Program | 80 | 24 | 30% |
WER | Women’s Employment Readiness | 214 | 51 | 24% |
STAR | Skilled Trades Awareness and Readiness Program | 23 | 3 | 13% |
SIP | Sectoral Initiatives Program | 763 | 116 | 15% |
Total | 9862 | 3041 | 31% |
---|
Note: “n=” represents the number of respondents to a question, it is known in statistical language as the size of the sample. Sample sizes below n=30 are considered small and below n=10 considered very small. Results of small and very small sample sizes should be interpreted with caution and findings viewed as directional in nature.
The quantitative survey also served as a recruitment tool for the qualitative research, by asking if organizations would be interested in voluntarily participating in focus groups or in-depth interviews at a later date.
Only those organizations with email contact information on file were invited to participate, which does not represent the total volume of applicants.
Calibration of the Data – Quantitative Approach
Weighting adjustments were made to bring the sample into proportion with the universe by program volume (depending on the most recent intake for the particular program).
The final data was weighted by the number of respondents in each program in proportion to the total number of applicants as detailed below. The universe proportions used to develop the targets were based on figures provided by ESDC.
Program | #Of Applicants | %Of Total |
Canada Summer Jobs | 41463 | 84.94% |
Enabling Accessibility Fund | 1040 | 2.13% |
New Horizons for Seniors Program | 4176 | 8.56% |
All programs but CSJ, EAF and NHSP | 1252 | 2.56% |
Social Development Partnerships Program – Children and Families | 881 | 1.80% |
Social Development Partnerships Program – Disability Inclusion | 195 | 0.40% |
Apprenticeship Service | 36 | 0.07% |
Workplace Opportunities: Removing Barriers to Equity | 74 |
0.15% |
Supports for Student Learning Program | 75 | 0.15% |
Women’s Employment Readiness | 210 | 0.43% |
Skilled Trades Awareness and Readiness Program | 23 | 0.05% |
Sectoral Initiatives Program | 639 | 1.31% |
Total | 48812 |
---|
Note Regarding Program Complexity
For the purpose of this study, program complexity has been defined as low, moderate, and high as outlined in the following table. These service standard clusters are informed by departmental reporting in the Performance Measurement and Management Framework.
Note: Canada Summer Jobs does not fit into these distinct clusters and has been analyzed as a separate group.
Program Complexity Level | Description | Programs Included |
---|---|---|
Low complexity programs | Grant programs in the 112 days/16 week review period |
|
Moderate delivery-complexity programs | Contribution streams in the 126 days/18 week review period |
|
High-delivery complexity programs | Contribution streams in the 154 days/22 week review period |
|
Note on Reporting Conventions – Quantitative Data
Throughout the report, subgroup results have been compared to average of all applicants (i.e., total) and statistically significant differences at the 95% confidence level noted using green and red boxes.
Where subgroup results are statistically higher than the total a green box has been used and where results are statistically lower than the total a red box has been used.
Additionally, where results in Year 3 were statistically higher than Year 2, a green arrow has been used and where results in Year 3 were statistically lower than Year 2, a red arrow has been used.
Significantly higher / lower than total
Significantly higher / lower than Year 2
For the purposes of legibility, values of less than 3% have not been labelled in charts throughout the report.
Bases marked with a * indicate a small sample size and with ** indicate very small sample size, so results should be interpreted with caution and findings viewed as directional in nature.
As part of the analysis, a key drivers’ analysis was conducted to identify the factors which have the greatest impact on overall satisfaction. Throughout the report, the top 5 drivers have been identified using a yellow box.
Top 5 driver of satisfaction
Methodology – Qualitative Research
Respondents from the Gs&Cs client experience survey were asked a question whether they would be interested in taking part in follow-up qualitative research. After conducting an analysis of the sample that opted-in to ensure a mix of programs, regions, and to ensure inclusion of applicants in both official languages, potential applicants were contacted randomly and asked if they would like to be taken through the screening questionnaire to confirm their eligibility for an in-depth interview or online focus group.

Click for larger view
As shown in the tables below, 4 focus groups and 26 in-depth interviews were conducted.
Focus Groups Composition | Date and Time |
---|---|
Group 1: Unfunded applicants to any program NATIONAL - ENGLISH |
July 26 at 10AM ET 6 Applicants |
Group 2: Funded applicants to any program NATIONAL - ENGLISH |
July 26 at 3PM ET 8 Applicants |
Group 3: Unfunded applicants to any program QUEBEC or Official Language Minority Communities (OLMC) - FRENCH |
July 27 at 10AM ET 7 Applicants |
Group 4: Funded applicants to any program QUEBEC or Official Language Minority Communities (OLMC) - FRENCH |
July 27 at 1PM ET 7 Applicants |
In-depth Interviews Composition | Date and Time |
---|---|
In-depth interviews. The following programs were prioritized: Apprenticeship Service (AS), Workplace Opportunities: Removing Barriers to Equity (WORBE), Social Development Partnerships Program - Disability (SDPP-D), Supports for Student Learning Program (SSLP), Sectoral Initiatives Program (SIP). | July 18 to August 10 19 English Applicants 7 French Applicants |
Methodology – Qualitative Research Data Collection, Analysis
Data Collection
With applicants’ consent, all qualitative research sessions are both audio and video taped. Verbatim transcripts from each and every focus group and interview is created; however, names or personal identifying details are not captured and/or scrubbed or redacted by the moderator to ensure applicants’ privacy.
Moderators also capture high level findings on each topic of their own observations – what the overall reaction was, any nuances, and any non-verbal cues on body language or tone. Because our transcripts are anonymous, it is not possible to comment on any variations by group or audience, if they have not been placed in separate groups – for example, moderators cannot provide a sense of different opinions by older vs. younger applicants, or males vs. females, depending on the topic.
Data Analysis
We identify the following elements in the qualitative analysis:
- Universal agreement where applicants all agree, or there is agreement across different groups of stakeholders
- Consensus perspectives that reflect the view of most applicants; areas of wide agreement without much counter point (Many, most, several)
- Conflicting or polarized perspectives where views are much more divided, or if there is a spectrum or variety of views (Some vs others)
- Minority perspectives, often expressed by one or two applicants as a counterpoint to a consensus viewpoint, or if they have an individual take or example/story (a few, a couple, mention)
- Verbatim commentary, providing examples of what applicants actually said during a discussion (direct unattributed quotes)
- External context, for this project it is the results of quantitative research that provided a foundation for the qualitative research conducted and the discussion questions posed.
Note on Interpretation of Qualitative Findings
The value of qualitative research is in exploring the issues and experiences of research participants in depth, free from the constraints of a structured quantitative questionnaire. Qualitative evidence is rich and allows researchers to hear first-hand the underlying factors shaping experiences and opinions, as well as the interplay between factors.
Qualitative findings should not be extrapolated to the broader population, as they are not statistically projectable. Notable nuances that emerged in the interviews have been highlighted where relevant and these should be treated as strictly directional.
The qualitative findings should thus be viewed as complementary to the quantitative survey findings in terms of building a more complete understanding of the Gs&Cs client experience.