Student Satisfaction Still A Topic Of Study


February 23rd's meeting of the college's Board of Governors sounded like a classic Rolling Stones concert, because there was much talk of satisfaction. 

Two update reports tabled before the Board examined student satisfaction levels, and how they are surveyed and analyzed. 


The first of the reports dealt with the satisfaction opinions of those enrolled in St. Clair-delivered apprenticeship programs. 

Surveying apprentices has been something of a sporadic affair during the past several years. Such students were, a few years ago, included in the standard, provincial-government-mandated, annual Key Performance Indicator (KPI) surveying conducted among all students. But, then, because apprentices aren't taught in the same manner as many regular students - and nor do they use many of the services provided to regular students - the provincial government scrapped their participation in KPIs. 

The pandemic also had the effect of suspending the administration of the 2019-20, in-person (in-class) surveying of apprentices. 

This year, however, using its own internal survey, delivered on-line, the college was able to relaunch its questioning of apprentices. 

Because it wanted to have an historic perspective to its study - to see whether rates were trending positively or negatively - the college has continued to use the now scrapped provincial KPI results as something of a "base-line" for the new data. 

And that has provided some indicators of upward trends. 

In the past two to three years of the various forms of survey-taking, it appears that St. Clair's servicing of apprentices - which had, a few years before that, been "provincially below-par" in many instances - is greatly improving. 

The college's apprentices are now much more positive and complimentary about the quality of their programs, their learning experiences, and the services and facilities/resources available to them. 

app results

Indeed, if the previously mandatory provincial survey was still being administered at all of Ontario's colleges - if the red line on the graphics (like the one above) could be extrapolated forward from where it stopped - it would now appear that the satisfaction rates of St. Clair's apprentices would exceed the provincial average in almost every category. The accompanying graphic, for instance, shows that approximately 80 percent of St. Clair's apprentices during the past couple of years have been "very satisfied/satisfied" with their "overall" experience at the college. 


The 20-years-old, provincially mandated KPI-surveying of students-in-general was suspended by the provincial government in 2019. It was supposed to be replaced by a new-and-improved survey system, but that was derailed when the pandemic broke out - because, historically, the surveying had been conducted in an in-person manner during class-time. 

As they awaited the development of the new provincial opinion-taking method, half of the province's two dozen colleges formed a consortium to continue some sort of student-opinion surveying in 2019-20 and 2020-21. 

They wanted the results for their own internal, quality-gauging purposes; and so that they would have some degree of "continuity trends" to track whenever (if ever) a new province-wide system is introduced. 

The new survey - renamed from Key Performance Indicators to the Student Experience Survey (SES) - was conducted on-line at St. Clair during the last two weeks of February. 

The administration's report on this new survey noted it had many benefits over the previous KPI system: 

In switching from the KPI to the SES, St. Clair is aligning with new survey practices that seek to determine if the lived student experience matches the planned student experience. Core capstone questions on knowledge and skills, learning experiences, student services, and facilities are maintained for continuity with historic KPI results.  However, sub-questions in these areas now profile the extent and means by which students develop and improve technical and transferable skills. New sections dedicated to remote learning and college-to-student communication better capture the expectations of modern education and provide data for audit requirements.  

Methodology Improvements: The SES will be administered online through a survey management platform, and will continue to be administered online once in-person classes resume. This administration change is responsive to student requests, and permits more detailed analysis of results as students' underlying data can be confidentially paired with student responses. Students will be able to express their thoughts, feelings, and concerns through ratings, rankings, and comments. In particular, comment cards are now integrated within the SES, with writing prompts that change according to previous responses. 

Analysis Improvements: Results will be shared using the Tableau visual analytics platform. All stakeholders can easily access the visualization appropriate for the analysis they are performing instead of collating multiple documents. The performance of programs and clusters of programs can be evaluated within and between schools, in addition to the usual year-over-year analysis. At a system level, the standardized data set permits efficient comparisons of program and college performance between institutions. 

Cost Benefits: The direct costs of administering the SES and preparing data for analysis are expected to be 60 percent less than the direct costs of the KPI. Indirect cost-savings include over 300 volunteer, work study, and reassigned work hours, and over 150 instructional hours reclaimed in moving the survey online. 

Read about the college’s still remarkably strong enrolment growth: 

Read about a new program – well, not really:  

Read about the college’s enhanced expertise with on-line education and services: 

Read about the college’s recent beautification efforts: