Utilities Section

Tagline

Inspiring Excellence in Education and Life

Utility Tools Section

Blog Comment

Eye of Education Blog

A Rose by any Other Name: The Fraser Institute's Elementary School Rankings
Gordon MacIntyre

The Fraser Institute ranks our school highly every year. This is flattering, but I wonder: How would they know? Actually, the answer is they don't know. The Fraser Institute has never known because its methodology – the 'science' –they purport to use is conceptually flawed and downright contrived. The Fraser Institute knows this, yet steams ahead regardless, driven by its unsupported and ill-conceived belief that rankings benefit schools.

The Fraser Institute typically releases its annual ranking of Elementary Schools around spring break. Although we anticipate that we will once again rank highly, we remain concerned about the misuse of data in this way. I have written in detail about the glaring holes in the Fraser Institute's methodology for ranking secondary schools, but the elementary rankings are similarly flawed.

First, the Foundation Skills Assessments (FSAs), the mandatory assessments in reading, writing and math for all students in Grade 4 and Grade 7, are the sole means by which the Fraser Institute ranks elementary schools. Forget for a moment that these are one-off, decontextualized tests; the FSAs are no longer standardised tests by definition. Although the multiple choice sections are scored electronically by the Ministry, all of the written response components in reading, writing and math are scored by individual schools and districts. The Ministry offloaded the responsibility for marking these sections to schools and districts a number of years ago, primarily as a cost-savings measure and ostensibly to make the data available in a timely manner. Prior to that, provincial teams of markers were rigorously trained by the Ministry in July so that there was a uniform standard of marking across the province.

There is no longer any assurance of a uniform standard of marking. How one school or district applies the scoring guides can vary from another – any pretense of inter-relater reliability has long since vanished. The Fraser Institute is aware that it uses subjective results that are scored and submitted by individual schools and districts, but nevertheless treats them as standardised data.

Second, the Fraser Institute introduces a gender weighting category that comprises 25% of the overall ranking. If boys and girls in a school perform slightly differently from each other, a school can be penalised. Obviously, this applies to co-educational schools, but single-sex schools enjoy an exemption. Just as it does when ranking secondary schools, the Fraser Institute uses a different formula for co-educational schools than single-sex schools, but compares them to each other nonetheless.

Third, some schools have figured out how to game the Fraser Institute's system by excluding students from writing the tests. For example, some local schools achieved a number one ranking a couple of years ago by exempting 15% of their students from the reading and writing tests. They were not penalized by the Fraser Institute for this high rate of exclusion because it was under the provincial norm. Across the province, many parents refuse to have their children write the FSAs as a form of protest over the Fraser Institute's rankings. It may be legitimate in exceptional cases to exclude a student from sitting the tests due to a learning challenge, but exempting up to 15% of students at one school is tactical, not exceptional. For reference, Mulgrave typically has 97-100% of its students write the tests.

The Fraser Institute badly misuses educational data. This is not surprising given that it has been widely discredited over its rankings of social and economic progress. Recently, a UN agency roundly dismissed its rankings of a country's economic freedom index due to major conceptual flaws (UN Agency Slams Fraser Institute's Methodology). These flaws included double counting the same indicators, using data from multiple sources for the same variable, and cherry picking which components they included in their methodology without justifying their choices. There is a parallel here that applies to the school rankings.

The Fraser Institute continues to claim its rankings are a service to schools. It continues to believe that if you call an unsightly weed a rose, it shall smell sweet. One can only hope that the Fraser Institute's scent shall waste itself on the desert air.

Gordon MacIntyre
Deputy Head of School

You must be logged in to post a comment.
Powered by Finalsite