Writing in the National Post, Macdonald-Laurier Institute Senior Fellow Philip Cross says the proliferation of pseudo-knowledge and “sham data” is obscuring key issues facing Canadians.
Cross cites as an example a recent CIBC study declaring job quality in Canada to have hit an all-time low – just of many studies that provided a skewed portrait of the country using unreliable numbers.
“The result is worse than having no data—it is a complete misrepresentation of reality”, Cross writes. “A little bit of data is as dangerous as a little bit of knowledge”.
By Philip Cross, March 9, 2015
The CIBC job quality index drew reams of attention last week because it hit an all-time low. If Statistics Canada had made such a claim, alarm bells would have gone off across the country; seriously, is it possible today’s quality of jobs is the worst ever? Worse than at the depth of the recession in 1982 when unemployment hit 13%? Worse than in 2008-2009, when financial markets around the world were frozen and some of our most iconic firms were filing for bankruptcy? Economists would have said that doesn’t pass the smell test, because it doesn’t.
I have to laugh at the image some people have of Statistics Canada’s employees, walking around in white lab coats and safety goggles dispassionately dissecting the latest statistical results as if they were as clear cut as a science experiment. The reality is that there is a messy element of arbitrariness to every statistical measure; Statcan publishes 10 variants of the unemployment rate, three ways of measuring GDP, and endless permutations of low income.(And the best characterization I ever heard of Statcan’s dress code was “Star Wars bar.”)
But when I compare Statcan to some other organizations, I can see why statisticians have this pristine image of quasi-scientists. Take the CIBC job quality index, a composite of part-time work, self-employment and the industry of employment. Statcan would never publish something as unrefined as this measure of job quality because it doesn’t meet the minimum standards for data quality. For starters, it is far too simplistic in reducing the many dimensions of job quality to only three variables. The International Labor Organization lists over a dozen factors that relate to job quality, such as safety and accessibility where there have been huge advances over time. Nor is there any rationale for how the index is constructed; just because weighting the components equally requires the least amount of thought is not a defensible methodology. There is no way of knowing what the weights should be, a sure-fire warning to proceed cautiously if at all.
The job quality index’s biggest problem is that the three components chosen have no compelling rationale as measures of job quality. The share of part-time work and self-employment can move for reasons unrelated to the quality of jobs. For example, older workers have a marked preference for both part-time work and self-employment, so some of their recent growth reflects a voluntary choice by workers who might otherwise have permanently retired from the labour force.
The worst decision in constructing the index was using the average pay by industry as a proxy for what jobs are actually paying. I say worst, because you don’t have to use a proxy. Statcan publishes the pay levels earned by all employees. Here is what it shows in the nearby graph. Since 2010, low paying jobs at less than $12 an hour have fallen by half a million positions. Jobs paying between $12 and $20 an hour have grown slightly; those paying $20 to $30 an hour increased by 300,000. And jobs paying more than $30 an hour have expanded by a whopping one million positions. Does that sound like the quality of jobs is deteriorating, never mind sinking to the worst on record?
By comparison, the CIBC index shows that jobs have grown the most in industries that pay the least. However, that misses that most job creation has been in jobs that pay the highest wages, irrespective of the average wage in a particular industry. The only reason to use a proxy of wages instead of the actual wage data is that the track record is longer (wage data only go back to 1997). Again, no reputable statistical agency would make such a decision. If a time series is wrong, and a better one exists for a shorter period, you always use the better data.
Aiding and abetting the media in popularizing this meaningless data series is the silence of economists at other financial institutions. These same economists gleefully jump on every mistake Statcan makes, which is fair enough since the government compels taxpayers to feed its statistical beast. Douglas Porter, BMO’s head economist, openly boasts about “bringing to light the most serious error on record by Statistics Canada” on the CD Howe website (referring to a mistake in consumer prices). Yet for years no one has commented on the obvious flaws of the job quality index.
The CIBC job quality index is an example of the statistical-media complex at its worst. An index is produced, using minimal amounts of math, stats and economic theory. However, because it apparently quantifies something, a gullible media eats it up. Attach a number to something and lo it becomes sacrosanct in our society’s futile need for everything in our public discussions to be evidence-based, even when the smallest amount of reflection reveals the data make no sense. The result is worse than having no data—it is a complete misrepresentation of reality. A little bit of data is as dangerous as a little bit of knowledge.
The number one issue facing this country is not income inequality, as CIBC claims, but the proliferation of pseudo-knowledge and sham data—nonsense free floating in our public discourse, but treated seriously by both mainstream and fringe media and commentators. It is the most important issue because it has ramifications for all the issues society faces and what needs to be done about it.
Philip Cross is former Chief Economic Analyst at Statistics Canada.