Consider a thought experiment about some two dozen Quebec voters in the recent election, a group of bilingual Quebec anglos living in a rural area, found to include three or four surprising PQ voters, but with the rest voting for the provincial Liberals. Suppose further that this is how they are composed: ‘John Smith’, son of a poor Scots-Canadian millworker, has made a substantial fortune, first in a local pulp and paper business, then shifting all his capital into other and distant fields. He has an annual income of $2 million. Through three consecutive unhappy and expensive marriages to now deceased wives, he has produced a brood of over a dozen children, all of them hopeless wastrels, drunks, drug addicts, and militant animal liberationists, all incapable of making a living on their own, sometimes drifting away, but mostly living on the Smith rural estate. Little fond of any of them, the Smith paterfamilias has provided each with separate trust funds permanently outside their control, paying each $30,000 a year, adjustable only for inflation. He has also prepared a challenge-proof will that will maintain the trust funds only if they are not touched. He wills all his major assets to his distant company employees, household servants, and favourite charities.
Further assume that some current political party spin doctor, doing an election post-mortem by using provincial demographic and tax data, has found that the total incomes of the little region come to $2,500,000. This is actually made up of $2 million from Smith, a quarter of a million from the trust-funded wastrel Smith offspring, and another quarter of a million from the household staff. The spin doctor, dividing the $2,500,000 between a total of 25 individuals, concludes that these individuals have an ‘average’ income of $100,000, indicating that it is composed of voters who are members of the affluent part of the middle class. He therefore manages to be completely wrong about every single one of them.
A political operative of a half century ago would be even more likely than the current one to make so crude a simple statistical error; howlers of that kind were much more common fifty years ago, especially by journalists. On the other hand, he would also take for granted that, even with more tedious labour required, he would need to learn more about the Smith village. The current investigator, perhaps unlikely to be as wildly misled as the hypothetical tale suggests, would nonetheless be more likely to depend on his numerical data alone, even if more skilfully manipulated. Nowadays, political science or journalism graduates are likely at a minimum to have yawned their way through a course in ‘Quantitative Methods’, or may have acquired some quicker wisdom from the old Darrell Huff classic, How To Lie With Statistics. Most would immediately recognize the fallacy of using an arithmetic mean to describe the motley individual fortunes of the Smiths and the servant staff. But they might not realize that even a more sophisticated use of a weighted mean or mode would still not entirely capture the Smith saga.
They are, after all, living in a world in which, often without realizing it, they would now be far more likely to be really familiar only with some statistical version of the Smiths, or the Joneses, or the Tremblays, Gagnons, or Goldbergs, than with any extended personal contact, or with the detailed histories and biographies of any of them For decades now, they will have found themselves surrounded and intellectually seduced, sometimes almost unconsciously, by now omnivorous statistical information and explanation.
In some ways, this has been just a particular variation of the great transformation of all thought in the last half of the 20th century, as it grew more university-monopolized, professionalized, specialized, and exponentially expanded. But the particular advance of statistics has also been a kind of unnoticed revolution in its own right; a readable historical account of how this happened can be found in The Lady Tasting Tea, by David Salsburg, himself a distinguished statistician. Statistical data, sampling, and theory, are now far more than useful tools; they have become dominant components in political policy studies, bureaucratic position papers, corporate marketing plans, research reports in the natural sciences, theses in the social sciences, and journalistic expositions.
Only men and women well over sixty are likely to be fully aware how recent and remarkable this change has been. Many people may have been misled by recalling the old jibe about ‘lies, damned lies, and statistics’, attributed to both Disraeli and Mark Twain, and hence suggesting that annoyed familiarity goes back at least to the 19th century. But it is less often realized that, however annoying, the ‘statistics’ bemoaned by Disraeli or Twain still just meant ‘state information’, that could include things like geological survey maps. Even as it came to be applied to exclusively numerical data, until well into the 20th century, the data was largely ‘raw’. As late as the early 1930s, it still most commonly consisted of overall government-tabulated demographic numbers, like births and deaths or employed and unemployed, until late in the decade making no use of sampling and sample comparisons.
Quantitative reasoning about probability and statistics was already being developed in the 17th century by thinkers like William Petty and Blaise Pascal, and gradually advanced over the next two centuries. But most of the major contributors to the mathematical statistics used today lived in the first half of the 20th century; the single most important one, Sir Ronald Fisher, died in 1962. The textbooks that university ‘Q.M.’ students are now required to buy are mostly just glosses of a 1935 book by Fisher, The Design of Experiments. But even as late as the time of his death, statistics was still something of an academic orphan, usually represented by a single course given in the mathematics departments, where it was often regarded with dislike by the pure mathematicians, who thought of it as a backwater for mediocrities.
This view of statistics has the same irony as found in almost all social science. In all of them, founding fathers like Adam Smith and Max Weber were undoubtedly intelligent, but they fathered disciplines that eventually became natural homes for narrow and gullible intellectual mediocrity. Fisher, and William Sealy Gosset [who had to write his major statistical papers under the pseudonym ‘Student’, as he was employed as a brewmaster at Guinness, statistical thinking long vital in the creation of good beer], were intellectual giants in their own right, entirely in the same class as ‘paradigmatic’ titans of physics and chemistry. On the other hand, just as very few of the now giant armies of social scientists, perhaps none, would be intellectual matches for Weber or Keynes, even when provided with advanced statistical techniques and computers on which to apply them, many current professors of statistics itself are intellectually mediocre, ill-educated and unimaginative, just as the pure mathematicians had long complained.
Many intimidatingly technical statistical publications in all fields, although now seldom showing elementary howlers like using the wrong choice of an unweighted mean, are still not very good. Many of them should have the the ghosts of Fisher and Gosset regularly tearing their hair, since, like most real geniuses in every field of thought, their greatest intellectual achievements came in understanding the fundamental limitations of their methods. But beyond that, undeniably indispensable as many modern statistical analyses in every field have become, there is still something ‘wrong’ about even the better ones, especially in politics and economics. It is that we are now living in a culture, at the scholarly level as well as the mass media one, in which it is regularly forgotten that the original purpose of converting information about human beings into numerical samples and quantified conclusions was to provide quick, roughly accurate, and practically useful insights about those human beings, not to replace altogether the kind of human understanding only possible through a combination of direct human experience and at least some knowledge of history, biography, literature, and philosophy.
Fisher and Gosset themselves understood that much of their work did not stand on entirely secure and consensually accepted philosophical premises. In fact, different founding premises yield different kinds of statistics, ‘frequentist’ versus ‘Bayesian’. Fundamental primises contend much as they do in non-quantitative politics and economics. But many eyes glaze over at the sight of algebraic formulae, inducing a mistaken sense of mathematical infallibility, rather than just another way of murdering to dissect. For anyone who really wants to understand society and politics, in Quebec or anywhere, they need to meet, or at least read about, lots of real Smiths, Tremblays, and Goldbergs, ‘biased sample’, or not. Otherwise, they may get the habit of giving statistical abstractions more life and purpose than real human beings in their minds, which is just a modern kind of superstition.