Tag Archives: PAH

Lasting Conservative Lessons from Liberal Reasoners

Lasting Conservative Lessons from Liberal Reasoners

Every year, new books appear about philosophical ideas and politics. Truth be told, even those getting rapturous reviews seldom have any impact at all on thought or action in public affairs. The interesting exceptions are often new editions, or sequels or reprints, of the few that did. Two examples have appeared this year, both offering flashbacks to two great debates in which their authors had a central part. One is a followup to a bombshell book of 1945 and the outset of the Cold War; the other is a reprint of an even more sensational work of 1986, only a little before the collapse of Communism, and in the midst of a very different intellectual and political era, which it helped redefine. The first is a posthumously published and edited collection of late essays by Karl Popper, After ‘The Open Society’ (Routledge), and the other is a 25th anniversary edition of Allan Bloom’s The Closing of the American Mind (Simon & Schuster), with a new introductory essay by Andrew Ferguson.

Popper, an Austrian Jew by origin who fled from the Nazis to New Zealand in 1937, has had plenty of critics, but has remained influential for over half a century as both a philosophical theorist and as a polemicist. He had introduced, starting in the 1930s with a work in German on ‘the logic of scientific discovery’, a novel criterion for assessing the value of theories in both the natural sciences and the fuzzier social ones. The criterion was not one of empirical verification, but of refutation; he pointed out that no number, no matter how large, of observed white swans could definitively ‘verify’ the universal claim that ‘all swans are white’, but the discovery of only a single black swan would refute the claim. He developed and expanded this ‘fallibilist’ epistemology in a long series of later books, using it to analyze the persuasiveness of accomplishments in natural science and the feebleness of social and psychological theories. His first two English-language works, The Poverty of Historicism (1942) and The Open Society and its Enemies (1945) were also ferociously polemical, attacking Marx and Marxism, Freud and Freudianism, and what he regarded as ‘unfalsifiable’ and ‘prophetic’ theories in general. In his earlier years, Popper had still remained, nonetheless, essentially a ‘reformist liberal’ or non-Marxist social democrat, advocating a cautious kind of ‘piecemeal social engineering’.

The Open Society was both a sensation on its appearance and one of the most celebrated and enduring works of the intellectual Cold War, remaining in print for many years. The essays in After the Open Society show how Popper gradually became less and less persuaded that even moderate and evolutionary state-directed economic changes were either morally desirable or practically efficacious. Once sharply distinguishing his own views from those of Friedrich Hayek, he eventually came to the latter’s entirely free-market ideas. He drew far less attention in making this change than he had with his earlier books; he was not that distinctive in the wider neoconservative political current that arrived in the 1980s.

The new bombshell of that decade was Allan Bloom’s book, which became a bestseller beyond the level of all of Popper’s books put together. Bloom did not so much propose a single new organizing Grand Idea, like Popper’s ‘refutationism’, although all his essays showed the effects of the close interpretive readings of classical and modern authors that he had learned from his mentor, Leo Strauss. The philosophical reasoning was of a great deal more polished and subtle kind than that provided by Popper, who, like the Vienna Circle of logical positivists thinkers with whom he had once associated, was far more affected by the apparent special authority of natural science in the modern world, especially as displayed in the evolutionary biology of Darwin and the relativity and quantum physics of Einstein and Planck.

However, Bloom the Plato scholar and Strauss disciple was just as much a no-quarter polemicist as Popper, his main target not the Marxist-Leninists and their mirror-image fascist opponents, that Popper had regarded it as necessary to stretch on his rack. Bloom, an odd mixture of American academic liberal Democrat voter and elitist Europhile cultural conservative, offered a keening lament about the entire development of modern Western university education, which he charged with a complete failure in providing a moral and intellectual foundation for rising generations. The original title he had intended for his book is in some ways a better condensation of his indictment than the eventual famous one; he had intended it to be called Souls Without Longing.

Both Andrew Ferguson, the conservative journalist who provided an excellent historiographic introduction to Bloom and his book for the new commemorative edition, and the two editors of the late Popper essays, have been bound to realize that the world has changed so much over the last three decades that many of the disputes that concerned these authors already have a somewhat archaic quality. For younger readers interested in philosophy and politics today, at least on first approach, reading might at first seem like reading, say, a couple of brilliant Victorian authors discussing church-state relations, or the debates between Marxist and pacifist poets and novelists that were a feature of the Great Depression. But Popper and Bloom are both very much still worth reading, not only for the many things they say that continue to have substantial importance, but even more to appreciate and understand the sheer intensity and force with which they analyzed the nature of modern – non-‘post-modern’ – Western culture and political thought.

The impact of both writers was not entirely because of what they said. Popper provided a striking individual technique in showing the fallacies of Marxist and Freudian reasoning, but with conclusions not so different from many rival books of the Cold War by liberal and social democratic writers still on the left, but bitterly disillusioned by Marxism and by the Soviert Union. Bloom gave a highly condensed, and very entertaining, summary of almost all the underlying discontents that historically-informed and even moderately literate adults have been bound to have about mass democracy, and the mass expansion of ‘higher’ education that has gone with it.

But what both writers also understood and exemplify is the remarkable power, almost ‘poetic’, that it is possible to pack into single real books. The Open Society, and a couple of Popper’s other best books, like Conjectures and Refutations (1964), Bloom’s Closing and a less-known but equally fine essay collection, Giants and Dwarfs, are not just components of respectable bibliographies and personal bookshelves; they were, and for some purposes still are, books that change minds. For countless young people, reading them was a watershed point in their lives, the end of their intellectual childhood. For many others, not even an entire undergraduate program in liberal arts or social sciences, or graduate studies and degrees, for that matter, ever accomplish the same thing. It is only necessary to observe, for example, the recent mass behaviour of college students in Montreal, to realize how infrequent this transformation to intellectual adulthood can be. Reading Popper and Bloom, even in 2012, still offers splendid lessons in the use of genuine reasoned argument and individual moral conviction in politics, and it is greatly to be hoped that some young people will still learn the same lesson from these brave and brilliant men.

 

Idealist Politics from Plato to Ed Broadbest

Idealist Politics from Plato to Ed Broadbest

Ed Broadbent has been much in the news lately. One short-term reason was that, as a widely-respected former leader of the NDP, he made it loudly clear that he did not favour Tom Mulcair as its new one, apparently fearing that Mulcair will be more interested in broadening the party’s Canada-wide appeal than with preserving what remains of the party’s traditional socialist pieties. This may not much worry Mulcair, or many other Canadians, for the present.

However, Broadbent also got some new publicity with a separate initiative that may become a more enduring problem, not only for Mulcair, but possibly even for the for the Harper Conservatives. He has created, and is actively promoting, a new left-wing thinktank, the Broadbent Institute. Its arrival has been signalled with the publication of a poll carried out for it by Environics, which purports to show that a broad majority of Canadians, of all regions and all income levels, are ‘willing to pay slightly higher taxes’, if that will ‘save social programs’.

The polling questions were loaded in the customary fashion of such enterprises, but they will probably still attract the attention of professional politicians and their spin doctors, including Conservative ones, since even the respondents who identified themselves as Conservative voters included well over half who gave similar answers to those given by over two-thirds of the NDP and Liberal voters, favouring tax increases in general, larger bites for higher income taxpayers, a 35% inheritance tax on estates above $5 million, and ‘returning corporate taxes to 2008 levels’.

These results are presented as an opening salvo of a continuing ‘Broadbent Institute Equality Project’ with large ambitions: ‘Our research shows that..Canadians of all political stripes want income inequality resolved, [sic] are ready for solutions [sic], and see the problem as decidedly un-Canadian [sic]. The report concluded, ‘It’s time to tackle income inequality once and for al,. and Canadians are prepared to do their part.’

Kelly McPartland dismissed these claims in the National Post, observing that politicians of all parties were unwise to take too seriously polls of this kind. He pointed out that Ste’phane Dion, in particular, had found out the hard way that Canadians tend to sound a lot greener on polls than they turn out to be in actual voting behaviour. But this is a little too quick. Even among Conservatives of free-market inclinations, there is a great deal less gusto for unabashed ‘Reaganism’ than there was worldwide before the 2008 Crash. Bay Street has not aroused as much resentment as Wall Street has, but it is still quite possible that some degree of Canadian economic nationalism and egalitarianism may now be in for a renewed run. It is easy enough to see why this prospect would not be warmly received by, say, Alberta oilmen. But understanding why it should not win over, say, unemployed Eastern university graduates either, requires a reminder of where Ed Broadbent is coming from.

University courses in political science sometimes have a way of being misleading at the very outset, because they often start by contrasting the ideas of Plato and Aristotle. This has long set the stage for a loaded argument, since the more instructive contrast would be between Plato and Thucydides, or even Pericles. That is, the really fundamental argument about thought and action in politics has always less been one between rival philosophical schools than between thinkers, however labelled, who begin with an ideal notion of the good society, and who then try to project it on the one in which they live, and opposing thinkers, mainly historians rather than philosophers, or men of action, who start with the historical and empirical analysis of the societies in which they find themselves, and then construct any theories accordingly.

Permanent disciples of the philosophically idealist traditions in political philosophy have never been much shaken by centuries of accumulating empirical evidence that while they may succeed in introducing some enduring particular reforms, all attempts to achieve their grander objectives, state-imposed egalitarianism above all, have not only been unsuccessful in practice, but have frequently been the cause of epic human catastrophes.

The dangers of coercive utopianism were already recognized well before the 20th century by English political thinkers, both liberal and conservative. But the middleclass and gradualist Fabian leaders who appeared at the end of the Victorian age, Bernard Shaw and the Webbs, attempted an end run around these apprehensions, roughly synthesizing reformist liberalism with socialism, although with little concern for individual freedom. The Webbs were not only diligently industrious researchers and pamphleteers, but also created two enduring institutions, the London School of Economics and a popular middlebrow weekly review, The New Statesman. Both of these provided a venue for their most influential disciple of the following generation, Harold Laski. Laski, who was sometimes a very foolish man, was nonetheless an inspiring teacher of politics at LSE throughout the 1930s and 1940s. Shocked by the 1931 Crash, he also thereafter moved from being a fairly liberal Fabian to becoming a quasi-Stalinist without ever becoming an orthodox Marxist; like Shaw and the Webbs themselves in their old age, it was mainly the Saint-Simonian ‘planning’ of the Soviets that all of them came to gullibly adore. Laski deeply impressed large numbers of the intellectually and politically ambitious young throughout the first half of the 20th century, including many who would later be the largely destructive leaders of the new states carved out of the disintegrating British Empire.

He had the same kind of impact on a wide variety of Canadian students, from Pierre Trudeau to Dalton Camp. One of these, who came to study with him in 1933, was C. B. Macpherson (‘Brough’ to his friends). Macpherson became a professor of political philosophy at the University of Toronto a few years later, and like Laski, was a highly popular professor, who spent his own decades impressing on young Canadians the desirability of a democracy based on economic equality. Today, outside of university political science departments, most Canadians have probably never heard of him, but by the 1960s, he had become renowned and much-honoured in the academic world, especially for his quasi-Marxist interpretation of Thomas Hobbes and John Locke, The Political Theory of Possessive Individualism, and his 1964 Massey Lectures, The Real World of Democracy. In a little-read later book called Essays in Retrieval, he also wrote a trenchant criticism of Milton Friedman, who was himself a sort of idealist philosopher, but of the ‘possessive individualism’ that Macpherson loathed.

One of Macpherson’s admiring students was Ed Broadbent, who later became yet another university political science professor, writing his doctorate on John Stuart Mill’s idea of the good society. When Macpherson died in 1987, Broadbent wrote him a warm tribute in This Magazine, which declared of his former teacher that ‘He made the most humane and significant contribution by any Canadian to modern political thought’. No doubt he would be disappointed to learn that the Bibler of modern undergraduates, Wikipedia, does not include Macpherson in its elaborate lists of notable Canadians under either ‘Educators’ or ‘Scholars’, although room was found for figures like Northrop Frye, George Grant, and Marshall McLuhan.

But Broadbent’s tribute gave away a major reason that Macpherson is not that widely remembered now, in writing that Macpherson’s one great theme was his concept of ‘possessive individualism’, his term for what he saw as the underlying 17th century philosophical basis for modern societies built around inegalitarian capitalist democracy. Broadbent, still awestruck with admiration, thought that made Macpherson an insightful authority on practically everything.

But Laski, Macpherson, and Broadbent never seem to have grasped the possibility that one might just as well make an ide’e maitrise out of ‘the political theory of academic idealism’, the preservation of a kind of eternal world of Platonic forms that are imagined as both shaping past political and economic history and providing a program for making its advocates the heralds and builders of the good society, taught to generation after generation, while ‘the real world of socialist democracy’ went right on delivering disaster after disaster. Broadbent will devote his new institute to carrying on this sisyphean enterprise; conservatives will continue to prefer the evidence of actual history, and what their own eyes and ears tell them.

 

Nicholas Hoare’s Ancestors and Lloyd Blankfein’s Contemporaries

Nicholas Hoare’s Ancestors and Lloyd Blankfein’s Contemporaries.

Many Montrealers have been saddened to hear that Nicholas Hoare, owner of the excellent small chain of bookstores bearing his name, is being forced by changing conditions to close his Montreal branch. While they know his surname for his stores, in England, and long throughout the whole financial world, the name is more commonly associated with the family’s centuries of history in merchant banking, once a vital part of world finance. What gradually happened to Hoare & Co. over the last half century provides a capsule illustration of what happened to almost all banking, with effects that eventually reached every part of modern society.

In the 19th century, on both sides of the Atlantic, the ‘commercial’ banks, those mainly taking deposits and making loans, and ‘merchant’ or ‘investment’ banks, specializing in the making of riskier and more exciting direct business investments, were once not that different in legal status. Both used to be run by small numbers of partners, affluent, financially skilled, and mostly prudent, since they were entirely liable for loans or investments that went wrong, even to being made personally destitute, which sometimes happened.. By the middle of the century, however, the commercial banks, growing huge through Victorian industrialization, began to draw away from the investment specialists. Governments first cautiously allowed them to move to ‘extended liability’, allowing larger numbers of shareholders, who were still required to take a partial hit from financial failures. But by the end of the 1930s. the big banks everywhere were being allowed to become full ‘limited liability’ corporations.

Meanwhile, the investment specialists, like Hoare & Co. in London and Goldman Sachs in New York, remained partnerships with unlimited liability, serving the vital function of launching and expanding entirely new kinds of enterprise. In the 1920s, some big American banks ventured into creating investment banking subsidiaries. But this came to be regarded (although not entirely accurately) as one of the causes of the 1929 Crash, and the Roosevelt administration’s 1934 Glass-Steagall Act strictly divided the two kinds of banking. In any case, the two decades after 1929 did not see much life on the stock market of any kind. An ordinary 1929 portfolio hit by the crash could not be sold at a profit until the 1950s, although its holder might have done well from GM and GE dividends. From 1930 to 1960, most of even the most famous investment banks looked only a little more lively than museums or libraries, often with the same men sitting in the same dignified old buildings into which they had first walked in the 1920s.

From the late 1960s on, however, banking and finance have been subjected to a continuing triple revolution. First of all, a huge new save of demand for equity investment capital re-appeared, almost more than the investment banking partnerships could handle. Secondly, a cascade of failures and frustrations in the highly regulated postwar ‘mixed economies’ of the West led, from the 1970s on, to a very powerful new current of free-market economic ideas, finding an academic base at the U. of Chicago, popularized effectively to the general public by Milton Friedman, and spreading to both political and business leaders.

As well, academic research in statistics, and ‘finance’ as a new university discipline, began to be more and more integrated into the daily practice of banking. That change dovetailed with a third large one: rapidly improving telecommunications and computerization steadily transformed all of financial intermediation, rendering many traditional banking sources of profit obsolete, while opening up all kinds of new ones, promising but not all that well understood, whether by old bankers or young computer sophisticates. For example, academic economists like Robert Merton showed that big commercial banks could boost their profit margins substantially by deliberately embracing more volatility, while their total risk would still be ‘limited’ by their limited liability, but there was little attention given to the terrific ‘moral hazard’ this would offer a few years later.

Commercial banks, their assets growing more gigantic every year, began pressing again to be allowed into the investment banking business. Eventually deregulation made this possible, often simply by buying out existing investment banks. While only Merrill Lynch had long taken the course of functioning as a public company, in time all the other investment banks gave in to internal and external pressure to do the same, although Goldman Sachs held out to the end of the 1990s, and put most of their stock issue in the hands of existing or former partners. The investment bankers in the exact sense also found themselves in frequent bitter internal quarrels with the firm’s traders, firmly fixed on maximizing volume and size of transactions, often regarding mere clients with something of a blank stare.

Hove & Co. showed all the effects of these multiple revolutionary forces. Until it merged in 1970 with another merchant bank to become Hoare Govett, it was led by ‘Kit’ Hoare, who was said to do business on a nod or handshake, never wrote anything down, but knew how to get in on all the best deals. Hoare Govett remained an elite merchant bank for a while, but when deregulation made a new game possible, it was sold for a high price in 1982 to a Los Angeles bank, Security Pacific. Its purchaser went down in flames in the 1992 California real estate crash, and was swallowed by the Bank of America, which then sold Hoare Govett to the Dutch giant, ABN Amro.

In 2007, by which time banking risks had been buried under layers of complex derivatives full of bad American real estate loans, the Royal Bank of Scotland then moved in on ABN Amro, both managing to make a catastrophic deal. That led to Hoare Govett being sold again, for very little this time, ironically enough to a relatively small ’boutique’ American investment bank called Jeffries. The mountains laboured, and what was once a sleek and handsome British mouse was driven hither and yon, at first fattened, then starved, then delivered into a nest of voracious American rodents. The fate of Hoare Govett was one largely shared by the other old British merchant banks from the late 1970s on, described bitterly in Phillip Auger’s The Death of Gentlemanly Capitalism.

In the grim reckoning of 2007-2008, famous American investment banks either went broke outright or were hastily made a compulsory meal of almost equally tottering commercial banks/ Their ‘limited’ liability left a horrendous bloodbath for their main shareholders, institutional investors, mostly acting for pension funds. Their chief executives had been large shareholders in their own firms as well, so they took large losses of their own, but the wider public naturally sheds few tears for them, seeing them depart with years of spectacular salaries and bonuses, leaving pensioners at the bottom of the food chain holding the bag.

There is a lesson in this for conservatives, many of whom were seduced over the last three decades by the ideas of doctrinaire economic libertarians. The 2008 Crash was not just another boom-after-bust that is an ordinary consequence of capitalism. From about 1998, even apparently successful new kinds of financial intermediation simply began accounting for more and more of a proportion of entire Western economies with disguised debt,, frequently masking rather anaemic development in ordinary enterprise, save in China. Letting ‘animal spirits’ rip in ordinary business has been, on the whole, a success story. But not in banking.

The gentleman merchant bankers of the first two-thirds of the 20th century were not infallible, and sometimes could be’piratical’ in their own ways. But by and large, they knew what they were doing, they had to take real personal responsibility for their mistakes, and that knowledge disciplined what they did. A legislative mess like the Dodd-Frank Bill does not come remotely near returning anything like that discipline, and the international increased capital requirements of ‘Basel III’ are only a little better. Banking needs another revolution, one that will take years, For future bankers and political leaders alike, it should begin in a spirit of humility, with great suspicion of systematic economic theories and abstract mathematical formulas, and with intelligent responses to public pressure. The latter may prove especially necessary, including in Canada, to moderate the activities of those politicians who use the term ‘conservative’ only to describe an obedient attention to the lobbying demands of monsters too big to fail.

 

Moral Neophilia and its Gloomy Prophet

Moral Neophilia and its Gloomy Prophet

Some news stories missed by the mass media are signposts of deep historical change. This is a condensed version of one that recently appeared in Stars and Stripes, the U. S. military magazine:

Camp Zama, Japan, Feb. 14 – The army is ordering its hardened combat veterans to wear fake breasts and empathy bellies so they can better understand how pregnant soldiers feel during pregnancy training. This week, 13 non-commissioned officers at Camp Zama took turns wearing the ‘pregnancy simulators’ as they stretched, twisted and exercised during a three-day class… [Male NCOs] all over the world are being ordered to take the Pregnancy Postpartum Physical Training Exercise Leaders Course, or PPPT, according to health promotion educator Jana York…”[Pregnant women] shouldn’t push themselves too hard or participate in high-impact activities such as snowboarding, bungee jumping, or horse riding”, York said.

York also described the male soldiers as being ‘timid’ on their first day of pseudo-pregnancy exercises, unlikely to be the correct adjective. Combat veterans, including those with no women in their own units, were ordered to do the training. An Army study had found that when female soldiers did not take physical exercises during their pregnancy, they frequently failed physical tests on their return.

Whether this is just one more bed of Procrustes generated by obsessive egalitarianism, or a display of feminist sadism, the more interesting question is why such grotesque adventures advance so relentlessly today. One persuasive answer can be found in the writings of the late Phillip Rieff. Rieff, who died in 2006, was the Benjamin Franklin Professor of Sociology at the University of Pennsylvania, but he bore little resemblance to his academic colleagues. More like a modern version of an Old Testament prophet, he was a deeply and broadly learned grand theorist, more like his 19th century predecessors than the social thinkers of more recent times. He was above all a scholar on Freud, whose papers he edited. However, he was not at all a ‘Freudian’, as that term is commonly understood, but a pessimistic student of civilization in general, drawing on Freud to develop an interpretive framework all his own.

Rieff had much scholarly recognition in his lifetime, but he never reached a wider public. A memorable lecturer, he sometimes wrote densely clotted prose, getting worse in his later years. His most immediately influential books were his first two, Freud: The Mind of the Moralist (1959), and The Triumph of the Therapeutic (1965). They unveiled his complete re-interpretation of Freudian thought, as remote from the original as that of Augustine was from Plato.

The ‘scientific’ claims of Freudian psychoanalysis were by then already being discredited, a process that has continued. But that little concerned Rieff, who treated Freud as a profound but ultimately unsatisfactory philosopher, on whose work he could build. He rejected the idea of Freudianism as a ‘liberation’, preferring the pessimistic conservatism of Civilization and its Discontents. Whether or not Freud would have agreed, to Rieff he should be read mainly as a guide to the indispensability of order and restraint. He thought that Freud had rightly identified a ‘primal sense of guilt’, but failed to explain it with his hypothesis of a parricidal ‘primal crime’.

Rieff, by comparing past historical cultures, classical, Judeo-Christian, and ‘post-Enlightenment’, offered a different hypothesis. He maintained that the bedrock, the real defining principle of all historical civilizations, was not found in their social, political, and economic arrangements, or even their religious theologies, although these came closer. Underlying even the latter, according to Rieff, were what he called the interdicts of a culture, its taboos. ‘Interdicts’ did not define what people thought they ‘should’ do, but what they were compelled to do; in fact, in a culture with powerful interdicts, acting against them would not merely be regarded as ‘immoral’, but as impossible, in the same sense that jumping one’s own shadow is impossible.

The Freudian ‘liberationism’ of 1960s neo-Marxist gurus like Herbert Marcuse appalled Rieff. He saw the idea of limitless human possibility as the basis of an existential terror, which had only been superficially banished by ‘the triumph of the therapeutic': the replacement of good and evil with sickness and health, efficiency and inefficiency. Primal, formless,, uncontrollable instinct is paralysing and isolating, preventing trust in ourselves or in others. Only a full system of ‘interdicts’ can save a culture. The interdicts can be occasionally eased by accepted periodic ‘remissions’, like the suspensions of traditional moral rules common in Mardi Gras and in harvest festivals found worldwide, but these actually strengthen the interdicts.

The real threat to the interdicts comes from egoistic transgressions, which may be the work of anyone from popular artists to successful gangsters, and if the transgressions increase in scale and are not resisted, they gradually become the new interdicts. Every culture is a constant dialectic of prohibitions and permissions, and there must be an ‘unalterable’ interpretive authority to maintain the prohibitions. Otherwise, the primal self merely ‘expresses the fecundity of its own emptiness’. If everything could be expressed by everyone identically, nothing remains to be expressed individually. The supreme activity of culture is to prevent the expression of everything, and hence to prevent the one truly egalitarian dominion: nothingness.

He thus takes a different position from the Canadian philosopher, George Grant, who once defined modernity as ‘no taboo’. A better prophet, Rieff thought that only applied to brief transitional periods, like the 1970s. He correctly anticipated that modernity would rapidly convert what had once been transgressions into the new taboos. Scientific reason banishes supernaturalism, and permanently revolutionary capitalist technology banishes scarcity, but progressive rationalism does not banish guilt, merely giving it new objects. At the end of his study of Freud, Rieff argued that Western culture until the 20th century had been the creation of three ideal character types: classical political man, dedicated to the glory of his city, religious man, dedicated to the glory of God, and, via Enlightenment liberalism, economic man, believing in doing good for others by doing well for himself, whether individually, through capitalism, or collectively, through socialism. But what was now surviving from economic man was psychological man, beyond ideals and illusions: at best a narcissist, at worst a thug.

The children of psychological man, raised without repressions, regard all authority as illegitimate. They enter into a society without sacred hierarchies, seeking salvation only in ‘the amplitude in living itself’, a world that can only end in moral squalor and chaos. Binding moral imperatives do not depend on the ‘reason’ of psychological man, but on guilt, fear, and faith, generating obedience, trust, dependence, and communal purpose. The psychology of normless release goes with the growing imposition of ‘practical’ restrictions, driven by universalist egalitarianism.

Rieff thus became the Savonarola of the post-1965 ‘counterculture’. In Fellow Teachers (1973), he savagely attacked rebellious students, hippie dropouts, and their acquiescent and applauding professors, contemptuous of their slogans of ‘Love of Humanity’ and ‘Power to the People’. “We will see in true light the craven aping and interminable apologies for the transgressive types at the bottom: the perverts, the underclass, all those who can do no wrong because they have been wronged…”

He endorsed Max Weber’s theory of charisma, but deplored its secularization: ‘no charisma without creed’. He did not mean traditional theologies; ‘creed’ was above all moral, an ordering of interdicts and remissions. True charisma does not abolish limits, but imposes new ones. Piety and submission to wisdom were the only paths to greatness of soul, happiness, and common life.

Rieff’s bleak pessimism and unabashed defence of legitimate authority mean he is unlikely ever to become popular. But he provided a more instructive guide to the real course of modern egalitarian rationalism than its celebrants. Astute ‘religious atheists,’ like Christopher Hitchens, sometimes reach the edges of what Rieff was saying. So have surviving religious traditionalists, despite their temptations to sink into obscurantism. Both realize that egalitarian rationalism is not moving the world to freedom, justice, and happiness, but to a kind of madness, in which ‘human rights councils’ police egalitarianism’s incoherent new standards of blasphemy and sacrilege, and male soldiers are forced to pretend they are pregnant women. Transgressions need to be resisted; otherwise they will destroy culture and civilization altogether.

 

Big Sister is Watching You

Big Sister is Watching You

George Orwell, the greatest of all the now largely forgotten international galaxy of disillusioned, despairing, Dostoievskian essayists and political novelists of the Cold War, is still universally known, but not very widely understood. His two great dystopian novels, Animal Farm and Brave New World, have survived as frequent high school reading assignments and in occasional film adaptations. But Orwell would be little cheered by this. Always more of an essayist and cultural critic than a novelist, he did not simply set out, as is now widely assumed, to show the evil of Stalinist dictatorship. He aimed at providing av much broader and more profound analysis of the enduring utopian illusions and dangers of all modern democracies; he is as relevant as ever, but must be read, even re-read repeatedly.
However, he and all the other bleak Cassandras of the 1930s to the 1950s largely missed out on two major factors in the shaping of the totalitarian mind, less important in the ideological politics of his time than they have been for the last half century. The first has been the revelation of the constant vulnerability of intelligent but cloistered young men and women to what they simply take to be ‘forward-looking’ intellectual currents of their time, hence serving as foot soldiers of unending cultural revolution. The second has been that young women in particular, with their permanent drives to create social consensus and moral improvement, have been are far more likely than men to be the numerous, determined, and unbending agents of that cultural revolution, especially when they are able to centre it on collectivist theories of identity. This largely explains what happened to popular ideas from about 1975 to the present.
Of course those ideas did a great deal to improve the real status and power of women in general in modern society. But what Orwell realized, all issues about the status of women aside, was that democratic cultural revolution, beyond any of its stated objectives of the moment, is also nihilist, and can thus be almost casually directed, not to ‘liberation’, but to a gradual and pervasive enslavement of the human mind. Totalitarianism is not simply the work of an oppressive State, but is carried forward by a kind of mindless collective bullying. He saw that freedom can be slowly crushed, and even gradually made inconceivable, by a constant degradation and perversion of language, to the point of preventing people from even thinking like free men and women, scarcely requiring the visible coercion of a secret political police; policing themselves.
It has been this kind of cultural revolution that should be causing alarm today. Feminism, unlike other radical radical enthusiasms of the 1960s, took root in much of the general female population, some older as well as younger, all emancipated by the contraceptive pill, widening affluence, and increasing formal education. Women were attracted not only by the persuasive arguments for expanded social and legal equality and occupational opportunity, but also to the grander ‘gender-specific’ notion of the identity of the personal and the political.
In practice, what this proved to mean by the end of the century, with the full arrival of mass post-secondary education for women and the associated indoctrination in raised ‘consciousness’ has been the rise of the form of language and thought control called political correctness. The term actually goes back to the doctrinal commandments of Stalinist Communists in the 1930s and 1940s, but today’s version is an entirely post-1960s feminist creation, which has succeeded in persuading large numbers of women that it is no more than an improving direction in manners. PC has now reached far beyond hothouse campus assemblages, although these continue as an avant-garde in discovering new imagined offences. It now polices the public language of politicians, governments, corporations, and media outlets; at Queen’s a couple of years ago, there was even an attempt to monitor students’ private conversations. Still riding the runaway steamroller of moral entitlement, PC is totalitarian in the exact sense: it has no boundaries.
So it is now starting to mimic the thought control once practised by Gestapo and KGB informers in university classrooms; in the liberal arts and education faculties, and, perhaps most of all, in the law schools, the madrassas of ‘meaningful change’. And the loudest voices raised in policing activities have been those of earnest young women.
Consider a current example, reported in Canadian Lawyer & Law Times, about what happened when a big Toronto law firm tried to be funny in a recruiting ad. It led to a letter to the editor of Obiter Dicta, the Osgoode Hall student periodical, from a student named Kisha Monroe:
“This letter concerns the [Davies Ward Phillips & Vineburg Law Firm] ad on the back of the last issue where the ‘D’ in Davies is struck through and replaced with a graffitied ‘SL’, rendering the word ‘Slavies’. It has come to my attention that this is an informal hyperbolic nickname that some students who have articled there have coined to refer to the workload they experienced during their time with the law firm. In this ad, Davies appears to be re-claiming this reputation and re-positioning it as just part of their story…
I take real exception to the fact that there are people for whom this joke would even be funny… This is beyond my control…What is even more offensive is that the legacy of the Trans-Atlantic slave trade is still alive and well with regard to disparities in access to employment, education, wealth and justice that the descendants of slaves still suffer. It is beyond distasteful…Imagine the name of the law firm was not ‘Davies’ but instead rhymed with a particular concentration camp…I…will read any subsequent running of [the ad] as racial harassment as laid out in the Ontario Human Rights code…I will be writing to Davies as well and encourage likeminded Osgoode community members to let them know that this ad is offensive and illegal..”
The letter was followed by a shorter supporting one in the same vein by another female Osgoode student, adding the pungent observation that ‘Davies Slavies’ were paid $1450 a week. A female ‘Director of Student Affairs’ at Davies immediately sent in an abject apology: ‘…It did not occur to our team that we would be seen as making light of slavery…Obviously it should have…’ But this was not enough for another law student, this time at Queen’s, Joy Wakefield, who demanded Davies provide evidence that they were taking steps to make sure that they never did anything like this again, and were taking action to increase ‘diversity’.
What a horribly depressing experience it is for a politically incorrect male to read the thoughts of these young women about what ‘has come to [their] attention’. Their idealism and desire to serve the good is patent, but so is their awe-inspiring historical myopia and ignorance, about freedom, about slavery, and about language. They need to re-read, or read for the first time, the whole of 1984, with special attention to Orwell’s ‘Glossary of Newspeak’ at the end the book. They should follow it up by reading The Captive Mind by Czeslaw Milosz, and equally vital, Milan Kundera’s The Joke, For the great irony of what they are doing, whether they ever become ‘Slavies for Davies’ or not, is that they are trying to turn themselves, their fellow students, and their whole society, into slaves, slaves like those created by the coercive utopians of 20th century Europe, not those of past centuries, Secure in their righteous rage, they will not be deterred by the nervous laughter of male colleagues. They need to be opposed head on, for their own good and for the good of all men and women.

 

The Irony of The Iron Lady

The Irony of The Iron Lady

The Iron Lady, which has been praised above all for Meryl Streep’s powerful portrayal of Margaret Thatcher, has some other good features as well, like the presence of several good supporting actors, especially Jim Broadbent, who plays Thatcher’s husband Denis, for most of the movie a deceased ghostly hallucination and Greek chorus. It has apparently not been that big a hit with general audiences, perhaps because too much of the story had been given away in the heavy advance publicity, perhaps due to its many and sometimes confusing quick jumps back and forth in time. But critics have mostly been kind, enraptured by Streep’s performance. Even admirers of the real Thatcher, including British Conservatives who knew her well personally, were relieved to find that the film was not just an elaborate hatchet job, although the amount of it devoted to the elderly and failing Thatcher made John O’Sullivan aptly comment that it would have been better entitled The Lioness in Winter.
Nonetheless, The Iron Lady is an unsatisfactory film, even something worse. One major cause is not a weakness particular to it, but one found in a long list of movies of the last decade that have drawn Academy Awards and other plaudits for the lead actor or pair of actors. There is nothing wrong with an occasional play or film that is quite obviously more a vehicle for one or two splendid virtuoso performances, but enough already. It can scarcely be said anymore that the play’s the thing; nowadays, on Broadway and the West End as well as in movies, only the actors really matter. They are more and more diminishing the stories and scripts, with writers deploying most of their own talents in providing scenes in which the actors can shine. The tendency can be seen in films as different as Million Dollar Baby and The King’s Speech.
This doesn’t matter very much when the film is of a purely fictional story, or rediscovers new possibilities in otherwise minor past people and events, as in The Man Who Was Peter Pan or My Week With Marilyn. But movies that are almost entirely showcases for the actors can be immensely irritating when they are about significant historical personages and events that still matter in the present world. Dramatic films have not often been that successful in recapturing authentic history at the best of times, but their makers now seem to be losing even the capacity to provide alternative myths. Sometimes the result has been an obvious shambles, as in Anonymous, in which talented British actors were dragged through an asinine script. In others, like The King’s Speech and The Iron Lady, the script and editing are good enough that the audience and most critics are so delighted by the terrific lead performance that the film gets praised overall. But this has some serious bad consequences, Films of this kind are not only unlikely to endure as memorable and repeatedly enjoyed classics, but contribute to the general infantilization of modern culture, joining with the more obviously flawed mixture of brilliant technical effects and brainless storytelling of blockbuster markers like Steven Spielberg and James Cameron.
Meryl Streep has provided a brilliant mimicry of Margaret Thatcher’s visible personality, and The Iron Lady also offers a more universal’ portrait of ambitious youth, triumphant middle age, and frailty, loneliness, and dementia in final years. But it it is not a good ‘political’ movie, as this could be said of, say, the original All the King’s Men of 1947, in which Broderick Crawford played a fictionalized Huey Long, or even a bad one, like that film’s feeble recent remake with Sena Penn in the Crawford role. The Iron Lady is not a political movie at all. It is a woman’s movie, not all that different from movies about other famous female personalities, like Coco Chanel, Marilyn Monroe, or Jackie Kennedy, with similar preoccupations: physical appearance and style; courage and vulnerability; and above all else, relations with men.
Applied to Thatcher, that is quite enough to provide a compelling story, sometimes a quite moving one, but while the approach is also all that is really required for a Coco or a Marilyn, it has been bound to make a fundamentally inadequate movie about Thatcher, who was never, even to her strongest admirers of either sex, a female or feminist icon, but above all a political leader, and a leader drawing on political and economic ideas, held with great conviction.
There is scarcely a trace of these ideas in the film, even as objects of criticism. Its rapid and very incomplete sequence of political events is provided without context or explanation. Younger audience members, for example, would gain no idea at all that there was such a thing as the Cold War, much less that Thatcher was a major player in fighting it right through its final decade. Nor would any viewer get much sense of the ideological divisions within the Conservative Party; there is no indication that Thatcher carried forward the ideas of immediate predecessors like Enoch Powell and Keith Joseph. There is no explanation of how and why Thatcher became the successor of Edward Heath,, nor of the failures of both Heath’s policies and those of his Labour successors in the grim 1970s. The Labour Leader she faced when in power, Michael Foot, is shown as a mere cipher. Nor is more than the quickest and most glancing attention given to her close alliance with Ronald Reagan. Sometimes the movie has such quick references as to make the viewer wonder if it is a ruthlessly trimmed version of a much longer account.
The movie labours the singularity of Thatcher’s triumph in the Conservative Party, but makes it incomprehensible. It shows only her husband Denis, her early champion Airey Neave, and one or two other Conservatives at most, offering her support and loyalty. The rest of the leading figures of the 1970s and 1980s British Conservative Party are shown as being entirely a privileged Old Boys’ club.. This portrayal makes it impossible to understand why they would ever have let her become Party Leader and Prime Minister in the first place. It also has to ignore the fact that Edward Heath, not only her predecessor, but her permanent bitter enemy, both personally and in terms of domestic and foreign policies, was as much the product of a grocer’s family as she was. Those who supported and opposed her were not simply divided on class lines; she won majority support in her party by successfully opposing the ideas of free-market economics and individual liberty to the powerful Conservative traditions of ‘one England’ and noblesse oblige.
So poetic license could serve well enough to offer a version of the mythic ‘woman’s story’ of the first female British Prime Minister, but it did not merely ignore the political history; it obliterated political ideas of all kinds from the last third of the 20th century. Scrubbing out that history did not just diminish the real scale of Thatcher’s achievement; it also gave a version of her story that was a great deal less important and interesting than the real one.
When Tom Cruise got the odd idea of making Valkyrie, in which he played Claus Von Stauffenberg, the courageous German officer who tried to assassinate Hitler in 1944, he did not give anything like as virtuoso a Stauffenberg as Meryl Streep’s Thatcher. But at least Cruise seems to have realized that Stauffenberg was a more important person than he was. The Iron Lady does show Thatcher’s courage, and her forceful and uncompromising character in office. But it conveyed little sense that she was not just the first female Prime Minister, but one of the most powerful people in the world thirty years ago, who stood for political and economic ideas. and for moral values. in a way seldom seen before and not seen since. The implicit message of the movie was that Meryl Streep is more important than Margaret Thatcher, and that is not just poetic license, but a terrible injustice.

 

The Lord of the Atom

The Lord of the Atom

This year will mark the 75th anniversary of the death of Ernest Rutherford, who deserves some new reflections. He was the greatest experimental physicist of the 20th century, and one of the most creative and influential scientists of all time. During his lifetime, he was admired by his scientific peers and by the public as much as was Albert Einstein. Many historians of science think that his overall achievements were even more substantial than Einstein’s.

But Rutherford’s name has not kept the enduring mystique attached to Einstein’s, especially in the U. S. Few there objected, to a claim in a 1997 speech by Bill Clinton, that Americans had ‘split the atom’, although it was two of Rutherford’s students who first did that in Cambridge in 1932. His many biographers recognize the impressive scale of his laboratory triumphs, and the powerful personality of the living man; like Teddy Roosevelt, he was often described as a force of nature. But for the wider public, that force did not long outlive him.

He has remained a favourite for historically-informed scientists and historians of science, but not an Einstein-like cultural icon. This offers a lesson in the differences between history and myth. Rutherford virtually invented nuclear physics, and the particle physics that succeeded it, and was the most important single individual in shaping the practice and prestige of natural science in the first half of the 20th century. One reason this did not prevent later fading popular attention has been that, like his contemporary, Winston Churchill, Rutherford rose to fame when Britain and its Empire appeared to be at their height. Dying unexpectedly in 1937, he did not witness, as Churchill, Bertrand Russell, and other longer-living late-Victorian contemporaries did, the postwar decline of Britain and the rapid collapse of its Empire.

He thus also missed the arriving American domination in all fields of science that began in World War II, spectacularly and horrifyingly displayed at Hiroshima. He had frequently dismissed the likelihood of drawing energy from the atom – although he quietly warned British officials to ‘keep an eye’ on future possibilities. They did. Several of his brightest British students, and Nils Bohr, his Danish longtime colleague and friend, would play an important part in the multinational Manhattan Project. It had been Rutherford’s onetime student in 1905 Montreal, Otto Hahn, who made the crucial experiment demonstrating nuclear fission in Berlin in late 1938. The world’s then tiny group of nuclear scientists immediately realized that the Bomb would be possible, and that it might first be built by Nazi Germany. Hahn was not a Nazi, but other Germans were, including Hans Geiger, another former Rutherford student. Hahn, a British captive by August 1945, collapsed when he heard of Hiroshima, and spent the rest of his life as a passionate opponent of nuclear weapons. But it was Robert Oppenheimer, the complex, cultivated, leftist scientific head at Los Alamos, who became the tragic hero of the nuclear age, portrayed not just in histories and biographies, but in plays and movies.

Albert Einstein, whose famous letter to Roosevelt warning about the possible atomic bomb was actually written for him by two e’migre’ Hungarian nuclear scientists, had far less to do with nuclear physics or the creation of the Bomb than is commonly imagined. But as the century’s greatest theoretical physicist, Einstein was already acquiring his iconic status in the 1920s, after observations made during a 1919 eclipse of the sun provided powerful empirical support for his 1915 General Theory of Relativity. A worldwide poll in the 1920s found him one of the two most famous men in the world (the other was Charlie Chaplin). Other scientists shared ideas with him, but he did not establish his own distinct research tradition; he was not inaccurately seen as a solitary genius, persecuted as a Jew, and a welcome arrival and adornment in America.

Rutherford, a New Zealand Scot by origin,, an English graduate student, a temporary Canadian, eventually an English peer of the realm, was a British late Victorian above all, who sometimes reminded people of a sturdy and cheerful colonial farmer. But he was also the incarnation of a brief era in which a very small elite international community of laboratory physicists and chemists, mainly British and German and learning from each other, were the world’s leading discoverers. He used astonishingly simple and inexpensive laboratory equipment. He lived just long enough to recognize the increasing usefulness of his students with engineering background, and the larger and more powerful equipment they built for the Cavendish in the 1930s. But he never saw the full arrival of ‘Big Science’, financed by government and war. For him, until the rise of Stalin and Hitler, science had remained something of an idyllic arcadia, in which a few gifted researchers, many from aristocratic backgrounds who did not even need a university income, and in which even those who did – new professionals like Rutherford himself – were driven mainly by pure curiosity and the competitive drive for recognition.

Hence his career was filled with ironies, some not even requiring post-Hiroshima hindsight. For example, it has now long been taken for granted that physics is the ‘paradigmatic’ science, assumed to have a predominance going back to Galileo and Newton. That was not the way things looked when Rutherford began his career at McGill in 1898. What is now identified as ‘physics’ was in the 19th century still thought of as either ‘natural philosophy’ or as chemistry. Even the word ‘physicist’, the ‘-ist’ implying a paid professional occupation, had only been invented a few years earlier (and was detested by Michael Faraday). Encyclopedias of the early 1900s devoted dozens of pages to chemistry, only a couple to physics. However, the last years of the 19th century had also seen engineering education moved from expensive apprenticeships to comparatively cheap university programs, which offered guaranteed work for mathematically-expert young men, teaching calculus to undergraduates forever after. Rutherford and many other of the most famous physicists of the early 20th century, while well-trained in higher mathematics, otherwise took their undergraduate education in liberal arts.

He arrived at McGill with distinguished Cavendish credentials, but still as a young unknown. In 1901, a fateful encounter with an even younger Oxford chemist, Frederick Soddy, led to an outstandingly successful collaboration. Studying particles emitted from uranium, they introduced the concept that radioactivity was the product of atomic disintegration. That brought Rutherford a Nobel Prize in 1908, but to his amusement, it was in chemistry rather than physics. Otto Hahn was a chemist as well, a very German and patient one; his skill in making ultra-microscopic titrations helps explain why it was he, rather than any of the 1930s physicists of Germany, Italy, Britain, or America who first accurately explained nuclear fission.

Rutherford was irresistibly drawn back to England in 1907, with a chair at Manchester. By then collaborating with other great talents like Bohr and Geiger, he essentially created modern physics, still closely allied with chemistry. His most gifted Manchester student H. G. J. Moseley – another chemist – killed in 1915 at Gallipoli – first worked out the table of elements based on atomic number, and Rutherford also caught the public imagination as an ‘alchemist’, as he ‘transmuted’ nitrogen, by alpha particle bombardment, into an isotope of oxygen.

In 1919, he replaced J. J. Thomson as Cavendish Director, and felt somewhat stalled in the 1920s, observing with annoyance that the theoretical physicists ‘have got their tails up again’. But in the 1930s, now acting mainly as a guide and inspiration to his younger colleagues, he entered another brilliant decade. His wife, a puritanical prohibitionist, grew rather shrewish with age, and both of them were devastated by the early death of their daughter and only child in 1930. But he still found joy in the laboratory, booming out that they were living ‘in the heroic age of physics’, greeting each new experimental triumph with loud choruses of ‘Onward, Christian soldiers.’ The almost entirely British scientists who studied with him won Nobel Prizes galore in the 1930s and 1940s, even as decisive overall leadership in science was passing to the United States.

Becoming Lord Rutherford of Nelson in 1931, he soon also became a friend and adviser of Prime Minister Stanley Baldwin. In the laboratory, as teacher and mentor, he constantly drove students back to experiment and observation. Highly skilled in mathematics, he would deliberately conceal his expertise from his students, and regularly warn them about the seductive dangers of purely mathematical reasoning, quickly flying too far from the world of experience. Teasingly irreverent even about chemistry, he would probably have been even more derisory about arriving ‘social sciences’, heavily dependent on theory, neologisms, and statistics. ‘All science,’ he would only half-jokingly proclaim, ‘is physics or stamp collecting’.

More surprisingly, he several times turned down financial bequests to the Cavendish. He believed that too much money actually stifled imagination and creativity; that an absence of resources compelled young men to think harder. Some of his lessons could be well worth re-learning today, with theoretical physics stalled for two decades in mathematical string theory abstraction, and with granthunting and university bureaucracy frequently stifling original thought – and money visibly having he bad effects he feared. Today’s professors and students – and not just in the sciences – could still learn a lot from him.

 

The Ignorance of the Learned

PAH First 2012 Article: The Ignorance of the Learned

Most university professors are happy enough when their publications are acclaimed by their colleagues, and by reviewers in influential periodicals. But some more ambitious ones hunt bigger game. They proclaim the creations of new ’emergent disciplines’. These hothouse products increase the over-specialization already plaguing academic studies, and have also, too often, been the means by which entrenched professorial prejudices can be repackaged as new justifications for the oppressive alliance of political fashions and expanded state power.

For a decade now, just such an emerging nuisance, called ‘agnotology’, has been cultivated with loving care by two Stanford professors in the history of science, both Harvard Ph.D.s with many academic laurels, Robert Proctor and his wife, Londa Schiebinger. ‘Agnotology’ has already been sanctified by Wikipedia, which defines it as ‘the study of culturally-induced ignorance or doubt, particularly the publication of inaccurate or misleading scientific data’.

Proctor coined the word in a 1995 book, deriving it from the Greek ‘agnosis’, or ‘not knowing’. He maintains that ignorance a complex entity, with a ‘changing political geography…often an excellent indicator of the politics of knowledge’. Later, he applied the term ‘only half jokingly’ to a study he made of the limited geologic knowledge of agate, discovered over two thousand years ago, contrasted with the more detailed research on rocks and minerals with commercial value. Agate had been the victim of a ‘structured apathy’, part of ‘the social construction of ignorance’.

Soon he was not joking. In 2003, he and his wife organized an agnotology ‘workship’ at Pennsylvania State University. In 2004, Schiebinger gave a paper on gender effects on 18th century scientific explorations – her speciality is ‘gender relations in science’ – which found some more agnotological ignorance, ‘the outcome of cultural and political struggle’.

By 2005, the couple were able to launch a full-scale agnotology conference at Stanford; by then, the concept was beginning to appear in popular books. In 2010, a Marxist art and film critic, Michael Betancourt joined the pursuit, claiming in an article, that the whole 1980 – 2008 economy had been one long ‘bubble’, the creation of what he called ‘agnotologic capitalism’.

Exciting counterfactual historical investigations may thus now unfold, explaining why various forms of ‘knowledge’ did not ‘come to be’. At the 2005 conference, for example, Proctor claimed that the science of plate tectonics was delayed a decade by military censorship of evidence about the seabed. But he could have added that it was the WW II interest of the U. S. Navy in possible undersea hiding places for subs that had led to the seabed studies in the first place.

Proctor has achieved more public celebrity for his research on how tobacco companies, government regulators, scientists, and media reports shaped what his 1995 book called The Cancer Wars. He attacked the tobacco industry for ‘manufacturing doubt’ about the increased risk of cancer from tobacco use, including the subsidizing of scientific research about everything but tobacco hazards. He has also written on a topic that apparently was a big surprise to him, although familiar to many historians, The Nazi War on Cancer. He was quite stunned to discover that the Nazis were far ‘ahead’ of the U. S. in not only recognizing the carcinogenic nature of tobacco, but in waging a public anti-smoking campaign. The failure of America to learn quickly from the Nazis was clearly a case of agnotology striking again.

Agnotology has so many things wrong with it that another ’emerging discipline’ could be launched to refute it. It could start by returning to the word’s Greek derivation, this time expanding not on agnosis, but on gnosis, not its antonym, but another academic concept, this one advanced by the conservative thinker and historian of ideas, Eric Vogelin (1901-1985). Vogelin was an e’migre’ German scholar who devoted his life to studying the causes of the political violence and totalitarianism of the 20th century. He is unlikely ever to become a favourite at Harvard. Apart from having a difficult style, and drawing on a deep understanding of classical and modern European history, Vogelin is simply too serious for Harvard, a standing rebuke to professional superficiality. But he has important things to say, not only about the ideas that shaped the 20th century totalitarians, but about present intellectual life.

Gnosis was one of Vogelin’s central concepts, ironically quite suitable for describing the reasoning of the agnotologists. Vogelin maintained that it was an intellectual deformation arising from the abandonment of a sense of transcendance and trust, that can never be fully defined or described, but only partially comprehended in symbols, and which provides the enduring basis of order. Its ‘gnostic’ replacement takes the form of ‘a purported direct, immediate apprehension or vision of truth…the special gift of a spiritual and cognitive elite’.

Nowadays, that is not just a description of obvious ideological fanatics, but something close to the blind and bland incomprehension of Ivy League progressive professors. These include the new agnotologists. First of all, only years of shelter in academia could allow anyone to take seriously the idea that public ‘confusion’ about the cancer risks of tobacco or the possible threat of global warming has required much in the way of conscious ‘manipulation’ to explain it.

No mattter how serious these threats may be over the long term, the far more obvious reason for public resistance to coercive government policies intended to set them on a more healthy and righteous path is the simple remoteness of the threats, compared to both the pleasurable and highly addictive nature of tobacco, and the alarming immediate costs demanded for rapid conversion to what are still very uneconomic non-fossil-fuels means of providing energy.

Books and articles that minimize, dismiss, or at least introduce ‘doubt’ about health or climate risks may sometimes be tendentious and wrongheaded, but at best or worst have only a limited role in explaining public ‘ignorance’ about these topics. The far more powerful factor is the commonsensical attachment to what is immediately persuasive, and a recognition that learned theories about future dangers have so often been shown to be wrong. That stolid indifference may indeed sometimes lead the masses of mankind into folly, but has also saved them from it. It was ‘future-orented’ intellectuals who were most rapidly and dangerously seduced by Marxism, fascism, schemes of eugenic improvement, and all other forms of what Vogelin called gnosticism.

Not only that, the agnotologists resemble past Marxists in their incapacity to consider the merits of arguments opposing them as such, completely preoccupied by speculative theories about the parti pris of those who do not agree with them. For example, the evidence that actual smoking is a major risk factor for cancer has long been overwhelming, but the reason that many scientists have raised ‘doubt’ about the risks attendant on second-hand smoke is not simply calculated efforts to confuse the issue by tobacco companies, but the real weakness of the statistical evidence. It is the anti-smoking crusade which itself has confused that issue.

Similarly, even if ‘climate change sceptics’ have wrongly underestimated the seriousness and immediacy of the threat of global warming, their scepticism is genuine. It is simply nonsense to imagine that they are all merely tools of the giant oil companies; indeed, several of the latter have joined the ‘need for remedial action’ chorus. And whether the public policy issue is smoking restriction, government action on energy consumption, or something else, even arguments made that are demonstrably partisan are not therefore demonstrably false. Even Hitler and the Nazis were not wrong about everything, disturbing as Proctor found this; they were quite clever about a number of things besides the dangers of smoking, but were still a gang of murderous thugs.

Oddest of all for two ‘historians of science’, Proctor and Siebinger don’t seem to realize that they have merely introduced a minor verbal variation on Francis Bacon’s 500-year-old arguments about the influence on thought of what he called ‘idols’, of the tribe, the theatre, etc. Nor do they seem to be aware that the weakness of Baconianism, like that of Freudianism, Marxism, or the ‘sociology of knowledge’ theories of writers like Karl Mannheim, is that they all wind up, as do all other would-be ‘unmaking’ theories, entangled in a version of the Cretan liar’s paradox. That is, if Professor P attempts to dismiss Proposition X on the grounds, not that it is contradicted by evidence or logic, but because it was advanced by someone working out his Oedipus Complex, displaying his class-based ‘false consciousness’, dedicated to the spreading of agnotological confusion, or whatever, any third commentator can dismiss the objections of Professor X in exactly the same way, and all reasoning and dialogue therefore collapses.

Bacon, Freud, Marx, Mannheim, and the agnotologists can all still sometimes make persuasive arguments, as also hold for Neo-Marxists, Postmodernists, and the many other current academic descendants of the Laputans in Gulliver’s Travels. Even stopped clocks are right twice a day. But there is no place like the Ivy League for confusing stopped clocks with binoculars.

 

The Metahistorical Dramatists and the Triumph of Doubt

The Metahistorical Dramatists and the Triumph of Doubt.

Students of history have sometimes been tempted to imagine it as a great drama, its author God or a surrogate of God, like Hegel’s Absolute. For a century, every decade or two has witnessed the arrival of a historian who aspires to be the ultimate drama critic or rival playwright, best called a ‘metahistorian’, who goes beyond his more prosaic contemporaries, and offers an overarching theory, giving a unified interpretation of past, present, and future.

From World War I to the 1930s, it was the pessimistic arch-conservative, Oswald Spengler, who achieved international acclaim with The Decline of the West, which portrayed all civilizations, from ancient times to the present, in cycles of birth, flowering, stasis, and final decline. It was first published in 1917, and was an immediate sensation in his native Germany. Later translated into several languages, including English, it suited the intellectual mood after the First World War. Spengler’s play is unavoidably tragic; he once commented that ‘optimism is cowardice’..

From the 1930s to the 1950s, a far more elaborate but similar cyclical theory was provided by the English Christian historian and Foreign Office adviser, Arnold Toynbee, in his massive multi-volume work, A Study of History, written over two decades. His greatest influence was not in his own country, but in the United States, mainly through a one-volume abridged version. Toynbee was also a prolific journalist and lecturer, and he made the cover of Time in 1947. Americans liked Toynbee’s vision better than Spengler’s, still tragic but leavened by Christian hope. But while he lived until 1975, his popularity declined sharply after 1960.

Metahistory somewhat fell out of fashion from the 1960s through the 1980s, although Paul Kennedy’s The Rise and Fall of the Great Powers (1987), was in the tradition. Kennedy, a British historian teaching at Yale, drew on a military and strategic emphasis. But while his book had some impact on publication, he never captured the imagination of the intellectual world as widely as Spengler and Toynbee had at the height of their popularity.

Something near that was accomplished, however, at the start of the 1990s, by The End of History and the Last Man, by the Japanese-American historian Francis Fukuyama; the book continued to be widely read and discussed throughout the decade. Fukuyama’s neo-Hegelian Big Idea was that ‘history’ was now coming to an end, as the entire world was moving to a common acceptance of liberal democracy in political institutions and capitalism in economics. In Bill Clinton’s America, the great drama ends in universal bourgeois happiness.

By then it was evident that media anointment is only extended to a single metahistorical dramatist at a time. Fukuyama has written several books since his 1990 bombshell, but has long lost the crown. In the English-speaking world of the 21st century, it has been firmly seized by Niall Ferguson, onetime Oxford don, now holder of two Harvard professorships.

Ferguson has some differences from all his predecessors. The earlier English historian he most resembles was not Toynbee, but another flamboyant Oxford don and TV star, A. J. P. Taylor. Ferguson owes him a great lesson. Taylor was a genuine scholar of note, but his fame, or notoriety, was far more derived from what he did in his most famous book, The Origins of the Second World War (1963) which offered a very unorthodox and even shocking thesis. In it, Taylor maintained that Hitler’s great evils were all in his domestic policies, while his foreign policies were what any German government would have tried to achieve in the 1930s. His big mistake, according to Taylor, was in going after the Sudetenland first, rather than first making his alliance with Stalin; had he reversed the order, Hitler might have achieved his objectives without launching the Second World War.

Ferguson launched his fame in England in almost exactly the same way, with a new history of the First World War, The Pity of War. Otherwise mainly notable for an emphasis on economic factors, it included a very Tayloresque attention-grabber, the media rising like trout to the bait. He claimed that the world would have been better off had the British simply stayed out. In that case, the Germans would have won the war in a year or two, and that would not have led to the terribly ruinous effects of the war that actually took place: a total catastrophe, not only for its terrible bloodshed, but in bringing the collapse of the old dynastic empires, the rise of Bolshevism, Fascism, and Nazism, and the seeds of another even more terrible war.

Ferguson thus revealed himself from the start as an enthusiast, a far more thoroughgoing and repeated one than Taylor, for ‘counterfactual’ history. In fact, he edited a whole volume of essays by young historians called Virtual History, of’ other interesting ‘what ifs’. But he was no new Taylor otherwise. Even historians who disliked the latter would agree that he was a master of English prose, both elegant and witty. Ferguson has always known how to be donnishly amusing, but has been a far more pedestrian writer overall. His books in his own field of financial history, like The Cash Nexus, despite being on interesting topics, have little of Taylor’s sparkle and flair. But he paid his scholarly dues with his two-volume work on the Rothschilds, made with unprecedented access to the Rothschild family archives, perhaps his last work that attained scholarly approval but few general readers.

Nonetheless, his ambition and energy has carried him to greater heights. Taylor was not only a lifelong leftist, but a despairing one; he explicitly declared that he had little belief that the writing of history, his own or anyone else’s, made much difference to the way actual history worked out, and he was almost indifferent to economics.. Ferguson, a Thatcherite and neoconservative economic historian, has always had larger hopes.

Taylor also remained an almost entirely British popular phenomenon. Ferguson made his Transatlantic jump early, giving popular courses to undergraduates on both sides of the water, and writing more and more books suitable for TV adaptation. To his adoring Harvard students and charmed TV interviewers he has also unveiled his own Big Idea, to teach the Americans about both the greatness and the follies of the British Empire, offering instructive parallels for the present U. S.

Ferguson’s recent books, Empire, Colossus, and this year’s Civilization: The West and the Rest, with its ‘six killer apps’, are not so much historical works as rivals of the journalistic ones of Thomas Friedman. But compared to past metahistorians, he has always been more commonsensical and less theoretical, even in his less popular works. Despite his fondness for Big Topics and dramatic counterfactualism, he has remained something of an Oxford empiricist. But adapting to some American public disillusionment with neoconservatism. he has also lately been forced into some backtracking; he must now struggle with the iron law that those who live by the trend shall perish by the trend.

In his fast footwork, suitable for the media revolution of his times, Ferguson may be offering the last gasp of the metahistorical project. No more the bleak pessimism of Spenglerian tragedy,, the magisterial remoteness of Toynbee, the geostrategic calculations of Kennedy. or Fukuyama’s Americanized happy Hegelianism. A. J. P. Taylor, who actually taught Paul Kennedy, lived long enough to sadly observe that his own kind of leftist politics could go through its own decline, but there was one kind of Oxford wisdom he always represented, which both Fukuyama and Ferguson have had to rediscover: historians are more a mirror of their own times than reliable prophets.

 

Hedgehogs, Foxes, and the Idea of a Global Elite

Hedgehogs, Foxes, and the Idea of a Global Elite.

Alfred North Whitehead once wisely remarked that, to understand the nature of an age, don’t concentrate on its visible disputes, but on the ideas that even the opponents agree on. But he might have added that you you also have to look at the way people try to understand things in general. In particular, you have to distinguish between all those who view the world as Platonist hedgehogs, drawn always to Grand Theories or constructing ones of their own, and those who are Aristotelian foxes, willing to regard all kinds of important events in the world as having little or nothing to do with each other.

Applying Whitehead’s principle to the years from just before the 2008 Crash to the present, it is easy to find the great hedgehog accord. It has been a growing preoccupation with the concept of a single elite. Academic social scientists now join with internet conspiracy cranks, Tea Partiers with Wall Street Occupiers, FoxNews talking heads with pious Guardian pundits. All have been seizing on the term lately, although with different choices of favourite charter members.

This breadth of acceptance is new. Conspiracy theories of ‘the elite’, of course, have been around for over a century, often declaring it to be staffed with Freemasons and Illuminati. But even the less zany ones have appeared as paranoid fantasies to most sensible people. Several developments have prepared the way for the arrival of rational ideas of a new global elite.

The pre-Crash decade-long expansion of the financial sector was one major cause. For example, in the middle of the decade, the investment banking arm of then-mighty Citigroup published a report that candidly declared that they saw no point in even pretending that the bank was still operating in a democracy, but rather was now in a ‘plutocracy’. They didn’t even bother to flatter their wealthy clients as a new ‘aristocracy’, ready instead simply to hail a new world of ‘rule by the rich’, with no apologetic qualifications.

Then there has been the highly visible impact of the dashing Niall Ferguson, Oxford history don turned American Ivy League professor and TV pundit. Ferguson’s most similar predecessor a few decades earlier as celebrity historian had been A. J. P. Taylor, but Taylor had been a narrower British phenomenon, and a writer of social and political history. Ferguson has not only been more Anglo-American and ‘global’, but has specialized in the economic history of Top People, in books like The Cash Nexus, The Ascent of Money, and The Rothschilds.

Then, just a few months before the Crash, a thinktank writer and consultant named David Rothkrug attempted an exact definition. His book, Superclass: The Global Power Elite and the World They are Making, went beyond the annual ‘Top 400′ wealth rankings provided by Forbes for many years. Assuming a world population of six billion, rather than the seven lately reached, Rothkopf identified a ‘one in a million’ group of six thousand men and women, as a new international self-conscious ‘power elite’. According to him, they increasingly have more in common with each other than with anyone else, including the non-elite people of their countries of origin. He did not include all of the world’s hyper rich, excluding inheritors more fond of privacy than of schmoozing, but did count the CEOs of the very largest corporations and banks, threw in about two dozen of the most powerful political leaders, and added a sprinkling of superstar celebrities, like Bono and Angelina Jolie.

The bold cynics of soon-to-be-humbled Citigroup, Ferguson’s popular history, and Rothkopf’s nose-pressed-up-against-the-window reporting, all began making their impact on journalists. ‘Global elite’ articles have now appeared in Newsweek, The Economist, and The Atlantic. The media in general have also become fascinated with individual wealth concentration. Various studies through the last decade have estimated that the top 10% of adults own 85% of global wealth, and more startling, the top 1% alone own about 40%.

However, note that even a mere top hundredth of this top 1% could scarcely be comfortably accommodated at the annual gatherings at Davos. .01% is still no less than 700,000 people. .01% of the U. S. alone would be over 30,000. That is making the highly unrealistic assumption that the super-rich are as likely to build their several homes in Chad or Mongolia as in Connecticut or Florida. The real American and European figures will obviously be tens of thousands higher.

This alone allows for some foxy Aristotelian scepticism. How can comprehensible purposes be assigned to such large statistical assemblages? Especially of owners of assets so large as to require subordinate armies of fund managers merely to keep track of their possessions? Ferguson and Rothkopf, and other writers like them, seem to have forgotten why sensible people laughed off old-fashioned conspiracy theories. The latter, along with complete fictions, have sometimes drawn on selected real historical events. They still produced fantastic explanations, because the conspiracists failed to recognize that the only real links between the people and events that obsessed them were found entirely in their own consciousness.

There is a lesson in this for more respectable scholars and journalists. Elites in the plural are found in every human activity, those at the top often have power over others in the same activity, and people at the top in countless fields can afford to stay in the same luxury hotels. The great difficulty of all theories of a single elite, however, is that the factor that gives elite status in any field is only in that field itself. It will only be erratically true, and frequently not at all, that social interaction between top people in different fields has much to do with the way they actually exercise power, much more a ‘vertical’ process than a ‘horizontal’ one, save in the cosy but essentially secondary area of shared philanthropic projects. Most human beings, not just the completely powerless, but the hundreds of millions in intermediate levels of wealth and authority, are mostly far more concerned with the power of those immediately above them than with the distant mighty. You have to be yourself a bank employee, or be a fairly grand fromage in the corporate world, for example, to feel directly the impact of decisions made by JP Morgan CEO Jamie Dimon; if you conduct business on any lesser scale, you will be more worried about the decisions of your own local bank manager. Much the same experience of the locality of power applies to hierarchies of bureaucrats, academics, professionals, and media people.

Even a corporate CEO, a bank president, and a top political leader or bureaucrat who attended the same university, eat in the same restaurants and frequent the same clubs, are not necessarily going to have any extended common purpose. Industrialists and investors do not like or trust even their own bankers at times, much less their business and financial rivals, however frequently they exchange frozen smiles in their common venues.

Half a century ago, David Riesman, a Chicago lawyer before he became a famous cultural critic, provided an insight about elite theories that remains valid. He argued that, for Americans especially, and for both reactionaries and radicals, belief in a remote and all-powerful ruling group, even a malignant one, was preferable to a haunting deeper fear. What everyone from professors to prostitutes observe, in every modern American metropolis, is a buzzing profusion of peaceful commerce and blatant crime, brilliant achievement and brutal thuggery, daily tragedies and daily comedies. The most frightening thought that can arise from this is not that it all serves the purposes of a distant and possibly evil cabal. It is that, in the end, this is all there is. NO ONE is really minding the store.

Riesman still trumps Rothkopf. All the many in the world who are not only powerless, but who live in an environment that constantly reminds them of this powerlessness, like most university students and most journalists, are always subject to the hedgehog temptation: to view the few who are powerful as needing to conspire to exercise that power. But elite tycoons and elite financiers clash with each other as readily as bicyclists and taxidrivers, and have little more idea about what news awaits us all tomorrow.