Grab your popcorn and enjoy the show.
Shawn Macomb writes: So now that the Democratic party is well and truly feeling the Bern, how should those of us who identify not as democratic socialists nor oligarchs nor oligarch-enablers feel about those lighter-shade-of-Mao “Bernie 2016″ yard signs reddening up the landscape?
“The Sandernistas on the march will be more fun to watch than a crossover season of Girls and The Walking Dead—if, that is, one could still stomach watching Lena Dunham now that she’s thrown in her lot with that pantsuited Goldman Sachs subsidiary who portrays Hillary Clinton on various debate stages and social media accounts.”
The perhaps counterintuitive answer is . . . thrilled. Ecstatic, even. The Sandernistas on the march will be more fun to watch than a crossover season of Girls and The Walking Dead—if, that is, one could still stomach watching Lena Dunham now that she’s thrown in her lot with that pantsuited Goldman Sachs subsidiary who portrays Hillary Clinton on various debate stages and social media accounts.
Skeptical? Allow me to relate a single line from Outsider in the House, Sanders’s memoir of his 1996 congressional campaign: “I’m not sure how many of them actually heard my fourteen-second speech about the dangers of Newt Gingrich, given when I stepped out of my tiger costume.”
Sanders is describing his collaboration with the Bread and Puppet Domestic Resurrection Circus, “a political company whose accomplished theatrical productions are,” the then-congressman assured us, “truly radical”—radical enough to induce a sitting congressman to hold up the hind quarters of a tiger costume, anyway. “It’s better than being a horse’s ass,” Sanders writes, though whether he speaks from experience is not immediately clear.
“Alas, the charge of ‘insufficient Leninism’ is not the campaign-killer it once was. The Sandernistas don’t care about realpolitik lectures from ex-congressmen or the bitter ravings of the man whose 2000 campaign on the Green party ticket robbed the nation of four-to-eight glorious years of prime-time PowerPoint presentations from President Gore.”
Sure, the tiger-costume anecdote is a bit rich coming from the same guy who a few pages before slagged freshman Republicans who slept in their offices to save taxpayer cash back in ’95 as “total nuts” making “some kind of weird political statement.” But Sanders’s tale takes an even more absurdist turn as he recounts his address to the all-volunteer Mississquoi Valley Emergency Rescue Service later that same day. “Person after person,” Sanders notes, “talked about the trauma of seeing people die and the joy of saving people’s lives.” The contrast “from radical theatrics to community-based service,” he allows, “was striking.” Indeed. But “the differences strike me as more superficial than deep,” Sanders inexplicably feels compelled to add, as “both the rescue workers and the drama troupe are focused on . . . giving of themselves to build community.”
Even if he isn’t plotting to replace America’s first-responders with a puppeteer corps, Bernie Sanders is clearly delusional enough to be president. But is he delusional in the appropriate way?
Many of his erstwhile ideological allies are not so sure. Former congressman Barney Frank of Massachusetts, for example, snarked to National Journal, “I don’t understand what [Sanders] running for president would do other than frankly show that his viewpoint is not the majority viewpoint.” In a scathing Salon piece, writer Charles Davis averred that while, yes, Sanders “tosses rhetorical Molotovs at America’s 21st-century robber barons like few other national politicians,” he’s also “rather non-threatening, his politics reformist, not revolutionary—more old-school liberal than Leninist.” Read the rest of this entry »
Noah Rothman writes: Anyone who believes the 15-year-old wounds resulting from the Supreme Court’s decision in Bush v. Gore must have healed by now should ask a Broward County Democrat for their thoughts on the matter. Resentment among those who perceived themselves to be on the losing end of that decision lingers.
The notion that former Vice President Al Gore won the popular vote and yet lost the presidency is perceived even today by partisan Democrats not only as (erroneously) anathema to the foundational precepts of American constitutional governance but a veritable crime. Forget the merits of the case, which decidedly favor the plaintiff.
A pervasive sense of victimization continues to animate many a liberal Democrat. You would think a Clinton of all people would have internalized the lessons of 2000. Instead, the likely Democratic presidential nominee and the party she is vying to lead are sowing the seeds of similar discontent that might linger on for years…(read more)
Renaissance Florence, Enlightenment Edinburgh, Mozart‘s Vienna: why have certain places at certain times created such monumental leaps in thought and innovation? This is the question at the heart of travel writer Eric Weiner‘s latest book, The Geography of Genius: A Search for the World’s Most Creative Places, From Ancient Athens to Silicon Valley.
“This is a book about process, about how creative genius happens and what are the circumstances,” Weiner explains. “I believe in the power of place and the power of culture to shape our lives in unexpected ways.”
Traveling the globe, Weiner looked at the locations and cultures that fostered history’s greatest minds. Through his research he pieced together a list of ingredients he believes played a vital role in creating these “genius clusters,” including money, diversity, competition, and disorder.
“A little bit of chaos is good,” says Weiner. “The pot has to be stirred. If you are fully invested in the status quo—either as a person or a place—you are unlikely to create genius because you are too comfortable.”
So can a government build a city that will generate the geniuses of tomorrow? Weiner thinks not. “I wish I could sit here and tell you that there was a formula and if you applied that formula you could create the next Silicon Valley,” he says. “There is no formula.”
About 8:30 minutes.
Camera by Austin Bragg and Joshua Swain. Hosted and edited by Meredith Bragg.
Bernie Sanders is surging ahead of Hillary and Donald Trump is surging ahead of everyone… so are the 2016 nominees a done deal?
On the essays shelf:
My grandmother had a big illustrated copy of Poor Richard’s Almanac, which I had practically memorized by the time I was 6 years old. The illustrations were goofy and elaborate, and I somehow “got the joke” that so
much of it was a joke, a satire on the do-good-ish bromides of self-serious Puritans who worry about their neighbor’s morality. Obviously I wouldn’t have put it that way at age 6, but I understood that the book in my hands, the huge book, was not serious at all.
Clearly, many others did not get the joke. Benjamin Franklin, throughout his life, was a master at parody and satire, as well as such a master that he is still fooling people! He was his very own The Onion! He presented ridiculous arguments and opinions in a way where people nodded their heads in agreement, and then afterwards wondered uneasily if they were being made fun of. Their uneasiness was warranted. Yes, Benjamin Franklin was making fun of them.
Franklin played such a huge role not only in creating bonding-mechanisms between the colonies – with newspapers, his printing service, the Almanac – but in science and community service (he started the first fire-brigade in Philadelphia on the British model. He opened the first public lending library in the colonies), as well as his writing. He was an Elder Statesman of the relatively young men who made up the Revolution. There were so many of “those guys” who played a hand in the Revolution, but perhaps Benjamin Franklin played the most crucial role in his time as a diplomatic presence in France, where he became so beloved a figure that the French fell in love with him, commemorated him in songs and portraits, putting his mug on plates and cups and platters and buttons – so that in a time when nobody knew really what anybody looked like, Benjamin Franklin was instantly recognizable the world over. Read the rest of this entry »
You Know Less Than You Think About Guns
Brian Doherty writes: “There is a gun for roughly every man, woman, and child in America,” President Barack Obama proclaimed after the October mass shooting that killed 10 at Umpqua Community College in Oregon. “So how can you, with a straight face, make the argument that more guns will make us safer? We know that states with the most gun laws tend to have the fewest gun deaths. So the notion that gun laws don’t work—or just will make it harder for law-abiding citizens and criminals will still get their guns—is not borne out by the evidence.”
In this single brief statement, Obama tidily listed the major questions bedeviling social science research about guns—while also embodying the biggest problem with the way we process and apply that research. The president’s ironclad confidence in the conclusiveness of the science, and therefore the desirability of “common-sense gun safety laws,” is echoed widely with every new mass shooting, from academia to the popular press to that guy you knew from high school on Facebook.
In April 2015, the Harvard gun-violence researcher David Hemenway took to the pages of the Los Angeles Times to declare in a headline: “There’s scientific consensus on guns—and the NRA won’t like it.” Hemenway insisted that researchers have definitively established “that a gun in the home makes it a more dangerous place to be…that guns are not used in self-defense far more often than they are used in crime…and that the change to more permissive gun carrying laws has not reduced crime rates.” He concludes: “There is consensus that strong gun laws reduce homicide.”
But the science is a lot less certain than that. What we really know about the costs and benefits of private gun ownership and the efficacy of gun laws is far more fragile than what Hemenway and the president would have us believe.
More guns do not necessarily mean more homicides. More gun laws do not necessarily mean less gun crime. Finding good science is hard enough; finding good social science on a topic so fraught with politics is nigh impossible. The facts then become even more muddled as the conclusions of those less-than-ironclad academic studies cycle through the press and social media in a massive game of telephone. Despite the confident assertions of the gun controllers and decades of research, we still know astonishingly little about how guns actually function in society and almost nothing at all about whether gun control policies actually work as promised.
Do More Guns Mean More Homicides?
“More Americans have died from guns in the United States since 1968 than on battlefields of all the wars in American history,” New York Times columnist Nicholas Kristof wrote on August 26, 2015, just after the grisly on-air murder of two television journalists in Virginia. It’s a startling fact, and true.
But do the number of guns in circulation correlate with the number of gun deaths? Start by looking at the category of gun death that propels all gun policy discussion: homicides. (Gun suicides, discussed further below, are a separate matter whose frequent conflation with gun crime introduces much confusion into the debate.)
In 1994 Americans owned around 192 million guns, according to the U.S. Justice Department’s National Institute of Justice. Today, that figure is somewhere between 245 and 328 million, though as Philip J. Cook and Kristin A. Goss in their thorough 2014 book The Gun Debate: What Everyone Needs to Know (Oxford University Press) wisely concluded, “the bottom line is that no one knows how many firearms are in private hands in the United States.” Still, we have reason to believe gun prevalence likely surpassed the one-gun-per-adult mark early in President Barack Obama’s first term, according to a 2012 Congressional Research Service report that relied on sales and import data.
Yet during that same period, per-capita gun murders have been cut almost in half.
One could argue that the relevant number is not the number of guns, but the number of people with access to guns. That figure is also ambiguous. A Gallup poll in 2014 found 42 percent of households claiming to own a gun, which Gallup reports is “similar to the average reported to Gallup over the past decade.” But those looking for a smaller number, to downplay the significance of guns in American life, can rely on the door-to-door General Social Survey, which reported in 2014 that only 31 percent of households have guns, down 11 percentage points from 1993’s 42 percent. There is no singular theory to explain that discrepancy or to be sure which one is closer to correct—though some doubt, especially as gun ownership continues to be so politically contentious, that people always reliably report the weapons they own to a stranger literally at their door.
The gun murder rate in 1993 was 7.0 per 100,000, according to the Centers for Disease Control and Prevention‘s (CDC) National Center for Injury Prevention and Control. (Those reports rely on death certificate reporting, and they tend to show higher numbers than the FBI’s Uniform Crime Reporting program, though both trend the same.) In 2000 the gun murder rate per 100,000 was 3.8. By 2013, the rate was even lower, at 3.5, though there was a slight upswing in the mid-00s.
This simple point—that America is awash with more guns than ever before, yet we are killing each other with guns at a far lower rate than when we had far fewer guns—undermines the narrative that there is a straightforward, causal relationship between increased gun prevalence and gun homicide. Even if you fall back on the conclusion that it’s just a small number of owners stockpiling more and more guns, it’s hard to escape noticing that even these hoarders seem to be harming fewer and fewer people with their weapons, casting doubt on the proposition that gun ownership is a political crisis demanding action.
In the face of these trend lines—way more guns, way fewer gun murders—how can politicians such as Obama and Hillary Clinton so successfully capitalize on the panic that follows each high profile shooting? Partly because Americans haven’t caught on to the crime drop. A 2013 Pew Research Poll found 56 percent of respondents thought that gun crime had gone up over the past 20 years, and only 12 percent were aware it had declined.
Do Gun Laws Stop Gun Crimes?
The same week Kristof’s column came out, National Journal attracted major media attention with a showy piece of research and analysis headlined “The States With The Most Gun Laws See The Fewest Gun-Related Deaths.” The subhead lamented: “But there’s still little appetite to talk about more restrictions.”
Critics quickly noted that the Journal‘s Libby Isenstein had included suicides among “gun-related deaths” and suicide-irrelevant policies such as stand-your-ground laws among its tally of “gun laws.” That meant that high-suicide, low-homicide states such as Wyoming, Alaska, and Idaho were taken to task for their liberal carry-permit policies. Worse, several of the states with what the Brady Campaign to Prevent Gun Violence considers terribly lax gun laws were dropped from Isenstein’s data set because their murder rates were too low!
Another of National Journal‘s mistakes is a common one in gun science: The paper didn’t look at gun statistics in the context of overall violent crime, a much more relevant measure to the policy debate. After all, if less gun crime doesn’t mean less crime overall—if criminals simply substitute other weapons or means when guns are less available—the benefit of the relevant gun laws is thrown into doubt. When Thomas Firey of the Cato Institute ran regressions of Isenstein’s study with slightly different specifications and considering all violent crime, each of her effects either disappeared or reversed.
In 2015 we witness a rare geopolitcal power shift – and in the face of every kind of new external challenge the leaders of the EU and the USA have never looked weaker or more bemused.
Christopher Booker writes: As we enter this new year, what is the most significant feature of how the world is changing that went almost unnoticed in the year just ended? Two events last autumn might have given us a clue.
One was the very peculiar nature of that state visit in October, when the president of China was taken in a golden coach to stay at Buckingham Palace, down a Mall lined with hundreds of placard-waving pro‑China stooges, while the only people manhandled away by Chinese security guards were a few protesters against China’s treatment of Tibet and abuses of human rights.
Led by David Cameron, our politicians could not have fawned more humiliatingly on the leader of a country whose economy, before its recent wobbles, was predicted by the IMF to overtake that of the US as the largest in the world in 2016. While Britain once led the world in steel‑making and the civil use of nuclear power, the visit coincided with the crumbling of the remains of our steel industry before a flood of cheap Chinese steel, as our politicians pleaded for China’s help in building, to an obsolete design, the most costly nuclear power station in the world.
Three weeks later came the rather less prominent visit of Narendra Modi, prime minister of India, whose even faster-growing economy is predicted by financial analysts to become bigger than Britain’s within three years, and to overtake China’s as the world’s largest in the second half of the century. Read the rest of this entry »
REASON TV with Spiked’s Brendan O’Neill
The 41-year-old Londoner has similarly blunt and outspoken views about “left-wing environmentalism,” which he calls “an apology for poverty” and simply the latest iteration of religious “end-of-worldism” in which “we will be judged for our sins.”
O’Neill is also a critic of European policies that he says marginalize religious and ethnic minorities even as they “protect” immigrants by passing hate-speech laws and banning burqas. “In their efforts to enforce Enlightenment values,” he says, policymakers “actually undercut them.”
O’Neill got his start at the defunct Living Marxism, the publication of Britain’s Revolutionary Communist Party, and these days he sometimes calls himself a “Marxist libertarian.” “It seems like a contradiction in terms,” he acknowledges, “but that’s because people haven’t read the original Marx and Engels, the early stuff…if you read the early stuff it’s all about liberating humanity from poverty and from state diktat and allowing them to have as free a life as possible.” Read the rest of this entry »
Progressives and their media allies have launched a campaign to deny the ‘Ferguson effect’—but it’s real, and it’s increasingly deadly for inner cities.
Heather Mac Donald writes: Murders and shootings have spiked in many American cities—and so have efforts to ignore or deny the crime increase. The see-no-evil campaign eagerly embraced a report last month by the Brennan Center for Justice called “Crime in 2015: A Preliminary Analysis.” Many progressives and their media allies hailed the report as a refutation of what I and others have dubbed the “Ferguson effect”— cops backing off from proactive policing, demoralized by the ugly vitriol directed at them since a police shooting in Ferguson, Mo., last year. Americans are being asked to disbelieve both the Ferguson effect and its result: violent crime flourishing in the ensuing vacuum.
“Baltimore’s per capita homicide rate, for example, is now the highest in its history, according to the Baltimore Sun: 54 homicides per 100,000 residents, beating its 1993 rate of 48.8 per 100,000 residents. Shootings in Cincinnati, lethal and not, were up 30% by mid-September 2015 compared with the same period in 2014.”
In fact, the Brennan Center’s report confirms the Ferguson effect, while also showing how clueless the media are about crime and policing.
“Homicides in St. Louis were up 60% by the end of August. In Los Angeles, the police department reports that violent crime has increased 20% as of Dec. 5; there were 16% more shooting victims in the city, while arrests were down 9.5%. Shooting incidents in Chicago are up 17% through Dec. 13.”
The Brennan researchers gathered homicide data from 25 of the nation’s 30 largest cities for the period Jan. 1, 2015, to Oct. 1, 2015. (Not included were San Francisco, Indianapolis, Columbus, El Paso and Nashville.) The researchers then tried to estimate what 2015’s full-year homicide numbers for those 25 cities would be, based on the extent to which homicides were up from January to October this year compared with the similar period in 2014.
“To the Brennan Center and its cheerleaders, the nation’s law-enforcement officials are in the grip of a delusion that prevents them from seeing the halcyon crime picture before their eyes. For the past several months, police chiefs have been sounding the alarm about rising violent crime.”
The resulting projected increase for homicides in 2015 in those 25 cities is 11%. (By point of comparison, the FiveThirtyEight data blog looked at the 60 largest cities and found a 16% increase in homicides by September 2015.) An 11% one-year increase in any crime category is massive; an equivalent decrease in homicides would be greeted with high-fives by politicians and police chiefs. Yet the media have tried to repackage that 11% homicide increase as trivial.
Several strategies are employed to play down the jump in homicides. The simplest is to hide the actual figure. An Atlantic magazine article in November, “Debunking the Ferguson Effect,” reports: “Based on their data, the Brennan Center projects that homicides will rise slightly overall from 2014 to 2015.” A reader could be forgiven for thinking that “slightly” means an increase of, say, 2%.
Nothing in the Atlantic write-up disabuses the reader of that mistaken impression. The website Vox, declaring the crime increase “bunk,” is similarly discreet about the actual homicide rate, leaving it to the reader’s imagination. Crime & Justice News, published by the John Jay College of Criminal Justice, coyly admits that “murder is up moderately in some places” without disclosing what that “moderate” increase may be. Read the rest of this entry »
Mass surveillance may seem eerily futuristic, but it marks a return to a time when we were watched by an omniscient authority. We called it God.
Amanda Power writes: Humanity, according to the most influential origin story of Western culture, was created naked, unashamed, wholly willing to submit to the scrutiny of the god who made the world and its rules. Through an act of defiance urged on humans by an enemy of their happy state, “their eyes were opened”—they realized their own nakedness and sought to hide from view.
“Nothing is hidden from the eyes of the observing world.”
— Aleksandr Pushkin, 1837
The god was so angered by this that he threw them out of paradise to suffer and die. This was the original sin, the disobedience for which humans deserved to be punished through generations, centuries, and until the world ends. It was, quite simply, the pursuit of knowledge not sanctioned by the one who ruled them, and the hunger for privacy from surveillance. Or so the ruling elite, through its rabbis and priests, has told the population for thousands of years, through the brief and vivid story of the Fall.
Nor did variants on this god—depending on the teller: murderous or tender, wild with wrath or soberly judging, immediate or remote, but consistently male—cease watching after humanity’s expulsion from Eden. The resulting observations were the basis for a highly interventionist treatment of those he called his chosen people. When they obeyed him, he gave them, in his hot and possessive love, pleasant places to live, and he slaughtered their enemies. When they looked to other gods, he rained devastating punishments on them until they submitted once again.
He could see into their hearts and enter their dreams. Much of this remained the same in his Christian incarnation, but the dazzling promise that immortality could be regained through Christ’s death was yoked to the demand for a particular kind of self-scrutiny: the constant examination and exposure of one’s inner self. He knew us but also insisted we know ourselves and share our knowledge with him. Participation in our own surveillance was the price of entry into heaven.
For centuries the history of Western nations was traced from these beginnings, and so for centuries this god was part of how we legitimized our forms of government and those individuals who governed us. The flawed nature of societies characterized by inequality and injustice was simply another aspect of life in the unsatisfactory world created by mankind’s original sin. Around 1159 John of Salisbury, discussing governance in his Policraticus, observed that even tyrants of the worst kind were “ministers of God, who by His just judgment has willed them to be preeminent over both soul and body.
“The mass surveillance of the global population by corporations and government bureaucracies that has transcended all pretense of democratic accountability. The technologies that enable it are sophisticated, sleek, and silent. A sort of cyborg omniscience is obtained by those who control the information.”
By means of tyrants, the evil are punished and the good are corrected and trained.” All this, he believed, was a result of humans reaching a “rash and reckless hand toward the forbidden tree of knowledge,” and thereby plunging themselves into misery and death. The only remedy lay in submission to God; the only comfort in hard times was His watchful eye. So useful a tool did the idea of God prove to be—to ruler and ruled alike—that it has been carried, through the teeth of the so-called Enlightenment, into the social imagination of many republics and democracies. And it would not be surprising if these ideas, reiterated so consistently over the centuries, informed our attitudes toward the sort of surveillance we now experience as a novel aspect of modern life.
“If we have drifted into the dystopia of which George Orwell and Aldous Huxley warned, then surely, we are inclined to think, we have entered a terrifying new world.”
For it seems to be such a contemporary issue: the mass surveillance of the global population by corporations and government bureaucracies that has transcended all pretense of democratic accountability. The technologies that enable it are sophisticated, sleek, and silent. A sort of cyborg omniscience is obtained by those who control the information. If we have drifted into the dystopia of which George Orwell and Aldous Huxley warned, then surely, we are inclined to think, we have entered a terrifying new world.
But those who see in all this something eerily futuristic may have it backward. In our modern surveillance state, it’s possible we have in some perverse and unexpected fashion actually regained something of the comforts of being known by a higher authority—something that the modern West had largely lost, and for which we have perhaps unconsciously longed.
“But those who see in all this something eerily futuristic may have it backward. In our modern surveillance state, it’s possible we have in some perverse and unexpected fashion actually regained something of the comforts of being known by a higher authority—something that the modern West had largely lost, and for which we have perhaps unconsciously longed.”
At its most essential level, the notion of an omniscient, omnipotent, interested, judging God was translated into our inherited forms of governance through the Roman Catholic interpretation of Christ’s words to Peter, in the Gospel According to Matthew. “Upon this rock I will build my church,” Christ says to his apostle, “and the gates of Hades shall not overpower it. I will give you the keys of the kingdom of heaven; and whatever you shall bind on earth shall be bound in heaven, and whatever you shall loose on earth shall be loosed in heaven.” The Church alleged that this authority had been transmitted through the succession of the bishops of Rome, and flowed from pope on down through the clerical hierarchy, so that every priest shared in the power to bind and loose on earth, in the knowledge that their decisions would be upheld by God.
“At its most essential level, the notion of an omniscient, omnipotent, interested, judging God was translated into our inherited forms of governance through the Roman Catholic interpretation of Christ’s words to Peter, in the Gospel According to Matthew.”
Through the priests, God’s power to watch and judge had a human embodiment. They were not to shed blood, but there were circumstances in which they were to hand over obdurate individuals to secular authorities for execution. God’s dispersed authority was thus delegated even to laypeople whose individual jurisdiction extended no further than towns and villages. At the top of the secular hierarchy, monarchs were anointed by priests, thus symbolizing their religious legitimacy. As in John of Salisbury’s “ministers of God,” these monarchs’ worst abuses were sanctioned by the assertion of the elites that governments always operated with the backing of watchful divine will. Read the rest of this entry »
The philosopher talks to Mick Hume about politics, marriage and Islam.
Mick Hume writes: Ours is an age of intellectual conformism, in which expressing offensive opinions often seems to be deemed the worst offence of all; academia is decreed a ‘safe space’ where ‘uncomfortable’ ideas are banished, and using the wrong word can see you accused of committing a ‘microaggression’. And you are supposed to apologise at the first sign of a wagging finger.
“When I was in Paris in ’68 I became indignant at the total ignorance of the people who tried to tell me that this revolution was something important. I couldn’t argue with them about the thing that really mattered to me, culture. To them that was just ‘bourgeois’. This word bourgeois really got up my nose.”
Roger Scruton apparently didn’t get the memo. During our conversation, the conservative philosopher gently but unapologetically delivered blunt and cutting opinions on subjects ranging from Slavoj Zizek to Jeremy Corbyn, from banning the veil to Islamist terrorism, from homosexuality to fox hunting. Whatever anybody thinks of his views, they should surely endorse his aversion to the ‘radical censorship of anything that disturbs people’ and his insistence that the controversial ‘needs to be discussed’ rather than continually ‘pushed under the carpet’.
“I decided, yes, of course there is such a thing as the bourgeoisie and you are it, these well-fed, pampered middle-class students whose one concern was to throw stones at working-class people who happened to be in a policeman’s uniform.’”
Now 71, Scruton has been the bête noire of British left intellectuals for more than 30 years, and gives them another beastly mauling in his new book Fools, Frauds and Firebrands: Thinkers of the New Left. It is a tour de force that, the introduction concedes, is ‘not a word-mincing book’, but rather ‘a provocation’. In just under 300 pages he Scruton-izes a collection of stars, past and present, of the radical Western intelligentsia – the likes of Eric Hobsbawm and EP Thompson in Britain, JK Galbraith and Ronald Dworkin in the US, Jurgen Habermas, Louis Althusser, Jacques Lacan and Gilles Deleuze in Europe. An expanded and updated version of his controversial Thinkers of the New Left(1985), the book ends with a new chapter entitled ‘The kraken wakes’ dealing with the ‘mad incantations’ of Alan Badiou and the left’s marginally newer academic celebrity, the Slovenian Zizek.
The slightly pained look on his face suggests that I am not the first to ask Scruton why he has devoted a book to taking on a collection of largely declining or deceased intellectuals and a culture that he concedes ‘now survives largely in its academic redoubts’. ‘They may seem like obscure intellectuals to the man in the street but actually they are still dominant on the humanities curriculum’, he explains. ‘If you study English or French, even musicology or whatever, you have to swallow a whole load of Lacan and Deleuze. Take Deleuze’s book, A Thousand Plateaus – the English translation has only been out a few years, but it’s already gone through 11 printings. A huge, totally unreadable tome by somebody who can’t write French.’
“Defending academic freedom against the forces of conformity matters to Scruton because ‘My life began, insofar as it had a beginning, in the university. That’s where I grew up, and I love my subject, philosophy, love the whole idea of the academic and scholarly life, that one has a place apart where people are pursuing the truth and communicating that to people who are eager to learn it.”
‘Yet this is core curriculum throughout the humanities in American and English universities. Why? The one sole reason is it’s on the left. There is nothing that anybody can translate into lucid prose, but for that very reason, it seems like a suit of armour around the age-old prejudices against power and authority, the old unshaped and unshapeable agenda.’
“‘And this thing has completely destroyed the intellectual life.’ He considers these leftists prime culprits in what might be called the closing of the university mind, though ‘whether they caused the closing of the mind or are the effect of it is another matter’.”
Defending academic freedom against the forces of conformity matters to Scruton because ‘My life began, insofar as it had a beginning, in the university. That’s where I grew up, and I love my subject, philosophy, love the whole idea of the academic and scholarly life, that one has a place apart where people are pursuing the truth and communicating that to people who are eager to learn it.
And this thing has completely destroyed the intellectual life.’ He considers these leftists prime culprits in what might be called the closing of the university mind, though ‘whether they caused the closing of the mind or are the effect of it is another matter’.
“Scruton’s powerful aversion to ‘the French gurus of ’68 and their jargon-ridden prose’ dates from that student revolt in Paris in 1968. It gave birth to a generation of radical thinkers, and, in the process, helped turn at least one young Englishman into a conservative.”
Scruton’s powerful aversion to ‘the French gurus of ’68 and their jargon-ridden prose’ dates from that student revolt in Paris in 1968. It gave birth to a generation of radical thinkers, and, in the process, helped turn at least one young Englishman into a conservative. ‘I was there in Paris and I was indignant at the stupidity of what I observed. I was a normal young person in England, I was brought up in a Labour Party family and as far as I had any views they’d be vaguely on the left.’ His father was a working-class lad from Manchester who became a schoolteacher and moved his family south, where Scruton attended High Wycombe Royal Grammar School, played bass guitar and listened to The Beatles before being expelled shortly after winning a scholarship to Cambridge University. Read the rest of this entry »
Suicide in The Fast Lane: European Civilization in Accelerated Decline, Politically Correct Universities ‘Are Killing Free Speech’Posted: December 19, 2015
British universities have become too politically correct and are stifling free speech by banning anything that causes the least offence to anyone, academics argue.
Javier Espinoza writes: A whole generation of students is being denied the “intellectual challenge of debating conflicting views” because self-censorship is turning campuses into over-sanitised “safe spaces”, they say.
“A generation of students is being denied the opportunity to test their opinions against the views of those they don’t agree with.”
Oriel College says the statue of Rhodes, on a building he paid for, jars with the values of a modern university. It is facing a battle with Historic England, which has listed the statue as an object of historical interest.
Writing in The Telegraph, the academics, led by Frank Furedi, professor of sociology at the University of Canterbury, and Joanna Williams, education editor, Spiked, say it is part of a “long and growing” list of people and objects banned from British campuses, including pop songs, sombreros and atheists.
“Students who are offended by opposing views are perhaps not yet ready to be at university.”
They say the “deeply worrying development” is curtailing freedom of speech “like never before” because few things are safe from student censors.
Because universities increasingly see fee-paying students as customers, they do not dare to stand up to the “small but vocal minority” of student activists who want to ban everything from the Sun newspaper to the historian David Starkey.
“In September, the University of East Anglia banned students from wearing free sombreros they were given by a local Tex-Mex restaurant because the student union decided non-Mexicans wearing the wide-brimmed hats could be interpreted as racist.”
The letter says: “Few academics challenge censorship that emerges from students. It is important that more do, because a culture that restricts the free exchange of ideas encourages self-censorship and leaves people afraid to express their views in case they may be misinterpreted. This risks destroying the very fabric of democracy.
“An open and democratic society requires people to have the courage to argue against ideas they disagree with or even find offensive. At the moment there is a real risk that students are not given opportunities to engage in such debate.
“A generation of students is being denied the opportunity to test their opinions against the views of those they don’t agree with.”
Calling on vice-chancellors to take a “much stronger stance” against all forms of censorship, they conclude that “students who are offended by opposing views are perhaps not yet ready to be at university”.
Professors have complained recently that they are being bullied online by students who are easily offended by opposing views.
In recent months, students at British universities have banned, cancelled or challenged a host of speakers and objects because some found them offensive. Maryam Namazie, a prominent human rights campaigner who is one of the signatories to the letter, was initially banned from speaking at Warwick University because she is an atheist who, it was feared, could incite hatred on campus. She spoke at Warwick in the end. Read the rest of this entry »