‘They are convinced that slavery was an American problem.’
Kate Hardiman – University of Notre Dame: For 11 years, Professor Duke Pesta gave quizzes to his students at the beginning of the school year to test their knowledge on basic facts about American history and Western culture.
“Most of my students could not tell me anything meaningful about slavery outside of America. They are convinced that slavery was an American problem that more or less ended with the Civil War, and they are very fuzzy about the history of slavery prior to the Colonial era. Their entire education about slavery was confined to America.”
The most surprising result from his 11-year experiment? Students’ overwhelming belief that slavery began in the United States and was almost exclusively an American phenomenon, he said.
“They cannot tell you many historical facts or relate anything meaningful about historical biographies, but they are, however, stridently vocal about the corrupt nature of the Republic, about the wickedness of the founding fathers, and about the evils of free markets.”
“Most of my students could not tell me anything meaningful about slavery outside of America,” Pesta told The College Fix. “They are convinced that slavery was an American problem that more or less ended with the Civil War, and they are very fuzzy about the history of slavery prior to the Colonial era. Their entire education about slavery was confined to America.”
“Most alarmingly, they know nothing about the fraught history of Marxist ideology and communist governments over the last century, but often reductively define socialism as ‘fairness.’”
— Professor Duke Pesta
Pesta, currently an associate professor of English at the University of Wisconsin Oshkosh, has taught the gamut of Western literature—from the Classics to the modern—at seven different universities, ranging from large research institutions to small liberal arts colleges to branch campuses. He said he has given the quizzes to students at Purdue University, University of Tennessee Martin, Ursinus College, Oklahoma State University, and University of Wisconsin Oshkosh.
The origin of these quizzes, which Pesta calls “cultural literacy markers,” was his increasing discomfort with gaps in his students’ foundational knowledge.
“They came to college without the basic rudiments of American history or Western culture and their reading level was pretty low,” Pesta told The Fix.
Before even distributing the syllabus for his courses, Pesta administered his short quizzes with basic questions about American history, economics and Western culture. For instance, the questions asked students to circle which of three historical figures was a president of the United States, or to name three slave-holding countries over the last 2,000 years, or define “capitalism” and “socialism” in one sentence each.
Often, more students connected Thomas Jefferson to slavery than could identify him as president, according to Pesta. On one quiz, 29 out of 32 students responding knew that Jefferson owned slaves, but only three out of the 32 correctly identified him as president. Interestingly, more students— six of 32—actually believed Ben Franklin had been president. Read the rest of this entry »
Te-Ping Chen reports: Chinese teachers should be on their guard against the infiltration of Western ideas, the country’s education minister says. Also, while they’re at it, they should stop complaining and venting their grievances in front of students as well.
“Mr. Yuan declared that the government ‘absolutely could not allow teachers to whine while teaching, air their resentments or spread negative spirits to their students.’ The report didn’t elaborate on the nature of grumbling that the government was opposed to.”
The minister, Yuan Guiren, made the comments at a conference Thursday at which representatives from some of China’s best universities were assembled. According to Mr. Yuan, as cited by state news agency Xinhua, universities should avoid use of teaching materials that “disseminate Western values.”
As well, Xinhua said, Mr. Yuan declared that the government “absolutely could not allow teachers to whine while teaching, air their resentments or spread negative spirits to their students.” The report didn’t elaborate on the nature of grumbling that the government was opposed to.
“Since assuming office, Chinese President Xi Jinping has actively pushed the study of traditional Chinese culture. Such a push has also come in tandem with a backlash against certain Western traditions, notably Christmas.”
Mr. Yuan’s comments come amid a growing scrutiny of ideology on China’s campuses. Earlier this month, the State Council General Office released an opinion on the need to “further strengthen and improve propaganda and ideology work.” It declared that higher education is a key “battlefield” in the struggle for ideology. Read the rest of this entry »
Michael Tomasky almost makes a good case here, but his credibility is strained by some perplexing comments. For example, the worst kind of wishful thinking is revealed in statements like this: “If states were to alter their conceptions of sharia law so that blasphemy and apostasy were lesser crimes, or preferably not crimes at all…” Well, of course we prefer they’re “not crimes at all”. Islamic legal scholars are pretty much on record preferring otherwise. I’d prefer that fresh coffee be delivered to my desk each morning by a team of pink unicorns. Who wouldn’t? But in the real world, I still have to go out and get my own coffee. To adherents and advocates of sharia law — perhaps not in its western world incarnations and deviations – but certainly in the Islamic world, to recommend liberalizing sharia to the point of irrelevance is itself arguably blasphemous. Or at the least, unrealistic to the point of being dangerously blind. Perhaps I’m wrong, maybe sharia has more potential to be flexible than I’m aware of. But current global trends certainly suggests otherwise.
Further, Tomasky’s flimsy defense of CAIR is questionable, and his call for maturity is rank snobbery disguised as insight: “Groups like CAIR and leading intellectuals and imams have been denouncing acts like these for years. It’s just that they don’t often make the news when they do it. So let’s please just grow out of that one,” he writes. Really? Let’s not grow out of that one, Mr. Tomasky. Terrorist front-group CAIR pays lip service to such things, but their blood-soaked insincerity is as ripe and thick as their FBI rap sheet. Let’s not even pretend that CAIR is a legitimate organization, if we’re trying to have a serious discussion. Those complaints aside? It’s a good article. And a worthwhile debate to have. Anyone willing to defend blasphemy, and advocate reform, is one of the good guys. Read the whole thing here, at The Daily Beast.
Today, Saudi Arabia will flog a blogger for blasphemy. We may not be able to stop terrorists from killing, but can we pressure states?
Michael Tomasky writes: Today, Saudi Arabia will flog a blogger for blasphemy. We may not be able to stop terrorists from killing, but can we pressure states?
As you go about your business today and think once or twice (as I hope you will) of Charb and his colleagues in Paris, spare another thought for Raif Badawi. He is, or was, a blogger in Saudi Arabia. Not the most agreeable place to ply the trade, as he learned in 2012 when he was arrested and charged with using his web site, “Free Saudi Liberals,” to engage in electronic insult of Islam. I read on Jonathan Turley’s blog that today, Friday, he will receive the first dose of his sentence in the form of 50 lashes.
“Have a look at this telling research from Pew on blasphemy and apostasy laws around the world. We do see that a few European countries have them on the books: Germany, Poland, Italy, Ireland, a couple more. In these countries, the punishment is typically a fine. Maybe in theory a short stint in the cooler, but in reality the laws in these countries are rarely enforced, and in some countries there hasn’t been a prosecution in years or decades.”
Badawi’s crime was to run a web site that “violates Islamic values and propagates liberal thought.” Interesting that those who sat in judgment of him found those two sets of beliefs to be incompatible. He was originally sentenced to seven years and 600 lashes. A huge international outcry ensued. He was retried, and sure enough his sentence was adjusted. It was increased—to 10 years and 1,000 lashes. But give the Kingdom credit for its sense of mercy: The lashes will be administered only 50 at a time.
Like Nick Kristof, I have been gratified to see that my Twitter feed has been bursting to the rafters with tweets from Muslims and Arabs condemning the Paris attacks in the strongest possible terms. Gratified but not surprised. Anyone who’s paid attention has known for some time now that there are millions of Muslims and Arabs (obviously, not all Muslims are Arabs, and vice versa) who espouse and fight for liberal secular values. I know some. They’re some of the most courageous people I’ve ever met.
“The most notorious states are Saudi Arabia and Pakistan, where death is an acceptable legal remedy. In 2009, a Pakistani Christian woman got into a religious argument with some Muslim women with whom she was harvesting berries. Asia Bibi, as she is known, was arrested and sentenced to death.”
It’s high time—and if this tragedy has prodded Western culture to turn this particular corner, then that’s one good thing that will have come of it—that we stop demanding of Muslims and Arabs that they denounce acts of terrorism just because they’re Muslims and Arabs. Read the rest of this entry »
John Armstrong writes: For decades, Western culture has been reluctant to assign an inherent value or a purpose to art—even as it continues to hold art in high esteem. Though we no longer seem comfortable saying so, our reverence for art must be founded on a timeless premise: that art is good for us. If we don’t believe this, then our commitment—in money, time, and study—makes little sense. In what way might art be good for us? The answer, I believe, is that art is a therapeutic instrument: its value lies in its capacity to exhort, console, and guide us toward better versions of ourselves and to help us live more flourishing lives, individually and collectively.
Resistance to such a notion is understandable today, since “therapy” has become associated with questionable, or at least unavailing, methods of improving mental health. To say that art is therapeutic is not to suggest that it shares therapy’s methods but rather its underlying ambition: to help us to cope better with existence. While several predominant ways of thinking about art appear to ignore or reject this goal, their ultimate claim is therapeutic as well.
Art’s capacity to shock remains for some a strong source of its contemporary appeal. We are conscious that, individually and collectively, we may grow complacent; art can be valuable when it disrupts or astonishes us. We are particularly in danger of forgetting the artificiality of certain norms. It was once taken for granted, for instance, that women should not be allowed to vote and that the study of ancient Greek should dominate the curricula of English schools. It’s easy now to see that those arrangements were far from inevitable: they were open to change and improvement.
The new paternalism is so nonconfrontational, anti-ideological, and unwilling to claim moral authority that it can hardly be called “paternal.” Let’s call it “maternalism”…
Nancy McDermott looks at the cultural assault on masculinity
‘Nanny and Sammy followed their mother’s instructions without a murmur; indeed, they were overawed. There is a certain uncanny and superhuman quality about all such purely original undertakings as their mother’s was to them. Nanny went back and forth with her light loads, and Sammy tugged with sober energy.’ (From ‘The Revolt of Mother’ by Mary E Wilkins (1)).
“…what we are seeing today is the dismantling of the historic gains of the Enlightenment in the name of The Mother”
The idea for this essay began percolating about a year ago, when I reviewed Hanna Rosin’s The End of Men. She made the case that women are achieving parity with men and even surpassing them in a number of important ways. Although I didn’t quite buy all her explanations, I liked Rosin’s book and was sorry to see so many reviewers dismiss it in what seemed like a rush to reiterate the persistence of women’s oppression. I thought her observations were reasonable, but more importantly they seemed to throw the contours of something else into relief, something beyond gender roles. It was only when I began to look at the question of paternalism that it dawned on me what this might be.
Paternalism has emerged as the dominant form of authoritarianism in our society. Across the world, policymakers are quietly working behind the scenes to save us from ourselves, nudging us towards Jerusalem with smaller fast-food cups, architecture intended to make us climb more stairs, and maternity wards that encourage bonding and breastfeeding. These policies are seldom debated or even noticed. When they are, the routine argument is not whether they are a good idea but how ‘hard’ or openly coercive should they be. Why value autonomy at all when people, left to their own devices, continually make poor choices that foil their aspirations and create a social burden in the process?
Pity the baby boomers, blamed in their youth for every ill and excess of American society and now, in their dotage, for threatening to sink the economy and perhaps Western civilization itself.
The revival of The Great Gatsby serves as a reminder that continuing to blame boomers even in their old age was not a foregone conclusion. The young people of the 1920s were as controversial to their older contemporaries as their counterparts in the 1960s and 1970s. They were called flappers (less commonly “sheiks,” in the case of men), or Bright Young Things in England. The cartoons of John Held, Jr. have memorialized their hair styles, bobbed for women, slicked back for men — the Beatles cuts and Afros of their own time. But the gilded youth of that earlier age, having enjoyed bootleg liquor and cigarettes rather than stronger substances, were allowed to make a discreet transition to middle age and then little old lady and gentleman status without the medical clucking or cultural sneers of journalists. They vanished back into the multitude while the so-called Boomers seem destined to be hounded to death. Why?
One obvious contrast is that high-flying former young people suffered with their elders and their children in the Depression, and some of them were still young enough to serve alongside teenagers in the Second World War. But the turbulent 1970s were succeeded not by a new depression but by the Reagan-era boom of the 1980s, in which the Boomers metamorphosed into new folk heroes/villains, the Yuppies. Only the prosperous ones were noted as constituting a generation; the poor melted back into their communities.
There was a second difference. Age consciousness had been growing since the late nineteenth century but was still relatively rudimentary in the 1920s; “middle age,” for example, had just been invented and was not fully part of the culture until Walter B. Pitkin’s Life Begins at Forty (1933). But it was the postwar media world that created a distinctive youth mass market and thus began the definition of a generation by its popular music and amusements. In the nineteenth century, generations referred to cohorts who shared momentous political and military events that their younger siblings didn’t: the Revolutionary War, the Civil War, the First World War. Scott Fitzgerald wrote a classic description of his own cohort in its historic framework:
We were born to power and intense nationalism. We did not have to stand up in a movie house and recite a child’s pledge to the flag to be aware of it. We were told, individually and as a unit, that we were a race that could potentially lick ten others of any genus. This is not a nostalgic article for it has a point to make — but we began life in post-Fauntleroy suits (often a sailor’s uniform as a taunt to Spain). Jingo was the lingo. …
That America passed away somewhere between 1910 and 1920; and the fact gives my generation its uniqueness — we are at once prewar and postwar. We were well-grown in the tense Spring of 1917, but for the most part not married and settled. The peace found us almost intact–less than five percent of my college class were killed in the war, and the colleges had a high average compared to the country as a whole. Men of our age in Europe simply do not exist. I have looked for them often, but they are twenty-five years dead.
So we inherited two worlds — the one of hope to which we had been bred; the one of disillusion which we had discovered early for ourselves. And that first world was growing as remote as another country, however close in time.
Third, there was a vast difference in the experience of world history. Fitzgerald’s generation — at least the white upper middle class to which he belonged — shared a unifying experience of expansionist patriotism and post-World War I disillusionment. Vietnam, on the other hand, divided the young as it did the rest of the country. In fact, as the political scientist Gordon L. Bowen has written:
Contrary to the myth, when Americans were asked whether they supported or opposed the war, the youngest set of Americans were uniformly more supportive of the war than were oldest set of Americans. Moreover, 20-somethings also were almost uniformly more likely to be supportive of the war than were 30 to 49 year olds.
Bowen also shows that throughout the war, college graduates were more likely to favor it than were people whose education stopped at elementary school.
Finally, there is a fourth reason. Old age wasn’t really officially defined in America until the Social Security Act set it at 65. The youth of the 1920s began to pay into the system and benefited in the 1960s and 1970s from pensions and Medicare thanks in part to the payments of young people entering the work force then. Now that they are reaching retirement age, they are a ripe target for demonization in the interest of “entitlement reform” as their grandparents never were. There are legitimate arguments about the financing and extent of Social Security and the level of contributions by wealthier people; I don’t mean to dismiss such concerns. But Boomerphobia — with no counterpart in Fitzgerald’s time — appears to have filled the media niche left by the political incorrectness of older stereotypes. If this collective scapegoat didn’t exist, it would have to be invented.
via The Atlantic.
The other day I visited The Oakland Museum, and while I wandered through one of its rooms this scene presented itself to me:
Immediately a thought struck me: This is it — the decline and fall of Western culture is encapsulated perfectly in this one scene.
Let me explain.
In the foreground we have a marble sculpture entitled “California Venus,” in a timeless neo-classical style.
It was carved in 1895 by sculptor Rupert Schmid.
In the background, just a few steps away, we have its companion piece, a sculpture entitled “Pink Lady.”
It was created in 1965 by artist Viola Frey.
In just 80 years, the state of sculpture in America went from beautiful and exquisitely refined to ugly, klutzy and incompetent.
I don’t know whether the curators at the Oakland Museum juxtaposed these two pieces intentionally, or if it was just an accident, but either way they deftly summarized everything that went wrong with 20th century art.
Striving for Beauty — or for Ugliness?
The very goal of art changed radically between 1885 and 1965. Back at the end of the 19th century no one yet questioned the assumption that art was an attempt to capture or create beauty. It had been that way for millennia. Little did anyone know that within just a few decades the very philosophy of art would move away from idealization first toward abstraction, then to realism, and finally to grotesquerie.