Posted: October 2, 2013 Filed under: Mediasphere | Tags: Government shutdown, History, Honor Flight, National World War II Memorial, Twentieth Century, Wars and Conflicts, Washington DC, White House, World War II
STANDING UP TO OPPRESSIVE GOVERNMENT: You’ll love the ‘trophy’ World War II vets took home from the government shutdown blockade.
Instapundit – Twitchy
Posted: October 2, 2013 Filed under: History, Self Defense, White House | Tags: CCL, Eleanor Roosevelt, FDR, Gun License, Gun rights, History, Pistol, Revolver, Second Amendment, Self-defense, State of New York, Weapons Permit
Posted: September 11, 2013 Filed under: History, War Room | Tags: Barack Obama, Bush George Walker, George W. Bush, History, New York, President, United States
President George W. Bush: Bullhorn Speech to Emergency Rescue Workers at 9/11 Ground Zero, New York. Delivered 14 September 2001.
via ▶ YouTube
Posted: September 10, 2013 Filed under: Global, Mediasphere, War Room | Tags: BarackObama, Bashar al-Assad, Damascus, Hisham Jaber, History, Obama, Syrian Armed Forces, United States, White House, YouTube
Posted: September 5, 2013 Filed under: Japan | Tags: History, Japan, Japanese cuisine, Supersize, Tokyo
There’s a stereotype that only American food is jumbo-sized. That stereotype is wrong.
Japan also has its own super-sized helpings—known as “ohmori” (大盛り) or “large serving.” Recently on 2ch, Japan’s largest web forum, images of “mega ohmori” meals were uploaded. Here’s a look at those, along with some other huge meals restaurants offer. Some of these dishes are called “mega” (メガ) or even “jumbo” (ジャンボ or “janbo”) in Japanese.
Read the rest of this entry »
Posted: August 29, 2013 Filed under: War Room | Tags: Barack Obama, History, Huffington Post, John Kerry, President, United States, White House, YouTube
by CHRISS W. STREET
As President Barack Obama has outwardly attempted to curtail Americans’ Constitutional Second Amendment right to bear arms, his Administration has approved huge increases in defense spending and export sales. The Administration is now seeking to eliminate stringent State Department controls on exports and foreign licensing of dozens of categories of weapons and technology from the United States Munitions List (USML) by transferring control to the pro-business Commerce Department.
In spite of the Government Accountability Office (GAO) warnings this change could increase terrorist access to dangerous weapons, the Administration claims this “reform” would enhance “the competitiveness of key United States manufacturing and technology sectors.”
Read the rest of this entry »
Posted: August 27, 2013 Filed under: Mediasphere | Tags: American Sniper, Bradley Cooper, Chris Kyle, Clint Eastwood, History, military, Steven Spielberg, The Hollywood Reporter
Eastwood is in early talks to direct the movie, based on the autobiography of Navy SEAL Chris Kyle, The Hollywood Reporter has confirmed.
Steven Spielberg was previously on board to direct the project but left earlier this month after he and the studio couldn’t come to agreement on a budget. (The parting of ways was quite amicable, according to several sources.) Bradley Cooper is attached to star and has been developing the project as a producer.
If a deal is made, that puts Eastwood in a tight schedule squeeze. The veteran filmmaker is about to begin directing Jersey Boys, the adaptation of the Broadway musical about the rise and fall of Frankie Valli and the Four Seasons.
Sniper must shoot early next year because of Cooper’s many commitments. But Eastwood is famously known for his short and efficient shoots, so the studio has no fear that he won’t be able to pull it off.
Sniper is an adaptation of Kyle’s book American Sniper: The Autobiography of the Most Lethal Sniper in U.S. Military History. It reveals how Kyle came to record the highest number of sniper kills for an American. The book has been praised for its frankness in telling a first-person account of a warrior who shoots from far and close distances.
Kyle was killed at a shooting range by a fellow veteran in February.
Twitch first reported the news.
Posted: August 26, 2013 Filed under: Mediasphere | Tags: Crime, Glamour, History, Mugshots, Photography
The Sydney Living Museum recently made public a series of mugshots taken from the 1920s. Yes, I know, I know, Australia’s always had a rep for being an island of criminals, but these mugshots are actually super cool. The collection comprises both male and female convicts, posing artfully in both close up and full-body shots.
Read the rest of this entry »
Posted: August 25, 2013 Filed under: Mediasphere, Reading Room | Tags: Benjamin Franklin Museum, Casson Mann, Founding Father, History, Philadelphia, Robert Venturi
Everett Benjamin Franklin (1706-1790), portrait by David Martin, 1767
For the latest word on biographical museums, check out the newly revamped Benjamin Franklin Museum in Philadelphia. If it hasn’t shaken the “great man of history” myth entirely, it at least offers more transparency about Franklin’s life–and his living quarters.
Unlike the old museum, the new one, opening today, devotes considerable attention to Franklin’s personal life. It shows his very expansive view of family, his rich social network and his evolving attitudes about slavery. In some respects, Franklin seems quite contemporary–his career as a political revolutionary gained steam after a British court chastised him for leaking some letters.
“People don’t know very much his personal life; they don’t know much about him as a man,” says curator Page Talbott. “Our goal is to augment that.”
Read the rest of this entry »
Posted: July 30, 2013 Filed under: Economics | Tags: Economic history, England, George Mason University, Gordon, History, Human, Industrial Revolution, Second Industrial Revolution, Standard of living, Sweden, Tyler Cowen, United States
What if everything we’ve come to think of as American is predicated on a freak coincidence of economic history? And what if that coincidence has run its course?
By Benjamin Wallace-Wells
Illustration by Mario Hugo
Picture this, arranged along a time line.
For all of measurable human history up until the year 1750, nothing happened that mattered. This isn’t to say history was stagnant, or that life was only grim and blank, but the well-being of average people did not perceptibly improve. All of the wars, literature, love affairs, and religious schisms, the schemes for empire-making and ocean-crossing and simple profit and freedom, the entire human theater of ambition and deceit and redemption took place on a scale too small to register, too minor to much improve the lot of ordinary human beings. In England before the middle of the eighteenth century, where industrialization first began, the pace of progress was so slow that it took 350 years for a family to double its standard of living. In Sweden, during a similar 200-year period, there was essentially no improvement at all. By the middle of the eighteenth century, the state of technology and the luxury and quality of life afforded the average individual were little better than they had been two millennia earlier, in ancient Rome.
Then two things happened that did matter, and they were so grand that they dwarfed everything that had come before and encompassed most everything that has come since: the first industrial revolution, beginning in 1750 or so in the north of England, and the second industrial revolution, beginning around 1870 and created mostly in this country. That the second industrial revolution happened just as the first had begun to dissipate was an incredible stroke of good luck. It meant that during the whole modern era from 1750 onward—which contains, not coincidentally, the full life span of the United States—human well-being accelerated at a rate that could barely have been contemplated before. Instead of permanent stagnation, growth became so rapid and so seemingly automatic that by the fifties and sixties the average American would roughly double his or her parents’ standard of living. In the space of a single generation, for most everybody, life was getting twice as good.
At some point in the late sixties or early seventies, this great acceleration began to taper off. The shift was modest at first, and it was concealed in the hectic up-and-down of yearly data. But if you examine the growth data since the early seventies, and if you are mathematically astute enough to fit a curve to it, you can see a clear trend: The rate at which life is improving here, on the frontier of human well-being, has slowed.
If you are like most economists—until a couple of years ago, it was virtually all economists—you are not greatly troubled by this story, which is, with some variation, the consensus long-arc view of economic history. The machinery of innovation, after all, is now more organized and sophisticated than it has ever been, human intelligence is more efficiently marshaled by spreading education and expanding global connectedness, and the examples of the Internet, and perhaps artificial intelligence, suggest that progress continues to be rapid.
But if you are prone to a more radical sense of what is possible, you might begin to follow a different line of thought. If nothing like the first and second industrial revolutions had ever happened before, what is to say that anything similar will happen again? Then, perhaps, the global economic slump that we have endured since 2008 might not merely be the consequence of the burst housing bubble, or financial entanglement and overreach, or the coming generational trauma of the retiring baby boomers, but instead a glimpse at a far broader change, the slow expiration of a historically singular event. Perhaps our fitful post-crisis recovery is no aberration. This line of thinking would make you an acolyte of a 72-year-old economist at Northwestern named Robert Gordon, and you would probably share his view that it would be crazy to expect something on the scale of the second industrial revolution to ever take place again.
“Some things,” Gordon says, and he says it often enough that it has become both a battle cry and a mantra, “can happen only once.”
Read the rest of this entry »
Posted: June 6, 2013 Filed under: Mediasphere | Tags: Abraham Lincoln, BarackObama, History, National Security Agency, New York Times, Obama administration, President, United States
Completely Lost its Mind Lost All Credibility
“Nearly all men can stand adversity, but if you want to test a man’s character, give him power.”
NYT Not as Thrilled as it was a few months ago
Posted: April 9, 2013 Filed under: Mediasphere, War Room | Tags: History, Holocaust, Holocaust Memorial Days, Jews, Nazism, Twentieth Century, Warsaw Ghetto Uprising, Yom HaShoah
Victims. Helpless. Downtrodden.That’s the narrative that’s been spread about Jews for the last 70 years since the Holocaust. We’ve embraced it to our detriment. We can’t seem to address antisemitism without running to the world and screaming that we’re being persecuted, rather than standing up strongly in defiance, aware of our own inner strength.
The Holocaust has scarred us, a yetzer hara sneaky bastard of a voice in our heads, that keeps trying to tell us how we are defined by our past, controlled by events that happened to us, instead of using those moments as points of growth.
And, in a weird way, that’s why all those images of us looking so helpless, so gaunt, in heaps of nameless bodies, have become a morbid fascination for us. We, and by extension the rest of the world, have chosen to define the Holocaust with these images.But there are other images. Images that show a more subtle, more true, story. A story that shows our inner power, our inner turmoil in dealing with a situation we cannot comprehend, our attempts to gain justice, and our final steps into moving above and beyond our past and into a new future.
These are the images you will see below. Some of them may be disturbing to you. Some of them may inspire you.But in the end, they do one thing that we desperately need as a people: they tell the real story of the Holocaust. A story that goes beyond victimhood and into our present-day lives. And today, on Yom HaShoa, 2013, it’s about time that story got told.
via 20 Photos — Pop Chassid
Posted: October 8, 2012 Filed under: Breaking News | Tags: Andrew Sullivan, Barack Obama, Daily Beast, Elections, History, Mitt Romney, President, United States
“The Pew poll is devastating, just devastating…”
This is devastating…devastating…
I‘m … I’m… Did I mention that I’m devastated?
>> Andrew Sullivan
Posted: October 8, 2012 Filed under: Mediasphere | Tags: Barack Obama, Democratic Party, History, Obama, President, United States
“I’m shocked, shocked…”
Obama phone Program to aid the poor lining the pockets of the wealthy
A wireless company profiting from the so-called “Obama phone” giveaway program is run by a prominent Democratic donor whose wife has raised more than $1.5 million for the president since 2007…
More >> via Washington Free Beacon
Posted: October 3, 2012 Filed under: War Room | Tags: Anti-Americanism, History, journalism, media, Middle East, Terror
The US state department believes American journalist Austin Tice, who disappeared in Syria in August, is in the custody of the Bashar al-Assad regime. This video, which emerged on Monday, purports to show a blindfolded Tice being led up a rocky pathway. The state department cannot confirm the authenticity of the video.
via >> guardian.co.uk
Posted: September 30, 2012 Filed under: War Room | Tags: Barack Obama, History, Terror, United States
“Actually, this is much more than an issue of semantics. Calling it a terrorist attack would have given Obama powers under the Authorization for the Use of Military Force Against Terrorists AUMF to use military action, including drone warfare, against the perpetrators. If he were serious about “bring[ing] to justice the killers,” which he vowed to do in the speech, then labeling this incident a terrorist attack if he believed that’s what it was would have been critical. Instead, we now have the FBI sitting with its hands bound in Tripoli, unable to move forward with a serious investigation.”
via Commentary Magazine
Posted: September 29, 2012 Filed under: Reading Room, War Room | Tags: al Qaeda, Anti-Americanism, History, Israel, Libya, Terror, United States
On Nov. 9, 1938, thousands of German storm troopers, acting under direct orders, launched the Jewish pogrom known as Kristallnacht. The attacks left approximately 100 Jews dead and 7,500 Jewish businesses damaged. Hundreds of homes and synagogues were vandalized.
The mastermind of the pogrom, Nazi propaganda minister Joseph Goebbels, explained it to the world as a “spontaneous” reaction to the murder of German diplomat by Herschel Grynszpan, a 17-year-old Jew. Goebbels said the pogrom showed the “healthy instincts” of the German people.
Some Jewish organizations, while strongly condemning German actions, expressed concern about the pogrom’s alleged cause. The World Jewish Congress stated that it “deplored the fatal shooting of an official of the German Embassy by a young Polish Jew.” These displays of contrition did not help. Kristallnacht was soon followed by the Holocaust, in which more than six million European Jews died.
What can we learn from that tragic history? First, atrocities on such a scale are rarely “spontaneous.” They require preparation and organization. Equally important is the lesson that accepting enemy propaganda makes us look weak and shortsighted. Any appreciation of the pretexts for such atrocities makes their perpetrators bolder and more aggressive.
Unfortunately, these lessons have not been learned. America’s ambassador to Libya is dead, U.S. embassies in Egypt and other Muslim countries are under siege, the American flag is being burned, and the Obama administration and media have blamed a video clip instead of denouncing the perpetrators.
“…accepting enemy propaganda makes us look weak and shortsighted…”
The lack of realism is stunning. “We reject all efforts to denigrate the religious beliefs of others,” said President Barack Obama—not about the murder of Americans or the persecution of Christians and Jews in Muslim countries, but about an amateur film on YouTube. Secretary of State Hillary Clinton said that the film is “disgusting and reprehensible.” These sentiments were echoed by the chairman of the Joint Chiefs of Staff, the U.S. ambassador to the U.N., and a multitude of pundits.
Yet all evidence indicates that events of Sept. 11, 2012, were not a “spontaneous reaction” to the 14-minute trailer, but were pre-organized—not only in Benghazi but in Cairo as well. The film, “Innocence of Muslims,” was available on YouTube for a long time without attracting any attention. Two days before the riots, the film was broadcast in Arabic on the Salafi Egyptian television channel Al-Nas. Several popular preachers on other conservative Islamic satellite channels called upon people to turn out Tuesday at the U.S. Embassy in Egypt. If this was not organization, what was it?
Still, America’s leaders have effectively accepted that the main blame for the embassy attacks should be put on the producers of video clip, rather than on the organizers and participants of the violence. America’s leaders did not stand up for freedom of speech. Instead, they practically apologized for the lack of censorship in the U.S….
More via >> How to Win an Ideological War | Defining Ideas | Hoover Institution
Mr. Yarim-Agaev is a scientist and human-rights activist who was a leading dissident in the Soviet Union in the 1970s. He is currently a distinguished visiting fellow at the Hoover Institution.
Posted: September 29, 2012 Filed under: Mediasphere | Tags: Campaign, History, journalism, media
“…the pundits and pollsters, who’s collective integrity could almost fill a shot glass…”
(watch ’till the end, for the concluding statement)
via >> The American Gob: News Politics and Culture >> via PJMedia
Posted: September 24, 2012 Filed under: Economics | Tags: Barack Obama, History, Iran, Mitt Romney, Obama, President, Republicans, United States
The annotated Obama: How 90% of the deficit becomes somebody else’s fault
A question raised by President Obama’s immortal line on CBS’s “60 Minutes” on Sunday—”I think that, you know, as President, I bear responsibility for everything, to some degree”—is what that degree really is. Maybe 70% or 80% of the buck stops with him? Or is it halfsies?
Nope. Now we know: It turns out the figure is 10%. The other 90% is somebody else’s fault.
This revelation came when Steve Croft mentioned that the national debt has climbed 60% on the President’s watch. “Well, first of all, Steve, I think it’s important to understand the context here,” Mr. Obama replied. Fair enough, so here’s his context in full, with our own annotation and translation below:
“When I came into office, I inherited the biggest deficit in our history.1 And over the last four years, the deficit has gone up, but 90% of that is as a consequence of two wars that weren’t paid for,2 as a consequence of tax cuts that weren’t paid for,3 a prescription drug plan that was not paid for,4 and then the worst economic crisis since the Great Depression.
“Now we took some emergency actions, but that accounts for about 10% of this increase in the deficit,6 and we have actually seen the federal government grow at a slower pace than at any time since Dwight Eisenhower, in fact, substantially lower than the federal government grew under either Ronald Reagan or George Bush.”
Footnote No. 1: Either Mr. Obama inherited the largest deficit in American history or he won the 1944 election, but both can’t be true. The biggest annual deficit the modern government has ever run was in 1943, equal to 30.3% of the economy, to mobilize for World War II. The next biggest years were the following two, at 22.7% and 21.5%, to win it.
The deficit in fiscal 2008 was a mere 3.2% of GDP. The deficit in fiscal 2009, which began on October 1, 2008 and ran through September 2009, soared to 10.1%, the highest since 1945.
Mr. Obama wants to blame all of that on his predecessor, and no doubt the recession that began in December 2007 reduced revenues and increased automatic spending “stabilizers” like jobless insurance. But Mr. Obama conveniently forgets a little event in February 2009 known as the “stimulus” that increased spending by a mere $830 billionabove the normal baseline.
The recession ended in June 2009, but spending has still kept rising. The President has presided over four years in a row of deficits in excess of $1 trillion, and the spending baseline going forward into his second term is nearly $1.1 trillion more than in fiscal 2007.
Federal spending as a share of GDP will average 24.1% over his first term including 2013. Even if you throw out fiscal 2009 and blame that entirely on Mr. Bush, the Obama spending average will be 23.8% of GDP. That compares to a post-WWII average of a little under 20%. Spending under Mr. Bush averaged 20.1% including 2009, and 19.6% if that year is left out.
Footnotes No. 2 through 4: Liberals continue to claim that the main causes of the current fiscal mess are tax rates established in, er, 2001 and 2003 and the post-9/11 wars on terror. But by 2006 and 2007, those tax rates were producing revenue of 18.2% and 18.5% of GDP, near historic norms.
Another quandary for Mr. Obama’s apologists is that he has endorsed nearly all of these policies. The 2003 Medicare drug benefit wasn’t offset by tax hikes or spending cuts, but Democrats expanded the program as part of ObamaCare.
The President also extended all the Bush tax rates in 2010 for two more years in the name of helping the economy, and he now wants to continue them for people earning under $200,000, which is where 71% of their “cost” resides. The Iraq campaign was won and beginning to be wound down when he took office, and he himself surged more troops in Afghanistan.
Footnote No. 5: Mr. Obama keeps dining out on the excuse of the recession, but that ended halfway through his first year. The main deficit problems since 2009 are a permanently higher spending base (see Footnote No. 1) and the slowest economic recovery in modern history. Revenues have remained below 16% of the economy, compared to 18% to 19% in a normal expansion.
The 2008 crisis is long over. The crisis now is Mr. Obama’s non-recovery.
Footnote No. 6: Even at face value, Mr. Obama’s suggestion that he is “only” responsible for 10% of what the government does is ludicrous. Note that in addition to his stimulus, what he calls “emergency actions” include his new health-care entitlement that will cost taxpayers $200 billion per year when fully implemented and grow annually at 8%, even using low-ball assumptions.
But the larger point concerns executive leadership. Every President “inherits” a government that was built over generations, which he chooses to change, or not to change, to suit his priorities. Mr. Obama chose to see the government he inherited and grow it faster than any President since LBJ.
The pre-eminent political question now is whether to reform the government we have to make it affordable going forward, or to keep growing the government and raise taxes to finance it, if that is even possible.
Mr. Obama favors the second option, though he pretends he can merely tax the rich to do it. Nobody who has looked honestly at the numbers believes that—not his own Simpson-Bowles commission and not the Congressional “super committee” he sanctioned but then worked to undermine.
At every turn he has demagogued the Romney-Ryan proposals to modernize the entitlement state so it is affordable, and he personally blew up the “grand bargain” House Speaker John Boehner was willing to strike last summer.
Footnote No. 7: Mr. Obama’s posture as the tightest skinflint since Eisenhower is a tutorial in how to dissemble with statistics. The growth rate seems low because he’s measuring from the end of fiscal 2009, after a one-year spending increase of $535 billion. That is the year of his stimulus and thus spending is growing off a much higher base. The real annual pace of government growth is closer to 5%, and that doesn’t count ObamaCare.
Note the passive voice, as if the President’s re-election campaign is disembodied from the President. If Mr. Obama’s campaign seems dishonest enough that even Mr. Obama is forced to admit it, this is because it’s coming from the top.
Via >> REVIEW & OUTLOOK
Posted: September 23, 2012 Filed under: Reading Room | Tags: Books, Christopher Hitchens, George Orwell, History, media, Picture of Dorian Gray, Tyranny
When looking back on the life of the late Christopher Hitchens, one sees that his persona is oddly like that of Oscar Wilde’s character Lord Henry Wotton from The Picture of Dorian Gray: loved by an assortment of people for assorted reasons, often when they cannot square with him on something else. Like Wotton, Hitchens was popular with individuals, not because they agreed with him, but because they disagreed with him. When faced with the cultivated erudition, wit, conviction, and eloquence such that “Hitch” displayed, peacocking before a podium or a writer’s desk, one couldn’t help but fall like those in Dorian Gray who despised the hedonist Wotton, and yet couldn’t stay away from his conversation.
It’s hard to say where Hitchens’ greatest popularity lies, but much Hitch-love comes from his status as the successor to George Orwell. Orwell’s manner, if anything, was the opposite of Hitchens’ strut. But the two are compared because they both criticized the Left from within on matters of international policy, albeit in independent ways. Hitchens broke from the Left over the so-called war on terror, quitting his literary homestead, The Nation, and making particularly derisive comments about his comrades. These actions were viewed as the strongest individual leftist dissent by a writer since Orwell’s infamous break over the Spanish Communists and the Soviet Union. To boot, Hitchens offered strong, vocal admiration for the elder English author and polemicist, and invoked Orwell on matters of principle and ethics regarding his own conservative turn. Indeed, the two are similarly noteworthy for their incorporation of morals into their politics.
Nevertheless, does all or any of this suffice to anoint Hitchens the inheritor, not of Orwell’s work, but of Orwell’s pen? The idea certainly has its critics. In his obituary on Hitchens, the New Statesman’s editor Jason Cowley argued that many of the comparisons made between the two are false. And although it’s popular to identify Hitchens with Orwell, the only serious, fleshed-out argument for exactly how the younger furthered the elder’s work that I’ve seen is from the Orwell scholar John Rodden, whose excellent essay on the topic appeared in The Kenyon Review in 2004. Rodden considers the idea thoroughly and concludes that there was “an intellectual passing of the torch between the two men,” and that Hitchens viewed his break with the Left as what Orwell would have done, although Rodden writes that the comparisons were too simplistic and he had reservations about such phrasings of inheritance.
However, the connection is a very useful way, if not the best way, to understand Hitchens’ importance—one that hasn’t been properly discerned. Because Hitch didn’t just follow Orwell in similarities over leftish dissent. What he did was to further Orwellian work on the totalitarian, namely by showing the importance of overcoming tyrannies held over the individual through a lack of robust criticism. This, along with his exceptional personality, is why Hitchens will be remembered and studied, because it takes the idea of the totalitarian to the next level, treating the concept as more sublime than is often believed. “The totalitarian, to me, is the enemy,” Hitchens said in his final interview…
More >> via>> The Humanist.
Posted: September 20, 2012 Filed under: Reading Room | Tags: History
A long-awaited book on Ronald Reagan’s secret alliance with the FBI…
Hoover and Reagan. The FBI director trusted few but found a comrade in the former-actor-turned-politician. For a time, they shared a foe: UC Berkeley…
via Arts & Letters Daily
Posted: September 20, 2012 Filed under: Reading Room | Tags: Anti-Americanism, Books, History, Iran, Iraq, Libya
Andrew McCarthy‘s new book, Spring Fever: The Illusion of Islamic Democracy is out …
tip via The Greenroom.
Posted: September 19, 2012 Filed under: Mediasphere | Tags: effects of lsd, History, james fadiman, science
For decades, the U.S. government banned medical studies of the effects of LSD. But for one longtime, elite researcher, the promise of mind-blowing revelations was just too tempting…
Stanford, the other from Hewlett-Packard—donned eyeshades and earphones, sank into comfy couches, and waited for their government-approved dose of LSD to kick in. From across the suite and with no small amount of anticipation, Dr. James Fadiman spun the knobs of an impeccable sound system and unleashed Beethoven’s “Symphony No. 6 in F Major, Op. 68.” Then he stood by, ready to ease any concerns or discomfort.
For this particular experiment, the couched volunteers had each brought along three highly technical problems from their respective fields that they’d been unable to solve for at least several months. In approximately two hours, when the LSD became fully active, they were going to remove the eyeshades and earphones, and attempt to find some solutions. Fadiman and his team would monitor their efforts, insights, and output to determine if a relatively low dose of acid—100 micrograms to be exact—enhanced their creativity.
It was the summer of ’66. And the morning was beginning like many others at the International Foundation for Advanced Study, an inconspicuously named, privately funded facility dedicated to psychedelic drug research, which was located, even less conspicuously, on the second floor of a shopping plaza in Menlo Park, Calif. However, this particular morning wasn’t going to go like so many others had during the preceding five years, when researchers at IFAS (pronounced “if-as”) had legally dispensed LSD. Though Fadiman can’t recall the exact date, this was the day, for him at least, that the music died. Or, perhaps more accurately for all parties involved in his creativity study, it was the day before.
At approximately 10 a.m., a courier delivered an express letter to the receptionist, who in turn quickly relayed it to Fadiman and the other researchers. They were to stop administering LSD, by order of the U.S. Food and Drug Administration. Effective immediately. Dozens of other private and university-affiliated institutions had received similar letters that day.
That research centers once were permitted to explore the further frontiers of consciousness seems surprising to those of us who came of age when a strongly enforced psychedelic prohibition was the norm. They seem not unlike the last generation of children’s playgrounds, mostly eradicated during the ’90s, that were higher and riskier than today’s soft-plastic labyrinths. (Interestingly, a growing number of child psychologists now defend these playgrounds, saying they provided kids with both thrills and profound life lessons that simply can’t be had close to the ground.)
When the FDA’s edict arrived, Fadiman was 27 years old, IFAS’s youngest researcher. He’d been a true believer in the gospel of psychedelics since 1961, when his old Harvard professor Richard Alpert (now Ram Dass) dosed him with psilocybin, the magic in the mushroom, at a Paris café. That day, his narrow, self-absorbed thinking had fallen away like old skin. People would live more harmoniously, he’d thought, if they could access this cosmic consciousness. Then and there he’d decided his calling would be to provide such access to others. He migrated to California (naturally) and teamed up with psychiatrists and seekers to explore how and if psychedelics in general—and LSD in particular—could safely augment psychotherapy, addiction treatment, creative endeavors, and spiritual growth. At Stanford University, he investigated this subject at length through a dissertation—which, of course, the government ban had just dead-ended.
Couldn’t they comprehend what was at stake? Fadiman was devastated and more than a little indignant. However, even if he’d wanted to resist the FDA’s moratorium on ideological grounds, practical matters made compliance impossible: Four people who’d never been on acid before were about to peak…
The Heretic – The Morning News…
Posted: September 18, 2012 Filed under: Mediasphere | Tags: History, religion
A historian of early Christianity at Harvard Divinity School has identified a scrap of papyrus that she says was written in Coptic in the fourth century and contains a phrase never seen in any piece of Scripture: “Jesus said to them, ‘My wife …’”
The faded papyrus fragment is smaller than a business card, with eight lines on one side, in black ink legible under a magnifying glass. Just below the line about Jesus having a wife, the papyrus includes a second provocative clause that purportedly says, “she will be able to be my disciple.”
The finding is being made public in Rome on Tuesday at an international meeting of Coptic scholars by the historian Karen L. King, who has published several books about new Gospel discoveries and is the first woman to hold the nation’s oldest endowed chair, the Hollis professor of divinity.
The provenance of the papyrus fragment is a mystery, and its owner has asked to remain anonymous. Until Tuesday, Dr. King had shown the fragment to only a small circle of experts in papyrology and Coptic linguistics, who concluded that it is most likely not a forgery. But she and her collaborators say they are eager for more scholars to weigh in and perhaps upend their conclusions.
Even with many questions unsettled, the discovery could reignite the debate over whether Jesus was married, whether Mary Magdalene was his wife and whether he had a female disciple. These debates date to the early centuries of Christianity, scholars say. But they are relevant today, when global Christianity is roiling over the place of women in ministry and the boundaries of marriage.
The discussion is particularly animated in the Roman Catholic Church, where despite calls for change, the Vatican has reiterated the teaching that the priesthood cannot be opened to women and married men because of the model set by Jesus.
Dr. King gave an interview and showed the papyrus fragment, encased in glass, to reporters from The New York Times, The Boston Globe and Harvard Magazine in her garret office in the tower at Harvard Divinity School last Thursday. She left the next day for Rome to deliver her paper on the find on Tuesday at the International Congress of Coptic Studies.
She repeatedly cautioned that this fragment should not be taken as proof that Jesus, the historical person, was actually married. The text was probably written centuries after Jesus lived, and all other early, historically reliable Christian literature is silent on the question, she said.
But the discovery is exciting, Dr. King said, because it is the first known statement from antiquity that refers to Jesus speaking of a wife. It provides further evidence that there was an active discussion among early Christians about whether Jesus was celibate or married, and which path his followers should choose.
“This fragment suggests that some early Christians had a tradition that Jesus was married,” Dr. King said. “There was, we already know, a controversy in the second century over whether Jesus was married, caught up with a debate about whether Christians should marry and have sex…”
More via NYTimes.com…