An inside look at the single largest public outreach program for the Department of Defense — and the Pentagon’s most elaborate propaganda operation.
But the number printed in the newspaper in December 1955 had a digit wrong — and was instead the direct line into the secret military nerve center in Colorado Springs, Colo., where the Pentagon was on the lookout to prevent nuclear war. The Air Force officer and World War II fighter pilot who took the first call that day for Father Christmas thought it was a crank — and Col. Harry Shoup sternly said so.
“The little kid started crying,” Shoup’s daughter, Terri Van Keuren, recalled in an interview. “So Dad went into his ‘Ho ho ho’ and got the kid’s list.”
Sixty-two years later, the Continental Air Defense Command is now the North American Aerospace Defense Command, and its interactive NORAD Tracks Santa has become the largest single public outreach program for the Defense Department. It’s also, you might say, the Pentagon’s most elaborate propaganda operation.
On Christmas Eve, while monitoring the heavens for North Korean missile launches or Russian military aircraft flying too close to the U.S. or Canada, NORAD will also be reporting the progress of Santa and his reindeer as they travel from the North Pole around the world delivering presents and holiday cheer. It will correlate the jolly elf’s journey with its network of 47 radar stations, spy satellites in “geosynchronous” orbit 22,300 miles above the earth, fighter jets and a suite of special high-tech “SantaCams.” Or so the publicity stunt’s plan goes.
“The moment our radar tells us that Santa has lifted off, we begin to use the same satellites that we use in providing warning of possible missile launches aimed at North America,” says NORAD’s detailed 14-page internal handbook for the operation, which is replete with Santa stats (first flight believed to be Dec. 24, 343 A.D.) and even talking points for that uncomfortable question many parents also confront: “Is there a Santa Claus?”
It’s all part of the ornamented script that more than 1,500 volunteers — including the four-star general in charge of defending North America — are using to field an anticipated 150,000 calls and an avalanche of emails and social media posts (2 million Facebook followers so far) who are all seeking to locate Ole St. Nick on his starlight odyssey.
“As soon as you’re hanging up there’s another kid wanting to talk to you,” Preston Schlachter, NORAD’s Track Santa program manager and its director of community outreach, said of the 23-hour period leading up to Christmas when volunteers work in two-hour shifts, backed up by dozens of sponsors ranging from Microsoft to the National Defense Industrial Association, Taco Bell and the local Amy’s Donuts in Colorado Springs.
In the past, VIPs like former first lady Michelle Obama have also taken a turn at the phones.
“It is the best two hours you’ll ever experience,” Schlachter added in an interview. “You are getting these calls from all over the world. One of the coolest things I like about the program is the multi-generational aspect of it. We are seeing feedback on social media, people who call in and tell us they tracked Santa when they were kids and they’ve introduced it to their kids and now they’re introducing it to their grandkids.” Read the rest of this entry »
We should undo the Obama administration’s rules that regulate the web like a 1930s utility.
Ajit Pai writes: As millions flocked to the web for the first time in the 1990s, President Clinton and a Republican Congress decided “to preserve the vibrant and competitive free market that presently exists for the Internet.” In the Telecommunications Act of 1996, the government called for an internet “unfettered by Federal or State regulation.” The result of that fateful decision was the greatest free-market success story in history.
Here’s my plan to repeal the Obama Administration’s heavy-handed regulation of the Internet. This time–unlike in 2015–you can read it before the @FCC votes. https://t.co/xcPDkxPgW7 https://t.co/wnshqlJoMa pic.twitter.com/wACDCspuEP
— Ajit Pai (@AjitPaiFCC) November 22, 2017
Encouraged by light-touch regulation, private companies invested over $1.5 trillion in nearly two decades to build out American communications networks. Without having to ask anyone’s pemission, innovators everywhere used the internet’s open platform to start companies that have transformed how billions of people live and work.
But that changed in 2014. Just days after a poor midterm election result, President Obama publicly pressured the Federal Communications Commission to reject the longstanding consensus on a market-based approach to the internet. He instead urged the agency to impose upon internet service providers a creaky regulatory framework called “Title II,” which was designed in the 1930s to tame the Ma Bell telephone monopoly. A few months later, the FCC followed President Obama’s instructions on a party-line vote. I voted “no,” but the agency’s majority chose micromanagement over markets.
This burdensome regulation has failed consumers and businesses alike. In the two years after the FCC’s decision, broadband network investment dropped more than 5.6%—the first time a decline has happened outside of a recession. If the current rules are left in place, millions of Americans who are on the wrong side of the digital divide would have to wait years to get more broadband.
The effect has been particularly serious for smaller internet service providers. They don’t have the time, money or lawyers to cut through a thicket of complex rules. The Wireless Internet Service Providers Association, which represents small fixed wireless companies that generally operate in rural America, found that more than 80% of its members “incurred additional expense in complying with the Title II rules, had delayed or reduced network expansion, had delayed or reduced services and had allocated budget to comply with the rules.” They aren’t alone. Other small companies have told the FCC that these regulations have forced them to cancel, delay or curtail upgrades to their fiber networks.
The uncertainty surrounding the FCC’s onerous rules has also slowed the introduction of new services. One major company reported that … (read more)
FCC Head Ajit Pai: Killing Net Neutrality Will Set the Internet Free
Promises that “we’re going to see an explosion in the kinds of connectivity and the depth of that connectivity” like never before.
Nick Gillespie & Ian Keyser report: Todd Krainin, ReasonIn an exclusive interview today just hours after announcing his plan to repeal “Net Neutrality” rules governing the actions of Internet-service providers (ISPs) and mobile carriers, Federal Communications Commission (FCC) Chairman Ajit Pai has an in-your-face prediction for his critics: “Over the coming years, we’re going to see an explosion in the kinds of connectivity and the depth of that connectivity,” he said this afternoon. “Ultimately that means that the human capital in the United States that’s currently on the shelf—the people who don’t have digital opportunity—will become participants in the digital economy.”
Pai stressed that regulating the Internet under a Title II framework originally created in the 1930s had led to less investment in infrastructure and a slower rate of innovation. “Since the dawn of the commercial internet, ISPs have been investing as much as they can in networks in order to upgrade their facilities and to compete with each other,” he says. “Outside of a recession we’ve never seen that sort of investment go down year over year. But we did in 2015, after these regulations were adopted.” In a Wall Street Journal column published today, Pai says Title II was responsible for a nearly 6 percent decline in broadband network investment as ISPs saw compliance costs rise and the regulatory atmosphere become uncertain. In his interview with Reason, Pai stressed that the real losers under Net Neutrality were people living in rural areas and low-income Americans who were stuck on the bad end of “the digital divide.”
Proponents of Net Neutrality maintain that rules that went into effect in 2015 are the only thing standing between rapacious businesses such as Comcast, Verizon (where Pai once worked), and Spectrum and an Internet choking on throttled traffic, expensive “fast lanes,” and completely blocked sites that displease whatever corporate entity controls the last mile of fiber into your home or business. Pai says that is bunk and noted that today’s proposed changes, which are expected to pass full FCC review in mid-December, return the Internet to the light-touch regulatory regime that governed it from the mid-1990s until 2015.
“It’s telling that the first investigations that the prior FCC initiated under these so-called Net Neutrality rules were involving free data offerings,” says Pai, pointing toward actions initiated by his predecessor against “zero-rating” services such as T-Mobile’s Binge program, which didn’t count data used to stream Netflix, Spotify, and a host of other services against a customer’s monthly data allowance. “To me it’s just absurd to say that the government should stand in the way of consumers who want to get, and companies that want to provide, free data.”
The FCC is not completely evacuating its oversight role. ISPs, he says, will need to be completely transparent with customers about all practices related to prioritizing traffic, data caps, and more. Pai believes that market competition for customers will prove far more effective in developing better and cheaper services than regulators deciding what is best for the sector. “In wireless,” he says, “there’s very intense competition—you have four national carriers and any number of regional carriers competing to provide 4G LTE, and a number of different services. In those marketplaces where there’s not as much competition as we’d like to see, to me at least, the solution isn’t to preemptively regulate as if it were a monopoly, as if we’re dealing with ‘Ma Bell,’ but to promote more competition.” Read the rest of this entry »
Chance Miller reports: Earlier this year, Apple was forced to remove several VPN apps from the App Store in China due to regulatory reasons. At the time, Tim Cook explained that he would rather not remove them, but was forced to comply.
Now, United States Senators Ted Cruz and Patrick Leahy are pressing Apple for more information…
In a letter sent to Tim Cook, Cruz and Leahy say Apple may be “enabling the Chinese government’s censorship and surveillance of the Internet,” noting that China has an “abysmal human rights record.”
Specifically, Cruz and Leahy pointed to Cook’s acceptance of the Newseum’s 2017 Free Expression Award. While receiving the award, Cook remarked that Apple “enables people around the world to speak up.” The senators, however, argue that Apple’s removal of VPN apps in China do the exact opposite of that:
While Apple’s many contributions to the global exchange of information are admirable, removing VPN apps that allow individuals in China to evade the Great Firewall and access the Internet privately does not enable people in China to “speak up.” To the contrary, if Apple complies with such demands from the Chinese governments, it inhibits free expression for users across China, particularly in light of the Cyberspace Administration of China’s new regulations targeting online anonymity.
Cook and Leahy outline a list of questions they want Cook to answer. Read the rest of this entry »
WASHINGTON (AP) — It sounds sort of like a mass of crickets. A high-pitched whine, but from what? It seems to undulate, even writhe. Listen closely: There are multiple, distinct tones that sound to some like they’re colliding in a nails-on-the-chalkboard effect.
The Associated Press has obtained a recording of what some U.S. Embassy workers heard in Havana in a series of unnerving incidents later deemed to be deliberate attacks. The recording, released Thursday by the AP, is the first disseminated publicly of the many taken in Cuba of mysterious sounds that led investigators initially to suspect a sonic weapon.
The recordings themselves are not believed to be dangerous to those who listen. Sound experts and physicians say they know of no sound that can cause physical damage when played for short durations at normal levels through standard equipment like a cellphone or computer.
What device produced the original sound remains unknown. Americans affected in Havana reported the sounds hit them at extreme volumes.
Whether there’s a direct relationship between the sound and the physical damage suffered by the victims is also unclear. The U.S. says that in general the attacks caused hearing, cognitive, visual, balance, sleep and other problems.
The recordings from Havana have been sent for analysis to the U.S. Navy, which has advanced capabilities for analyzing acoustic signals, and to the intelligence services, the AP has learned. But the recordings have not significantly advanced U.S. knowledge about what is harming diplomats.
The Navy did not respond to requests for comment on the recording. State Department spokeswoman Heather Nauert wouldn’t comment on the tape’s authenticity.
Cuba has denied involvement or knowledge of the attacks. The U.S. hasn’t blamed anyone and says it still doesn’t know what or who is responsible. But the government has faulted President Raul Castro’s government for failing to protect American personnel, and Nauert said Thursday that Cuba “may have more information than we are aware of right now.”
“We believe that the Cuban government could stop the attacks on our diplomats,” said White House chief of staff John Kelly.
Not all Americans injured in Cuba heard sounds. Of those who did, it’s not clear they heard precisely the same thing.
Yet the AP has reviewed several recordings from Havana taken under different circumstances, and all have variations of the same high-pitched sound. Individuals who have heard the noise in Havana confirm the recordings are generally consistent with what they heard.
“That’s the sound,” one of them said.
The recording being released by the AP has been digitally enhanced to increase volume and reduce background noise, but has not been otherwise altered.
The sound seemed to manifest in pulses of varying lengths — seven seconds, 12 seconds, two seconds — with some sustained periods of several minutes or more. Then there would be silence for a second, or 13 seconds, or four seconds, before the sound abruptly started again. Read the rest of this entry »
New technologies have disrupted news media over the past 20 years — but one report says that’s just the beginning.
Washington (AFP) – If you think technology has shaken up the news media — just wait, you haven’t seen anything yet.
The next wave of disruption is likely to be even more profound, according to a study presented Saturday to the Online News Association annual meeting in Washington.
News organizations which have struggled in the past two decades as readers moved online and to mobile devices will soon need to adapt to artificial intelligence, augmented reality and automated journalism and find ways to connect beyond the smartphone, the report said.
“Voice interface” will be one of the big challenges for media organizations, said the report by Amy Webb, a New York University Stern School of Business faculty member and Founder of the Future Today Institute.
The institute estimates that 50 percent of interactions that consumers have with computers will be using their voices by 2023.
“Once we are speaking to our machines about the news, what does the business model for journalism look like?” the report said.
“News organizations are ceding this future ecosystem to outside corporations. They will lose the ability to provide anything but content.”
Webb writes that most news organizations have done little experimentation with chat apps and voice skills on Amazon’s Alexa and Google Home, the likes of which may be key parts of the future news ecosystem.
Because of this, she argues that artificial intelligence or AI is posing “an existential threat to the future of journalism.”
“Journalism itself is not actively participating in building the AI ecosystem,” she wrote.
One big problem facing media organizations is that new technologies impacting the future of news such as AI are out of their control, and instead is in the hands of tech firms like Google, Amazon, Tencent, Baidu, IBM, Facebook, Apple and Microsoft, according to Webb.
“News organizations are customers, not significant contributors,” the report said. Read the rest of this entry »
Hackers had the power to cause blackouts, Symantec says. And yes, most signs point to Russia.
Andy Greenberg writes: In an era of hacker attacks on critical infrastructure, even a run-of-the-mill malware infection on an electric utility’s network is enough to raise alarm bells. But the latest collection of power grid penetrations went far deeper: Security firm Symantec is warning that a series of recent hacker attacks not only compromised energy companies in the US and Europe but also resulted in the intruders gaining hands-on access to power grid operations—enough control that they could have induced blackouts on American soil at will.
Symantec on Wednesday revealed a new campaign of attacks by a group it is calling Dragonfly 2.0, which it says targeted dozens of energy companies in the spring and summer of this year. In more than 20 cases, Symantec says the hackers successfully gained access to the target companies’ networks. And at a handful of US power firms and at least one company in Turkey—none of which Symantec will name—their forensic analysis found that the hackers obtained what they call operational access: control of the interfaces power company engineers use to send actual commands to equipment like circuit breakers, giving them the ability to stop the flow of electricity into US homes and businesses.
“There’s a difference between being a step away from conducting sabotage and actually being in a position to conduct sabotage … being able to flip the switch on power generation,” says Eric Chien, a Symantec security analyst. “We’re now talking about on-the-ground technical evidence this could happen in the US, and there’s nothing left standing in the way except the motivation of some actor out in the world.”
Never before have hackers been shown to have that level of control of American power company systems, Chien notes. The only comparable situations, he says, have been the repeated hacker attacks on the Ukrainian grid that twice caused power outages in the country in late 2015 and 2016, the first known hacker-induced blackouts.
The Usual Suspects
Security firms like FireEye and Dragos have pinned those Ukrainian attacks on a hacker group known as Sandworm, believed to be based in Russia. But Symantec stopped short of blaming the more recent attacks on any country or even trying to explain the hackers’ motives. Chien says the company has found no connections between Sandworm and the intrusions it has tracked. Nor has it directly connected the Dragonfly 2.0 campaign to the string of hacker intrusions at US power companies—including a Kansas nuclear facility—known as Palmetto Fusion, which unnamed officials revealed in July and later tied to Russia.
Chien does note, however, that the timing and public descriptions of the Palmetto Fusion hacking campaigns match up with its Dragonfly findings. “It’s highly unlikely this is just coincidental,” Chien says. But he adds that while the Palmetto Fusion intrusions included a breach of a nuclear power plant, the most serious DragonFly intrusions Symantec tracked penetrated only non-nuclear energy companies, which have less strict separations of their internet-connected IT networks and operational controls. Read the rest of this entry »
Laura Geggel reports: During medieval times, bookmakers fashioned the pages and cover of a rare copy of the Gospel of Luke out of five different types of animals: calves, two species of deer, sheep and goat, according to new research.
In addition, one more type of animal left its mark on the cover of this 12th-century book: Beetle larvae likely chewed holes into the leather binding, the researchers said.
Now, researchers are learning unexpected secrets about the manuscript by noninvasively testing the proteins and DNA on the book’s pages, the researchers told Live Science.
Rare books — such as this copy of the Gospel of Luke — are difficult to study because they’re fragile, prompting many librarians to bar any research that would harm such manuscripts or their pages.
This rule is all too familiar to Matthew Collins, a biochemist at both the University of York in the United Kingdom and the University of Copenhagen. He wanted to sample parchments — documents made from animal skins — as a way to determine how people have managed livestock throughout history.
When Collins and Sarah Fiddyment, a postdoctoral fellow of archaeology at the University of York, approached librarians at the University of York’s Borthwick Institute for Archives, “we were told that we would not be allowed to physically sample any of the parchment documents, as they are too valuable as cultural-heritage objects,” Fiddyment told Live Science.
But Fiddyment didn’t give up. She spent several months learning how librarians conserve rare parchments, and, surprisingly, found a new method that allows scientists to study these specimens without disturbing them — one that involves an eraser.
Typically, librarians “dry clean” parchments by gently rubbing a polyvinyl chloride eraser against them. This technique pulls fibers off the page, and the resulting debris is usually thrown away.
But Fiddyment realized this debris held valuable clues about the book. By isolating proteins and other biological fragments within the debris, and examining them with a mass spectrometer — an instrument that identifies different compounds by their masses — researchers could learn all kinds of information about the manuscripts, she found.
“This was Sarah’s brilliant idea,” Collins told Live Science in an email. “Oddly enough, I think we relished the challenge.”
It wasn’t long before Fiddyment put this technique into action. A historian bought the aforementioned Gospel of Luke at a 2009 Southeby’s auction. An analysis of its “prickly” style of script indicated that scribes at St. Augustine’s Abbey in Canterbury, in the United Kingdom, created it around A.D. 1120, Bruce Barker-Benfield, the curator of manuscripts at the Bodleian Libraries at the University of Oxford, told the journal Science.
To learn more about the gospel, the historian contacted Collins. Using Fiddyment’s method, Collins and his colleagues learned that the book’s white leather cover came from the skin of a roe deer— a common species in the United Kingdom. The book’s strap came from a larger deer species — either a native red deer or a fallow deer, an invasive species likely brought from continental Europe after the Normans invaded in 1066. Read the rest of this entry »
Another widespread cyber attack is causing massive problems across Europe Tuesday.
Ukraine has been hit particularly hard as government and company officials have reported serious intrusions across the Ukrainian power grid, banks and government offices. The country’s prime minister says that the cyber attack affecting his country is “unprecedented,” but “vital systems haven’t been affected.”
Ukrainian Deputy Prime Minister Pavlo Rozenko on Tuesday posted a picture of a darkened computer screen to Twitter, saying that the computer system at the government’s headquarters has been shut down.
There’s very little information about who might be behind the disruption, but technology experts who examined screenshots circulating on social media said it bears the hallmarks of ransomware, the name given to programs that hold data hostage by scrambling it until a payment is made.
“A massive ransomware campaign is currently unfolding worldwide,” said Romanian cybersecurity company Bitdefender. In a telephone interview, Bitdefender analyst Bogdan Botezatu said that he had examined samples of the program and that it appeared to be nearly identical to GoldenEye, one of a family of hostage-taking programs that has been circulating for months. Read the rest of this entry »
Siri will be the conductor of a suite of devices, all tracking your interactions and anticipating your next moves.
Apple Inc. will still sell an iPhone, but expect the device to morph into a suite of apps and services, enhanced with AI and AR, part of a ‘body area network’ of devices, batteries and sensors.
Christopher Mims writes: It’s 2027, and you’re walking down the street, confident you’ll arrive at your destination even though you don’t know where it is. You may not even remember why your device is telling you to go there.
There’s a voice in your ear giving you turn-by-turn directions and, in between, prepping you for this meeting. Oh, right, you’re supposed to be interviewing a dog whisperer for your pet-psychiatry business. You arrive at the coffee shop, look around quizzically, and a woman you don’t recognize approaches. A display only you can see highlights her face and prints her name next to it in crisp block lettering, Terminator-style. Afterward, you’ll get an automatically generated transcript of everything the two of you said.
As the iPhone this week marks the 10th anniversary of its first sale, it remains one of the most successful consumer products in history. But by the time it celebrates its 20th anniversary, the “phone” concept will be entirely uprooted: That dog-whisperer scenario will be brought to you even if you don’t have an iPhone in your pocket.
Sure, Apple AAPL 0.45% may still sell a glossy rectangle. (At that point, iPhones may also be thin and foldable, or roll up into scrolls like ancient papyri.) But the suite of apps and services that is today centered around the physical iPhone will have migrated to other, more convenient and equally capable devices—a “body area network” of computers, batteries and sensors residing on our wrists, in our ears, on our faces and who knows where else. We’ll find ourselves leaving the iPhone behind more and more often.
Trying to predict where technology will be in a decade may be a fool’s errand, but how often do we get to tie up so many emerging trends in a neat package?
In the wake of the U.K.’s most recent terrorist attacks, its prime minister is talking tough on Internet regulation, but what she’s suggesting is impractical.
Tragically, there have been three major terrorist attacks in the U.K. in less than three months’ time. After the second, in Manchester, May and others said they would look into finding ways to compel tech companies to put cryptographic “back doors” into their services, so that law enforcement agencies could more easily access suspects’ user data.
May repeated her stance in broaders terms Sunday, following new attacks in London. “The Internet, and the big companies” are providing “safe spaces” for extremism, she said, and new regulations are needed to “regulate cyberspace.” She offered no specifics, but her party’s line, just days from the June 8 national election, is clear: a country that already grants its government some of the most sweeping digital surveillance powers of any democracy needs more and tougher laws to prevent terrorism (see “New U.K. Surveillance Law Will Have Worldwide Implications”).
The trouble is, this kind of talk ignores how the Internet and modern consumer technology works. As Cory Doctorow points out in a detailed look at how you would actually go about creating services with cryptographic holes, the practicalities of such a demand render it ludicrous bordering on impossible. Even if all of the necessary state-mandated technical steps were taken by purveyors of commercial software and devices—like Google or Apple, say—anyone who wanted to could easily skirt their restrictions by running open-source versions of the software, or unlocked phones.
That isn’t to say that May and the Conservatives’ general idea that the government should be able to probe user data as part of an investigation should be dismissed out of hand. The balancing act between national security and digital privacy has become one of the central themes of our digital lives (see “What If Apple Is Wrong?”). And while there are advocates aplenty on both sides, simple answers are hard to come by. Read the rest of this entry »
“We’ve gone to a modern [broadcast] system that has a lot of places where stuff can happen without permission,” says Thomas W. Hazlett, who’s the FCC‘s former chief economist, a professor at Clemson University, and author of the new book The Political Spectrum: The Tumultuous Liberation of Wireless Technology, from Herbert Hoover to the Smartphone. “And we have seen that the smartphone revolution and some other great stuff in the wireless space has really burgeoned…That comes from deregulation.”
So-called net neutrality rules are designed to solve a non-existent problem and threaten to restrict consumer choice, Hazlett tells Reason’s Nick Gillespie. “The travesty is there’s already a regulatory scheme [to address anti-competitive behavior]—it’s called antitrust law.”
Greater autonomy and consumer freedom led to the development of cable television, the smartphone revolution, and the modern internet. While we’ve come a long way from the old days of mother-may-I pleading with the FCC to grant licenses for new technology, Hazlett says, “there’s a lot farther to go and there’s a lot of stuff out there that’s being suppressed.”
He points to the history of radio and television. Herbert Hoover and Lyndon Johnson exercised extraordinary control over spectrum allocation, which they used for their own political and financial gain. With liberalization, we now have hundreds of hours of varied television programming as compared to the big three broadcast networks of the ’60s, an abundance of choices in smartphone providers and networks as compared to the Ma Bell monopoly, and more to come. Read the rest of this entry »
“We face a possible future where people not only ignore scientific evidence, but seek to eliminate it entirely,” warns the march’s mission statement. “Staying silent is a luxury that we can no longer afford. We must stand together and support science.”
From whom do the marchers hope to defend science? Certainly not the American public: Most Americans are fairly strong supporters of the scientific enterprise. An October 2016 Pew Research Center poll reported, “Three-quarters of Americans (76%) have either a great deal (21%) or a fair amount of confidence (55%) in scientists, generally, to act in the public interest.” The General Social Survey notes that public confidence in scientists stands out among the most stable of about 13 institutions rated in the GSS survey since the mid-1970s. (For what it’s worth, the GSS reports only 8 percent of the public say that they have a great deal of confidence in the press, but at least that’s higher than the 6 percent who say the same about Congress.)
The mission statement also declares, “The application of science to policy is not a partisan issue. Anti-science agendas and policies have been advanced by politicians on both sides of the aisle, and they harm everyone—without exception.”
I thoroughly endorse that sentiment. But why didn’t the scientific community march when the Obama administration blocked over-the-counter access to emergency contraception to women under age 17? Or dawdled for years over the approval of genetically enhanced salmon? Or tried to kill off the Yucca Mountain nuclear waste storage facility? Or halted the development of direct-to-consumer genetic testing? Read the rest of this entry »
The GBU-43/B Massive Ordnance Air Blast (commonly known as the Mother of All Bombs) is a large-yield conventional (non-nuclear) bomb, developed for the United State military by Albert L. Weimorts, Jr. of the Air Force Research Laboratory. At the time of development, it was touted as the most powerful non-nuclear weapon ever designed.
The GBU-43/B Massive Ordnance Air Blast (MOAB pronounced /ˈmoʊ.æb/, commonly known as the Mother of All Bombs) is a large-yield conventional (non-nuclear) bomb, developed for the United States military by Albert L. Weimorts, Jr. of the Air Force Research Laboratory. At the time of development, it was touted as the most powerful non-nuclear weapon ever designed. The bomb was designed to be delivered by a C-130 Hercules, primarily the MC-130E Combat Talon I or MC-130H Combat Talon II variants.
Since then, Russia has tested its “Father of All Bombs“, which is claimed to be four times as powerful as the MOAB.
The U.S. military dropped the largest non-nuclear bomb in eastern Afghanistan on Thursday just days after a Green Beret was killed fighting ISIS there, a U.S. defense official confirmed to Fox News.
The GBU-43B, a 21,000-pound conventional bomb, was dropped in Nangarhar Province.
The MAOB (Massive Ordinance Air Blast) is also known as the “Mother Of All bombs.” It was first tested in 2003, but hadn’t been used before Thursday.
Aside from two test articles, the only known production is of 15 units at the McAlester Army Ammunition Plant in 2003 in support of the Iraq War. As of early 2007, none of those were known to have been used, although a single MOAB was moved to the Persian Gulf area in April 2003.
On April 13, 2017, a MOAB was dropped on a target in the Nangarhar Province inside Afghanistan. It was the first non-testing use of the bomb.
The basic operational concept bears some similarity to the BLU-82 Daisy Cutter, which was used to clear heavily wooded areas in the Vietnam War and in Iraq to clear mines and later as a psychological weapon against the Iraqi military. After the psychological impact of the BLU-82 on enemy soldiers was witnessed, and no BLU-82 weapons remained, the MOAB was developed partly to continue the ability to intimidate Iraqi soldiers. Pentagon officials had suggested their intention to use MOAB as an anti-personnel weapon, as part of the “shock and awe” strategy integral to the 2003 invasion of Iraq. Read the rest of this entry »
The Defense Department still uses 8-inch floppy disks and computers from the 1970s to coordinate nuclear forcesPosted: April 3, 2017
Mackenzie Eaglen writes: Dale Hayden, a senior researcher at the Air Force’s Air University, told an audience of aerospace experts earlier this month that proliferation of antisatellite technology has put America’s communications networks at risk. “In a conflict, it will be impossible to defend all of the space assets in totality,” he said. “Losses must be expected.”
It has never been easier for America’s adversaries—principally Russia and China, but also independent nonstate actors—to degrade the U.S. military’s ability to fight and communicate. Senior military officials have expressed grave doubts about the security of the Pentagon’s information systems and America’s ability to protect the wider commercial virtual infrastructure.
The U.S. Navy, under its mission to keep the global commons free, prevents tampering with undersea cables. But accidents—and worse—do happen. Last year a ship’s anchor severed a cable in the English Channel, slowing internet service on the island of Jersey. In 2013 the Egyptian coast guard arrested three scuba divers trying to cut a cable carrying a third of the internet traffic between Europe and Egypt. “When communications networks go down, the financial services sector does not grind to a halt, rather it snaps to a halt,” warned a senior staffer to Federal Reserve Chairman Ben Bernanke in 2009. Trillions of dollars in daily trading depends on GPS, which is kept free by the Air Force.
There are now an estimated 17.6 billion devices around the world connected to the internet, including more than six billion smartphones. The tech industry expects those numbers to double by 2020. That growth is dependent, however, on secure and reliable access to intercontinental undersea fiber-optic cables, which carry 99% of global internet traffic, and a range of satellite services.
The U.S. military is working on ways of making them more resilient. For instance, the Tactical Undersea Network Architectures program promises rapidly deployable, lightweight fiber-optic backup cables, and autonomous undersea vehicles could soon be used to monitor and repair cables. In space, the military is leading the way with advanced repair satellites as well as new and experimental GPS satellites, which will enhance both military and civilian signals. Read the rest of this entry »
Rob Pyers was a laid-off grocery bagger who learned to code on YouTube. Now the website he runs, the California Target Book, is shining a light on spending by politicians, their campaigns, and outside groups.
Rob Pyers didn’t set out to bring transparency to establishment politics. In fact, he didn’t even have any programming experience before he built the electronic systems for the California Target Book, a go-to resource for political transparency in the state. He initially came to Los Angeles with aspirations of becoming a screenwriter, but ended up stuck in his day job, bagging groceries. Then Walgreen’s laid him off, and he needed something else to do.
After joining the Target Book, Pyers taught himself how to code, mostly by watching YouTube videos. Two years later, the 41-year-old has built its systems from the ground up, and now runs the website from his cramped West Hollywood one-bedroom. He is often the first to publicize major donations and new candidates, making his Twitter feed invaluable to campaign consultants and journalists alike.
Pyers, who describes himself as “95 lbs of concentrated tech geek,” has become an expert on pulling data from hundreds of voter databases, election filings, and campaign finance disclosures. He’s done all this despite the fact that the state’s main resource for campaign information is an inaccessible hodgepodge of ZIP archives and tables that even the current Secretary of State has called a “Frankenstein monster of outdated code.”
“California’s Cal-Access website is notorious for being just sort of an ungodly, byzantine mess,” says Pyers. “If you have no idea what you’re doing, it’s almost impossible to get any useful information out of.”
The state is currently working on a multi-million dollar upgrade to the site, with an expected rollout in 2019. But while the government builds its new system, the Target Book has already proven its worth. During one 2016 Congressional race, the L.A. Times used Pyers’ data to reveal that candidate Isadore Hall may have misused hundreds of thousands of dollars of campaign cash. Read the rest of this entry »
Dissidents are using USB drives to smuggle information into authoritarian regimes.
“The struggle for freedom is one that used to be about who has more guns. Now information is a key component in making sure the government doesn’t get away with winning the day with its narrative and pushing what governments tend to do, which is the use of fear to control the population.”
But if you were looking for something truly disruptive at SXSW, look no further than a group of activists using tech to spread information to citizens oppressed by authoritarian regimes.
“The people out there they don’t have satellites, they don’t have internet, they have nothing,” says Abdalaziz Alhamza who escaped Syria and co-founded Raqqa is Being Slaughtered Silently. “To be stuck with only ISIS propaganda, it will affect them.”
Alhamza and dissidents from Eritrea, Afghanistan and Cuba were brought together by the Human Rights Foundation (HRF) for a panel discussion called “The Real Information Revolution.” Reason caught up with the group at the HRF booth on the convention floor, centered around a large wall of Kim Jong Un faces with USB ports for mouths. Attendees were invited to donate USB drives into the display. The drives will later be smuggled into North Korea after being wiped and filled with films and information from the outside world. Read the rest of this entry »
The U.S. stands to lose 80 million jobs to automation.
Thomas Phippen reports: The robotic labor revolution is coming quickly, and the workforce may not be able to adapt without long periods of unemployment, according to economists at the Bank of England.
“Economists should seriously consider the possibility that millions of people may be at risk of unemployment, should these technologies be widely adopted.”
“Economists should seriously consider the possibility that millions of people may be at risk of unemployment, should these technologies be widely adopted,” BOE economists Mauricio Armellini and Tim Pike wrote in a post on Bank Underground, a blog for bank employees, Wednesday.
Artificial intelligence (AI) “threatens to transform entire industries and sectors,” the authors write, arguing that with the speed of industries adopting technological developments won’t give the labor force time to adjust. Read the rest of this entry »
Brainchild of artist and actor Jack Millard causes stir along highway in Arizona
(1) Titan launch test from Cape Canaveral, only first stage engine tested, 2nd stage only a dummy, engine with 300,000 lbs thrust successful (2) News In Brief – Berlin mayor Willy Brandt arrives in U.S., speaks in English (3) “Virginia” – Fort Meyer VA funeral of 6 bodies returned by Russia, crew of plane shot done by Russia, no word of other 11 crew missing (partial newsreel).
1959: The United States successfully test-fires its first Titan I intercontinental ballistic missile. The threat of global nuclear holocaust moves from the plausible to the likely.
Tony Long The Titan I was not the first ICBM: Both the United States and Soviet Union had already deployed ICBMs earlier in the 1950s (the Atlas A by the Americans, the R-7 by the Russians). But the Titan represented a new generation, a liquid-fueled rocket with greater range and a more powerful payload that upped the ante in the Cold War.
The Titan that the U.S. Air Force successfully launched from Cape Canaveral featured a two-stage liquid rocket capable of delivering a 4-megaton warhead to targets 8,000 miles away. A 4-megaton detonation, puny by today’s standards, nevertheless dwarfed the destructive power of the atomic bombs dropped on Japan.
The Titan’s range meant that, firing from its home turf, the United States was now capable of hitting targets in Eastern Europe, the western Soviet Union and the Soviet Far East.
The first squadron of Titan I’s was declared operational in April 1962. By the mid-’60s, five squadrons were deployed in the western United States.
The missiles were stored in protective underground silos, but had to be brought to the surface for firing. The Titan II, which began appearing in large numbers during the mid-’60s and eventually supplanted the Titan I, would be the first ICBM that could be launched directly from its silo.
Today, ICBMs can be launched from silos, from mobile launchers and, most effectively, from submarines. Read the rest of this entry »