The woman killed herself after shooting with an AK-47-type automatic gun. Another man was killed in the raid, a police official said. Three police officers were injured.
“A woman blew herself up at the start of a dawn raid Wednesday on an apartment in the northern Paris suburb of Saint-Denis targeting suspects linked to Friday’s massacre in Paris,”
As the raid was unfolding, the Paris prosecutor said three men holed up in the apartment were detained, as well as a man and a woman nearby. Investigators didn’t immediately identify the detainees.
French authorities suspect Abdelhamid Abaaoud, a Belgian-born Islamic State operative and presumed mastermind of the deadly Paris terror attacks may be in Saint-Denis, where elite police were conducting the raid in the early hours of Wednesday morning, a spokeswoman for the Paris prosecutor said.
If confirmed, Mr. Abaaoud’s presence in Saint-Denis—near the a sports arena where three suicide bombers detonated their explosive vest on Friday—would deepen concerns about Europe’s security. It would raise questions over how an Islamic State operative, who featured prominently on Western military’s target lists, slipped back through borders to sow terror in the heart of the continent.
Adam Rogers writes: Imagine an election—A close one. You’re undecided. So you type the name of one of the candidates into your search engine of choice. (Actually, let’s not be coy here. In most of the world, one search engine dominates; in Europe and North America, it’s Google.) And Google coughs up, in fractions of a second, articles and facts about that candidate. Great! Now you are an informed voter, right? But a study published this week says that the order of those results, the ranking of positive or negative stories on the screen, can have an enormous influence on the way you vote. And if the election is close enough, the effect could be profound enough to change the outcome.
In other words: Google’s ranking algorithm for search results could accidentally steal the presidency. “We estimate, based on win margins in national elections around the world,” says Robert Epstein, a psychologist at the American Institute for Behavioral Research and Technology and one of the study’s authors, “that Google could determine the outcome of upwards of 25 percent of all national elections.”
Epstein’s paper combines a few years’ worth of experiments in which Epstein and his colleague Ronald Robertson gave people access to information about the race for prime minister in Australia in 2010, two years prior, and then let the mock-voters learn about the candidates via a simulated search engine that displayed real articles.
One group saw positive articles about one candidate first; the other saw positive articles about the other candidate. (A control group saw a random assortment.) The result: Whichever side people saw the positive results for, they were more likely to vote for—by more than 48 percent. The team calls that number the “vote manipulation power,” or VMP. The effect held—strengthened, even—when the researchers swapped in a single negative story into the number-four and number-three spots. Apparently it made the results seem even more neutral and therefore more trustworthy.
But of course that was all artificial—in the lab. So the researchers packed up and went to India in advance of the 2014 Lok Sabha elections, a national campaign with 800 million eligible voters. (Eventually 430 million people voted over the weeks of the actual election.) “I thought this time we’d be lucky if we got 2 or 3 percent, and my gut said we’re gonna get nothing,” Epstein says, “because this is an intense, intense election environment.” Voters get exposed, heavily, to lots of other information besides a mock search engine result. Read the rest of this entry »