No Matter What It Takes

Автор: Пользователь скрыл имя, 20 Декабря 2010 в 21:03, реферат

Описание работы

Apparently the accountants caught wind of this and told the bosses how much such a switch would cost (we're talking several billion dollars, at least). So now, the final decision (for the moment) is that Bulava will be made to work, no matter what it takes. Moreover, an investigative committee determined that most of the problems may have been due to sloppy manufacturing. So the construction of the Bulavas was ordered moved to another factory. That decision was also reversed, after someone did the math. Several senior development officials have already been fired. More jobs are on the line, although the latest successful test has saved several careers.

Работа содержит 1 файл

Практика4курс2.doc

— 124.50 Кб (Скачать)

     It all began back in the 1970s, when some CIA analysts discovered a new way to analyze the mountains of information they were receiving. The new tool was predictive analysis. What does this do for intelligence analysts? Predictive analysis was the result of a fortuitous combination of OR (Operations Research), large amounts of data and more powerful computers. OR is one the major (and generally unheralded) scientific developments of the early 20th century. OR is basically applying mathematical analysis to problems. OR turned out to be a major "weapon" for the Allies during World War II. OR, like radar, was developed in the 1930s, just in time for a major war, when whatever was available was put to work to win the conflict. OR is also, half jokingly, called a merger of math and common sense. It is widely used today in science, industry and, especially, in business (it's the primary tool of MBAs, where it's called "management science".) With predictive analysis, the most important OR tool was the ability to "backtest" (see if the simulation of a situation could accurately predict the outcome of something that had already happened, if the same historical decisions are made). For predictive analysis of contemporary situations, the backtest is, instead, a predictive tool that reveals likely outcomes.

     Predictive analysis, like OR in general, creates a framework that points you towards the right questions, and often provides the best answers as well. Like many OR problems, especially in the business world, the simulation framework is often quite rough. But in war, as in commerce, anything that will give you an edge can lead to success over your opponents. A predictive analysis is similar to what engineers call "a 60 percent solution" that can be calculated on the back of an envelope.

     The one form of predictive analysis that the general public is aware of is wargames, and these have been increasingly useful in predicting the outbreak, and outcomes, of wars. There have even been commercial manual (like chess) wargames that have successfully applied predictive analysis. The commercial manual wargames produced some impressive results when it came to actual wars.

     In late 1972 a game ("Year of the Rat") was published covering the recent (earlier in the year) North Vietnamese invasion of South Vietnam. This game didn't predict the outcome of the war, but it got the attention of people in the intelligence community, especially those who knew something about wargames, for it was a convincing demonstration of what a manual wargame, using unclassified data, could do in representing a very recently fought campaign. There was even talk that these games could actually predict the outcome, and details, of a future war. The next year, wargames did just that, accurately portraying the outcome of the 1973 Arab-Israeli war. The game ("Sinai") was about to be published when the war broke out, but some people in the intelligence community knew about it. A member of the Israeli UN delegation had watched the game in development (he was a wargamer), and was assigned to camp out at the publisher's offices, while the war raged, and report what the game was predicting.

     There weren't many wars to practice these predictive techniques on after that, until 1990, when Iraq invaded Kuwait. Months before the Coalition counterattacked, a game appeared ("Arabian Nightmare"), that predicted everything, including the low Coalition casualties. This time, the media got wind of it, and the game was featured on "Nightline" in October, 1990. This didn't cause much excitement with the general public, it was just some more weird stuff on the tube.

     What about the war on terror? From a wargamers perspective, it's not a difficult conflict to simulate. International terrorists are nothing new, and if you know how to work out the media impact on this, you've got yourself a wargame. Actually, you can do most of this stuff on a spreadsheet (which is a good vehicle for many types of predictive analysis). Same with the war in Iraq, or Afghanistan. Both countries are behaving as they have for centuries. Anyone familiar with the history of these two places, won't be surprised with what's going on there now, or how it's all going to turn out. Forget the media, they haven't a clue, and don't need one to stay in business.

     Remember, wargamers are also historians. They look at things from a historical perspective, and immediately apply an OR approach to any event they are studying. First thing they think of is; who has what, what can they do with it and what are the goals of the different factions? The Afghan tribes have issues, always have, always will until the tribal system fades away. In Iraq, the Sunni Arab minority wants to be in charge, and some of them are willing to fight on to avoid war crimes trials and confiscation of the oil money they stole. Al Qaeda is yet another attempt by Islamic conservatives to conquer the world. The Turks kept them in check for centuries, but thousand year old dreams die hard, especially in a culture that has found so many ways to fail.

     Wargames and predictive analysis put things in perspective. They force you to face reality. As a result, this kind of tool is not popular with politicians (who have a different kind of reality) and journalists (who want headlines, not reality.) But people in the military still use these tools to quickly get a grasp of fast moving situations. General Barry McCaffrey, CINC of SOUTHCOM, for example, was faced with a war between Peru and Ecuador in 1995. The Pentagon and the White House were looking to him for a quick analysis of the situation. Fortunately for him, the guy who designed Arabian Nightmare (Austin Bay, a reserve officer mobilized to debrief former Cuban soldiers among the Cuban refugees being moved through Panama, was in the area). LTC Bay came to the attention of a colonel on the CENTCOM staff, who remembered seeing some of Bays wargaming work at the Army War College, and asked LTC Bay if he could whip up a Peru-Ecuador wargame overnight, so they could put together an analysis for GEN McCaffrey. It was done, and, when McCaffrey briefed the Joint Chiefs, he used LTC Bay's game, and its analysis. It was noted that McCaffrey's tools were better than anything that Leavenworth or DC area analysts were able to come up with. McCaffrey gave Bay a commendation medal.

     The CIA uses wargames to get a better sense of the big picture, but found that turning those predictive analysis tools to the problem of identifying Islamic terrorists hiding out among Pushtun tribesmen along the Afghan border, also worked.  

     Russians Rue The Good Old Days

     August 10, 2010: The Russian prime minister, Vladimir Putin recently praised the ten Russian spies who had been discovered earlier this year in the United States, and exchanged for four spies in Russian prisons. Until this recent praise from Putin, it was uncertain how Russia would react to what was actually an espionage debacle. Now we know that Russia will stand by, and look after, its ten inept spies.

     This ineptitude is in sharp contrast to the Cold War (1945-91), where the Soviet Union had a much larger number of spies, and much better ones at that. Russia used to be the premier trainer of all types of spies. The ten Russian spies caught in the United States are called, in the trade, "illegals." This is because the most important spies usually have official jobs at the embassy, and thus are protected ("legal") by diplomatic immunity. During the Cold War, the U.S. and Russia restricted how many diplomatic personnel they allowed the other side to have. This is a normal part of establishing diplomatic relations, determining how many diplomatic personnel will be "recognized" (as immune to arrest, for any crime). All you can do with unwanted diplomatic personnel is to order them to leave the country, and this is usually done when "legal" spies are caught.

     "Illegals" are spies who do not have diplomatic immunity, and can be imprisoned, or even executed, if caught. Most countries use a lot of  diplomatic personnel, without diplomatic immunity, for this job. But the most important illegals were those who were living in a foreign country pretending to be locals, or migrants from some friendly nation. The Russians were very good at creating convincing "legends" (fake identities and back stories) for their illegals. During the Cold War, the Russians were so good that they were rumored to have special boarding schools where promising Russian children were sent to learn how to speak and act like an American (or German, or Briton or Brazilian or whatever). This was mostly fantasy, but there were schools that taught the customs of foreign nations, and language institutes where illegals could have their accent tweaked to eliminate all trace of its Russian origin.

     Russia would also recruit spies in third countries, and train them to be illegals in another nation, like the United States (where there were always a lot of migrants.) All these illegals were employees of the KGB (the Russian CIA/FBI), had KGB ranks, and, if they stayed alive and were successful, would eventually retire to a comfortable life on their KGB pension. Many did so, although dozens were caught and served long jail terms. A few were exchanged for U.S. spies in Russian jails. Some illegals switched sides, and had to worry about KGB death squads until the Soviet Union collapsed in 1991.

     All those KGB schools, and most of the world class KGB expertise, disappeared with the Soviet Union in the 1990s. The ten illegals caught in the United States this year were strictly amateurs, although they had some training and were employees of the FSB (the much smaller Russian successor to the KGB). But their language and cultural training were not up to KGB Cold War standards. Neither were their espionage skills. All ten were quickly detected and put under surveillance by the American FBI, which hoped to learn as much as possible about how the FSB operated, before rounding the illegals up. This crew were arrested when one of them apparently began suspecting that they were being watched, and reported this back to Russia. The FBI was indeed watching, and managed to arrest ten of the eleven Russian illegals they were monitoring. The eleventh spy may have been a double agent, as the Russians have said little about him.

     The FBI, obviously, is not releasing many details of this case, because they are likely other Russian illegals being watched. Some of these may not been confirmed as illegals, or may have been called back to Russia. Details on that sort of thing will be revealed in the future. Needless to say, all this espionage continues, much as it did when the Soviet Union collapsed. During the 1990s, a lot of suddenly (or potentially) unemployed KGB personnel (including legals, and officials back in Russia), offered to sell information to the CIA and FBI. Many of these deals were consummated, and Russia's formidable Cold War espionage network took a lot of damage in the 1990s. But in the last decade, Russia has been rebuilding. But it won't be the same. Now that we know how extensive the KGB espionage network was (due to all those 1990s turncoats), it's unlikely anyone else will have the resources, or ignorance in the West, to pull it off.  

     Total Recall

     August 2, 2010: The CIA (U.S. Central Intelligence Agency) and Internet search firm Google have joined forces to create a new software system, Recorded Future, that uses data mining of Internet data, and predictive analysis, to determine what people are up to. Google and the CIA have both been doing this sort of thing for decades (at least for the 63 year old intelligence agency). Google pioneered using data from the Internet to help advertisers find likely customers. The CIA has always been on the lookout for those who are seeking to harm America and Americans.

     This effort was triggered by Major Nidal Malik Hasan's murder of 13 people at Fort Hood last November 5th. Hasan's attack was the 13th reported (in the news media) act of an Islamic terrorism in the United States that year. The other twelve incidents consisted of arrests, or failed attacks. In one incident, an mosque official opened fire on FBI agents, and was shot dead. What the CIA noted was that most of these attacks were carried out by people who had left clues on the Internet about their intentions.

     The Hasan attack resulted in the first Americans killed by Islamic terrorists in the United States since September 11, 2001. But it's not for want of trying. Naturally, the FBI and intelligence agencies don't want to talk about their work, because most of it consists of keeping tabs on people who may, or may not, be terrorists. And, as terrorists like to point out, they can fail many times, and it's not news. But if the counter-terror effort fails once, it's big news. Thus the contortions the U.S. government went through to label the Hasan attack as anything but terrorism.

     In the United States, Australia, and Europe, police continue to arrest local Moslems (most of them recent immigrants) for trying to organize terror attacks. The police invariably have compelling proof, in the form of emails, phone recordings or videos. Many of these terrorists would not have been caught before September 11, 2001. That's because the intelligence gathering tools, and attitudes towards them, have changed a lot in the last decade.

     It's not just that a lot more people have been hired to seek out terrorists. The big change is the technology. More and more, it's robots that are looking for the terrorists. This approach has raised some interesting legal questions. For example, are privacy rights violated if only a robot is looking at the information? Many people aren't concerned with robots watching what they do, or have done. But American law, and the courts that interpret it, still give privacy rights primacy, even if no humans are involved in the surveillance. It wasn't always that way.

     Privacy rights have become a growing issue since World War II. But, since September 11, 2001, it's become obvious that protecting those rights can get people killed. For example, the investigation of the 911 attacks revealed that a terrorist suspect was captured before the attacks, who had information on his laptop that could have exposed the preparations for the attack. The FBI did not look at the laptop's hard drive because of concerns over violating the suspects privacy rights.

     Privacy in the modern world is a misunderstood concept. While the law keeps the government from using many forms of information, or information searching, for law enforcement or national security tasks, there are far fewer restrictions on commercial use of similar data and tools. The difference is that, without the access of commercial users to credit card, real estate, and other commercial transactions, the cost of these transactions would go up because of increased fraud. Thus the public tolerates this degree of surveillance to reduce fraud, and what they pay for things. And then there's data mining, an old technique that, as long ago as the 1970s, was used to identify and arrest terrorists in Germany. Yet the same techniques today are seen by many as an assault on privacy rights. Meanwhile, data mining has been used by commercial firms for decades to determine who your best customers are.

     What it comes down to is people not trusting their government, or at least trusting banks, credit card companies and mass marketing companies more than politicians. There's probably some wisdom in that, but it constantly puts intelligence officers up against a choice between tracking down terrorists, and breaking the law, or just ignoring the problem and making sure that all your paperwork is in order when the post-attack investigators come looking for reasons "why this happened."

     The distrust of politicians and government officials rests more on attitudes than facts. There's far more abuse of databases by private individuals than by government officials (who are more likely to get caught and prosecuted.) As a result, there are very few cases of these data searches actually being abused. But the fear is great, just like the irrational fear of nuclear power plants, alongside a tolerance for much more dangerous coal and oil fired plants. It's why people feel safer driving to an airport, than when they fly off on an aircraft. It's much more dangerous to travel in the car, but we're not talking about logic and truth here, but emotion and fears that can be exploited.

     But now robots are doing the searching, and suddenly the fears are going away. Take video surveillance. For a long time this was seen as yet another intrusion on privacy, even through almost all the surveillance was on public spaces. But suddenly, everyone has an "aha" moment when they realize that the cameras are recording, and nearly all those videos are never seen by human eyes unless a crime has been committed. At that point, you can more easily identify the criminal, and prosecute with little muss or fuss. The criminals, at least the ones with half a brain, now avoid places where there are cameras, and crime rates go down in those areas.

     But trying to make the same case for data mining databases in search of terrorists, even when nearly all the work is done by robots, still raises the hackles of civil libertarians who see this as an infringement on privacy. The government can't be trusted, even though there is no track record of government abuse in this area. It's not just an American problem. In the 1970s, after German police used data mining to shut down a lethal bunch of leftist terrorists, the data mining program was dismantled, lest some bureaucrat do some unnamed, but really terrible, mischief. The terrorists are back, and the police have had to carefully sneak back in the data mining tools.

     The same thing is happening in the United States. With paranoid lawyers at their sides, for protection, intelligence agencies are using data mining in innovative ways that catch the terrorists, while keeping the data miners out of jail. So far. Members of Congress who have been briefed have let the roundabout methods pass, for now. Members of Congress have been known to suddenly develop amnesia if something they have let pass suddenly becomes a war crime in the struggle to protect privacy.

     The Recorded Future project is different in that it is using data available to anyone. This has already become a problem for many people, who find that it's much more difficult to hide their past, at least if they have spent any time on the Internet. Google is criticized for using this data to assist in selling ads. But so are many other companies that use public data to improve their marketing efforts. What is particularly scary to some people is that a powerful software system that scours the entire Internet will probably be able to find out anything about anyone. That's what the CIA is after, because CIA analysts and statisticians, know that data mining (looking through vast quantities of data in search of for patterns and connections) works. It also works for companies out to sell more of their products.

     The way predictive analysis works is quite simple. With more data (from any source) it's possible to create a model (or simulation) of what potential terrorist activity looks like. Thus, if the CIA analysts see certain patterns of Internet activity, they can accurately predict where the Islamic terrorists are, what they doing and, often, what they plan to do. At that point, the FBI can be alerted to put closer surveillance on the suspects. The track record of the accuracy of these predictions has been striking. It's used to find targets for UAV missile attacks in Pakistan and Afghanistan. Few civilians have been attacked, nearly all the targets have been, as the predictive analysis indicated, terrorists. Now the CIA wants to use Internet data to identify terrorists worldwide, and have a chat with them before anything nasty happens.

     The Craven Defenders

     October 31, 2010: The CIA recently revealed its internal investigation of how a suicide bomber got into a CIA compound in Afghanistan a year ago, and killed seven CIA personnel and a Jordanian intelligence agent. The CIA concluded that no one, still alive, was responsible for the security debacle. There were multiple errors made, most of them the responsibility of the agent in charge of the Afghan base, who was killed in the attack. While the CIA acknowledged that there were failures farther up the chain of command, no one was punished. These errors included warnings (that the bomber, a local informant, was working for the enemy) that were not passed along to the agent-in-charge in Afghanistan. There were also training lapses for the agent-in-charge in Afghanistan, as well as some obviously ineffective supervision of such operations, as far back as senior officials in the United States. While the Russians, and the U.S. Navy, are notable for using the "vertical chop" (firing many, if not all, commanders in the chain of command when such things happen), most large government bureaucracies take care of their own, especially in the senior ranks. At worst, poor performers may be transferred to other jobs, or encouraged to quietly retire. This may have happened in this case, but it is kept quiet, to protect reputations and morale at the top.

     For an organization that should depend a lot on risk taking, the CIA has, like most government bureaucracies, become risk-averse. An example of this occurred earlier this year, when media reports revealed that the CIA was paying Afghan government officials for information. The CIA wanted background info on what Afghan politicians and parties were really up to, and found more of it could be obtained with "gifts". The news reports described these payments as somewhat distasteful behavior. What the media reports really revealed was how difficult it was for the CIA to do its job.

     All this became an issue after September 11, 2001, as the CIA has undertook a massive recruiting program (of analysts and field operators), and introduced lots of new technology (especially for the analysts) and techniques. All this was largely the result of the CIA being put into a sort of semi-hibernation since the late 1970s. This was an aftereffect of the Church Committee, an investigative operation sponsored by Congress, that sought to reform, and punish, the CIA.

Информация о работе No Matter What It Takes