The weakest link in anti-terror systems

President Barack Obama has described recent lapses in intelligence that allowed a would-be terrorist to board a Christmas Day flight to Detroit as systemic failures. But these were not failures of systems; they were failures of people using the systems.

It's important to remember that the best of systems are no good if they are not used properly or not used at all. The use of computers and networks to share information and make it available throughout a widespread intelligence community has improved greatly in the last eight years but, in the end, our ability to use the intelligence depends on our ability to put eyeballs on the data and make decisions.

It was the eyeballs that apparently failed in the Christmas Day incident, not the technology. Correcting the problems that led to that failure is not going to happen quickly.

Effective sharing of intelligence traditionally has been hampered by technology and by culture. Computer systems that store, process and transmit the data were built more with an eye toward controlling the data rather than sharing it. And in a culture where knowledge is power, you don’t give away information.

When it became clear that this environment was failing to protect the nation from many threats, correcting the technology part was relatively simple. The databases and other systems now serving the intelligence community might not be perfect but, in this case, they appear to have performed as intended and the necessary information was available.

It is not clear that the problem of culture has been completely solved. Things have improved, but priorities still seem to fall along organizational lines so that pieces of information do not get the attention they deserve outside the organization that generated them. That probably is due in part to the amount of manpower available to do the analysis.

The failures described in Obama's review of the incident were primarily on the analysis side. Available information was not properly prioritized and followed up and the would-be terrorist’s name, although in the system, was not moved to the proper list. It was in the focus and priorities that things failed, not in information sharing.

The president has promised to sharpen the focus and make officials accountable for following up critical information. But no mention is being made of the critical element needed to make these promises pay off. That is manpower. We have automated systems to collect, filter, process, and transmit the data. But we still need to have eyeballs to examine it and make decisions.

We need people making critical decisions because computers, although they can be fast and efficient, are stupid. They will search for and find exactly what they are told to look for, but they need people to tell them what that is. In the end, it is far more effective for a human brain to pick out the tell-tale traits of a terrorist than it is to try to describe those traits to a computer.

Computers are useful in filtering data according to set criteria, flagging it when criteria are met and then alerting someone. But eventually someone with reasoning power needs to make the decision whether the conditions identified by a computer amount to terrorism.

We don’t know exactly why the whistles did not go off when a person whose name was in a database purchased a one-way ticket with cash and then checked no luggage on the flight. That is the kind of correlation a computer should be good at. But the plot could have been identified well before that point, and even if the system is tweaked to blow the whistles when something like that occurs, we still need more people looking at the data.

If Obama's reforms are to work, if focus is going to be sharpened and people held accountable for improved results, we are going to need more trained, experienced people examining suspicious data, and that is going to take time. We should start now finding, hiring and training those people.

About the Author

William Jackson is a Maryland-based freelance writer.

inside gcn

  • system security (Titima Ongkantong/

    The 2016 election: A lesson on integrity

Reader Comments

Fri, Jan 29, 2010 Robert L O'Dell Jr/Volunteer Intelligence Richmond, Mo. U.S.

I wonder How many scenario tests have come out of this Problem? If there were a Basic Scenario for each type of Targeted Intelligence to look for, there may be a better way to Identify the Targeted Threats, that are imposed by every known Penetration. Then to Cross The Threats into a Camouflage type of Scenario testing that would make a more Exposed Threat to be detected. This would make it easier to establish between Complex Threats and Simple Threats maybe. Then Human Error might not be so hard to do. If it makes sense it may gain an Advantage on the Threat Initiatives, as well as Detections too. If the Threats are Advisable.The best threat detection may be as easy as Intimidation too.

Tue, Jan 19, 2010

I suspect the system the president refered to includes the computer technology and the people that use it.

Thu, Jan 14, 2010 M Reston, VA

I heard an Israeli security consultant state it the other way around: Security systems need to back up the best quailifed human beings. This matches my experiences with any other type of system in solving unstructured problems.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group