Police Tech & Gear
with Tim Dees
Predicting criminal behavior: DHS takes a page from 'Minority Report'
The Future Attribute Screening Technology (FAST) Project uses high-tech sensors to detect indicators of potential criminal activity
By nature or training, law enforcement officers use a variety of cues to either predict the behavior of others, or to provide insight into their more obvious activities. A suspect who suddenly tenses his muscles or who looks furtively toward a possible escape route telegraphs to the officer that he is about to fight or flee. Responding to questions with non-direct answers or more questions — “Where do I live? I just got into town last week” indicates outright deception or an attempt to conceal information. These signs by themselves don’t provide enough foundation for an arrest, but any cop who ignored them wouldn’t remain a cop — or alive — for very long. Research underway by the U.S. Department of Homeland Security (DHS) is intended to detect cues of criminal behavior from much further away, and could determine the way an officer or an entire tactical unit would respond to an incident.
The research is the Future Attribute Screening Technology (FAST) Project, and uses high-tech sensors to monitor physiological activity, including heart and respiration rates and sudden changes or variances in them, eye movement and gaze direction, thermal patterns on visible skin surfaces, and movements of limbs and facial features. In short, DHS is trying to detect at a distance the cues that officers perceive when up close and personal with people. DHS is considering adding pheromone detectors to the mix to pick up on the scents we generate when emotionally stressed.
The objective seems to be to quantify and recognize the behaviors and cues of people who are, as cops are fond of saying, “getting hinky.” Most of us can recognize signs of nervousness or stress in others, even if we might have difficulty describing it to someone else. Polygraphs measure the effect of these stresses to some extent, recording blood pressure, pulse rate, respiration and changes in the conductivity of the skin as the tested subject sweats, presumably when in fear of being revealed as a liar. Although the science behind it is largely debunked, advocates of voice stress analysis claim they can detect microtremors in the voice that indicate stress and attempts at deception.
A system that could identify a “hinky” person in a crowd would be very useful at mass transit checkpoints and public gatherings. Officers who patrol and monitor these environments develop the ability to pick the bad actors out from the crowd, but anyone can be overwhelmed by fatigue and sheer numbers. Even when the stressed individual is identified, officers aren’t able to act decisively because their gut instinct is insufficient to justify interference with that person’s freedom of movement. An automated system would assign threshold values to indicators of criminal behavior or intent, and thus provide a basis for stopping and questioning the person under scrutiny.
The potential for abuse here is fairly grave. Some people are instinctively nervous in crowds or at the thought of getting into an aircraft. A traveler with no harmful intentions could be nervous about the job interview they’re going to, or feel guilty over the affair they had while on a business trip without their spouse. Should the government be allowed to detain and question someone about why they are showing signs of stress, solely because some individuals under stress are planning to bring down the aircraft?
Already, critics of FAST are comparing it to the system of the “precogs” described in Phillip K. Dick’s short story Minority Report and the 2002 Tom Cruise movie of the same name. In that universe, a trio of prescient advisers foresaw murders that were to take place within the next hour or so, so that the cops from the “Precrime” unit could arrest and hold the pre-killers in an electronically-maintained coma before they could do any harm. If one could be imprisoned for what they thought about doing, I would be doing life sentences many times over.
In tests with shills inserted into wide-area environments, FAST has been about 70 percent effective in identifying people demonstrating the target behaviors and indicators. That not only allows a lot of bad people to slip through, it also flags people who didn’t have any evil in mind at all. The technology is new, and may never be deployed for real-world protection. In the meantime, relax. It’s not like anyone’s watching.