Trending Topics
Sponsored Content

3 keys for responsible use of artificial intelligence

Focus on specific use cases and proven technologies to help address community concerns about AI tools like facial recognition

Sponsored by
GettyImages-1180902197.jpg

AI and facial recognition technologies are among the latest tools to help law enforcement agencies serve their communities more efficiently. It’s important to apply these tools thoughtfully and to explain to the public how they will be used in order to build trust.

image/Getty

Sponsored by Motorola Solutions

By Police1 BrandFocus Staff

The phrase “artificial intelligence” often raises questions about how it’s being used, particularly in the role of public safety. Those who actually work in law enforcement know the technology can play a critical role in keeping communities safe.

It has always been important to educate the public about the benefits of these technologies. But now, given the protests around the country in the wake of George Floyd’s death, it is more important than ever that law enforcement makes every effort to share with the public why and how it is using these technologies to protect and serve.

What best practices, then, can law enforcement use to help citizens understand that AI and facial recognition technologies are valuable tools to help law enforcement agencies serve their communities more efficiently?

Start with transparency, advises Roger Rodriguez, a former NYPD detective and lead of NYPD’s first facial recognition program. Now a senior consultant with Motorola Solutions, which offers several new AI tools specifically designed for public safety purposes, Rodriguez says it’s important to work with citizens to understand what their concerns are, then share the specific goals your law enforcement agency is looking to accomplish. Use those goals to create policies to govern how your agency will use those technologies, and share successful use cases with the media and the communities you serve to build trust.

“A core complaint of law enforcement is inefficiency,” he said. “AI technology helps law enforcement create more efficient workflows. Facts outweigh assumptions and can help quell the rhetoric that unfairly criticizes this technology.”

Building trust with the community is first and foremost, and has arguably never been more important than it is today. Second is understanding state and local laws and rules and then working with your vendor of choice to build a program and a policy that meets your goals and complies with the laws.

In addition to its AI tools, Motorola Solutions also offers guidance to help address community concerns about these emerging technologies, including three principles that guide the company’s use of AI technology, such as facial recognition:

1. USE TECHNOLOGY TO PROVIDE SUGGESTIONS, NOT DECISIONS

A human decision-maker is at the core of responsible AI use, says Rodriguez, a concept that Motorola Solutions dubs “the human in the loop.” The company’s philosophy states that the best use of AI in public safety is to assist and accelerate human decision-making, not replace it.

This means that AI tools suggest leads but never make the decision for the officer or investigator. With the company’s AI tools, the human operator always has the final decision on whether a lead is accurate or not.

The idea that technology is making the decisions instead of human operators is the biggest misconception, Rodriguez says.

“That’s not the case,” he said. “Yes, AI is there to assist you. Yes, AI is exciting and an emerging piece of technology, but it is not the final deciding factor. The human has the deciding factor. That’s why you have the law enforcement expert in front of that computer or responding to that call or making those investigative decisions.”

For example, facial recognition programs can provide leads for investigators to consider, but that doesn’t mean someone flagged by the program is an automatic candidate for further investigation.

Instead, it is always up to the agency to decide whether the match by the technology becomes an investigative lead. Toward that end, Motorola Solutions encourages agencies with facial recognition programs to ensure their face examiners are properly trained, have a policy governing the program in place and never to rely on facial recognition results alone to establish probable cause.

2. USE TECHNOLOGY TO BOOST THE EFFICIENCY OF SPECIFIC POLICING TASKS

Overreach is a common concern cited in efforts to ban or block the use of AI technology in policing, such as the three-year ban on using bodycam footage for facial recognition enacted in California in October 2019.

The best way to address these concerns, says Rodriguez, is by focusing on very specific law enforcement use cases for AI and other smart technology applications. Motorola Solutions offers several smart technologies designed specifically for law enforcement that support policing activities from pre-event (detecting, alerting) through mid-event (response, reporting) and post-event (investigative leads, analysis).

Many people equate artificial intelligence with facial recognition when, in fact, AI can be applied to many other aspects of law enforcement. For example, AI is used to streamline reporting by auto-populating and cross-checking fields, to transcribe speech using speech to text and to power voice interactions.

For example, Motorola Solutions’ new smart radio, APX NEXT, has a voice-activated assistant (like Siri or Google) that helps officers focus on the task at hand. An officer can say, “Hey ViQi – look up license plate ABCD,” and the radio’s AI looks up the plate and tells the officer if there’s a warrant, etc. This helps officers respond to or mitigate a situation without having to burn precious minutes going back to the cruiser to look up critical information – and more importantly, they can do it without having to take their eyes off the situation.

Officers can use the voice-activated assistant in Motorola Solutions’ new smart radio, APX Next, to look up license plates and other critical information on the go, which helps them save time and maintain focus on the situation. (image/Motorola Solutions)

Officers can use the voice-activated assistant in Motorola Solutions’ new smart radio, APX Next, to look up license plates and other critical information on the go, which helps them save time and maintain focus on the situation.

image/Motorola Solutions

The intelligence analytics within the radio helps make officers more efficient and safe, says Rodriguez, because they get critical information faster, can keep their eyes on the scene, and the voice-to-talk chatter is also being transcribed into a report to cut down on after-action paperwork.

AI technology can benefit investigations in multiple ways, especially when it comes to video analysis.

“We work very closely with IJIS and PERF to really understand where AI can be applied to video, whether that is facial recognition or other forms of analysis,” said Rodriguez.

For example, AI technology can be applied to a bank of cameras in a real-time crime center to detect and analyze unusual traffic, such as a vehicle going the wrong way, and alert the human operator.

“AI can pick that up without the human having to watch the screen,” said Rodriguez. “The AI would pick up on that car going in the wrong direction, alert the analyst, and the analyst would then verify and respond accordingly.”

This application makes AI a force multiplier, he adds, because no human analyst is able to watch and analyze the feeds of multiple cameras simultaneously. The AI can analyze multiple feeds at the same time, and more importantly, it can highlight areas of interest. For example, through its Avigilon platform, Motorola Solutions offers an application called Focus of Attention that helps the human analyst focus on the camera feed that is most important at any given moment.

3. ADOPT PROVEN TECHNOLOGIES DESIGNED SPECIFICALLY FOR LAW ENFORCEMENT

Artificial intelligence isn’t one-size-fits-all, cautions Rodriguez. Motorola Solutions carefully considers which investigative functions would best be served by AI technology and how to ethically apply proven AI components to those functions.

“We are rigorously testing every single piece of technology that has some AI component within it to ensure that it meets law enforcement standards,” he said. “Just as important, we’ve established the Motorola Solutions Technology Advisory Committee to serve as a ‘technical conscience’ and advise the company on the legality, ethics, limitations and implications of specific product technologies, including AI and facial recognition. MTAC guides how we innovate and encourages the responsible use of our technologies by our customers.”

Using the simplest and most mature solutions makes the AI more predictable and reliable, he adds. The facial recognition program Motorola Solutions offers through Vigilant FaceSearch uses AI to generate leads by comparing a photo of a person of interest against against a gallery of images or photos provided by a complainant or witness as part of the overall investigation.

For example, facial recognition can be used to find an abducted, lost or missing individual, as in the case of a Silver Alert or Amber Alert. Similarly, it can be used when an agency must identify a homicide victim who has no other form of identification. This proven technology uses biometric algorithms of facial landmarks to find potential matches to narrow down the candidate pool, based on the shape of the person’s face.

“AI really plays the factor in making it more efficient,” said Rodriguez. “Instead of looking through hundreds of thousands of photos manually, which is what officers used to do, now it’s just a repository of those photos processed by a facial analysis system that makes it easier to search.”

More importantly, a law enforcement investigator’s use of a proven facial recognition solution results in better identification analysis. A key study from the National Academy of Sciences has found that a well-trained human, coupled with an accurate facial recognition engine, produces the most accurate results versus multiple collaborating humans or the algorithm by itself.

Privacy is understandably a concern, but it’s important to educate the public about the basic investigative functions law enforcement seeks to enhance with data and AI tools.

Do your research and choose the right vendor to meet the needs of your agency and your community, advises Rodriguez.

“Really analyze who your vendors are and if they really understand law enforcement’s application of these technologies,” he said. “Choose a vendor who deploys a responsible solution that covers system oversight and reporting tools, audit capabilities, compliance controls and controls for data sharing and user permissions, and who recommends policy and offers training.”

The bottom line? Focus on specific use cases of technologies like facial recognition for your agency, then craft smart policies to guide usage, enact those policies to ensure proper and responsible usage, and educate citizens in order to build trust and allay privacy concerns.

Visit Motorola Solutions for more information.

This article originally appeared in “AI in Law Enforcement: Harnessing the Power of Emerging Technologies.”

READ NEXT: How new smart radio features help keep officer focus where it is needed most

RECOMMENDED FOR YOU