Would you believe that the latest weapon to fight crime in cities around the world isn’t really a weapon at all?

Roger that.

As it turns out, artificial intelligence (AI) and machine learning are helping cops take a bigger bite out of crime than ever before—helping prevent crimes before they happen.

One emerging trend—referred to as predictive policing—is answering the call, leveraging the same predictive analytics technology that’s benefitting industries like energy, financial services, retail, and real estate.

A recent BBC article covered the story of how California-based startup PredPol leveraged technology designed to predict earthquakes to develop a predictive policing app. Today, the company is working with more than 50 police department across the US, claiming it can reduce a city’s crime rates anywhere from 10-50 percent. Using machine-learning algorithms, PrePol analyzes three data points—crime type, location, and time—to predict when and where crimes will happen. Helping police proactively patrol, the result can not only reduce crime rates, but it can literally save lives by better preparing police for what they may face.

While law enforcement has used practices like “hot spot analysis” to fight crime in a similar manner before, predictive policing goes further. With the application of AI and machine learning, police can “predict” the future vs. “react” to what happened yesterday. This is because AI and machine learning have the capacity to analyze massive amounts of data, more than humans—including real-time and historical data. It’s this added intelligence, at speed, that enables police to be more effective by spotting crime patterns they had not seen before.

Other useful policing apps that leverage AI and machine learning include facial recognition, advanced surveillance, and even sensors that can detect gunfire before the police do.

But while solutions like predictive policing are showing promise for fighting crime, not all AI-powered policing apps are out walking the daily beat. A controversial solution known as sentencing software has faced a backlash and accusations of fostering racial bias. Sentencing software supposedly determines the likelihood of criminals to become repeat offenders by analyzing information on defendants, crimes and other activity. However, a paper published earlier this year in Sciences Advances magazine found that the AI-powered software was no better at predicting repeat offenders than humans.

The applications of predictive analytics are endless, with the latest breakthroughs protecting our citizens in ways we never before thought possible. While there may be some mis-steps along the way, there’s no doubt that AI-powered policing is here to stay.