08 November 2021

City of Mesa, Arizona Opts for OVER-POLICING...LAPD Not so much

A recent decision by the Mesa City Council to hire twice-as-many positions in the police department than what was earlier approved by voters was remarkable to say the least unless they expect an improbable sudden sharp spike in crime rates or need to they intend to curry political favors from a powerful bloc that deliver votes in the next election go-round. Those choices are a toss-up in the scheme of things. There was never any public here in Mesa and hardly any negative news reports.
A different story comes to light in Los Angeles

LAPD ended predictive policing programs amid public outcry. A new effort shares many of their flaws

<div class=__reading__mode__extracted__imagecaption>The LAPD documents show how data-driven programs validated existing policing patterns. Illustration: Ricardo Santos/The Guardian<br>The LAPD documents show how data-driven programs validated existing policing patterns. Illustration: Ricardo Santos/The Guardian</div>

"The Los Angeles police department has been a pioneer in predictive policing, for years touting avant-garde programs that use historical data and software to predict future crime.
But newly revealed public documents detail how PredPol and Operation Laser, the department’s flagship data-driven programs, validated existing patterns of policing and reinforced decisions to patrol certain people and neighborhoods over others, leading to the over-policing of Black and brown communities in the metropole.
This is helping automate the harm, automate the banishment, automate the displacement that policing has always been responsible for.
-- Shakeer Rahman, Stop LAPD Spying
The documents, which include internal LAPD documents and emails and were released as part of a report by the Stop LAPD Spying coalition, also suggest that pledges to reform the programs amid rising public criticism largely rang hollow.
LAPD’s efforts to rebrand its predictive policing experiments mirror a broader shift in the private surveillance industry, the experts say, as companies increasingly reinvent existing products in response to negative press on predictive policing.
“Rather than re-evaluating their whole business model, they’re just trying to reframe the value of the product,” said Albert Fox Cahn, the founder of the Surveillance Technology Oversight Project (Stop), another anti-police-surveillance advocacy group. “They’re saying: here’s how you can prevent crime by allocating officers and changing patrols and changing who you engage with. And that’s going to result in the exact same outcomes.”

How Operation Laser created a vicious cycle

Launched in 2011, Operation Laser (an acronym for Los Angeles Strategic Extraction and Restoration) got its name from what LAPD hoped it would do: extract “offenders” with the precision of a doctor using laser surgery to remove a tumor.

On its face, using a data-backed approach to remove a “tumor” may seem logical. The problem was, according to critics and experts, that the data the program ran on was malignant.

Activists protest outside the Palantir Technologies software company in 2019. Photograph: Shannon Stapleton/Reuters

Operation Laser used historical information such as data on gun-related crimes, arrests, and calls to map out “problem areas” (called “laser zones”) and “points of interest” (called “anchor points”) for officers to focus their efforts on. A newly established group, the crime intelligence detail, worked to create chronic offender bulletins, assigning criminal risk scores to people based on arrest records, gang affiliation, probation and field interviews. Information collected during these policing efforts was again fed into computer software that further helped automate the department’s crime-prediction efforts.

Central to Operation Laser’s success, wrote Craig Uchida, the program’s architect at LAPD, in a research paper in 2012, was Palantir. The software, controversial for aiding US Immigration and Customs Enforcement in surveilling immigrants, made it easier and faster for the department to create chronic offender bulletins and put together information from various sources on people deemed suspicious or inclined to commit a crime, Uchida said.

But the picture of crime in LA the software drew up was based on calls for service, crime reports and information collected by officers, the documents show, creating a vicious loop.

> In 2019, the LAPD inspector general, Mark Smith, said the criteria used in the program to identify people likely to commit violent crimes were inconsistent.

[. . .] As the Guardian revealed on Sunday, one of the locations that Operation Laser targeted was the Crenshaw district, where the rapper Nipsey Hussle was based. Hussle had long complained about policing in his neighborhood, saying in a 2013 interview that LAPD officers “come hop out, ask you questions, take your name, your address, your cell phone number, your social, when you ain’t done nothing. Just so they know everybody in the hood.” . . The consequences could be severe. The information of civilians stopped in the intersection would be fed into the data system, even if they hadn’t committed any offenses.

PredPol’s earthquake theory of crime

In addition to running Operation Laser, LAPD contracted with PredPol, a company that grew out of a research project between LAPD and the UCLA professor Jeff Brantingham.

PredPol applied an earthquake prediction model to crime. The underlying theory – which the company once compared to the unproven and controversial broken windows policing strategy– was that like earthquakes and their aftershocks, smaller crimes were gateways to bigger crimes and occurred in similar places.While the mathematics might look complicated for “normal mortal humans”, PredPol said in a 2014 presentation obtained by Motherboard, the model was “based on nearly seven years of detailed academic research into the causes of crime pattern formation”.

Jeff Brantingham displays computer-generated predictive policing process at an LAPD post in 2012. Photograph: Damian Dovarganes/AP

But academics say the theory is flawed, and the math the company pitched to police was too simple to effectively predict crime. The model was essentially assessing where arrests had been made and sending police back to those locations, according to those academics.

More than a dozen police departments experimented with PredPol, including in Palo Alto and Mountain View. But by the end of 2019, both Operation Laser and PredPol had garnered intense criticism, with skeptics charging that the systems perpetuated discrimination.

By that time, several police departments had dropped their contracts with PredPol, saying there was little proof it helped reduce crime. After three years of use, the Palo Alto police department “didn’t get any value out of it”, a spokesperson, Janine De la Vega, said at the time.

LAPD initially promised reform, but ultimately shuttered Operation Laser in April 2019 and canceled its contract with PredPol in April 2020. LAPD conceded the data used in Operation Laser “was inconsistent” and needed to be reassessed. PredPol, it said, was terminated because of budgetary constraints due to the pandemic. Still, the police chief, Michel Moore, maintained the underlying principles of the program were valuable.

A bid to establish ‘digital trust’

With the data-driven programs the LAPD had promoted for years gone, a new effort took their place. Days before announcing the end of PredPol, LAPD published information about what it called data-informed community-focused policing. The intention of DICFP, the department said, was to establish a deeper relationship between community members and police and address some of the concerns the public had with previous policing programs, all while working to prevent crime. “The legitimacy of a police department is dependent on a community’s trust in its police officers,” an April 2020 LAPD brochure on the program read.

[. . .]

The similarities start with what LAPD now calls “neighborhood engagement areas” or “neighborhoods experiencing crimes and low community engagement”. Like anchor points, those areas are identified based on information such as crime data and calls for service, which include anything from calls about robberies to traffic-related incidents and “non-emergency” calls, according to a daily operations guide.

To address crime in neighborhood engagement areas, according to the brochure, LAPD would use a problem-solving model first introduced under Operation Laser called Sara – an acronym for scanning, analysis, response and assessment. As part of that model, police and stakeholders would use tools such as increased patrolling and surveillance to prevent future crimes.

An LAPD Sara report from 2020. Photograph: LAPD records from Stop LAPD Spying report

Similar in process to Operation Laser, DICFP would lead to similar results: at least one anchor point under the previous regime was also selected as a neighborhood engagement area in 2020 and at least one other area of interest was located within what was previously a Laser zone, the documents show.

[. . ]

“I don’t know about you but I’m not building trust with someone who spies on me,” said Tracey Corder, the deputy campaign director at Acre, a group that helps local organizations campaign against racial injustice.

“It sounds like a rebrand,” she continued. “It’s a co-option of organizer demands and organizing wins. We have set the stage and said policing as it exists does not work. All of this has been an effort to not actually change, but rebrand and reuse what they’ve already been doing.”

Cahn, the Surveillance Technology Oversight Project founder, said: “It seems like the worst sort of fear of organizers. Rather than actually addressing any of the substantive harms that come from predictive policing, they’re simply providing this veneer of community engagement.”

LAPD did not reply to repeated and detailed requests for comment.

‘They’re trying to do some whitewashing’

The LAPD was not alone in rebranding its predictive policing efforts. A month after the department introduced DICFP, PredPol changed its name to Geolitica. On the company website, where there was once a banner that said it was “the predictive policing company” that works to “predict critical events”, Geolitica now boasts “data-driven community policing” that helped public safety teams “be more transparent, accountable, and effective.”

Privacy advocates say LAPD and PredPol’s efforts were part of a larger trend in the predictive policing industry – both in police departments and private companies. In response to public criticism of predictive policing, companies have rebranded existing products or launched new products that promote police accountability and transparency.

Police departments are ‘definitely aware of all the negative connotations of predictive policing’, says Brian Hofer of the reform advocacy group Secure Justice. Photograph: Mel Melcon/Los Angeles Times/Rex/Shutterstock

“They’re definitely aware of all the negative connotations of predictive policing,” said Brian Hofer, the executive director of the government reform advocacy group Secure Justice and the chair of the Oakland Privacy Commission. “They’re trying to really do some whitewashing by rebranding different verbiage and talking about serving these communities instead.”

But safeguards shouldn’t be left to police or tech companies to implement, Corder argues.

“When you think about the way police respond to any kind of calls for reforms from civilians, it’s always oppositional,” Corder said of police departments that use these purported accountability services. “But now all of a sudden we’re supposed to believe they are fine with oversight coming from tech companies? Anybody should be concerned about that and we should start asking the question of why.”

Sam Levin contributed reporting

No comments:

Iran Focuses on Modern Submarines in Major Naval Expansion.

Iran Focuses on Modern Submarines in Major Naval Expansion. 25 Nov, 2024 - 12:11 Naval News Navy 2024 According to information published by ...