Tuesday, December 28, 2021

PREDICTIVE POLICING (PredPol) is Racist Policing...Confirmation Bias Built Into The System | TechDirt

Mis-Uses of Technology - latest update yesterday from Tim Cushing published yesterday:
"Don't kid yourselves, techbros. Predictive policing is regular policing, only with confirmation bias built in. The only question for citizens is whether or not they want to pay tech companies millions to give them the same racist policing they've been dealing with since policing began.

Unsecured Data Leak Shows Predicitive Policing Is Just Tech-Washed, Old School Biased Policing

from the hey-that-data-isn't-accurate-says-AI-company-confirming-data's-accuracy dept

 
The data they obtained reinforces everything that's been reported about this form of "smarter" policing, confirming its utility as a law enforcement echo chamber that allows cops to harass more minorities because that's always what they've done in the past.

Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime-prediction software called PredPol.

The company that makes it sent more than 5.9 million of these crime predictions to law enforcement agencies across the country—from California to Florida, Texas to New Jersey—and we found those reports on an unsecured server.

Gizmodo and The Markup analyzed them and found persistent patterns.

Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.

By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.

Targeted more? In some cases, that's an understatement. Predictive policing algorithms compound existing problems. If cops patrolled neighborhoods mainly populated by minorities frequently in the past due to biased pre-predictive policing habits, the introduction of that data into the system returns "predictions" that "predict" more crime to be committed in areas where officers have most often been located historically.

The end result is what you see summarized above: non-white neighborhoods receive the most police attention, resulting in more data to feed to the machine, which results in more outputs that say cops should do the same thing they've been doing for decades more often. Run this feedback loop through enough iterations and it results in the continued infliction of misery on certain members of the population.

These communities weren’t just targeted more—in some cases, they were targeted relentlessly. Crimes were predicted every day, sometimes multiple times a day, sometimes in multiple locations in the same neighborhood: thousands upon thousands of crime predictions over years.

That's the aggregate. . . 

=========================================================================

INSERT

3 weeks ago · ... software company specializing in predictive policing (hence the name, which it has since changed to Geolitica) through machine learning.
4 weeks ago · That may sound shocking, but really, it's par for the course with PredPol, which rebranded itself as Geolitica in order to distance itself...
5 months ago · Santa Cruz-based Geolitica, which helped popularize the term “predictive policing,” changed its name in March to better reflect how police...
5 days ago · But these systems have always been found to be affected by cognitive bias. The systems developed by US pioneer PredPol (today Geolitica), for...
9 months ago · Geolitica claims to avoid the bias programmed into individual risk assessment softwares by excluding personal characteristics. Its data is...
3 months ago · As part of PredPol remodeling itself as Geolitica, the company is reframing its forecasting software as a tool for greater police...
1 month ago · ... introduced DICFP, PredPol changed its name to Geolitica. ... Geolitica now boasts “data-driven community policing” that helped public...
2 months ago · This is why we changed our name from PredPol to Geolitica earlier this year," said CEO Brian MacDonald. But police use of AI technology is...
4 weeks ago · We sent our methodology as well as the underlying data to PredPol, which renamed itself Geolitica earlier this year.

=========================================================================

[..] That's not policing. That's oppression. Both law enforcement and a percentage of the general public still believe cops are capable of preventing crime, even though that has never been a feature of American law enforcement. PredPol software leans into this delusion, building on bad assumptions fueled by biased data to claim that data-based policing can convert police omnipresence into crime reduction. The reality is far more dire: residents in over-policed areas are confronted, detained, or rung up on bullshit charges with alarming frequency. And this data gets fed back into the software to generate more of the same abuse.

None of this seems to matter to law enforcement agencies paying for this software paid for with federal and local tax dollars. Only one law enforcement official -- Elgin (IL) PD's deputy police chief -- called the software "bias by proxy." For everyone else, it was law enforcement business as usual.

That also goes for the company supplying the software. PredPol -- perhaps recognizing some people might assume the "Pred" stands for "Predatory" -- rebranded to the much more banal "Geolitica" earlier this year.

www.santacruztechbeat.com/wp-content/uploads/2021/...

The logo swap doesn't change the underlying algorithms, which have accurately predicted biased policing will result in more biased policing.

When confronted with the alarming findings following Gizmodo's and The Markup's examination of Geolitica predictive policing data, the company's first move was to claim (hilariously) that data found on unsecured servers couldn't be trusted.

PredPol, which renamed itself Geolitica in March, criticized our analysis as based on reports “found on the internet.”

Finding an unsecured server with data isn't the same thing as finding someone's speculative YouTube video about police patrol habits. What makes this bizarre accusation about the supposed inherent untrustworthiness of the data truly laughable is Geolitica's follow-up:

But the company did not dispute the authenticity of the prediction reports, which we provided, acknowledging that they “appeared to be generated by PredPol.”

Geolitica says everything is good. geolitica.com/assets/images/blog_img/phm_boxes_map...

Its customers aren't so sure.

> Gizmodo received responses from 13 of 38 departments listed in the data and most sent back written statements that they no longer used PredPol. That includes the Los Angeles Police Department, an early adopter that sent PredPol packing after discovering it was more effective at generating lawsuits and complaints from residents than actually predicting or preventing crime.

This report -- which is extremely detailed and well-worth reading in full -- shows PredPol is just another boondoggle, albeit one that's able to take away people's freedoms along with their tax dollars. Until someone's willing to build a system that doesn't consider all cop data to be created equally, so-called "smart" policing is just putting a shiny tech sheen on old-school cop work that relies on harassing minorities to generate biased busywork for police officers."

Filed Under: biased policing, garbage in, garbage out, police tech, predictive policing, predpol
Companies: geolitica 

__________________________________________________________________________

RELATED CONTENT

>

Aug 13, 2021 · Geolitica was founded in 2012 with the goal of bringing greater transparency and accountability to policing through the use of objective data
 
> Geolitica stories at Techdirt.
www.techdirt.com › blog › company=geolitica
 
11 hours ago · stories about: "geolitica". Expand All Posts | Collapse All Posts. Unsecured Data Leak Shows Predicitive Policing Is Just Tech-Washed, ...

About GeoliticaSM

Trusted Services
for Safer Communities

We run operations for public safety teams to be more transparent, accountable, and effective.

Effectiveness

Identify highest risk locations to patrol

Add points of interest (POI’s) or custom boxes using SARA, Risk-Terrain Modeling, or your agency's unique intelligence

Proactively patrol to reduce crime rates and victimization

Geolitica is currently being used to help protect roughly one out of every 30 people in the United States

Accountability

Analyze daily patrol patterns

Manage patrol operations in real time

Create patrol heat maps

Identify resource hotspots 

Elon Musk JUST ANNOUNCED Tesla's EXTRAORDINARY NEW UPGRADED Bot!

Monday, December 27, 2021

Why Sex Education Is So Bad In The U.S.

'Bloomberg Surveillance: Early Edition' Full Show (12/27/2021)

JUST IN: Biden, National Governor's Association Discuss Omicron Response

DuckDuckGo may be the search engine for you. . . | Bleeping Computer

Privacy-focused search engine DuckDuckGo grew by 46% in 2021

"The privacy-focused search engine DuckDuckGo continues to grow rapidly, with the company now averaging over 100 million daily search queries and growing by almost 47% in 2021.

Unlike other search engines, DuckDuckGo says they do not track your searches or your behavior on other sites. Instead of building user profiles used to display interest-based ads, DuckDuckGo search pages display contextual advertisements based on the searched keywords.

This means that if you search on DuckDuckGo for a television, that search query will not be used to display television ads at every other site you visit.

Furthermore, to build their search index, the search engine uses the DuckDuckBot spider to crawl sites and receive data from partners, such as Wikipedia and Bing. However, they do not build their index using data from Google.

DuckDuckGo shows rapid growth

While Google remains the dominant search platform, DuckDuckGo has seen impressive year-over-year growth.

> In 2020, DuckDuckGo received 23.6 billion total search queries and achieved a daily average of 79 million search queries by the end of December.

> In 2021, DuckDuckGo received 34.6 billion total search queries so far and currently has an average of 100 million search queries per day, showing a 46.4% growth for the year.

While DuckDuckGo's growth is considerable, it still only has 2.53% of the total market share, with Yahoo at 3.3%, Bing at 6.43%, and Google holding a dominant share of 87.33% of search engine traffic in the USA.

However, as people continue to become frustrated with how their data is being used by tech giants like Google, Facebook, Microsoft, and Apple, we will likely see more people switch to privacy-focused search engines.

To further help users protect their privacy, DuckDuckGo released an email forwarding service in 2021 called 'Email Protection' that strips email trackers and allows you to protect your actual email address.

They also introduced 'App Tracking Protection for Android,' which blocks third-party trackers from Google and Facebook found in apps.

More recently, DuckDuckGo announced they are releasing a DuckDuckGo Privacy Browser for Desktop that will not be based on Chromium and will be built from scratch.

"No complicated settings, no misleading warnings, no "levels" of privacy protection – just robust privacy protection that works by default, across search, browsing, email, and more," explains a recent blog post about the new browser.

"It's not a "privacy browser"; it's an everyday browsing app that respects your privacy because there's never a bad time to stop companies from spying on your search and browsing history."

For those looking to take back control of their data and add more privacy to their search behavior, DuckDuckGo may be the search engine for you."

AUTO-CONTROL: Your Ride is a Streaming Catch of Profitable Data-Points

Intro: HAVE NO FEAR that tech companies will soon be doing to cars what they did to phones: Tying their exclusive operating systems to specific products to force out competitors and dominate a huge swath of the global economy.
Now, having missed the boat as the tech giants cornered the market on smartphones, some policymakers and regulators believe the battle over connected cars represents a chance to block potential monopolies before they form.
By LEAH NYLEN

". . .Indeed, the smartphone wars are over, and Google and Apple won. Now they — and Amazon — are battling to control how you operate within your car. All three see autos as the next great opportunity to reach American consumers, who spend more time in the driver’s seat than anywhere outside their home or workplace. And automakers, after years of floundering to incorporate cutting-edge technologies into cars on their own, are increasingly eager for Silicon Valley’s help — hoping to adopt both its tech and its lucrative business models where consumers pay monthly for ongoing services instead of shelling out for a product just once. . .

The stakes are enormous. Tech companies and the automakers envision a future where riders can seamlessly blend work, play and chores, easily ordering groceries, scheduling work meetings or watching TV from the comfort of their cars. The data coming off those vehicles also could automatically update maps, notify city workers about potholes and tell brick-and-mortar retailers where customers travel from.

[...]

Brand loyalty or monopoly?

While Silicon Valley and automakers are thrilled about the future of connected and autonomous cars, regulators and privacy advocates are less so.

“These companies have an amount of data on us that they shouldn’t have, and they have a history of not using it in responsible ways,” said Katharine Trendacosta of the digital civil liberties group Electronic Frontier Foundation. “They have a history of going back on promises they have made about that data.”

[...] READ IN-BETWEEN-THE-LINES https://www.politico.com/news/magazine/2021/12/27/self-driving-car-big-tech-monopoly-525867

“I know it’s hard to see in the future and difficult to make guesses about what companies should be allowed to do with technology that doesn’t exist. But we know what they are doing with things that already exist,” Trendacosta said. “If we had rules that carry forward to whatever they make in the future, we’d be in a better place.”