Tuesday, December 28, 2021

IT HAPPENED! Elon Musk FINALLY Reveals Artificial Gravity Starship 2021!

2021 BIG $CORE$$$$$$ FOR ENTERTAINMENT... | Forbes

Here's some 'news' from Lysette Voytko writing in Forbes:

From ‘South Park’ To Springsteen: The Entertainment Industry’s 2021 Scorecard

"Eyebrows were raised when, just six days into the year, Neil Young announced that he had spun “Heart of Gold” and the rest of his song catalog into actual gold by selling them to song management company Hipgnosis for $150 million. Coming on the heels of Bob Dylan’s similar, $300 million deal made just a few weeks earlier, it seemed clear that it was just the beginning of an entertainment bull market.

Then came the March announcement that Paul Simon was selling his song catalog for $250 million, followed by sales involving members of Fleetwood Mac, heirs to the Prince estate and the Red Hot Chili Peppers. The year ended on an even higher note, when news broke that Bruce Springsteen reportedly sold his catalog to Sony Music for at least half a billion to Sony Music in December in apparently the biggest deal ever for the life’s work of an individual artist. 

The music sales were just the tip of a massive intellectual property iceberg that soared in value during 2021. In May, Friends once again cashed in on a show that has funneled more than $1.4 billion to the cast and creators as they switched loyalties from Netflix to HBOMax. Then Viacom signed a deal to pay the creators of South Park more than $900 million over a number of years.

And some say the pop-culture horse trading is just getting started, so look for plenty more drama in 2022. Until then, here’s a look at 2021’s highlights:
Reese Witherspoon cashed out.
Rihanna and Kim Kardashian West became billionaires.
And it was “Glory Days” for Bruce.
But not all stars glittered in a year of massive deals.  
 
>

With Friends Like These...

The actors and creative minds behind Friends, one of network TV’s most beloved sitcoms made a big splash with their reunion show, which aired on HBO Max in May. The cast earned an estimated $5 million each for their appearance. That’s just the icing on the cake: altogether Forbes estimates the show brought in more than $1 billion in pretax earnings since its 1994 premiere; the cast earned $816 million pretax, or $136 million each

Bilbo Baggins’ Billions 

Lord of the Rings director Peter Jackson’s foray into visual effects through his shop Weta Digital paid off big time. The New Zealand-based firm, which he cofounded to provide effects for Heavenly Creatures (1994), grew into a Hollywood powerhouse, providing computer-generated imagery for everything from Wolverine (2013) to the 2019 remake of Lady and the Tramp. In November, Weta announced that a chunk of its assets were being sold for $1.6 billion to Unity Software, which makes software for videogames. Forbes conservatively estimates the deal officially makes Jackson a billionaire

Cuomo Non Grata

It wasn’t a great year for the Cuomo family—Andrew Cuomo resigned in August as New York governor amid fallout from his sexual harassment scandal—while younger brother Chris Cuomo was fired from his CNN anchor job in December after legal documents revealed the extent to which he aided Andrew in his battle against the harassment allegations—seen as an unforgivable breach of journalistic ethics. 

Gone, But Not Forgotten 

Even after death, success in show business can still command a marquee price tag. In September, 31 years after the creator of Charlie and the Chocolate Factory died of cancer at age 74, Netflix paid a reported $684 million for the Roald Dahl Story Company and rights to the British novelist’s stories

 

 

The Batman - Official Trailer #3 (2022) Robert Pattinson, Zoe Kravitz, ...

YOU ARE INVITED --> Go Read This

Intro: 
It's NOT A SURPRISE to read this finding in an excerpt: "Besides supposedly predicting individual crimes, a 2018 report by The Verge looked into Pentagon-funded research by PredPol’s founder Jeff Brantingham about using the software to predict gang-related crime.
The former University of California-Los Angeles anthropology professor adapted earlier research in forecasting battlefield casualties in Iraq to create the platform, and the paper — “Partially Generative Neural Networks for Gang Crime Classification with Partial Information” — raised concern over the ethical implications.

Go read this data analysis that uncovers predictive policing’s flawed algorithm

A co-reported investigation with Gizmodo and The Markup into PredPol

"Gizmodo released a deep-dive look into the data collection process behind its co-reported investigation with The Markup into PredPol, a software company specializing in predictive policing (hence the name, which it has since changed to Geolitica) through machine learning.

PredPol’s algorithm is supposed to make predictions based on existing crime reports. However, since crimes aren’t equally reported everywhere, the readings it provides to law enforcement could simply copy the biases in reporting over each area. If police use this to decide where to patrol, they could end up over-policing areas that don’t need a larger presence.

When Gizmodo and The Markup evaluated the areas, they found that the places PredPol’s software targeted for increased patrols “were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.”

Even as police tactics evolved to include crime and arrest data, there has been a historical disparity in how these tactics affect communities of color. As Gizmodo points out in its analysis, in New York in the 1990s, researchers at the time found that the methods reduced crime without simply displacing it to other areas. However, the approach included tactics like stop-and-frisk, which have been criticized as violations of civil rights.

PredPol’s algorithm has already been looked into and criticized by academics more than once. As Vice quoted Suresh Venkatasubramanian, a member of the board of directors for ACLU Utah, in 2019:

“Because this data is collected as a by-product of police activity, predictions made on the basis of patterns learned from this data do not pertain to future instances of crime on the whole,” Venkatasubramanian’s study notes. “In this sense, predictive policing is aptly named: it is predicting future policing, not future crime.”

Still, there hasn’t been an investigation as thorough as this one. This investigation used figures retrieved from public data available via the web. According to Gizmodo and The Markup, they found an unsecured cloud database linked from the Los Angeles Police Department’s website. That data contained millions of predictions stretching back several years.

Besides supposedly predicting individual crimes, a 2018 report by The Verge looked into Pentagon-funded research by PredPol’s founder Jeff Brantingham about using the software to predict gang-related crime. The former University of California-Los Angeles anthropology professor adapted earlier research in forecasting battlefield casualties in Iraq to create the platform, and the paper — “Partially Generative Neural Networks for Gang Crime Classification with Partial Information” — raised concern over the ethical implications.

Critics said this approach could do more harm than good. “You’re making algorithms off a false narrative that’s been created for people — the gang documentation thing is the state defining people according to what they believe ... When you plug this into the computer, every crime is gonna be gang-related,” activist Aaron Harvey told The Verge. . .

[...] Relying on some algorithms can work magic for some industries, but their impact can come at a real human cost. With bad data or the wrong parameters, things can go wrong quickly, even in circumstances that are less fraught than policing. Look no further than Zillow recently having to shut down its house-flipping operation after losing hundreds of millions of dollars despite the “pricing models and automation” it thought would provide an edge. . .

Overall, Gizmodo and The Markup’s reporting is a good consideration of how significantly predictive algorithms can affect the people unknowingly targeted by them. The accompanying analysis by Gizmodo provides relevant data insight while giving readers a behind-the-scenes look into these measures.

The report indicates that 23 of the 38 law enforcement agencies tracked are no longer PredPol customers, despite initially signing up for it to help distribute crime-stopping resources. Perhaps by using methods that build transparency and trust on both sides, law enforcement could spend less time on tech that leads to pieces like this, which highlights the exact opposite approach.

Blastoff! 36 Oneweb satellites launch atop Soyuz rocket

The World Ahead 2022: five stories to watch out for | The Economist

PREDICTIVE POLICING (PredPol) is Racist Policing...Confirmation Bias Built Into The System | TechDirt

Mis-Uses of Technology - latest update yesterday from Tim Cushing published yesterday:
"Don't kid yourselves, techbros. Predictive policing is regular policing, only with confirmation bias built in. The only question for citizens is whether or not they want to pay tech companies millions to give them the same racist policing they've been dealing with since policing began.

Unsecured Data Leak Shows Predicitive Policing Is Just Tech-Washed, Old School Biased Policing

from the hey-that-data-isn't-accurate-says-AI-company-confirming-data's-accuracy dept

 
The data they obtained reinforces everything that's been reported about this form of "smarter" policing, confirming its utility as a law enforcement echo chamber that allows cops to harass more minorities because that's always what they've done in the past.

Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime-prediction software called PredPol.

The company that makes it sent more than 5.9 million of these crime predictions to law enforcement agencies across the country—from California to Florida, Texas to New Jersey—and we found those reports on an unsecured server.

Gizmodo and The Markup analyzed them and found persistent patterns.

Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.

By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.

Targeted more? In some cases, that's an understatement. Predictive policing algorithms compound existing problems. If cops patrolled neighborhoods mainly populated by minorities frequently in the past due to biased pre-predictive policing habits, the introduction of that data into the system returns "predictions" that "predict" more crime to be committed in areas where officers have most often been located historically.

The end result is what you see summarized above: non-white neighborhoods receive the most police attention, resulting in more data to feed to the machine, which results in more outputs that say cops should do the same thing they've been doing for decades more often. Run this feedback loop through enough iterations and it results in the continued infliction of misery on certain members of the population.

These communities weren’t just targeted more—in some cases, they were targeted relentlessly. Crimes were predicted every day, sometimes multiple times a day, sometimes in multiple locations in the same neighborhood: thousands upon thousands of crime predictions over years.

That's the aggregate. . . 

=========================================================================

INSERT

3 weeks ago · ... software company specializing in predictive policing (hence the name, which it has since changed to Geolitica) through machine learning.
4 weeks ago · That may sound shocking, but really, it's par for the course with PredPol, which rebranded itself as Geolitica in order to distance itself...
5 months ago · Santa Cruz-based Geolitica, which helped popularize the term “predictive policing,” changed its name in March to better reflect how police...
5 days ago · But these systems have always been found to be affected by cognitive bias. The systems developed by US pioneer PredPol (today Geolitica), for...
9 months ago · Geolitica claims to avoid the bias programmed into individual risk assessment softwares by excluding personal characteristics. Its data is...
3 months ago · As part of PredPol remodeling itself as Geolitica, the company is reframing its forecasting software as a tool for greater police...
1 month ago · ... introduced DICFP, PredPol changed its name to Geolitica. ... Geolitica now boasts “data-driven community policing” that helped public...
2 months ago · This is why we changed our name from PredPol to Geolitica earlier this year," said CEO Brian MacDonald. But police use of AI technology is...
4 weeks ago · We sent our methodology as well as the underlying data to PredPol, which renamed itself Geolitica earlier this year.

=========================================================================

[..] That's not policing. That's oppression. Both law enforcement and a percentage of the general public still believe cops are capable of preventing crime, even though that has never been a feature of American law enforcement. PredPol software leans into this delusion, building on bad assumptions fueled by biased data to claim that data-based policing can convert police omnipresence into crime reduction. The reality is far more dire: residents in over-policed areas are confronted, detained, or rung up on bullshit charges with alarming frequency. And this data gets fed back into the software to generate more of the same abuse.

None of this seems to matter to law enforcement agencies paying for this software paid for with federal and local tax dollars. Only one law enforcement official -- Elgin (IL) PD's deputy police chief -- called the software "bias by proxy." For everyone else, it was law enforcement business as usual.

That also goes for the company supplying the software. PredPol -- perhaps recognizing some people might assume the "Pred" stands for "Predatory" -- rebranded to the much more banal "Geolitica" earlier this year.

www.santacruztechbeat.com/wp-content/uploads/2021/...

The logo swap doesn't change the underlying algorithms, which have accurately predicted biased policing will result in more biased policing.

When confronted with the alarming findings following Gizmodo's and The Markup's examination of Geolitica predictive policing data, the company's first move was to claim (hilariously) that data found on unsecured servers couldn't be trusted.

PredPol, which renamed itself Geolitica in March, criticized our analysis as based on reports “found on the internet.”

Finding an unsecured server with data isn't the same thing as finding someone's speculative YouTube video about police patrol habits. What makes this bizarre accusation about the supposed inherent untrustworthiness of the data truly laughable is Geolitica's follow-up:

But the company did not dispute the authenticity of the prediction reports, which we provided, acknowledging that they “appeared to be generated by PredPol.”

Geolitica says everything is good. geolitica.com/assets/images/blog_img/phm_boxes_map...

Its customers aren't so sure.

> Gizmodo received responses from 13 of 38 departments listed in the data and most sent back written statements that they no longer used PredPol. That includes the Los Angeles Police Department, an early adopter that sent PredPol packing after discovering it was more effective at generating lawsuits and complaints from residents than actually predicting or preventing crime.

This report -- which is extremely detailed and well-worth reading in full -- shows PredPol is just another boondoggle, albeit one that's able to take away people's freedoms along with their tax dollars. Until someone's willing to build a system that doesn't consider all cop data to be created equally, so-called "smart" policing is just putting a shiny tech sheen on old-school cop work that relies on harassing minorities to generate biased busywork for police officers."

Filed Under: biased policing, garbage in, garbage out, police tech, predictive policing, predpol
Companies: geolitica 

__________________________________________________________________________

RELATED CONTENT

>

Aug 13, 2021 · Geolitica was founded in 2012 with the goal of bringing greater transparency and accountability to policing through the use of objective data
 
> Geolitica stories at Techdirt.
www.techdirt.com › blog › company=geolitica
 
11 hours ago · stories about: "geolitica". Expand All Posts | Collapse All Posts. Unsecured Data Leak Shows Predicitive Policing Is Just Tech-Washed, ...

About GeoliticaSM

Trusted Services
for Safer Communities

We run operations for public safety teams to be more transparent, accountable, and effective.

Effectiveness

Identify highest risk locations to patrol

Add points of interest (POI’s) or custom boxes using SARA, Risk-Terrain Modeling, or your agency's unique intelligence

Proactively patrol to reduce crime rates and victimization

Geolitica is currently being used to help protect roughly one out of every 30 people in the United States

Accountability

Analyze daily patrol patterns

Manage patrol operations in real time

Create patrol heat maps

Identify resource hotspots 

BEA News: Gross Domestic Product by State and Personal Income by State, 3rd Quarter 2025

  BEA News: Gross Domestic Product by State and Personal Income by S...