digitalcourage.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Diese Instanz wird betrieben von Digitalcourage e.V. für die Allgemeinheit. Damit wir das nachhaltig tun können, erheben wir einen jährlichen Vorausbeitrag von 1€/Monat per SEPA-Lastschrifteinzug.

Server stats:

832
active users

#predictivepolicing

0 posts0 participants0 posts today

„Watching you – Die Welt von Palantir und Alex Karp“ (2024)

Als hätte „Minority Report“ in deutschen Amtsstuben das Denken ersetzt. Der Dokumentarfilm von Klaus Stern stellt den Menschen vor, der diese Vision verkauft – und mit ihr eine technoide Version der Totalüberwachung salonfähig macht. Dass der Film dabei selbst ein Teil dieser Normalisierung wird, ist die bittere Pointe. Er zeigt, was er nicht kritisiert. Das ist sein größtes Problem. (ARD, Neu)

NexxtPress · "Watching you – Die Welt von Palantir und Alex Karp" (2024)
More from Mediathekperlen
#alexkarp#ard#bayern

First the police, now Big Tech wants to put 'crime-predicting' tech in UK probation services.

A lack of transparency and reliance on flawed data means that institutional racism will be hardwired into the justice system.

All at the expense of dignity and rights.

theguardian.com/society/2025/j

The Guardian · Tech firms suggested placing trackers under offenders’ skin at meeting with justice secretaryBy Robert Booth

"Predictive policing technologies infringe human rights “at their heart” and should be prohibited in the UK, argues Green MP Siân Berry, after tabling an amendment to the government’s forthcoming Crime and Policing Bill.

Speaking in the House of Commons during the report stage of the bill, Berry highlighted the dangers of using predictive policing technologies to assess the likelihood of individuals or groups committing criminal offences in the future.

“Such technologies, however cleverly sold, will always need to be built on existing, flawed police data, or data from other flawed and biased public and private sources,” she said. “That means that communities that have historically been over-policed will be more likely to be identified as being ‘at risk’ of future criminal behaviour.”"

computerweekly.com/news/366626

ComputerWeekly.com · MPs propose ban on predictive policingBy Sebastian Klovig Skelton
Continued thread

#KI bei der #Polizei - #Vera übernehmen Sie! Hier könnte gleich ein Mord geschehen!

Die bayerische #Polizei möchte mithilfe einer Software der US-Firma #Palantir Verbrechen verhindern, bevor sie verübt werden. Ist nun jeder verdächtig? 

'"Verfahrensübergreifende Recherche- und Analyseplattform", so heißt die Software, um die es hier gehen soll. VeRA, eine Abkürzung, die Datenschützern Zornestränen in die Augen treibt. Ein Werkzeug sei das, von dem die Stasi nur hätte träumen können, sagen sie. Bayern nutzt es seit rund neun Monaten.

Kriminalbeamte wollen mit der Software "Predictive Policing" betreiben, also vorhersagebasierte Polizeiarbeit.'

zeit.de/2025/26/ki-polizei-bay

DIE ZEITKI bei der Polizei: Hier könnte gleich ein Mord geschehenDie bayerische Polizei möchte mithilfe einer Software der US-Firma Palantir Verbrechen verhindern, bevor sie verübt werden. Ist nun jeder verdächtig?

"At their heart, these technologies infringe human rights."

Last week @sianberry tabled an amendment to the UK Crime and Policing Bill that would prohibit the use and deployment of dangerous 'crime-predicting' police tech.

These systems will subject overpoliced communities to more surveillance. More discrimination. More injustice.

Sign the petition to BAN it ➡️ you.38degrees.org.uk/petitions

Oops! AI did it again... you're not that innocent.

Nectar, a 'crime-predicting' system developed with #Palantir, could be rolled out nationally after a pilot with Bedfordshire police (UK).

Data such as race, sex life, trade union membership, philosophical beliefs and health are used to 'predict' criminality so people can be targeted for #surveillance.

inews.co.uk/news/police-use-co

The i Paper · Police use controversial AI tool that looks at people’s sex lives and beliefsSenior MPs and privacy campaigners have expressed alarm at the deployment of Palantir’s AI-powered crime-fighting software with access to sensitive personal information
Replied in thread

@heidilifeldman

#USpol #TheBrownSpiderWeb

(2/n)

👉2025 is set to become #1933 and"#1984" at the same time.👈
With the real #KingMaker's (#PeterThiel) #spyware and #surveillance products (#Palantir,) 2026 is set to add a next-generation ingredient to #Fascism: #PredictivePolicing, a brand new way in RL to persecute #ThoughtCrime.

@heidilifeldman

That said, I agree 100% with the excellent @guardian article:

theguardian.com/commentisfree/

At #DOGE "...#AI is...

Replied in thread

@tg9541 @mattotcha

#UKpol #UKpolitics
#Precrime #ThoughtCrime #FreeSpeech #PeacefulProtest
#CivilRights #Legal

👉A friendly warning to the #Starmer Government👈

(3/n)

... advent of #PredictivePolicing and the continuing crackdown on the right to #PeacefulProtest in the #UK, it seems that the #Starmer government seems to be following down that road.

👉The despicable use of anti-terror force by 30 #policemen in a place of #worship in #London against six young women👈 discussing...

‘Predictive’ policing tools in France are flawed, opaque, and dangerous.

A new report from @LaQuadrature, now available in English as part of a Statewatch-coordinated project, lays out the risks in detail.

The report finds that these systems reinforce discrimination, evade accountability, and threaten fundamental rights. La Quadrature is calling for a full ban—and we support them.

📄 Read more and access the full report: statewatch.org/news/2025/may/f

Continued thread

Wie Algorithmen in #Deutschland Straftaten „voraussehen“ sollen #PredictivePolicing

"In dem Bericht „Automating Injustice“ werden ausgewählte Systeme untersucht, die in Deutschland von der Polizei, Strafverfolgungsbehörden und Gefängnissen entwickelt oder eingesetzt werden. Außerdem werden öffentlich zugängliche Informationen über solche Praktiken analysiert, um zu erklären, wie die Systeme funktionieren, welche Daten sie verwenden, weshalb sie zu einer stärkeren Diskriminierung führen können und generell eine Gefahr für die Grundrechte sind......."

algorithmwatch.org/de/predicti via @algorithmwatch

AlgorithmWatchAutomatisierte Polizeiarbeit: Wie Algorithmen in Deutschland Straftaten „voraussehen“ sollen - AlgorithmWatchDie Polizei, Strafverfolgungsbehörden und Justizvollzugsanstalten in Deutschland versuchen immer stärker, Straftaten digital „vorherzusagen“ und zu „verhindern“. Der Bericht „Automating Injustice“ gibt einen Überblick über solche algorithmischen Systeme, die in Deutschland entwickelt und eingesetzt werden.

"Alexander, more than midway through a 20-year prison sentence on drug charges, was making preparations for what he hoped would be his new life. His daughter, with whom he had only recently become acquainted, had even made up a room for him in her New Orleans home.

Then, two months before the hearing date, prison officials sent Alexander a letter informing him he was no longer eligible for parole.

A computerized scoring system adopted by the state Department of Public Safety and Corrections had deemed the nearly blind 70-year-old, who uses a wheelchair, a moderate risk of reoffending, should he be released. And under a new law, that meant he and thousands of other prisoners with moderate or high risk ratings cannot plead their cases before the board. According to the department of corrections, about 13,000 people — nearly half the state’s prison population — have such risk ratings, although not all of them are eligible for parole.

Alexander said he felt “betrayed” upon learning his hearing had been canceled. “People in jail have … lost hope in being able to do anything to reduce their time,” he said.

The law that changed Alexander’s prospects is part of a series of legislation passed by Louisiana Republicans last year reflecting Gov. Jeff Landry’s tough-on-crime agenda to make it more difficult for prisoners to be released."

propublica.org/article/tiger-a

ProPublicaAn Algorithm Deemed This Nearly Blind 70-Year-Old Prisoner a “Moderate Risk.” Now He’s No Longer Eligible for Parole.
More from ProPublica

"The UK government is developing a “murder prediction” programme which it hopes can use personal data of those known to the authorities to identify the people most likely to become killers.

Researchers are alleged to be using algorithms to analyse the information of thousands of people, including victims of crime, as they try to identify those at greatest risk of committing serious violent offences.

The scheme was originally called the “homicide prediction project”, but its name has been changed to “sharing data to improve risk assessment”. The Ministry of Justice hopes the project will help boost public safety but campaigners have called it “chilling and dystopian”."

theguardian.com/uk-news/2025/a

The Guardian · UK creating ‘murder prediction’ tool to identify people most likely to killBy Vikram Dodd