101010.pl is one of the many independent Mastodon servers you can use to participate in the fediverse.
101010.pl czyli najstarszy polski serwer Mastodon. Posiadamy wpisy do 2048 znaków.

Server stats:

489
active users

#predictivepolicing

0 posts0 participants0 posts today

First the police, now Big Tech wants to put 'crime-predicting' tech in UK probation services.

A lack of transparency and reliance on flawed data means that institutional racism will be hardwired into the justice system.

All at the expense of dignity and rights.

theguardian.com/society/2025/j

The Guardian · Tech firms suggested placing trackers under offenders’ skin at meeting with justice secretaryBy Robert Booth

"At their heart, these technologies infringe human rights."

Last week @sianberry tabled an amendment to the UK Crime and Policing Bill that would prohibit the use and deployment of dangerous 'crime-predicting' police tech.

These systems will subject overpoliced communities to more surveillance. More discrimination. More injustice.

Sign the petition to BAN it ➡️ you.38degrees.org.uk/petitions

Oops! AI did it again... you're not that innocent.

Nectar, a 'crime-predicting' system developed with #Palantir, could be rolled out nationally after a pilot with Bedfordshire police (UK).

Data such as race, sex life, trade union membership, philosophical beliefs and health are used to 'predict' criminality so people can be targeted for #surveillance.

inews.co.uk/news/police-use-co

The i Paper · Police use controversial AI tool that looks at people’s sex lives and beliefsSenior MPs and privacy campaigners have expressed alarm at the deployment of Palantir’s AI-powered crime-fighting software with access to sensitive personal information
Replied in thread

@heidilifeldman

#USpol #TheBrownSpiderWeb

(2/n)

👉2025 is set to become #1933 and"#1984" at the same time.👈
With the real #KingMaker's (#PeterThiel) #spyware and #surveillance products (#Palantir,) 2026 is set to add a next-generation ingredient to #Fascism: #PredictivePolicing, a brand new way in RL to persecute #ThoughtCrime.

@heidilifeldman

That said, I agree 100% with the excellent @guardian article:

theguardian.com/commentisfree/

At #DOGE "...#AI is...

Replied in thread

@tg9541 @mattotcha

#UKpol #UKpolitics
#Precrime #ThoughtCrime #FreeSpeech #PeacefulProtest
#CivilRights #Legal

👉A friendly warning to the #Starmer Government👈

(3/n)

... advent of #PredictivePolicing and the continuing crackdown on the right to #PeacefulProtest in the #UK, it seems that the #Starmer government seems to be following down that road.

👉The despicable use of anti-terror force by 30 #policemen in a place of #worship in #London against six young women👈 discussing...

‘Predictive’ policing tools in France are flawed, opaque, and dangerous.

A new report from @LaQuadrature, now available in English as part of a Statewatch-coordinated project, lays out the risks in detail.

The report finds that these systems reinforce discrimination, evade accountability, and threaten fundamental rights. La Quadrature is calling for a full ban—and we support them.

📄 Read more and access the full report: statewatch.org/news/2025/may/f

Continued thread

Wie Algorithmen in #Deutschland Straftaten „voraussehen“ sollen #PredictivePolicing

"In dem Bericht „Automating Injustice“ werden ausgewählte Systeme untersucht, die in Deutschland von der Polizei, Strafverfolgungsbehörden und Gefängnissen entwickelt oder eingesetzt werden. Außerdem werden öffentlich zugängliche Informationen über solche Praktiken analysiert, um zu erklären, wie die Systeme funktionieren, welche Daten sie verwenden, weshalb sie zu einer stärkeren Diskriminierung führen können und generell eine Gefahr für die Grundrechte sind......."

algorithmwatch.org/de/predicti via @algorithmwatch

AlgorithmWatchAutomatisierte Polizeiarbeit: Wie Algorithmen in Deutschland Straftaten „voraussehen“ sollen - AlgorithmWatchDie Polizei, Strafverfolgungsbehörden und Justizvollzugsanstalten in Deutschland versuchen immer stärker, Straftaten digital „vorherzusagen“ und zu „verhindern“. Der Bericht „Automating Injustice“ gibt einen Überblick über solche algorithmischen Systeme, die in Deutschland entwickelt und eingesetzt werden.

Perils of predictive policing

Amnesty publishes a report warning of the perils of predictive policing

February 2025

Many TV detective series have technology at their core as our heroes vigorously pursue the wrongdoers. CCTV cameras are scrutinised for movements of the criminals, DNA evidence is obtained and of course fingerprints are taken. The story lines of countless detective series feature forensic evidence as a key component of police detection. The series and stories are reassuring by displaying law enforcement officers using all the techniques – scientific and technological – to keep us all safe and lock up the bad guys. Using science and algorithms to enable police forces to predict crime must be a good idea surely?

It is not. The Amnesty report, and other research, explain in great detail the problems and what the risks are. One of the persistent biases in the justice system is racism and it would be worth reading the book The Science of Racism by Keon West (Picador, pub. 2025). The author takes the reader through copious peer reviewed research conducted over many years in different countries explaining the extent of racism. Examples include many cv studies (US: resume) where identical cv’s, but with different names which indicate the ethnicity of candidates, produces markedly different results. There are similar examples from the world of medicine and academia. Racism is endemic and persists. As Keon West acknowledges, a similar book could be written about how women are treated differently.

The Amnesty report notes that Black people are twice as likely to be arrested; three times as likely to be subject to force and 4 times as likely to be subject to stop and search as white people. With such bias in place, the risk is that predictive policing might simply perpetuate existing prejudice and bias. The concern partly centres around the use of skin colour, where people live and their socio-economic background all used as predictive tools.

People have a deep faith in technology. On a recent Any Answers? programme (on BBC Radio 4), a debate about the death penalty and the problem of mistakes, several people showed a touching faith in DNA in particular inferring that mistakes cannot happen. People are mesmerised by the white suited forensic officers on television giving a sense of science and certainly. Technology is only as good as the human systems which use it however. There have been many wrongful arrests and prison sentences of innocent people despite DNA, fingerprints, CCTV and all the rest. Mistakes are made. The worry is that predictive policing could enhance discrimination.

People who are profiled have no way of knowing that they have been. There is a need to publish details of what systems the police and others are using. The police are reluctant to do this the report notes. What is the legal basis for effectively labelling people because of their skin colour, where they live and their socio-economic status?

The police are keen on the idea and around 45 forces use it. The evidence for its effectiveness is doubtful. The risks are considerable.

Previous

Amnesty's new report shows that the police are supercharging racism through predictive policing.

At least 33 UK police forces have used prediction or profiling tools.

“These systems are developed and operated using data from policing and the criminal legal system. That data reflects the structural and institutional racism and discrimination in policing and the criminal legal system.”

#policing #police #precrime #predictivepolicing #codedbias #ukpolitics #ukpol

theguardian.com/uk-news/2025/f

The Guardian · UK use of predictive policing is racist and should be banned, says AmnestyBy Vikram Dodd

@philip_cardella

#LEOBR #Legal #USpol

(1/3)

Litigation is a huge issue in the #US. Penalties often are astronomical seen from an international perspective. Also, malfeasance by police officers including #RacialProfiling and l, IMO, #PredictivePolicing seem to be much more frequent, maybe just more reported.

However, my bottom line is that
a) #SystemicRacism will never be overcome w/o abolishing #LEOBR|s. Police officers are nothing but citizens in uniform. They're no animals more...