CSS pop

Tuesday, January 12, 2021

Justice in policing 2020

 https://www.google.com/url?sa=t&source=web&rct=j&url=https://judiciary.house.gov/uploadedfiles/fact_sheet_justice_in_policing_act_of_2020.pdf&ved=2ahUKEwjE6eSsgpfuAhXFU80KHdP3C4YQFjAAegQIARAB&usg=AOvVaw2VF5YBzc9A-8yWCMdfkRg_

It looks like qualified immunity is gone.I could only find that this past the house maybe it's all the way through I don't know

This document might explain some of the questions I've had as well

https://www.google.com/url?sa=t&source=web&rct=j&url=https://www.justice.gov/crs/file/836401/download&ved=2ahUKEwjcr7ukhJfuAhV8Ap0JHf5MCTEQFjAAegQIARAB&usg=AOvVaw0Swb1ZuYIO-RK8_H5bRv1A

I don't think this goes far enough in the first article. If no one's paying attention externally we still have a problem with all types of discrimination. it's really a trivial task to let ai and deep learning algorithms draw inferences as long as there's some good or marked questionable with good data in databases.

I can't remember for sure if it was for Minnesota but I noticed that there were police telling people online or a department site that if an officer didn't allow a report filed he didn't believe a crime has been committed. That would seem too short circuit the justice system if no investigations being done or no records being kept. I'm not saying they have to investigate everything but we could do quite a bit with the data to figure out what's actually going on and or if anyone looks like they're being discriminated against if we simply required to report on the following any time someone called to try to file a report

It would be as simple as gender age the ethnicity and the claim made regardless of its the officer believes it. If it went a bit further and had why the officer believes that are dozen that would be ideal for a number of reasons even if the public never directly sees it.

Most people don't understand how much ai can do now. For instance we've decoded without internal electrodes and image shown to a subject on a screen told them to concentrate on it turned off the screen

with the type of electrodes you could have fixed to the top of a baseball hat deep learning machine learning algorithms managed to decode the electrical signals coming through the person's skull And draw the image on a screen from the person's mind

That takes pattern recognition and most humans wouldn't even be able to figure that out they looked at all the data that algorithms being presented

It also has no reason to be biased do it could probably be made to be biased

Chicago eye red is doing something that didn't seem exactly like this. What I just described requires reports be taken and that info entered. Actually if you had body cams that are uploaded real time or at the end of a shift you might get a pretty good facial not quite facial but at least a racial and age guesstimation from just video. But Chicago is trying to do it to find The hot list as they called it the problem is if you don't require reports you can easily by us that without actively doing anything other than denying a report if you're not in putting any of the why for some people but for others you have a list that the computer picked from info it didn't have the full picture on that's not a good list humans do the same thing it's not magic

Actually it isn't good list if your intent is to obscure profiling people

if your intent is against equal justice then running a system like that without any thought to safeguards is exactly what you're going to do. but if that is truly how they're doing it without any safeguards like that then I'm guessing someone at some level already knows that because it's the perfect thing to abstract or to cloak the fact that you're doing that from the public because it might as well be magic for most people


No comments:

Post a Comment

 It just dawned on me. If you want to see evidence that black people are no more inherently violent than white people Martin Luther King and...