this post was submitted on 14 Aug 2023
1014 points (100.0% liked)
196
18274 readers
1113 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
Other rules
Behavior rules:
- No bigotry (transphobia, racism, etc…)
- No genocide denial
- No support for authoritarian behaviour (incl. Tankies)
- No namecalling
- Accounts from lemmygrad.ml, threads.net, or hexbear.net are held to higher standards
- Other things seen as cleary bad
Posting rules:
- No AI generated content (DALL-E etc…)
- No advertisements
- No gore / violence
- Mutual aid posts are not allowed
NSFW: NSFW content is permitted but it must be tagged and have content warnings. Anything that doesn't adhere to this will be removed. Content warnings should be added like: [penis], [explicit description of sex]. Non-sexualized breasts of any gender are not considered inappropriate and therefore do not need to be blurred/tagged.
If you have any questions, feel free to contact us on our matrix channel or email.
Other 196's:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Like what?
You want examples of systemic racism?
That's capitalism's fault. Poor white kids would face the same issues.
That's just a lack of data points and not a system constructed by anyone. The data points should be increasing naturally.
It'd be awesome if we can just solve language barriers generally. Before we can do that having a single official language in working situations seems to be not avoidable for productivity.
Not related to racism.
Is this happening? I think it's straight out wrong to predict criminals with AI trained on previous data.
All in all I agree that many of the existing systems sucks but I don't think it's helpful to link every problem to racism. Disclaimer: I'm not black or white
Yeah, cops are using ai facial recognition to identify suspects. As mentioned ai facial recognition is really bad at identifying black people particularly. This results in an increased incidence of wrongful arrest. In America, something like 2/3 of cases go to a plea deal because the people involved cannot afford to fight the charges, so this also results in increases in wrongful conviction.
Cops also use ai to tell where to focus patrols. This uses historical arrest data, often dating back to the 50s. The police tend to arrest more people in areas they patrol more heavily, so the ai ends up suggesting more patrols in areas which were previously sivject to discriminatory laws. They've automated away discrimanotory over-policing