this post was submitted on 05 Apr 2024
87 points (89.9% liked)

News

30771 readers
2497 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
all 14 comments
sorted by: hot top controversial new old
[–] Car 16 points 1 year ago

Insider look of the AI assistance:

“Here’s a 100 square meter zone of unmolested real estate. Would you like me to generate a press statement for why this is an important and legitimate target?”

[–] [email protected] 12 points 1 year ago

That’s strange, I could have sworn they were using a magic 8-ball.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago)

Leaked AI code:

10 if israeli goto 30

20 kill

30 end

[–] [email protected] 11 points 1 year ago (1 children)

This just underscores that it is not AI we should fear, it is how some choose to use it.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

I mean, I'd assume that people are going to have a go at using it for pretty much everything. If not LLMs now, then more-sophisticated systems down the road.

[–] [email protected] 6 points 1 year ago (2 children)

The officials, quoted in an extensive investigation by the online publication jointly run by Palestinians and Israelis, said that the AI-based tool was called “Lavender” and was known to have a 10% error rate.

So, there's pretty much no information to decipher what it's actually doing. But I think that one could at least use a human baseline. For a human in a similar role, assuming that a human can approximate whatever it's doing, what's the error rate?

[–] [email protected] 10 points 1 year ago (1 children)

So, there’s pretty much no information to decipher what it’s actually doing. But I think that one could at least use a human baseline. For a human in a similar role, assuming that a human can approximate whatever it’s doing, what’s the error rate?

the verge had a piece on it.

Lavender was trained to identify “features” associated with Hamas operatives, including being in a WhatsApp group with a known militant, changing cellphones every few months, or changing addresses frequently. That data was then used to rank other Palestinians in Gaza on a 1–100 scale based on how similar they were to the known Hamas operatives in the initial dataset.

Basically, they're looking at habits and social connections and the AI matches people.

part of the problem?

To build the Lavender system, information on known Hamas and Palestinian Islamic Jihad operatives was fed into a dataset — but, according to one source who worked with the data science team that trained Lavender, so was data on people loosely affiliated with Hamas, such as employees of Gaza’s Internal Security Ministry. “I was bothered by the fact that when Lavender was trained, they used the term ‘Hamas operative’ loosely, and included people who were civil defense workers in the training dataset,” the source told +972.

shit data in, shit data out.

[–] [email protected] 2 points 1 year ago (1 children)

Hmm.

I believe that law enforcement has done that sort of thing for a long time, built databases to look for correlating factors and among relationships. And it sounds like they're explicitly writing up the criteria, else they probably wouldn't be able to rattle them off. So I kind of doubt that they're using machine learning to find new criteria.

If I had to guess from your text, what they did is had people come up with all the criteria that they could think of that's likely to indicate that someone is Hamas. Then they had some database of known Hamas figures, and ran their classifiers against it, let the system figure how weightings for each of those criteria. I don't know if that last bit is standard practice for law enforcement software, to identify likely suspects, but I can believe that it might be.

"AI" might be a slightly ambitious term to use for that. I have used SpamAssassin, which uses Bayesian classifiers to identify spam, for decades. It does something comparable, but I don't think that people have generally called SpamAssassin "AI".

[–] [email protected] 2 points 1 year ago

So it’s machine learning.

They have lots of data -social media foot print, addresses, names of friends, coworkers, etc. who they call, where they go for coffee; or happy hour after work, quite literally everything they can get on these guys.

They the. Give it a known list and tell the machine to look for patterns (like switching burner cell phones every so often.) consistent between all of them.

It even weights lower strength correlations as softer evidence.

They then take that and run it against everyone they have in their database. And it spits out people that match the same things.

As for it being artificial intelligence- it is, just not general AI (Like Data in Star Trek, R2-D2 in Star Wars or Kryzen in Red Dwarf). They’re more like idiot savants that are very good at this one task and suck and literally anything else.

The problem is mostly in the shit data it was programmed with; and an assumption that it would always be right. It can recognize patterns, but there’s always some natural variation in the pattern.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

AI doesn’t know who is an aid worker, so chooses targets like WCK volunteers in armored vehicles transporting food to starving people.

[–] [email protected] 5 points 1 year ago

I don’t understand how Israel can sit there and justify being evil by claiming everyone has ties to Hamas. Like yeah, they’re the terrorist group ruling Gaza, I’m sure everyone knows someone involved with them.

By that logic, since the IDF has been terrorizing Palestine for decades, and since military service is mandatory in Israel, are there no true civilians and only terrorists in Israel too?

Of course that’s silly, but it’s a good way to point out how flawed Israel’s logic is when they always claim casualties in Palestine have ties to Hamas.

[–] [email protected] 4 points 1 year ago

Seemingly the AI from Goldeneye 64

[–] [email protected] 3 points 1 year ago

AI: All of them.

Israel: OK!

Fuck Israel.