LedgeDrop

joined 2 years ago
[–] [email protected] 21 points 9 months ago

I think OP is referring to the fact that bad actors, who are exploiting facets of SEO (rather then providing "meaningful" content), use to need to programically generate content (pre-AI/LLM).

For a real reader, it was obvious (at a quick glance) this was meaningless garbage. As they would often be large walls of text that didn't make sense, or just lists of random key words.

With LLM/AI, they're still walls of text and random key words, but now they grammatically/structurally correct and require no real effort to generate. Unfortunately, it means that the reader actually need to invest time in reading it. You'll also notice a growing trend in articles (especially in "compare X vs Y" type articles), the same content is recycled and rephrased to "pad" the article and give it a higher SEO ranking.

[–] [email protected] 2 points 9 months ago

Fantastic! Thank you for looking into the source code and verifying it!

[–] [email protected] 12 points 9 months ago* (last edited 9 months ago) (5 children)

Not true.

The links just need to have a "no follow" attribute (which is something that Lemmy could add, if they haven't already).

These links do not influence the search engine rankings of the destination URL because Google does not transfer PageRank or anchor text across them. In fact, Google doesn’t even crawl nofollowed links.

edit: added relevant blob of text.

[–] [email protected] 3 points 10 months ago

Welp, I guess this means something bad is gonna happen and Spez is trying to get in front of the inevitable protests.

I wonder what it could be....

[–] [email protected] 2 points 10 months ago

There has to be a better way to keep the strengths of federating without partitioning the community smaller and smaller until there is no community left.

Can you imagine Lemmy with a similar amount of Reddit users? Anytime you'd post, you'd have to replicate it between X number of instances (for visibility). Conversations would be fragemented and duplicated, votes would be duplicated. To me this almost sounds like "work"...

There has to be something better.

For example, instead of "every instance is an island". Meaning the current hierarchy is "instance" - > "community" - > "post" - > "threads". We could instead have "community (ie: asklemmy)" - > "post (ie: this post)" - > "instance (Lemmy.ml, Lemmy.world, etc)" - > "threads (this comment)".

From a technical perspective, it would mean that each instance would replicate the community names and posts. Which is already beginning done (this post is a perfect example), but as long as each instance would share a unique identifier to associate the two communities/posts as "the same thing" (and this could simply be the hash of the community /post name). Everything else would be UX. Each instance would take ownership of the copy of the community and post, which means they could moderate it according to their standards.

[–] [email protected] 2 points 10 months ago

I fixed the link. For some reason the Lemmy Client (Voyager) keeps generating '.ml' links (even though I'm on Lemm.ee)

This whole identical thread really confused Voyager, I thought I was seeing double.

[–] [email protected] 12 points 10 months ago

Off-topic: Lemmy really needs better crosspost functionality.

Lemmy is a small group of people, let's not divide it further by having the exact same conversation in two (or more) places.

[–] [email protected] 9 points 10 months ago* (last edited 10 months ago) (6 children)

Off-topic: Lemmy really needs better crosspost functionality.

Lemmy is a small group of people, let's not divide it further by having the exact same conversation in two (or more) places.

edit: Fixed the link.

[–] [email protected] 17 points 10 months ago

I don't have anything meaningful to add, other than my sincere gratitude to you for posting this.

I haven't laughed so hard in a good while.

[–] [email protected] 4 points 10 months ago

Sure, they could block based on your VPN provider, but they're probably also using Deep Packet Inspection .

The ELI5 verson: It's possible to just "watch" your traffic and notice that it's not the "normal" https traffic (which is the most common traffic) . This can be done by finger printing the request itself or just watching the amount of traffic. For example if you "visit" a website, but upload and download 3 megabytes of data and it takes 15 minutes to send/receive that data... well, that looks suspicious... and depending on the country, you may have some people knocking on your door.

[–] [email protected] 121 points 10 months ago (8 children)

Begins?!? Docker Inc was waist deep in enshittification the moment they started rate limiting docker hub, which was nearly 3 or 4 years ago.

This is just another step towards the deep end. Companies that could easily move away from docker hub, did so years ago. The companies that remain struggle to leave and will continue to pay.

[–] [email protected] 2 points 11 months ago

All aboard the gold train!

view more: ‹ prev next ›