catty

joined 1 week ago
[–] [email protected] 4 points 1 day ago (1 children)

Ahh seeing images load line-by-line and you get excited as you can actually start to tell what the image is!

[–] [email protected] 17 points 1 day ago* (last edited 1 day ago) (1 children)

I'm finding I'm using youtube less and less that I can't actually remember the last time I used it for anything. Youtube has so much crap now.

Because of their monetisation scheme for content creators, it was profitable for creators to drag 15 mins of information out to one hour, and then to include shock faces as thumbs to get more clicks, and then to get more viewers just to watch a bit of the video as viewership dwindles into just mindlessly watching suggestions.

I feel sorry for those creators who have gone balls deep with youtube content and who have no "plan b".

[–] [email protected] -3 points 1 day ago* (last edited 1 day ago)

And we're supposed to believe these people are smart enough for nukes? I wonder how many of the people in this photo can actually read or write?

[–] [email protected] 8 points 1 day ago

Totally nothing to do with Trump's own personal crypto currency scheme.

[–] [email protected] -2 points 1 day ago

"civil"

Recently in England there was a Turkish man burning the koran... a Muslim guy came out and lunged at him several times with a knife he just happened to have on him shouting "you will not burn my holy book" or some such, the book burner was charged by the police (the bible can be burnt in England but for some reason the koran cannot). Kier Starmer didn't want to progress an investigation into child abuse because of how it will expose the Muslim men abusing working class young white girls because their religion states girls who don't cover their hair are sluts.

[–] [email protected] 11 points 1 day ago

mental health nurses who work in an asylum/"hospital"/"mental health unit" too according to a friend who works in one as a nurse.

[–] [email protected] 2 points 1 day ago (1 children)

Any suggestions for solutions?

[–] [email protected] 3 points 1 day ago* (last edited 1 day ago)

Retail traders beg to differ though and are doing as they're told.

[–] [email protected] 16 points 2 days ago

It costs significantly less to have a sense of humour.

[–] [email protected] 2 points 2 days ago

You're conflating me asking how to use these tools with you who's misusing them. I see you still don't accept what you're doing is wrong. But go you.

[–] [email protected] 14 points 2 days ago* (last edited 1 day ago) (3 children)

I fucking hate this rhetoric which is: OIL! GAS! BUY OIL! BUY GAS! KEEP PUTIN HAPPY! OIL! GAS!

No one was taking the 'war' between Israel and Iran seriously last week after the initial surge in oil price because the USA was like... "yeah, nah". Now, after a quick call from Putin and bam, all this BS about how the US is pandering for war to drive commodity and share prices up when it's a nothingburger.

Remember kids, buy oil, buy gas and do as you're told.

[–] [email protected] 34 points 2 days ago (10 children)

Can we hurry up with this please. I want to cum buckets in my femboy slut. OK, thx, bye.

 

I've tried coding and every one I've tried fails unless really, really basic small functions like what you learn as a newbie compared to say 4o mini that can spit out more sensible stuff that works.

I've tried explanations and they just regurgitate sentences that can be irrelevant, wrong, or get stuck in a loop.

So. what can I actually use a small LLM for? Which ones? I ask because I have an old laptop and the GPU can't really handle anything above 4B in a timely manner. 8B is about 1 t/s!

 

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

 

I'm watching some retro television and this show is wild! Beauty contests with 16 year-old girls (though at the time, it was legal for 16 yo girls to pose topless for newspapers), old racist comedians from working men's clubs doing their routine, Boney M, English singers from the time, and happy dance routines!

vid

view more: next ›