fubarx

joined 2 years ago
[–] [email protected] 1 points 11 months ago* (last edited 11 months ago)

NEW, automated children's bicycle. Guaranteed to teach the little tyke how to ride! *

  • ^Comes with training wheels, and adult monitor standing by at all times.^
[–] [email protected] 3 points 11 months ago (4 children)

I can think of only two reasons to have a venv inside a container:

  • If you're running third-party services inside a container, pinned to different Python versions.

  • If you do local development without docker and scripts that have to activate the venv from inside the script. If you move the scripts inside the container, now you don't have a venv. But then it's easy to just check an environment variable and skip, if inside Docker.

For most applications, it seems like an unnecessary extra step.

[–] [email protected] 2 points 11 months ago

Pretty damned impressive they kept the lights on with 2M new users. Old Twitter would Blue-whale if you sneezed at it.

[–] [email protected] 2 points 11 months ago

I've been using ChatGPT, specialized ones on Huggingface, and a bunch of local ones using ollama. A colleague who is into this deep says Claude is giving him best results.

Thing is, depends on the task. For coding, I've found all suck. ChatGPT gets you up to a point, then puts out completely wrong stuff. Gemini, Microsoft, and CodeWhisperer put out half-baked rubbish. If you don't already know the domain, it will be frustrating finding the bugs.

For images, I've tried DALL-E for placeholder graphics. Problem is, if you change a single prompt element to refine the output, it will generate completely different images with no way to go back. Same with Adobe generators. Folks have recommended Stability for related images. Will be trying that next.

Most LLMs are just barely acceptable. Good for casual messing around, but I wouldn't bet the business on any of them. Once the novelty wears off, and the CFOs tally up the costs, my prediction is a lot of these are going away.

[–] [email protected] 15 points 11 months ago (1 children)

They missed speculation, hearsay, and guesstimation.

[–] [email protected] 13 points 11 months ago* (last edited 11 months ago) (1 children)
[–] [email protected] 26 points 11 months ago (3 children)

Let's hope no single person worked on that thing for the full 8 years under development. Would be crushed.

[–] [email protected] 3 points 11 months ago

Installed RabbitMQ for use in Python Celery (for task queue and crontab). Was pleasantly surprised it also offered MQTT support.

Was originally planning on using a third-party, commercial combo websocket/push notification service. But between RabbitMQ/MQTT with websockets and Firebase Cloud Messaging, I'm getting all of it: queuing, MQTT pubsub, and cross-platform push, all for free. 🎉

It all runs nicely in Docker and when time to deploy and scale, trust RabbitMQ more since it has solid cluster support.

[–] [email protected] 15 points 11 months ago

Once they get Threads support, their target audience will be the non-Twitter universe. This would make it easier for businesses, governments, journalists, and non-technical folks like influencers and celebrities to switch out. That's how you get mass adoption.

I just tried it last week. Good start. Lots of promise.

[–] [email protected] 6 points 11 months ago (1 children)

How great-grandpa caught syphilis? Towards the end, doctors said a lot of his ailments were because of that one, little issue. You can speculate, but...

[–] [email protected] 3 points 11 months ago (2 children)

Hate to be pedantic, but what kind of phone is that in the photo? It has the white, wired headphones, and an actual phone jack.

It says it's a campaign event on Monday.

view more: ‹ prev next ›