this post was submitted on 01 Jul 2025
2081 points (98.4% liked)
Microblog Memes
8358 readers
2639 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Let’s do the math.
Let’s take an SDXl porn model, with no 4-step speed augmentations, no hand written quantization/optimization schemes like svdquant, or anything, just an early, raw inefficient implementation:
https://www.baseten.co/blog/40-faster-stable-diffusion-xl-inference-with-nvidia-tensorrt/#sdxl-with-tensorrt-in-production
So 2.5 seconds on an A100 for a single image. Let’s batch it (because that’s what’s done in production), and run it on the now popular H100 instead, and very conservatively assume 1.5 seconds per single image (though it’s likely much faster).
That’s on a 700W SXM Nvidia H100. Usually in a server box with 7 others, so let’s say 1000W including its share of the CPU and everything else. Let’s say 1400W for networking, idle time, whatever else is going on.
That’s 2 kJ, or 0.6 watt hours.
…Or about the energy of browsing Lemmy for 30-60 seconds. And again, this is an high estimate, but also a fraction of a second of usage for a home AC system.
…So yeah, booby pictures take very little energy, and the usage is going down dramatically.
Training light, open models like Deepseek or Qwen or SDXL takes very little energy, as does running them. The GPU farms they use are tiny, and dwarfed by something like an aluminum plant.
What slurps energy is AI Bros like Musk or Altman trying to brute force their way to a decent model by scaling out instead of increasing efficiency, and mostly they’re blowing that out of proportion to try the hype the market and convince them AI will be expensive and grow infinitely (so people will give them money).
That isn’t going to work very long. Small on-device models are going to be too cheap to compete.
https://escholarship.org/uc/item/2kc978dg
So this is shit, they should be turning off AI farms too, but your porn images are a drop in the bucket compared to AC costs.
TL;DR: There are a bazillion things to flame AI Bros about, but inference for small models (like porn models) is objectively not one of them.
The problem is billionaires.
Not only are they cheaper than AC, but doing the math shows that they are more energy efficient than a human doing the same work, since humans operate at around 80-100W, 24 hours a day. (Assuming that the output is worth anything, of course.)
let's not use the term "efficiency" with humans making art, please. you're not helping anyone with that argument, you're just annoying both sides.
Humans at least run on renewable energy.
The computer you draw your art on, not so much. Reject modern art, embrace traditional carvings and cave paintings!