edinbruh

joined 2 years ago
[–] [email protected] 3 points 1 week ago

This is what Araki takes inspiration from

[–] [email protected] 3 points 1 week ago

Woe, brick be upon ye!

[–] [email protected] 4 points 2 weeks ago

Try checking the sampling rate in your pipewire config. It should be 48000. I don't remember exactly how to set it, check on the arch wiki.

Last time I had issues with digital audio that was it.

[–] [email protected] 9 points 2 weeks ago

Woe, brick be upon ye

[–] [email protected] 6 points 2 weeks ago

Flatpak 🤷

[–] [email protected] 11 points 2 weeks ago

My potions are only for the strongest and you are not of the strongest you are clearly the weakest

[–] [email protected] 5 points 2 weeks ago (1 children)

The PS3 doesn't have an ATI gpu. TL;DR: it's Nvidia.

The PS3 has a weird, one-of-a-kind IBM processor, called Cell. You can think of it kind of as a hybrid design that is both a CPU and a GPU (not like "a chip with both inside" but "a chip that is both") meant for multimedia and entertainment applications (like a game console). It was so peculiar that developers took a long time before learning how to use it effectively. Microsoft didn't want to risk it, so they went with a different CPU always from IBM that shared some of the Cell's design, but without the special GPU-ish parts, and paired it up with an ATI GPU.

Now, Sony wanted to get away with only the Cell, and use it both as CPU and GPU, but various tests showed that despite everything, it wasn't powerful enough to keep up with the graphics they expected. So they reached out to NVIDIA (not ATI) to make an additional GPU, so they designed a modified version of the 7800 GTX to work together with the Cell. To fully utilise the PS3 graphics hardware, one would have to mainly use the GPU for graphics, and assist it with the special Cell hardware. Which is harder.

[–] [email protected] 4 points 2 weeks ago

"you are in a Venn diagram, Max"

"... I was in a Venn diagram, funny as hell it was the most horrible thing I could think of"

[–] [email protected] 9 points 3 weeks ago (1 children)

Meh, I have at least two hdd enclosures that use that cable.

Standards don't mean that much when the hardware manufacturer just doesn't care

[–] [email protected] 2 points 3 weeks ago

Famine be like: 😭😭😭🫣🫣😭😭😭😭

[–] [email protected] 29 points 3 weeks ago

If I chop you up, in a meat grinder, and the only thing that comes out becomes you again... You are probably a sponge

[–] [email protected] 13 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

I know what I said, I looked up the symbols they use, that is the symbol for castration, it's like a mars symbol, but barred. The one for spaying (?) is like the Venus symbol, but without the little horizontal line

Edit: apparently, there's also a Venus symbol but barred, I don't know if they have different meanings

 

Can I get a better Nvidia+Wayland experience by using prime and connecting the display to an AMD iGPU? I saw that in the last year Nvidia Prime had some improvements, do they make it feasible?

I can't just try it because I have yet to buy said AMD iGPU. And I'd like to know it before buying

 

I don't like my ssh keys being stored in plain sight, I also don't like having to type a passphrase to use them.

On windows, once you run ssh-add, the key is stored in a secure way and managed by some kind of session manager (source), at that point you can delete the key file and go about your life knowing that the key is safe and you won't need to type a password again.

I would like something similar on linux, like storing the key via libsecret as you do with git, so that you can access your servers without having a key in plain text.

I think it's possible to generate a key with a passphrase and have gnome-keyring or kwallet remember the passphrase, but it would be nicer to just securely store the key itself.

Can that be done?

 

I have a projector that needs limited rgb range, but for some reason (maybe a faulty hdmi-vga dongle) the intel driver selects full range. I want to force the limited rgb range when I plug the projector, but I need it set to auto normally, because my usual monitor needs full range.

I read this guide that explains how to use proptest to switch mode when in wayland. The problem is that running the command when the gnome session is open doesn't work and returns an error 243 (I can't find it in errno.h, but google says its EACCESS). The guide deals with this by launching the command with systemd before gdm starts, but as I said, I only want to force the limited range when using the projector.

I noticed that I can switch to a tty, set the range, and switch back to gnome while everything is still running and it works, which is my current "workaround", and I'd like to automate it. So I thought that there's a moment when gnome "takes control" of a screen where this can be set. I tried to use a udev rule to switch as soon as a monitor is plugged, but it exits with 243 as usual. I suspect gdm has a way to automate such things that might possibly work, but I can't find it, I only read about some xorg scripts.

Also, there's this issue that's being worked on. One of the commenters uses an udev rule as a work arount but it doesn't work for me.

 

When the jack is inserted the internal speakers stop making sound and the only analog out is the jack, as it's common on laptops. But I want to address the two analog output individually so that I can:

  • Still select the speakers when headphones are plugged
  • Have different sounds come from headphones and speaker
  • Mix them with carla or other audio software

My alsa/pipewire settings are all default, I'm on a thinkpad t480s with fedora 38. My sound card is an intel hd audio card, with a realtek ALC257 analog chip.

I tried disabling auto_mute and rising the volume from alsamixer but nothing happens. Then I switching pipewire to "pro audio" but it doesn't separate the analog outputs. I also tried setting the indep_hp hint from hdarackretask but it doesn't change anything.

The hint enables a new "independent hp" option in alsamixer, but it can only be enabled by the cli and it doesn't work either.

I can provide configuration files or other info if needed but since they are all pretty long I didn't include them in the post. Also because I didn't edit them so they are just fedora's default.

Thanks

 
 
 

Is there a way to apply a apply an opengl shader to the entire screen in either gnome or kde using Wayland? I know hyprland has something like that, but I don't use tiling WMs.

I have an ald projector that I mainly use for game streaming or jellyfin, that has misaligned RGB panels. This model in particular cannot be adjusted, you can only replace the prism assembly all together (which I have no intention of buying). But I have tested that shader that simply samples about one pixel to the left/right is enough to fix the problem almost entirely.

Also, it would be perfect if I could also pass to the shader a uniform sampler of an image file, that I need to perform some extra color corrections. The green color is weaker on some areas, and I have a picture to use as a mask of those areas.

 

Posso inviare post da infinity?

 

Note that this is not a request for review bombing, but rather a request for your opinion.

Despite all the rage reddit is getting, more than half of the reviews are five stars, which means the people complaining are not expressing their opinion through the "proper channel".

And remember to be honest in the review, don't just invent problems for the sake of it. That defeats the purpose and just comes out as griefing.

 
view more: ‹ prev next ›