TheMightyCat
🤷♂️
I'm not sure what you mean by that?
They provide an appimage https://api.wooting.io/public/wootility/download?os=linux, aswell as instructions on how to set it up https://help.wooting.io/article/147-configuring-device-access-for-wootility-under-linux-udev-rules.
Can even get it from the AUR if you want https://aur.archlinux.org/packages/wootility-appimage.
Or use the web version, but i was talking about the native linux version.
NATIVE LINUX APP
i repeat
NATIVE LINUX APP
... and Linux – all you need is a Chromium-based browser.
That's already alot better then most manufacturers can say.
That was a huge rant, i also don't like the microsoft authenticator so guess what i don't use it, and the issue of your private keys to getting stolen if your pc is hacked has long been solved with password protected keys.
All of these issues pretty much amount to nothing, the standard works and is more secure then passwords, same reason as to why enabling password login on SSH is not recommended.
I was getting really hyped seeing these specs essentially being a better epyc 9575F but only 8 channels is rough. Can't have it all i suppose.
Currently I use Code OSS, which is less my favorite but it works.
Out of all the IDE's I've tried (vscode, webstorm, Code OSS, Kate, KDevelop), regular old Visual Studio 2022 is still my all time favorite, using it is such a smooth experience.
Its biggest flaw and why i had to switch is no linux support :(
I mean the US has fielded and expanded GBI for years now and no nuclear war has broken out.
As an EU citizen hearing daily how putin threatens to kill us all I'm glad we are investing into missile defense aswell, instead of our lives being in the hands of a madman.
Windows users when they see a wine user???
I'm glad wsl exists so I don't have to bother with windows and people can still run my programs.
I'm trying to understand how this is used, my current selfhosted setup is quite simple:
frontend - backend - vllm
Using the openai compatible api vllm returns with tool calls that need to be handeled by the backend, an results are then send to the frontend by the backend, the frontend never communicates with vllm directly.
Form what i can gather AG-UI is a way for the frontend to directly communicate with in my case vllm in a secure way? letting the frontend handle the tool calls?
This is why i like publishing under LGPL, people can still use it in proprietary software but the library itself is better protected.