this post was submitted on 13 Jul 2023
6 points (100.0% liked)

Cyberpunk 2077

702 readers
1 users here now

Universal community link
[email protected]

Official Website

Purchase

Communities

Allowed languages

founded 4 years ago
MODERATORS
top 2 comments
sorted by: hot top controversial new old
[–] [email protected] 5 points 2 years ago* (last edited 2 years ago) (1 children)

With local models and inference like llama.cpp, I wish the modder rather spent his energy with models that are locally run, and possibly even fine-tuned to the in-game world. Instead, this mod requires a metered API that needs billing and always-on network connection, while just serving a generic language model with little in-game knowledge.

[–] [email protected] 1 points 2 years ago* (last edited 2 years ago)

probably better to get it working first and then optimize. most users probably won't have the performance headroom to run both.

excited about the possibillites of a true radiant quest system especially combining this with vr and voice inputs

load more comments
view more: next ›