I get good mileage out of the Jan client and Void editor, various models will work but Jan-4B tends to do OK, maybe a Meta-Llama model could do alright too. The Jan client has settings where you can start up a local OpenAI-compatible server, and Void can be configured to point to that localhost URL+port and specific models. If you want to go the extra mile for privacy and you’re on a Linux distro, install firejail from your package manager and run both Void and Jan inside the same namespace with outside networking disabled so it only can talk on localhost. E.g.: firejail --noprofile --net=none --name=nameGoesHere Jan and firejail --noprofile --net=none --join=nameGoesHere void, where one of them sets up the namespace (–name=) and the other one joins the namespace (–join=)
- 0 Posts
- 2 Comments
Joined 1 year ago
Cake day: October 31st, 2024
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.


Yeah Linux likes to fill the cache entirely. If you want it to do that less and performance isn’t a concern, set up something that drops the caches every few hours running this as root:
echo 1 > /proc/sys/vm/drop_caches