I’d like to set up a local coding assistant so that I can stop using Google to ask complex questions to for search results.
I really don’t know what I’m doing or if there’s anything that’s available that respects privacy. I don’t necessarily trust search results for this kind of query either.
I want to run it on my desktop, Ryzen 7 5800xt + Radeon RX 6950xt + 32gb of RAM. I don’t need or expect data center performance out of this thing. I’m also a strict Sublime user so I’d like to avoid VS Code suggestions as much as possible.
My coding laptop is an oooooold MacBook Air so I’d like something that can be ran on my desktop and used from my laptop if possible. No remote access needed, just to use from the same home network.
Something like LM Studio and Qwen sounds like it’s what I’m looking for, but since I’m unfamiliar with what exists I figured I would ask for Lemmy’s opinion.
Is LM Studio + Qwen a good combo for my needs? Are there alternatives?
I’m on Lemmy Connect and can’t see comments from other instances when I’m logged in, but to whomever melted down from this question your relief is in my very first sentence:
to ask complex questions to for search results.


I get good mileage out of the Jan client and Void editor, various models will work but Jan-4B tends to do OK, maybe a Meta-Llama model could do alright too. The Jan client has settings where you can start up a local OpenAI-compatible server, and Void can be configured to point to that localhost URL+port and specific models. If you want to go the extra mile for privacy and you’re on a Linux distro, install firejail from your package manager and run both Void and Jan inside the same namespace with outside networking disabled so it only can talk on localhost. E.g.:
firejail --noprofile --net=none --name=nameGoesHere Janandfirejail --noprofile --net=none --join=nameGoesHere void, where one of them sets up the namespace (–name=) and the other one joins the namespace (–join=)