I actually started running this mid last week on Minisforum UM790 Pro — Ryzen 9, 64 gigs of RAM, 1TB SSD.
It’s awesome. The setup was way too easy and being able to set the mini pc in my closet and communicate with the agent from my laptop via the relay is so awesome. I have built some games, a couple websites, and other fun little projects.
Qwen is a great model. The code it produces is top notch for local llms.
Bonus points because of the iPhone app, so cool to be able to chat with my own server running in my closet at home without having any extra setup or opening any ports.
This is good stuff! Big win for local agentic coding!
Including OpenCode in the "how it stacks up" is a bit misleading since OpenCode is just the agent and can be used with many other providers. "Zen" is their in-house provider.
I have been using this since it dropped last week. Super interesting project. Obviously not perfect yet, but this has a ton of potential. I have been cranking thru some projects and the best part is I leave it running 24x7 running guilt free!
Im using this right now with an RTX A5000 24 GB VRAM. I am using it for a few .NET projects at work. It is the 1st local LLM implementation I have used that creates usable code
Looks and sounds interesting... Is there anything beyond glue that makes the Qwen models it uses better for development than what you get with local models through Ollama in an IDE or editor of your choice?
There are tweaks at each layer that we have engineered. But it is a full, OSS agent with subagents - so you control every layer of the stack. Plus it provides a free dual-box setup where you can leave the inference at home and use the agent remote anywhere, which is our custom setup and very very handy.
It’s awesome. The setup was way too easy and being able to set the mini pc in my closet and communicate with the agent from my laptop via the relay is so awesome. I have built some games, a couple websites, and other fun little projects.
Qwen is a great model. The code it produces is top notch for local llms.
Bonus points because of the iPhone app, so cool to be able to chat with my own server running in my closet at home without having any extra setup or opening any ports.
This is good stuff! Big win for local agentic coding!