LocalCodeViber
LocalCodeViber is a local-first agentic coding model built on Qwen3-8B, fine-tuned for tool-calling, multi-step code generation, and autonomous error recovery. Designed to run entirely on consumer hardware — no API, no cloud, no cost per token.
This is the SFT foundation model. Reinforcement learning is ongoing.
What it does
LocalCodeViber was trained to operate as a coding agent — not just generate code, but use tools to read files, write files, run commands, search the web, and recover from failures just like a real developer would.
It can:
- Read and edit files in a workspace
- Write complete, working code from a single prompt
- Execute shell commands and interpret the output
- Recover from failed tool calls without giving up
- Create pull requests on GitHub repositories
- Think through problems step by step using native
<think>tags before acting
Model Details
| Base Model | Qwen3-8B-Base |
| Architecture | Qwen3 transformer, 36 layers |
Training Data
LocalCodeViber was trained on a curated mix of 14,837 examples across 5 datasets:
| Dataset | Examples | Focus |
|---|---|---|
| TeichAI/convo-v1 | 777 | Conversational format, instruction following |
| AlicanKiraz0/Agentic-Chain-of-Thought-Coding-SFT-Dataset-v1.1 | ~3,700 | Agentic reasoning and tool use |
| TeichAI/MiniMax-M2.1-Code-SFT | ~1,300 | Agentic Code generation |
| TeichAI/MiniMax-M2.1-8800x | 8,800 | Diverse coding tasks |
| TeichAI/claude-4.5-opus-high-reasoning-250x | 250 | High-quality reasoning traces |
The dataset mix emphasises real agentic tool-use patterns including failed tool calls that are identified, diagnosed, and corrected — giving the model genuine error recovery capability rather than just pattern matching on success cases.
Tools
LocalCodeViber understands the following tool schema out of the box:
["read_file", "write_file", "edit_file", "list_directory", "search_code", "run_command", "web_search"]
These match the tools in the training data. Pass them via the standard OpenAI tool calling API.
Usage
LM Studio (Recommended)
- Download the GGUF version: Bob-the-Koala/LocalCodeViber-GGUF
- Load in LM Studio and break free from API costs!
Ollama
ollama run hf.co/Bob-the-Koala/LocalCodeViber-GGUF:Q4_K_M
Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained(
"Bob-the-Koala/LocalCodeViber",
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("Bob-the-Koala/LocalCodeViber")
GGUF Versions
Available in Bob-the-Koala/LocalCodeViber-GGUF:
| Quantization | Size | Use case |
|---|---|---|
Q4_K_M |
~4.8 GB | Everyday use, best balance |
System Prompt
For best results, use this system prompt:
You are a helpful coding assistant with access to file operations and code analysis tools.
Complete the user's task thoroughly and efficiently.
When given a coding task, create working code files in the workspace.
Limitations
- Base model started from bnb-4bit weights — quality ceiling is below a full precision 8B model
- SFT only — reinforcement learning is in progress and will significantly improve reasoning quality
- Not suitable for tasks requiring knowledge past Qwen3's training cutoff
Roadmap
- LocalCodeViber-RL — reinforcement learning on top of this SFT base, optimising for code correctness and task completion
- LocalCodeViber-Claw — fine-tuned specifically for OpenClaw skill schemas, channel routing, extra safety, and memory system
- LocalCodeViber-14B — same training recipe on Qwen3-14B for substantially higher capability
Acknowledgements
LocalCodeViber was trained using Unsloth and would not exist without the datasets provided by TeichAI and AlicanKiraz0.
License
This model is released under the Apache 2.0 license
Built by Bob-the-Koala

- Downloads last month
- -
Model tree for Bob-the-Koala/LocalCodeViber
Base model
Qwen/Qwen3-8B-Base