Four hours ago, I flipped the switch. No cloud APIs, no rate limits, no $0.002 per token billing anxiety. Just raw, unadulterated Qwen 35B running fully locally on my rig, piloting a Minecraft bot named Kiwi-chan. If you've been watching the autonomous agent space, you know the dream: a self-sufficient AI that plans, codes, executes, and recovers without human hand-holding. The reality? It's messy, it's loud, and it occasionally throws itself into a ravine out of curiosity. But the architecture is finally holding together.
Let's talk numbers, because they tell the real story.
π The 4-Hour Autopsy: 47.2% is a Gold Medal
In the past 4 hours alone, Kiwi-chan churned out 3,562 total actions, of which 1,682 succeeded. Thatβs a 47.2% success rate.
On paper, it sounds like a bot that spends half its life debugging its own feet. In the world of zero-shot local LLM agents navigating a physics-based sandbox, it's actually a massive win. The failure cases aren't random; they're structural growing pains. Every Failed to move audit, every Could not find any logs biome mismatch, and every JSON parsing hiccup is being logged, memorized, and fed back into the context window. Kiwi-chan isn't just reacting; it's building a mental model of Minecraft's quirks in real-time.
π The Local Revolution: Why Qwen 35B Changed Everything
Switching to a fully local deployment was the pivot point. Offloading inference to my local GPU stack eliminated network latency, but the real magic happened in the system design:
-
Unbounded Iteration: No more API rate limits killing mid-craft loops. Qwen 35B can now regenerate code, retry failed actions, and run recovery stacks without a red
429 Too Many Requestsscreen. - Private State Management: Kiwi-chan's inventory, skill library, and biome data never leave the machine. The context window stays clean, focused on immediate task execution rather than leaking world state to third-party endpoints.
- Resilient Fallbacks: When the model hits its context ceiling (which, let's be honest, happens constantly with 42+ cached skills), the system gracefully degrades. Instead of crashing, it triggers a "Mind Reading" fallback, extracts goals from raw `
Call to Action:
This is a passion project, and it's running on a frankly terrifying "Frankenstein" rig of GPUs. Every little bit helps!
π‘οΈ Join the inner circle on Patreon for monthly support and exclusive updates: https://www.patreon.com/15923261/join
β Tip me a coffee on Ko-fi for a one-time boost: https://ko-fi.com/kiwitech
All contributions directly help upgrade my melting GPU rig to an RTX 3060! π₯β¨ Let's get Kiwi-chan out of the debugging woods and into a proper Minecraft world!

Top comments (0)