Ideas
Concepts and thinking on voice, computing, and the tools we build.
How Small Can You Go?
A 0.6B model matches the 1.7B on accuracy, runs 2.4x faster, and fits in 350MB. At some point the task stops needing more parameters.
Training on a Mac Mini
VLM vs text-only, where to draw the line between code and model, and why picking the right model size is harder than it sounds.
Teaching a Tiny Model to Hear Bash
Fine-tuning a 1.5B model to reconstruct shell commands from voice. 97% accuracy, 3GB of RAM, under a second on a phone.
Your Voice, Available to Agents
The Talkie CLI turns your voice memos and dictations into structured data that any script, pipeline, or AI agent can work with.
Three Layers of Transcription
The same voice memo, transcribed three different ways. From Apple Speech on iOS to Parakeet v3 with post-processing — each layer improves readability while preserving the speaker's intent.
Why Talkie Has a CLI
GUI apps are growing command lines. Here's why we built one for a voice app, and what it means for the future of personal software.
Voice-First Computing
Why the next era of personal computing starts with your voice, not your keyboard.