Local-first AI coding assistant powered by Ollama LLMs
Privacy-first AI that runs entirely on your machine. Project-aware context, two-phase planning, and safe file operations.
$ devbldr /path/to/project
Indexing project... done
Found 42 files, 156 symbols
> Add authentication to the user API_
DevBldr uses Ollama to run AI models locally, keeping your code private while providing intelligent assistance for planning, writing, and refactoring.
Uses Ollama for privacy-first AI inference. Your code never leaves your machine.
Automatically indexes your codebase using tree-sitter and builds intelligent context for each request.
Plans changes before implementing them. Review and approve the plan before any code is written.
Tree-sitter parsing for C, C++, Python, JavaScript, TypeScript, Rust, Go, Java, and more.
Colorized diffs, approval workflow, and automatic backups. No changes without your consent.
Recognizes Flask, Django, FastAPI, Express, NextJS, React, Vue, and more frameworks.
Get started in seconds on your Mac
# Add the tap and install
$ brew tap jefferyabbott/devbldr
$ brew install devbldr
# Or install in one command
$ brew install jefferyabbott/devbldr/devbldr
Make sure you have Ollama installed and running
If you find DevBldr useful, consider buying the developer a coffee!
☕ Buy Me a Coffee