Ollama Code CLI is an open-source AI agent that brings the power of local LLMs through Ollama, right in your terminal, with advanced tool-calling features.
- Features
- Installation
- Requirements
- Usage
- Available Tools
- Examples
- Interactive Mode
- Project Structure
- Dependencies
- Contributing
- License
- 🎨 Elegant CLI Interface: Rich colors and structured output
- 🤖 Local AI Power: Interact with local LLMs through Ollama
- 🛠️ Tool Calling: Execute coding-related tools (file operations, code execution, etc.)
- 🔒 Permission Prompts: Safety prompts before executing potentially dangerous operations
- 💬 Interactive Mode: Maintain conversation context for multi-turn interactions
- 📝 Markdown Support: Elegantly formatted responses with syntax highlighting
- 📋 Structured Output: Clear panels and tables for tool calls and results
First, install a compatible model in Ollama:
# Choose one of these models:
ollama pull qwen3:4b
ollama pull qwen2.5:3bThen install the CLI:
pip install ollama-code-cliIf you see an error like "externally-managed-environment" when using pip3 install, this is because modern macOS and Homebrew Python prevent global package installation. Use one of these methods instead:
Option 1: Use pipx
brew install pipx
pipx ensurepath
pipx install ollama-code-cliOption 2: Use a virtual environment
python3 -m venv ~/.venvs/ollama-code-cli
source ~/.venvs/ollama-code-cli/bin/activate
pip install ollama-code-cliOption 3: Use uv (Fast Python package manager)
brew install uv
uv tool install ollama-code-cli- Python 3.13+
- Ollama installed and running
- An Ollama model that supports tool calling (e.g., Qwen3, Qwen2.5, etc.)
Start an interactive session:
ollama-code-cli --model qwen3:4bRun a single command:
ollama-code-cli "Create a Python function to calculate factorial"Use a specific model:
ollama-code-cli --model qwen3:4b "Explain how async/await works in Python"Disable permission prompts (use with caution):
ollama-code-cli --no-permission "Create and run a Python script"The CLI includes built-in security features to protect against potentially dangerous operations:
By default, the CLI will ask for your permission before executing potentially dangerous operations such as:
- Writing or modifying files
- Executing code
- Running shell commands
- Running Python files
These operations are considered safe and don't require permission:
- Reading files
- Listing directory contents
You can disable permission prompts using the --no-permission flag, but this should be used with caution:
ollama-code-cli --no-permission "Your prompt here"Warning: Disabling permission prompts allows the AI to execute operations without user confirmation. Only use this in trusted environments.
read_file: Read the contents of a filewrite_file: Write content to a fileexecute_code: Execute code in a subprocesslist_files: List files in a directoryrun_command: Run a shell command
1. Create a Python script and save it to a file:
ollama-code-cli "Create a Python script that calculates factorial and save it to a file named factorial.py"2. Read a file and explain its contents:
ollama-code-cli "Read the contents of main.py and explain what it does"3. Execute a shell command:
ollama-code-cli "List all files in the current directory"Launch the interactive mode for a conversational experience:
ollama-code-cliIn interactive mode, you can:
- Have multi-turn conversations with the AI
- See elegantly formatted responses with Markdown support
- Watch tool calls and results in real-time with visual panels
- Clear conversation history with the
clearcommand - Exit gracefully with the
exitcommand
ollama-code-cli/
├── ollama_code_cli/
│ ├── __init__.py
│ ├── cli/
│ │ ├── __init__.py
│ │ └── cli.py # Main CLI interface
│ ├── tools/
│ │ ├── __init__.py
│ │ └── tool_manager.py # Tool implementations
├── pyproject.toml # Project configuration
├── LICENSE
└── README.md
- Rich — Elegant terminal formatting
- Click — Command-line interface
- Ollama Python Client — Ollama integration
Contributions are welcome! Please open an issue or submit a pull request for any improvements, bug fixes, or suggestions.
This project is licensed under the MIT License.