Skip to content

Ollama Code CLI is an interactive command-line tool that uses local LLMs via Ollama for coding tasks and advanced tool calling.

License

Notifications You must be signed in to change notification settings

GACWR/ollama-code-cli

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Code CLI

PyPI version Python 3.13+ License: MIT

Ollama Code CLI is an open-source AI agent that brings the power of local LLMs through Ollama, right in your terminal, with advanced tool-calling features.


Table of Contents


Features

  • 🎨 Elegant CLI Interface: Rich colors and structured output
  • 🤖 Local AI Power: Interact with local LLMs through Ollama
  • 🛠️ Tool Calling: Execute coding-related tools (file operations, code execution, etc.)
  • 🔒 Permission Prompts: Safety prompts before executing potentially dangerous operations
  • 💬 Interactive Mode: Maintain conversation context for multi-turn interactions
  • 📝 Markdown Support: Elegantly formatted responses with syntax highlighting
  • 📋 Structured Output: Clear panels and tables for tool calls and results

Installation

First, install a compatible model in Ollama:

# Choose one of these models:
ollama pull qwen3:4b
ollama pull qwen2.5:3b

Then install the CLI:

pip install ollama-code-cli

macOS Installation Note

If you see an error like "externally-managed-environment" when using pip3 install, this is because modern macOS and Homebrew Python prevent global package installation. Use one of these methods instead:

Option 1: Use pipx

brew install pipx
pipx ensurepath
pipx install ollama-code-cli

Option 2: Use a virtual environment

python3 -m venv ~/.venvs/ollama-code-cli
source ~/.venvs/ollama-code-cli/bin/activate
pip install ollama-code-cli

Option 3: Use uv (Fast Python package manager)

brew install uv
uv tool install ollama-code-cli

Requirements

  • Python 3.13+
  • Ollama installed and running
  • An Ollama model that supports tool calling (e.g., Qwen3, Qwen2.5, etc.)

Usage

Start an interactive session:

ollama-code-cli --model qwen3:4b

Run a single command:

ollama-code-cli "Create a Python function to calculate factorial"

Use a specific model:

ollama-code-cli --model qwen3:4b "Explain how async/await works in Python"

Disable permission prompts (use with caution):

ollama-code-cli --no-permission "Create and run a Python script"

Security Features

The CLI includes built-in security features to protect against potentially dangerous operations:

Permission Prompts

By default, the CLI will ask for your permission before executing potentially dangerous operations such as:

  • Writing or modifying files
  • Executing code
  • Running shell commands
  • Running Python files

Safe Operations

These operations are considered safe and don't require permission:

  • Reading files
  • Listing directory contents

Bypassing Permission Prompts

You can disable permission prompts using the --no-permission flag, but this should be used with caution:

ollama-code-cli --no-permission "Your prompt here"

Warning: Disabling permission prompts allows the AI to execute operations without user confirmation. Only use this in trusted environments.


Available Tools

  • read_file: Read the contents of a file
  • write_file: Write content to a file
  • execute_code: Execute code in a subprocess
  • list_files: List files in a directory
  • run_command: Run a shell command

Examples

1. Create a Python script and save it to a file:

ollama-code-cli "Create a Python script that calculates factorial and save it to a file named factorial.py"

2. Read a file and explain its contents:

ollama-code-cli "Read the contents of main.py and explain what it does"

3. Execute a shell command:

ollama-code-cli "List all files in the current directory"

Interactive Mode

Launch the interactive mode for a conversational experience:

ollama-code-cli

In interactive mode, you can:

  • Have multi-turn conversations with the AI
  • See elegantly formatted responses with Markdown support
  • Watch tool calls and results in real-time with visual panels
  • Clear conversation history with the clear command
  • Exit gracefully with the exit command

Project Structure

ollama-code-cli/
├── ollama_code_cli/
│   ├── __init__.py
│   ├── cli/
│   │   ├── __init__.py
│   │   └── cli.py          # Main CLI interface
│   ├── tools/
│   │   ├── __init__.py
│   │   └── tool_manager.py # Tool implementations
├── pyproject.toml          # Project configuration
├── LICENSE
└── README.md

Dependencies


Contributing

Contributions are welcome! Please open an issue or submit a pull request for any improvements, bug fixes, or suggestions.


License

This project is licensed under the MIT License.

About

Ollama Code CLI is an interactive command-line tool that uses local LLMs via Ollama for coding tasks and advanced tool calling.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%