oli - Open Local Intelligent assistant
oli is an open-source alternative to Claude Code with powerful agentic capabilities for coding assistance. Features:
- A modern hybrid architecture:
- Rust backend for performance and core functionality
- React/Ink frontend for a beautiful, interactive terminal UI
- Support for both cloud APIs (Anthropic Claude Sonnet 3.7, OpenAI GPT4o and Google GEMINI) and local LLMs (via Ollama)
- Strong agentic capabilities including file search, edit, and command execution
- Tool use support across all model providers (Anthropic, OpenAI, Google and Ollama)
⚠️ This project is in a very early stage and is prone to bugs and issues! Please post your issues as you encounter them.
Installation
From Source
# Clone the repository
git clone https://github.com/amrit110/oli
cd oli
# Build both backend and frontend
./build.sh
# Run the hybrid application
./run.sh
Environment Setup
Development Setup
# Install Python dependencies (for pre-commit)
python -m pip install uv
uv venv
uv pip install -e .
# Install pre-commit hooks
pre-commit install
# Run Rust linting and formatting
cargo fmt
cargo clippy
# Run TypeScript checks in the UI directory
cd ui
npm run lint
npm run format
Cloud API Models
For API-based features, set up your environment variables:
# Create a .env file in the project root
echo "ANTHROPIC_API_KEY=your_key_here" > .env
# OR
echo "OPENAI_API_KEY=your_key_here" > .env
# OR
echo "GEMINI_API_KEY=your_key_here" > .env
Using Anthropic Claude 3.7 Sonnet (Recommended)
Claude 3.7 Sonnet provides the most reliable and advanced agent capabilities:
- Obtain an API key from Anthropic
- Set the ANTHROPIC_API_KEY environment variable
- Select the "Claude 3.7 Sonnet" model in the UI
This implementation includes:
- Optimized system prompts for Claude 3.7
- JSON schema output formatting for structured responses
- Improved error handling and retry mechanisms
Using Ollama Models
oli supports local models through Ollama:
- Install Ollama if you haven't already
- Start the Ollama server:
ollama serve
- Pull the model you want to use (we recommend models with tool use capabilities):
# Examples of compatible models ollama pull qwen2.5-coder:14b ollama pull qwen2.5-coder:3b ollama pull llama3:8b
- Start oli and select the Ollama model from the model selection menu
Note: For best results with tool use and agent capabilities, use models like Qwen 2.5 Coder which support function calling.
Usage
- Start the application:
./run.sh
-
Select a model:
- Cloud models (Claude 3 Sonnet, GPT-4o, Gemini 2.5) for full agent capabilities
- Local models via Ollama (Qwen, Llama, etc.)
-
Make your coding query in the chat interface:
- Ask for file searches
- Request code edits
- Execute shell commands
- Get explanations of code
Architecture
The application uses a hybrid architecture:
┌───────────────┐ ┌───────────────┐
│ React + Ink UI│◄───────┤ Rust Backend │
│ │ JSON │ │
│ - UI │ RPC │ - Agent │
│ - Task Display│ │ - Tool Exec │
│ - Loading │ │ - Code Parse │
└───────────────┘ └───────────────┘
- Rust Backend: Handles agent functionality, tool execution, and API calls
- React/Ink Frontend: Provides a modern, interactive terminal interface with smooth animations
Examples
Here are some example queries to try:
- "Explain the codebase and how to get started"
- "List all files in the project"
- "Summarize the Cargo.toml file"
- "Show me all files that import the 'anyhow' crate"
License
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
Acknowledgments
- This project is inspired by Claude Code and similar AI assistants
- Uses Anthropic's Claude 3.7 Sonnet model for optimal agent capabilities
- Backend built with Rust for performance and reliability
- Frontend built with React and Ink for a modern terminal UI experience
- Special thanks to the Rust and React communities for excellent libraries and tools