Bob is an LLM-powered coding agent built in Rust with a hexagonal (ports & adapters) architecture. It connects to language models via the genai crate and to external tools via MCP servers using rmcp.
- 🤖 Multi-Model Support: Works with OpenAI, Anthropic, Google, Groq, and more
- 🔧 Tool Integration: Connect to MCP servers for file operations, shell commands, and custom tools
- 🎯 Skill System: Load and apply predefined skills for specialized tasks
- 💬 Interactive REPL: Chat with the AI agent through a terminal interface
- 🔄 Streaming Responses: Real-time streaming of LLM responses
- 📊 Observability: Built-in tracing and event logging
- 🏗️ Clean Architecture: Hexagonal (ports & adapters) design for extensibility
This workspace contains the following crates:
| Crate | Description | Links |
|---|---|---|
| bob-core | Core domain types and port traits | |
| bob-runtime | Runtime orchestration layer | |
| bob-adapters | Adapter implementations | |
| cli-agent | CLI application | - |
bin/cli-agent — CLI composition root (config, REPL)
crates/bob-core — Domain types and port traits (LlmPort, ToolPort, SessionStore, EventSink)
crates/bob-runtime — Scheduler FSM, prompt builder, action parser, CompositeToolPort
crates/bob-adapters — Concrete adapter implementations (genai, rmcp, in-memory store, tracing)
┌─────────────────────────────────────────────────────────────┐
│ CLI Agent (bin) │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ DefaultAgentRuntime │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │ │
│ │ │Scheduler │→ │Prompt │→ │Action Parser │ │ │
│ │ │ FSM │ │Builder │ │ │ │ │
│ │ └──────────┘ └──────────┘ └──────────────────┘ │ │
│ └─────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
↓ uses ports (traits) from bob-core
┌─────────────────────────────────────────────────────────────┐
│ Adapters (bob-adapters) │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │GenAI LLM │ │MCP Tools │ │In-Memory │ │ Tracing │ │
│ │ │ │ │ │ Store │ │ Events │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │
└─────────────────────────────────────────────────────────────┘
See docs/design.md for the full design document.
# Install Rust (stable)
rustup install stable
# Install dev tools
just setup# Clone the repository
git clone https://github.com/longcipher/bob.git
cd bob
# Build
cargo build --release
# Run
cargo run --release --bin cli-agent -- --config agent.toml# Install the CLI agent
cargo install --git https://github.com/longcipher/bob cli-agent
# Run
cli-agent --config agent.tomlCreate an agent.toml in the project root:
[runtime]
default_model = "openai:gpt-4o-mini"
max_steps = 12
turn_timeout_ms = 90000
model_context_tokens = 128000
# Optional: Configure MCP tool servers
[mcp]
[[mcp.servers]]
id = "filesystem"
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
tool_timeout_ms = 15000
# Optional: Configure skills
[skills]
max_selected = 3
token_budget_ratio = 0.1
[[skills.sources]]
type = "directory"
path = "./skills"
recursive = false
# Optional: Configure policies
[policy]
deny_tools = ["local/shell_exec"]
allow_tools = ["local/read_file", "local/write_file"]Set your LLM provider API key:
# For OpenAI
export OPENAI_API_KEY="sk-..."
# For Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
# For Google
export GEMINI_API_KEY="..."cargo run --bin cli-agent -- --config agent.tomlThe REPL prints > when ready. Type a message and press Enter. Use /quit to exit.
> Read the main.rs file and explain what it does
I'll read the main.rs file for you...
[uses filesystem tool to read the file]
The main.rs file implements the CLI agent composition root. It loads
configuration, wires up adapters (LLM, tools, storage, events), creates
the runtime, and runs the REPL loop.
> Now add error handling to that function
[agent modifies the code]
I've added error handling to the function. The changes include:
- Using `Result` return type
- Adding context with `.wrap_err()`
- Handling specific error cases
Bob supports all providers available through genai:
| Provider | Model Examples | Configuration |
|---|---|---|
| OpenAI | gpt-4o, gpt-4o-mini |
Set OPENAI_API_KEY |
| Anthropic | claude-3-5-sonnet-20241022 |
Set ANTHROPIC_API_KEY |
gemini-2.0-flash-exp |
Set GEMINI_API_KEY |
|
| Groq | llama-3.3-70b-versatile |
Set GROQ_API_KEY |
| Cohere | command-r-plus |
Set COHERE_API_KEY |
Bob integrates with Model Context Protocol (MCP) servers:
- Filesystem:
@modelcontextprotocol/server-filesystem - GitHub:
@modelcontextprotocol/server-github - PostgreSQL:
@modelcontextprotocol/server-postgres - Slack:
@modelcontextprotocol/server-slack
You can build custom MCP servers in any language that supports the protocol.
# Format code
just format
# Run lints (typos, clippy, machete, etc.)
just lint
# Run all tests
just test
# Full CI check (lint + test + build)
just ci.
├── bin/
│ └── cli-agent/ # CLI application
├── crates/
│ ├── bob-core/ # Domain types and ports
│ ├── bob-runtime/ # Runtime orchestration
│ └── bob-adapters/ # Adapter implementations
├── docs/
│ └── design.md # Architecture design
├── specs/ # Task specifications
└── .opencode/ # AI development skills
The workspace uses strict clippy lints with the following principles:
- Pedantic by default: Enable all pedantic lints, then allow specific ones that are too noisy
- Panic safety: Deny
unwrap,expect, andpanic— use proper error handling - No debug code: Deny
dbg!,todo!, andunimplemented! - No stdout in libraries: Use
tracinginstead ofprintln!/eprintln!
Always use cargo add:
# Add to workspace
cargo add <crate> --workspace
# Add to specific crate
cargo add <crate> -p <crate-name>Each library crate can be published independently:
# Publish bob-core
cargo publish -p bob-core
# Publish bob-runtime
cargo publish -p bob-runtime
# Publish bob-adapters
cargo publish -p bob-adaptersDocumentation is automatically generated on docs.rs when published to crates.io:
Contributions are welcome! Please read our contributing guidelines before submitting PRs.
- Fork the repository
- Create a feature branch
- Make your changes
- Run
just cito ensure all checks pass - Submit a pull request
- Persistent session storage (SQLite, PostgreSQL)
- Web UI for agent interaction
- Multi-agent collaboration
- Custom skill marketplace
- Agent memory and context management
- Tool composition and chaining
- More MCP server integrations
Licensed under the Apache License, Version 2.0. See LICENSE.md for details.
- genai - Unified LLM API
- rmcp - MCP client implementation
- agent-skills - Skill system
- Model Context Protocol - Tool integration protocol
- Documentation: docs.rs
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Note: This project is in active development. APIs may change between versions.