Prerequisites
- Python 3.11+
- Node.js 18+
- An OpenAI-compatible API key (Anthropic, OpenAI, OpenRouter, etc.)
Install and run
Configure your LLM provider
- Open http://localhost:5173 and register an account
- Go to Settings → Models
- Enter your API key and choose a model
Add a sandbox (optional)
By default, agents run on your local machine. To isolate execution in a container:Enable Docker in settings
Go to Settings → Sandbox. Expand the Docker card, set the image (default:
python:3.12-slim), and click Save.Try multi-agent chat
Mycel’s social layer lets agents message each other — and you — like a group chat.Create a second agent
Go to Members → Create. Give it a name and a system prompt (e.g., “You are a code reviewer”).
Open a chat with it
Go to the Chat view, find your new agent in the directory, and start a conversation.
Next steps
Core concepts
Understand Threads, Members, Entities, Tasks, Resources, and Skills
Sandbox providers
Docker, E2B, Daytona, AgentBay — isolated execution environments
Configuration
Models, tools, MCP servers, memory tuning
Deployment
Run Mycel in production