Quick Start
Section titled “Quick Start”Get the full YouLab stack running locally.
Prerequisites
Section titled “Prerequisites”Services Overview
Section titled “Services Overview”| Service | Port | Purpose |
|---|---|---|
| OpenWebUI | 3000 | Chat frontend |
| Ollama | 11434 | Local LLM inference (optional) |
| Letta Server | 8283 | Agent framework with persistent memory |
| HTTP Service | 8100 | FastAPI bridge (AgentManager, StrategyManager) |
Browser → OpenWebUI:3000 → HTTP Service:8100 → Letta:8283 → Claude APIStep 1: Clone and Setup
Section titled “Step 1: Clone and Setup”git clone https://github.com/youlab/youlab.gitcd youlab
# Install dependencies and pre-commit hooksmake setupStep 2: Start Docker Services
Section titled “Step 2: Start Docker Services”Option A: Full Stack (Recommended)
Section titled “Option A: Full Stack (Recommended)”Start OpenWebUI, Ollama, and Letta together:
# Start OpenWebUI + Ollamacd OpenWebUI/open-webuidocker compose up -d
# Start Letta (from project root)cd ../..docker run -d \ --name letta \ -p 8283:8283 \ -v letta-data:/root/.letta \ letta/letta:latestOption B: Letta Only (API Development)
Section titled “Option B: Letta Only (API Development)”If you only need the backend API without the chat UI:
docker run -d \ --name letta \ -p 8283:8283 \ -v letta-data:/root/.letta \ letta/letta:latestVerify Docker Services
Section titled “Verify Docker Services”# Check all containers are runningdocker ps
# Expected output:# CONTAINER ID IMAGE PORTS NAMES# ... ghcr.io/open-webui/open-webui 0.0.0.0:3000->8080/tcp open-webui# ... ollama/ollama 11434/tcp ollama# ... letta/letta 0.0.0.0:8283->8283/tcp letta
# Test Letta healthcurl http://localhost:8283/v1/health# Should return: {"status": "ok"}Step 3: Configure Environment
Section titled “Step 3: Configure Environment”cp .env.example .envEdit .env with your API keys:
# Letta connectionLETTA_BASE_URL=http://localhost:8283
# LLM provider (choose one)OPENAI_API_KEY=sk-...# orANTHROPIC_API_KEY=sk-ant-...Step 4: Start HTTP Service
Section titled “Step 4: Start HTTP Service”uv run youlab-serverThe service starts on http://localhost:8100.
# Verify it's runningcurl http://localhost:8100/health# Should return: {"status": "ok", "letta_connected": true, ...}Step 5: Access the UI
Section titled “Step 5: Access the UI”Open http://localhost:3000 in your browser.
First-time setup:
- Create an admin account (first user becomes admin)
- Go to Admin Panel → Settings → Functions → Pipes
- Add the YouLab Pipe from
src/youlab_server/pipelines/letta_pipe.py - Configure the Pipe valves:
LETTA_SERVICE_URL:http://host.docker.internal:8100AGENT_TYPE:tutor
Step 6: Test the API
Section titled “Step 6: Test the API”Create an Agent
Section titled “Create an Agent”curl -X POST http://localhost:8100/agents \ -H "Content-Type: application/json" \ -d '{"user_id": "test-user", "agent_type": "tutor"}'Response:
{ "agent_id": "agent-abc123", "user_id": "test-user", "agent_type": "tutor", "agent_name": "youlab_test-user_tutor"}Send a Message
Section titled “Send a Message”curl -X POST http://localhost:8100/chat \ -H "Content-Type: application/json" \ -d '{ "agent_id": "agent-abc123", "message": "Help me brainstorm essay topics about my identity" }'Stream a Response
Section titled “Stream a Response”curl -N -X POST http://localhost:8100/chat/stream \ -H "Content-Type: application/json" \ -d '{ "agent_id": "agent-abc123", "message": "What makes a compelling personal narrative?" }'Quick Reference
Section titled “Quick Reference”Start Everything
Section titled “Start Everything”# Docker servicescd OpenWebUI/open-webui && docker compose up -d && cd ../..docker start letta # if already created
# HTTP serviceuv run youlab-serverStop Everything
Section titled “Stop Everything”# Stop HTTP service: Ctrl+C
# Stop Dockerdocker stop open-webui ollama lettaRestart After Reboot
Section titled “Restart After Reboot”# Containers may be stopped or paused after rebootdocker start open-webui ollama letta
# If containers show as "Up" but are unreachable:docker unpause open-webui ollama letta
# Start HTTP serviceuv run youlab-serverTroubleshooting
Section titled “Troubleshooting”OpenWebUI not reachable (localhost:3000)
Section titled “OpenWebUI not reachable (localhost:3000)”# Check container statusdocker ps
# If status shows "(Paused)":docker unpause open-webui
# If container isn't running:docker start open-webui
# Check logsdocker logs open-webui --tail 50Letta server not responding
Section titled “Letta server not responding”# Check if container is runningdocker ps | grep letta
# View logsdocker logs letta --tail 50
# Restartdocker restart lettaHTTP service port already in use
Section titled “HTTP service port already in use”# Check what's using port 8100lsof -i :8100
# Kill the process if neededkill -9 <PID>Agent creation fails
Section titled “Agent creation fails”# Verify Letta healthcurl http://localhost:8283/v1/health
# Check service logs (run with debug)LOG_LEVEL=DEBUG uv run youlab-serverPort 3000 conflict with docs server
Section titled “Port 3000 conflict with docs server”OpenWebUI uses port 3000. If you need to serve docs simultaneously:
npx serve docs -p 3001Verify Everything Works
Section titled “Verify Everything Works”Run the test suite:
make verify-agentExpected output:
✓ Ruff check✓ Ruff format✓ Typecheck✓ Tests (45 tests in 2.1s)Next Steps
Section titled “Next Steps”- Architecture - Understand the system design
- HTTP Service - Explore all API endpoints
- Pipeline - OpenWebUI Pipe integration details
- Configuration - All environment variables