Skip to content

← Back to YouLab

Get the full YouLab stack running locally.

  • Python 3.11+
  • uv package manager
  • Docker (for OpenWebUI, Ollama, and Letta)
ServicePortPurpose
OpenWebUI3000Chat frontend
Ollama11434Local LLM inference (optional)
Letta Server8283Agent framework with persistent memory
HTTP Service8100FastAPI bridge (AgentManager, StrategyManager)
Browser → OpenWebUI:3000 → HTTP Service:8100 → Letta:8283 → Claude API

Terminal window
git clone https://github.com/youlab/youlab.git
cd youlab
# Install dependencies and pre-commit hooks
make setup

Start OpenWebUI, Ollama, and Letta together:

Terminal window
# Start OpenWebUI + Ollama
cd OpenWebUI/open-webui
docker compose up -d
# Start Letta (from project root)
cd ../..
docker run -d \
--name letta \
-p 8283:8283 \
-v letta-data:/root/.letta \
letta/letta:latest

If you only need the backend API without the chat UI:

Terminal window
docker run -d \
--name letta \
-p 8283:8283 \
-v letta-data:/root/.letta \
letta/letta:latest
Terminal window
# Check all containers are running
docker ps
# Expected output:
# CONTAINER ID IMAGE PORTS NAMES
# ... ghcr.io/open-webui/open-webui 0.0.0.0:3000->8080/tcp open-webui
# ... ollama/ollama 11434/tcp ollama
# ... letta/letta 0.0.0.0:8283->8283/tcp letta
# Test Letta health
curl http://localhost:8283/v1/health
# Should return: {"status": "ok"}

Terminal window
cp .env.example .env

Edit .env with your API keys:

Terminal window
# Letta connection
LETTA_BASE_URL=http://localhost:8283
# LLM provider (choose one)
OPENAI_API_KEY=sk-...
# or
ANTHROPIC_API_KEY=sk-ant-...

Terminal window
uv run youlab-server

The service starts on http://localhost:8100.

Terminal window
# Verify it's running
curl http://localhost:8100/health
# Should return: {"status": "ok", "letta_connected": true, ...}

Open http://localhost:3000 in your browser.

First-time setup:

  1. Create an admin account (first user becomes admin)
  2. Go to Admin Panel → Settings → Functions → Pipes
  3. Add the YouLab Pipe from src/youlab_server/pipelines/letta_pipe.py
  4. Configure the Pipe valves:
    • LETTA_SERVICE_URL: http://host.docker.internal:8100
    • AGENT_TYPE: tutor

Terminal window
curl -X POST http://localhost:8100/agents \
-H "Content-Type: application/json" \
-d '{"user_id": "test-user", "agent_type": "tutor"}'

Response:

{
"agent_id": "agent-abc123",
"user_id": "test-user",
"agent_type": "tutor",
"agent_name": "youlab_test-user_tutor"
}
Terminal window
curl -X POST http://localhost:8100/chat \
-H "Content-Type: application/json" \
-d '{
"agent_id": "agent-abc123",
"message": "Help me brainstorm essay topics about my identity"
}'
Terminal window
curl -N -X POST http://localhost:8100/chat/stream \
-H "Content-Type: application/json" \
-d '{
"agent_id": "agent-abc123",
"message": "What makes a compelling personal narrative?"
}'

Terminal window
# Docker services
cd OpenWebUI/open-webui && docker compose up -d && cd ../..
docker start letta # if already created
# HTTP service
uv run youlab-server
Terminal window
# Stop HTTP service: Ctrl+C
# Stop Docker
docker stop open-webui ollama letta
Terminal window
# Containers may be stopped or paused after reboot
docker start open-webui ollama letta
# If containers show as "Up" but are unreachable:
docker unpause open-webui ollama letta
# Start HTTP service
uv run youlab-server

Terminal window
# Check container status
docker ps
# If status shows "(Paused)":
docker unpause open-webui
# If container isn't running:
docker start open-webui
# Check logs
docker logs open-webui --tail 50
Terminal window
# Check if container is running
docker ps | grep letta
# View logs
docker logs letta --tail 50
# Restart
docker restart letta
Terminal window
# Check what's using port 8100
lsof -i :8100
# Kill the process if needed
kill -9 <PID>
Terminal window
# Verify Letta health
curl http://localhost:8283/v1/health
# Check service logs (run with debug)
LOG_LEVEL=DEBUG uv run youlab-server

OpenWebUI uses port 3000. If you need to serve docs simultaneously:

Terminal window
npx serve docs -p 3001

Run the test suite:

Terminal window
make verify-agent

Expected output:

✓ Ruff check
✓ Ruff format
✓ Typecheck
✓ Tests (45 tests in 2.1s)