Honcho Integration
Section titled “Honcho Integration”Honcho provides message persistence for theory-of-mind (ToM) modeling in YouLab.
Overview
Section titled “Overview”Honcho captures all chat messages for long-term analysis:
- User messages - What students say
- Agent responses - What the tutor replies
- Session context - Chat IDs, titles, agent types
This enables future ToM features like:
- Student behavior modeling
- Learning pattern analysis
- Personalized recommendations
┌─────────────────────────────────────────────────────────────┐│ Chat Flow ││ ││ User Message ──► HTTP Service ──► Letta Server ││ │ ││ ▼ ││ HonchoClient ││ (fire-and-forget) ││ │ ││ ▼ ││ Honcho Service ││ (message persistence) │└─────────────────────────────────────────────────────────────┘Architecture
Section titled “Architecture”Honcho Concepts
Section titled “Honcho Concepts”| Concept | YouLab Mapping | Example |
|---|---|---|
| Workspace | Application | youlab |
| Peer | Message sender | student_{user_id}, tutor |
| Session | Chat thread | chat_{chat_id} |
| Message | Individual message | User or agent content |
Data Model
Section titled “Data Model”Workspace: "youlab"├── Peer: "student_user123"│ └── Messages from this student├── Peer: "student_user456"│ └── Messages from this student├── Peer: "tutor"│ └── All agent responses└── Session: "chat_abc123" └── Messages in this chat threadHonchoClient
Section titled “HonchoClient”Location: src/youlab_server/honcho/client.py
Initialization
Section titled “Initialization”from youlab_server.honcho import HonchoClient
client = HonchoClient( workspace_id="youlab", api_key=None, # Required for production environment="demo", # demo, local, or production)Lazy Loading
Section titled “Lazy Loading”The Honcho SDK client is lazily initialized on first use:
@propertydef client(self) -> Honcho | None: if self._client is None and not self._initialized: self._initialized = True # Initialize Honcho SDK... return self._clientIf initialization fails (network error, invalid credentials), client returns None and persistence is silently skipped.
Methods
Section titled “Methods”persist_user_message()
Section titled “persist_user_message()”Persist a user’s message:
await client.persist_user_message( user_id="user123", chat_id="chat456", message="Help me brainstorm essay topics", chat_title="Essay Brainstorming", agent_type="tutor",)persist_agent_message()
Section titled “persist_agent_message()”Persist an agent’s response:
await client.persist_agent_message( user_id="user123", # Which student this was for chat_id="chat456", message="Great! Let's explore some topics...", chat_title="Essay Brainstorming", agent_type="tutor",)check_connection()
Section titled “check_connection()”Verify Honcho is reachable:
if client.check_connection(): print("Honcho is available")query_dialectic()
Section titled “query_dialectic()”Query Honcho for insights about a student (theory-of-mind):
from youlab_server.honcho.client import SessionScope
response = await client.query_dialectic( user_id="user123", question="What learning style works best for this student?", session_scope=SessionScope.ALL, recent_limit=5,)
if response: print(response.insight) # "This student prefers hands-on examples..."| Parameter | Type | Default | Description |
|---|---|---|---|
user_id | string | Required | Student identifier |
question | string | Required | Natural language question |
session_scope | SessionScope | ALL | Which sessions to include |
session_id | string | None | Specific session (reserved) |
recent_limit | int | 5 | Number of recent sessions (reserved) |
SessionScope enum:
| Value | Description |
|---|---|
ALL | All sessions for this user |
RECENT | Last N sessions |
CURRENT | Current/active session only |
SPECIFIC | Explicit session ID |
Returns: DialecticResponse or None if unavailable.
@dataclassclass DialecticResponse: insight: str # Honcho's analysis session_scope: SessionScope query: str # Original questionFire-and-Forget Pattern
Section titled “Fire-and-Forget Pattern”Location: src/youlab_server/honcho/client.py:310-363
Messages are persisted asynchronously without blocking the chat response:
from youlab_server.honcho.client import create_persist_task
# In chat endpoint - doesn't block responsecreate_persist_task( honcho_client=honcho, user_id="user123", chat_id="chat456", message="User's message", is_user=True, chat_title="My Chat", agent_type="tutor",)Graceful Degradation
Section titled “Graceful Degradation”- If
honcho_clientisNone, persistence is skipped - If
chat_idis empty, persistence is skipped - If Honcho is unreachable, errors are logged but not raised
- Chat functionality continues regardless of Honcho status
HTTP Service Integration
Section titled “HTTP Service Integration”Location: src/youlab_server/server/main.py
Initialization
Section titled “Initialization”Honcho is initialized during service startup:
@asynccontextmanagerasync def lifespan(app: FastAPI): # ... other initialization ...
if settings.honcho_enabled: app.state.honcho_client = HonchoClient( workspace_id=settings.honcho_workspace_id, api_key=settings.honcho_api_key, environment=settings.honcho_environment, ) honcho_ok = app.state.honcho_client.check_connection() log.info("honcho_initialized", connected=honcho_ok) else: app.state.honcho_client = None log.info("honcho_disabled")Health Endpoint
Section titled “Health Endpoint”The /health endpoint reports Honcho status:
{ "status": "ok", "letta_connected": true, "honcho_connected": true, "version": "0.1.0"}Chat Endpoints
Section titled “Chat Endpoints”Both /chat and /chat/stream persist messages:
- User message - Persisted before sending to Letta
- Agent response - Persisted after receiving from Letta
Configuration
Section titled “Configuration”Environment Variables
Section titled “Environment Variables”| Variable | Default | Description |
|---|---|---|
YOULAB_SERVICE_HONCHO_ENABLED | true | Enable persistence |
YOULAB_SERVICE_HONCHO_WORKSPACE_ID | youlab | Workspace ID |
YOULAB_SERVICE_HONCHO_API_KEY | null | API key (production) |
YOULAB_SERVICE_HONCHO_ENVIRONMENT | demo | Environment |
Environments
Section titled “Environments”| Environment | Use Case | API Key |
|---|---|---|
demo | Development/testing | Not required |
local | Local Honcho server | Not required |
production | Production deployment | Required |
Metadata
Section titled “Metadata”Messages include metadata for context:
metadata = { "chat_id": "chat456", "agent_type": "tutor", "chat_title": "Essay Brainstorming", # Optional "user_id": "user123", # Agent messages only}Related Pages
Section titled “Related Pages”- Architecture - System overview with Honcho
- HTTP Service - Chat endpoint details
- Background Agents - Background agents using dialectic queries
- Agent Tools - Agent tools including query_honcho
- Configuration - Environment variables
- Roadmap - ToM integration plans