Skip to main content

Memory Integration

Chat automatically manages conversation memory through the Memory system. It handles conversation history, summarization, and user profiles.

Memory Features

Chat integrates with Memory to provide:
  • Full Session Memory: Complete conversation history stored in InteractionSession.chat_history
  • Summary Memory: Automatic conversation summarization (requires model configuration)
  • User Analysis Memory: User profile tracking and analysis (requires model configuration)

Basic Memory Usage

from upsonic import Agent, Chat

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    full_session_memory=True
)

# Chat automatically uses memory
await chat.invoke("My name is Alice")
await chat.invoke("What's my name?")  # Agent remembers from previous message

Advanced Memory Configuration

from upsonic import Agent, Chat

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    full_session_memory=True,
    summary_memory=True,  # Requires model (uses agent.model)
    user_analysis_memory=True,  # Requires model (uses agent.model)
    num_last_messages=50,  # Limits to last 50 message pairs
    feed_tool_call_results=True  # Include tool calls in history
)
Note: summary_memory and user_analysis_memory require the agent’s model to be configured. They use agent.model for LLM-based summarization and user trait extraction.

Memory with Custom Storage

from upsonic import Agent, Chat
from upsonic.storage.providers import SqliteStorage

storage = SqliteStorage("sessions", "profiles", "chat.db")
agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    storage=storage,
    full_session_memory=True,
    summary_memory=True
)

# Memory persists across sessions
await chat.invoke("Remember: I prefer dark mode")
# Later, in a new session with same user_id
chat2 = Chat(
    session_id="session2",
    user_id="user1",
    agent=agent,
    storage=storage,
    full_session_memory=True,
    summary_memory=True
)
await chat2.invoke("What's my preference?")  # Agent remembers

User Profile Schema

Use a custom Pydantic model for structured user profiles:
from pydantic import BaseModel
from upsonic import Agent, Chat

class UserProfile(BaseModel):
    name: str
    preferences: dict

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    user_profile_schema=UserProfile,
    user_analysis_memory=True
)

Dynamic Profile Schema

Enable automatic schema generation from conversations:
from upsonic import Agent, Chat

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    dynamic_user_profile=True,  # Automatically generate schema fields
    user_analysis_memory=True  # Required for dynamic profiles
)

# The schema will be automatically generated based on user conversations
# Fields are extracted as the user mentions information
When dynamic_user_profile=True, the system automatically identifies 2-5 relevant traits from conversations and creates a dynamic schema. This is useful when you don’t know the user profile structure in advance. Note: If both dynamic_user_profile=True and user_profile_schema are provided, the dynamic schema takes precedence.

Memory Modes

  • update: Incrementally update user profiles, merging new traits with existing ones (default)
  • replace: Replace user profiles completely on each update
from upsonic import Agent, Chat

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    user_analysis_memory=True,
    user_memory_mode="update"  # or "replace"
)

Limiting Message History

The num_last_messages parameter limits conversation history to the last N message pairs (request-response pairs):
from upsonic import Agent, Chat

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    full_session_memory=True,
    num_last_messages=20  # Keep last 20 request-response pairs
)
This helps manage context size and reduce token costs for long conversations.

Tool Call Filtering

By default, tool calls are excluded from memory. Enable them with feed_tool_call_results:
from upsonic import Agent, Chat

agent = Agent("openai/gpt-4o")

chat = Chat(
    session_id="session1",
    user_id="user1",
    agent=agent,
    full_session_memory=True,
    feed_tool_call_results=True  # Include tool execution results
)
When False (default), tool-related messages are filtered from chat_history to keep memory focused on user interactions.