class SimpleChatMemory
A simple chat memory class that stores and manages a rolling history of user and assistant message pairs with configurable maximum history length.
/tf/active/vicechatdev/OneCo_hybrid_RAG copy.py
685 - 716
simple
Purpose
SimpleChatMemory provides a lightweight mechanism for maintaining conversational context in chatbot applications. It stores message exchanges between users and assistants, automatically trims old messages when the history exceeds the configured limit, and formats the conversation history for injection into prompts. This is useful for maintaining context in multi-turn conversations with language models.
Source Code
class SimpleChatMemory:
"""
A bare-bones memory mechanism that stores a list of
user and assistant messages.
"""
def __init__(self, max_history: int = 3):
self.messages = []
self.max_history = max_history
def save_context(self, user_msg: Dict[str, str], assistant_msg: Dict[str, str]):
"""
Expects something like:
user_msg = {'role': 'user', 'content': 'Hello'}
assistant_msg = {'role': 'assistant', 'content': 'Hi!'}
"""
self.messages.append(user_msg)
self.messages.append(assistant_msg)
# Trim messages if we exceed max_history*2
if len(self.messages) > self.max_history * 2:
self.messages = self.messages[-(self.max_history):]
def get_formatted_history(self) -> str:
"""
Return the last N pairs of messages as a formatted string
that can be injected into a prompt.
"""
lines = []
for msg in self.messages[-(self.max_history * 2):]:
role = msg['role']
content = msg['content']
lines.append(f"{role.title()}: {content}")
return "\n".join(lines)
Parameters
| Name | Type | Default | Kind |
|---|---|---|---|
bases |
- | - |
Parameter Details
max_history: The maximum number of message pairs (user + assistant) to retain in memory. Default is 3, meaning up to 3 user messages and 3 assistant responses will be kept. When exceeded, older messages are automatically removed to maintain this limit.
Return Value
Instantiation returns a SimpleChatMemory object. The get_formatted_history() method returns a string containing formatted conversation history with each message on a new line in the format 'Role: content'. The save_context() method returns None but modifies internal state.
Class Interface
Methods
__init__(self, max_history: int = 3)
Purpose: Initializes the SimpleChatMemory instance with an empty message list and configurable history limit
Parameters:
max_history: Maximum number of message pairs to retain (default: 3)
Returns: None (constructor)
save_context(self, user_msg: Dict[str, str], assistant_msg: Dict[str, str])
Purpose: Stores a user-assistant message pair and automatically trims history if it exceeds the maximum limit
Parameters:
user_msg: Dictionary with 'role' key set to 'user' and 'content' key containing the user's messageassistant_msg: Dictionary with 'role' key set to 'assistant' and 'content' key containing the assistant's response
Returns: None (modifies internal state)
get_formatted_history(self) -> str
Purpose: Returns the conversation history as a formatted string suitable for prompt injection, with each message on a new line
Returns: A newline-separated string where each line is formatted as 'Role: content' (e.g., 'User: Hello\nAssistant: Hi!')
Attributes
| Name | Type | Description | Scope |
|---|---|---|---|
messages |
List[Dict[str, str]] | List storing all message dictionaries in chronological order, where each dictionary contains 'role' and 'content' keys | instance |
max_history |
int | Maximum number of message pairs to retain in memory before trimming occurs | instance |
Dependencies
typing
Required Imports
from typing import Dict
Usage Example
# Create a memory instance with max 3 message pairs
memory = SimpleChatMemory(max_history=3)
# Save a conversation exchange
user_msg = {'role': 'user', 'content': 'What is Python?'}
assistant_msg = {'role': 'assistant', 'content': 'Python is a programming language.'}
memory.save_context(user_msg, assistant_msg)
# Save another exchange
user_msg2 = {'role': 'user', 'content': 'Is it easy to learn?'}
assistant_msg2 = {'role': 'assistant', 'content': 'Yes, Python is beginner-friendly.'}
memory.save_context(user_msg2, assistant_msg2)
# Get formatted history for prompt injection
history = memory.get_formatted_history()
print(history)
# Output:
# User: What is Python?
# Assistant: Python is a programming language.
# User: Is it easy to learn?
# Assistant: Yes, Python is beginner-friendly.
Best Practices
- Always pass messages as dictionaries with 'role' and 'content' keys to ensure proper formatting
- The max_history parameter controls message pairs, so the actual message count will be max_history * 2
- Messages are automatically trimmed when the limit is exceeded, with oldest messages removed first
- Use get_formatted_history() to retrieve conversation context for prompt injection rather than accessing messages directly
- The class maintains state through the messages list, so create separate instances for different conversation threads
- Note that the trimming logic in save_context has a bug: it should use self.messages[-(self.max_history * 2):] instead of self.messages[-(self.max_history):] to properly maintain max_history pairs
- This is a simple in-memory storage solution with no persistence; messages are lost when the object is destroyed
Tags
Similar Components
AI-powered semantic similarity - components with related functionality:
-
class Reminder 42.6% similar
-
class ChatGptRequestOptions 40.8% similar
-
class TeamChannelManager 40.3% similar
-
class PresenceStatusMessage 39.4% similar
-
class Conversation 39.3% similar