-
@ GHOST
2025-02-12 02:20:27I fought with making this work for two days before I finally figured out that I made a simple SQL format mistake. I asked chatGPT to write a tutorial on how to do it based on what I did and the work I did with it. Sharing it in case anyone finds it useful.
Enabling Persistent Memory in Open WebUI with Ollama
This tutorial will guide you through the process of enabling persistent memory for your AI models running locally with Ollama and Open WebUI on Debian Linux. By the end of this guide, your AI will be able to remember the last 20 conversations you’ve had with it.
Prerequisites
- Debian Linux system
- Ollama installed and configured
- Open WebUI installed and running
Step 1: Setting Up the Database for Persistent Memory
We'll use SQLite to store conversation history.
1.1 Create
conversation_memory.py
Navigate to your Open WebUI backend directory and create a new file called
conversation_memory.py
.bash cd /home/your_username/Documents/open-webui-0.5.10/backend nano conversation_memory.py
Paste the following code into
conversation_memory.py
:```python import sqlite3 import os
Set DB_FILE to the absolute path of memory.db in the same directory as this script
DB_FILE = "/home/your_username/Documents/open-webui-0.5.10/backend/memory.db"
def init_db(): """Create database table if it doesn't exist""" conn = sqlite3.connect(DB_FILE) c = conn.cursor() c.execute(""" CREATE TABLE IF NOT EXISTS memory ( id INTEGER PRIMARY KEY AUTOINCREMENT, user TEXT, ai TEXT ) """) conn.commit() conn.close()
def save_conversation(user_input, ai_response): """Save a conversation entry to the database and maintain only the last 20 entries.""" try: conn = sqlite3.connect(DB_FILE) c = conn.cursor() c.execute("INSERT INTO memory (user, ai) VALUES (?, ?)", (user_input, ai_response)) c.execute(""" DELETE FROM memory WHERE id NOT IN ( SELECT id FROM memory ORDER BY id DESC LIMIT 20 ) """) conn.commit() conn.close() print(f"Successfully saved: User - {user_input}, AI - {ai_response}") except Exception as e: print(f"Error saving conversation: {e}")
def get_last_conversations(limit=5): """Retrieve the last 'limit' number of conversations""" try: conn = sqlite3.connect(DB_FILE) c = conn.cursor() c.execute("SELECT user, ai FROM memory ORDER BY id DESC LIMIT ?", (limit,)) conversations = c.fetchall() conn.close() return conversations except Exception as e: print(f"Error retrieving conversations: {e}") return []
Initialize the database when this script is run
init_db() ```
Step 2: Integrating Memory into Open WebUI Middleware
We'll modify the Open WebUI middleware to save and retrieve conversations.
2.1 Edit
middleware.py
Open the
middleware.py
file for editing:bash nano middleware.py
2.2 Import Memory Functions
At the top of the file, import the memory functions:
python from conversation_memory import save_conversation, get_last_conversations
2.3 Retrieve and Append Conversation History
Locate the function responsible for processing chat payloads. Add the following code to retrieve and append the last 20 conversations:
```python
Retrieve past conversations (e.g., last 20 messages)
conversation_history = get_last_conversations(limit=20)
Format past conversations as context
history_text = "\n".join([f"User: {conv[0]}\nAI: {conv[1]}" for conv in conversation_history])
Append conversation history to the current user message
user_message = get_last_user_message(form_data["messages"]) if history_text: combined_message = f"Previous conversation:\n{history_text}\n\nNew message:\nUser: {user_message}" else: combined_message = f"User: {user_message}"
Update the last user message with combined history
form_data["messages"][-1]["content"] = combined_message ```
2.4 Save New Conversations
Ensure that new conversations are saved after the AI generates a response. Add the following code where the AI response is handled:
```python
Extract AI response content
if isinstance(ai_response, dict) and "choices" in ai_response: ai_response_content = ai_response["choices"][0]["message"]["content"] else: ai_response_content = ""
Save the new conversation
if ai_response_content.strip(): save_conversation(user_message, ai_response_content) ```
Step 3: Testing Persistent Memory
3.1 Run the Script to Test Saving
Run
conversation_memory.py
to ensure it's saving data correctly:bash python3 /home/your_username/Documents/open-webui-0.5.10/backend/conversation_memory.py
3.2 Query the Database to Verify Data
Use SQLite to check if conversations are being saved:
bash sqlite3 /home/your_username/Documents/open-webui-0.5.10/backend/memory.db sqlite> SELECT * FROM memory;
You should see your test conversations listed.
Step 4: Final Verification in Open WebUI
- Restart the Open WebUI server to apply changes.
- Start a conversation with the AI.
- After several interactions, verify if the AI references past conversations.
- Query
memory.db
again to ensure new conversations are being saved:
bash sqlite3 /home/your_username/Documents/open-webui-0.5.10/backend/memory.db sqlite> SELECT * FROM memory;
Conclusion
You’ve successfully enabled persistent memory for your AI models running with Ollama and Open WebUI! The AI will now remember the last 20 conversations, creating a more dynamic and personalized user experience.
Feel free to adjust the memory limit or expand the functionality as needed. Happy coding!
Advocating for privacy does not finance itself. If you enjoyed this article, please consider zapping or sending monero
82XCDNK1Js8TethhpGLFPbVyKe25DxMUePad1rUn9z7V6QdCzxHEE7varvVh1VUidUhHVSA4atNU2BTpSNJLC1BqSvDajw1