Skip to content

jmuncor/mumpu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mumpu

Middleware for Universal Memory Persistence and Understanding

A transparent HTTP relay proxy that adds long-term memory to any LLM application. Point your tools at mumpu and it remembers everything across sessions — extracts knowledge, builds connections, and injects relevant context automatically.

  Your tools ──► mumpu (:8420) ──► OpenAI / Anthropic / Gemini
                   │
              Extracts memories
              Builds connections
              Injects context

Quick Start

pip install -e ".[dev]"

# Start the proxy with TUI dashboard
mumpu start

# In another terminal, use Claude through the proxy
export ANTHROPIC_BASE_URL=http://localhost:8420
mumpu claude

Open http://localhost:8420/dashboard to see the memory graph grow in real time.

How It Works

  • Middleware — relay between your tools and the API
  • Universal — works with any tool, any provider (OpenAI, Anthropic, Gemini)
  • Memory — extracts and stores knowledge from conversations
  • Persistence — memories survive across sessions in SQLite
  • Understanding — smart retrieval with graph-based connections, not dumb storage

Documentation

About

Memory that sits between your LLM client and provider, building a knowledge from every conversation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors