GitHub – localgpt-app/localgpt

🔥 Check out this must-read post from Hacker News 📖

📂 **Category**:

📌 **What You’ll Learn**:

A local device focused AI assistant built in Rust — persistent memory, autonomous tasks, ~27MB binary. Inspired by and compatible with OpenClaw.

cargo install localgpt

  • Single binary — no Node.js, Docker, or Python required
  • Local device focused — runs entirely on your machine, your memory data stays yours
  • Persistent memory — markdown-based knowledge store with full-text and semantic search
  • Autonomous heartbeat — delegate tasks and let it work in the background
  • Multiple interfaces — CLI, web UI, desktop GUI
  • Multiple LLM providers — Anthropic (Claude), OpenAI, Ollama
  • OpenClaw compatible — works with SOUL, MEMORY, HEARTBEAT markdown files and skills format
# Initialize configuration
localgpt config init

# Start interactive chat
localgpt chat

# Ask a single question
localgpt ask "What is the meaning of life?"

# Run as a daemon with heartbeat, HTTP API and web ui
localgpt daemon start

LocalGPT uses plain markdown files as its memory:

~/.localgpt/workspace/
├── MEMORY.md            # Long-term knowledge (auto-loaded each session)
├── HEARTBEAT.md         # Autonomous task queue
├── SOUL.md              # Personality and behavioral guidance
└── knowledge/           # Structured knowledge bank (optional)
    ├── finance/
    ├── legal/
    └── tech/

Files are indexed with SQLite FTS5 for fast keyword search, and sqlite-vec for semantic search with local embeddings

Stored at ~/.localgpt/config.toml:

[agent]
default_model = "claude-cli/opus"

[providers.anthropic]
api_key = "$💬"

[heartbeat]
enabled = true
interval = "30m"
active_hours = 💬

[memory]
workspace = "~/.localgpt/workspace"
# Chat
localgpt chat                     # Interactive chat
localgpt chat --session <id>      # Resume session
localgpt ask "question"           # Single question

# Daemon
localgpt daemon start             # Start background daemon
localgpt daemon stop              # Stop daemon
localgpt daemon status            # Show status
localgpt daemon heartbeat         # Run one heartbeat cycle

# Memory
localgpt memory search "query"    # Search memory
localgpt memory reindex           # Reindex files
localgpt memory stats             # Show statistics

# Config
localgpt config init              # Create default config
localgpt config show              # Show current config

When the daemon is running:

Endpoint Description
GET /health Health check
GET /api/status Server status
POST /api/chat Chat with the assistant
GET /api/memory/search?q= Search memory
GET /api/memory/stats Memory statistics

Rust, Tokio, Axum, SQLite (FTS5 + sqlite-vec), fastembed, eframe

Apache-2.0

{💬|⚡|🔥} **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#GitHub #localgptapplocalgpt**

🕒 **Posted on**: 1770518550

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *