Rowboat | Your AI coworker, with memory


Rowboat
Rowboat

Introduction

Rowboat is an open-source, local-first AI coworker designed to turn everyday work documents, emails, and meeting transcripts into an enduring, transparent knowledge graph. Unlike typical RAG systems that search files on-demand and start cold with every query, Rowboat builds and structures context over time into an Obsidian-compatible vault of plain Markdown files. Users can leverage this local ‘working memory’ to autonomously draft context-rich emails, synthesize meeting briefs, generate PDF roadmaps, and execute complex workflows through local LLMs (via Ollama/LM Studio) or cloud APIs, augmented by Model Context Protocol (MCP) integrations.

Use Cases

  • Meeting Intelligence & Preparation
    Automatically pulls past decisions, commitments, open questions, and relevant conversational threads to generate a comprehensive brief or audio note before an upcoming meeting.
  • Automated Presentation & Document Generation
    Translates high-level prompts like ‘Build me a deck about our next quarter roadmap’ into tangible artifacts like PDFs by pulling explicit priorities directly from your local knowledge graph.
  • Multi-Platform Live Tracking
    Create ‘Live Notes’ by typing @rowboat on any document to continuously track and summarize competitors, projects, or market deals across Reddit, X (Twitter), and live news feeds.
  • Context-Aware Content Drafting
    Drafts highly customized emails, memos, and project roadmaps that require deep historical familiarity without forcing you to re-explain foundational details.
  • Sovereign Personal Knowledge Base
    Serves as a completely private, offline-capable ‘Second Brain’ that records voice notes, extracts takeaways via Deepgram, and keeps data under complete user control.

Features & Benefits

  • Local-First Markdown Architecture
    Stores all notes and structural backlinks on your local disk as plain, human-inspectable, and editable Markdown text files, entirely avoiding proprietary data lock-in.
  • Model Context Protocol (MCP) Hub
    Natively integrates with external tools, CRMs, or databases via MCP and Composio.dev, enabling the AI to interact with live web services and internal environments.
  • Bring Your Own Model (BYOM)
    Fully decoupled from specific LLM providers; easily swaps between local offline setups (Ollama, LM Studio) and high-end cloud endpoints while keeping your data layer intact.
  • Persistent Memory Compounding
    Maintains a running, long-lived conceptual graph of relationships between people, projects, and key decisions that continuously refines itself through user feedback.
  • Multimodal Audio Pipeline
    Supports optional integrations with Deepgram for voice input translation and ElevenLabs for realistic voice briefings, controlled entirely via local configuration files.

Pros

  • Complete Data Privacy & Ownership
    Ideal for privacy-sensitive or enterprise use cases, as data never leaves your local machine unless explicitly routed through your chosen external APIs.
  • Human-Editable AI Memory
    Because the underlying knowledge graph is composed of plain Markdown files, you can manually tweak, update, or prune connections to eliminate AI hallucinations.
  • Highly Extensible Agent Logic
    Leverages the standardized MCP layer to turn standard LLMs into highly capable, cooperating agents that interact natively with your desktop and local terminal.

Cons

  • Manual Configuration Overhead
    Power users must handle setting up local directories, configuring JSON API keys for modular tools (e.g., Exa, ElevenLabs), or routing local model endpoints.
  • Resource Reliance
    Handling complex background agent tasks alongside continuous local model inference or vector operations requires a reasonably capable local computer.

Tutorial

None

Pricing


Popular Products