Rememberizer MCP Servers

Configure and use Rememberizer MCP servers to connect your AI assistants with your knowledge

The Model Context Protocol (MCP) is a standardized protocol designed to integrate AI models with various data sources and tools. It supports a client-server architecture facilitating the building of complex workflows and agents with enhanced flexibility and security.

Rememberizer MCP Server

The Rememberizer MCP Server is an MCP server tailored for interacting with Rememberizer's document and knowledge management API. It allows LLMs to efficiently search, retrieve, and manage documents and integrations. The server is available as a public package on mcp-get.com and as an open-source project on GitHub.

Integration Options

The Rememberizer MCP Server can be installed and integrated through multiple methods:

Via mcp-get.com

npx @michaellatman/mcp-get@latest install mcp-server-rememberizer

Via Smithery

npx -y @smithery/cli install mcp-server-rememberizer --client claude

Via SkyDeck AI Helper App

If you have SkyDeck AI Helper app installed, you can search for "Rememberizer" and install the mcp-server-rememberizer.

SkyDeck AI Helper

Tools Available

The Rememberizer MCP Server provides the following tools for interacting with your knowledge repository:

  1. retrieve_semantically_similar_internal_knowledge

    • Finds semantically similar matches from your Rememberizer knowledge repository

    • Parameters:

      • match_this (string, required): The text to find matches for (up to 400 words)

      • n_results (integer, optional): Number of results to return (default: 5)

      • from_datetime_ISO8601 (string, optional): Filter results from this date

      • to_datetime_ISO8601 (string, optional): Filter results until this date

  2. smart_search_internal_knowledge

    • Performs an agentic search across your knowledge sources

    • Parameters:

      • query (string, required): Your search query (up to 400 words)

      • user_context (string, optional): Additional context for better results

      • n_results (integer, optional): Number of results to return (default: 5)

      • from_datetime_ISO8601 (string, optional): Filter results from this date

      • to_datetime_ISO8601 (string, optional): Filter results until this date

  3. list_internal_knowledge_systems

    • Lists all your connected knowledge sources

    • No parameters required

  4. rememberizer_account_information

    • Retrieves your Rememberizer account details

    • No parameters required

  5. list_personal_team_knowledge_documents

    • Returns a paginated list of all your documents

    • Parameters:

      • page (integer, optional): Page number for pagination (default: 1)

      • page_size (integer, optional): Documents per page (default: 100, max: 1000)

  6. remember_this

    • Saves new information to your Rememberizer knowledge system

    • Parameters:

      • name (string, required): Name to identify this information

      • content (string, required): The information to memorize

Setup

Step 1: Sign up for a new Rememberizer account at rememberizer.ai.

Step 2: Add your knowledge to the Rememberizer platform by connecting to Gmail, Dropbox, or Google Drive, etc...

Step 3: To selectively share your knowledge, set up a Mementos Filter. This allows you to choose which information is shared and which remains private. (Guide here)

Step 4: Share your knowledge by creating a "Common Knowledge" (Guide here and here)

Step 5: To access your knowledge via APIs, create an API key (Guide here)

Step 6: If you're using Claude Desktop app, add this to your claude_desktop_config.json file.

{
  "mcpServers": {
    "rememberizer": {
      "command": "uvx",
      "args": ["mcp-server-rememberizer"],
      "env": {
        "REMEMBERIZER_API_TOKEN": "your_rememberizer_api_token"
      }
    }
  }
}

Step 7: If you're using SkyDeck AI Helper app, add the env REMEMBERIZER_API_TOKEN to mcp-server-rememberizer.

Congratulations, you're done!

With support from the Rememberizer MCP server, you can now ask the following questions in your Claude Desktop app or SkyDeck AI GenStudio

  • What is my Rememberizer account?

  • List all documents that I have there.

  • Give me a quick summary about "..."

Rememberizer Vector Store MCP Server

The Rememberizer VectorStore MCP Server facilitates interaction between LLMs and the Rememberizer Vector Store, enhancing document management and retrieval through semantic similarity searches.

Integration Options

The Rememberizer Vector Store MCP Server can be installed and integrated through similar methods as the main Rememberizer MCP Server:

Via Smithery

npx -y @smithery/cli install mcp-rememberizer-vectordb --client claude

Via SkyDeck AI Helper App

If you have SkyDeck AI Helper app installed, you can search for "Rememberizer Vector Store" and install the mcp-rememberizer-vectordb.

SkyDeck AI Helper - Vector Store Installation

Installation

To install the Rememberizer Vector Store MCP Server, follow the guide here.

Setup

Step 1: Sign up for a new Rememberizer account at rememberizer.ai.

Step 2: Create a new Vector Store (Guide here)

Step 3: To manage your Vector Store via APIs, you need to create an API key (Guide here)

Step 4: If you're using Claude Desktop app, add this to your claude_desktop_config.json file.

{
  "mcpServers": {
    "rememberizer": {
      "command": "uvx",
      "args": ["mcp-rememberizer-vectordb"],
      "env": {
        "REMEMBERIZER_VECTOR_STORE_API_KEY": "your_rememberizer_api_token"
      }
    }
  }
}

Step 5: If you're using SkyDeck AI Helper app, add the env REMEMBERIZER_VECTOR_STORE_API_KEY to mcp-rememberizer-vectordb.

Congratulations, you're done!

With support from the Rememberizer Vector Store MCP server, you can now ask the following questions in your Claude Desktop app or SkyDeck AI GenStudio

  • What is my current Rememberizer vector store?

  • List all documents that I have there.

  • Give me a quick summary about "..."

Conclusion

The Rememberizer MCP Servers demonstrate the powerful capabilities of the Model Context Protocol by providing an efficient, standardized way to connect AI models with comprehensive data management tools. These servers enhance the ability to search, retrieve, and manage documents with precision, utilizing advanced semantic search methods and the augmentation of LLM Agents.

Last updated