Using LibreChat to host your own MCP chatbot server

Using LibreChat to host your own MCP chatbot server

author

Evin Callahan

07 Oct 2025 - 03 Mins read

Having Claude Desktop or Cline reach out to external services via MCP is powerful for development, but what if you want to deploy a specialized chatbot for end users? Imagine:

  • A customer service bot with access to your product catalog and documentation to give high quality answers
  • Your users spend no time setting anything up to get this
  • They can control their accounts through your service

And it all only takes less than a day worth of effort to build and deploy!

Enter LibreChat: a fantastic open-source AI chat platform that makes this possible. It provides a production-ready interface with built-in support for Model Context Protocol (MCP) servers, allowing you to create custom AI assistants that connect to your business data and services.

Goals

By the end of this guide, you'll have:

  • A self-hosted LibreChat instance with OAuth authentication
  • MCP servers connected to your business data sources
  • A chatbot accessible to users through a web interface
  • Enhanced knowledge of your system through pre-existing documentation and anything else you want your users to know about

Requirements

  • Docker Desktop installed on your system
  • OAuth Provider (Google, GitHub, Auth0, etc.) for user authentication
  • API Keys for your chosen LLM provider (OpenAI, Anthropic, etc.)
  • Basic familiarity with command line and YAML configuration

Getting LibreChat Up and Running

LibreChat uses Docker for easy initial setup so we'll use it to get started

Download and Install

Clone the LibreChat repository:

git clone https://github.com/danny-avila/LibreChat.git
cd LibreChat

Configure secrets and settings

Copy the example environment file and configure your settings:

cp .env.example .env

Edit .env to add your LLM provider API keys. At minimum, you'll need:

# OpenAI Configuration
OPENAI_API_KEY=your_openai_key_here

# Or Anthropic
ANTHROPIC_API_KEY=your_anthropic_key_here

# MongoDB (included in docker-compose)
MONGO_URI=mongodb://mongodb:27017/LibreChat

# Session Security
CREDS_KEY=your_32_char_key_here
CREDS_IV=your_16_char_key_here
JWT_SECRET=your_jwt_secret_here
JWT_REFRESH_SECRET=your_refresh_secret_here

Use LibreChat's credentials generator tool to generate secure keys.

3. Launch LibreChat

Start the application with Docker:

docker compose up -d

LibreChat will be available at http://localhost:3080 .

By default you can sign up with any email address and no verification.

Setting Up OAuth Authentication

For production deployments, configure OAuth to allow users to sign in with existing accounts:

Google OAuth Example

There are more detailed instructions on how to do this on the librechat site, but a quick summary is as follows:

  1. Create OAuth credentials in Google Cloud Console
  2. Add authorized redirect URI: http://localhost:3080/oauth/google/callback
  3. Add to your .env:
DOMAIN_CLIENT=http://localhost:3080
DOMAIN_SERVER=http://localhost:3080

GOOGLE_CLIENT_ID=your_client_id
GOOGLE_CLIENT_SECRET=your_client_secret
GOOGLE_CALLBACK_URL=/oauth/google/callback

Restart LibreChat to apply changes.

docker compose restart

Users can now sign in with Google.

Creating and Connecting MCP Servers

MCP servers extend your chatbot's capabilities by connecting to external tools and data sources. LibreChat supports MCP through its librechat.yaml configuration file.

MCP Configuration

Create or edit librechat.yaml in your LibreChat directory:

version: 1.1.7

mcpServers:
  # Custom business MCP
  business-api:
    type: streamable-http
    url: https://api.yourbusiness.com/mcp
    headers:
      X-User-ID: "{{LIBRECHAT_USER_ID}}"
      Authorization: "Bearer ${API_TOKEN}"
    timeout: 30000
    serverInstructions: true

The key authentication piece here is to pass in the user credentials that were obtained during user login to LibreChat via the business-api.headers object. This shares the credentials with the MCP for use by the upstream service.

After adding MCP servers, restart LibreChat.

Creating Agents with MCP Tools

To keep all of the agent configuration in one place (ie docs, instructions, MCP configurations, etc), you should create an Agent in LibreChat:

  1. Navigate to the Agents section on the right sidebar
  2. Click Create Agent
  3. Select Add Tools and choose your MCP servers that you want it to reach out to
  4. Configure which specific tools from each server to enable (suggested to just enable all)
  5. Be sure to click the "Share" button, and ensure that all users have the ability to see this agent. Without this, only you have access to this agent.
  6. [Optional] Add documents to your agent that will be used for additional knowledge obtained during every chat request if needed.
  7. Save your agent

Users can then select this agent from the chat interface to access the connected MCP tools and documentation.

Getting Graphics and Code

Artifacts enable your chatbot to generate interactive content like React components, HTML pages, and Mermaid diagrams that users can view, edit, and download.

Enabling Artifacts

Configure artifacts at the agent level for granular control:

  1. Create or edit an agent
  2. Enable the Artifacts toggle
  3. Save the agent

When users interact with this agent, generated code and diagrams will appear in a side panel where they can be previewed, edited, and exported.

Use Cases for Artifacts

  • Code Generation: Create React components or HTML templates
  • Data Visualization: Generate charts and graphs with Mermaid
  • Prototyping: Rapidly iterate on UI designs
  • Documentation: Create interactive examples and tutorials

Next Steps

  • Deploy to production through your server provider of choice (Digital Ocean, AWS, GCP)
  • Explore the LibreChat documentation for advanced configuration
  • Join the LibreChat discord for support and ideas
  • Build custom MCP servers for your specific use cases

Start building your AI-powered chatbot today with LibreChat and MCP!

Recent Articles

How to build your own MCP server with FastMCP

How to build your own MCP server with FastMCP

FastMCP is a Python library that makes it incredibly easy to connect your existing code or backend services to an LLM (Large Language Model - aka ChatGPT).In this guide we...

author

Evin Callahan

24 Sep 2025 - 04 Mins read

Designing and Building Model Context Protocol (MCP) Servers

Designing and Building Model Context Protocol (MCP) Servers

The Model Context Protocol (MCP) represents a paradigm shift in how AI systems interact with external tools and data sources. As AI applications become more sophisticated, the need for standardized w...

author

Napsty Team

19 Sep 2025 - 04 Mins read

Ready to Harness AI Superpowers?

Don't fall for the AI hype. Let us show you where AI actually makes sense
for your business and implement it the right way.

Get in Touch