Using LLMs with MotherDuck
MCP Server
What is MCP?
The Model Context Protocol (MCP) is an open standard that enables AI assistants to interact with external data sources and tools. Think of MCP like a USB-C port for AI applications - it provides a standardized way to connect AI models to different data sources and tools. Learn more at modelcontextprotocol.io.
Purpose
MotherDuck's DuckDB MCP Server implements this protocol to allow AI assistants like Claude, or AI IDEs like Cursor to directly interact with your local DuckDB or MotherDuck cloud databases. It enables conversational SQL analytics without complex setup, letting you analyze your data through natural language conversations.
Key Features
- Query data from local DuckDB and/or MotherDuck cloud databases
- Access data in cloud storage (AWS s3) through MotherDuck's integrations
- Execute SQL analytics using natural language requests
Getting Started
While the MCP can connect to MotherDuck, you can also use it without any connection to the Cloud for pure DuckDB actions. Find out more about connecting to local DuckDB here.
To use the MCP server with MotherDuck, you'll need:
- A MotherDuck account and access token
- Claude Desktop, Cursor, VS Code or another MCP-compatible client
Setup guides:
If the MCP server is exposed to third parties and should only have read access to data, we recommend using a read scaling token and running the MCP server in SaaS mode.
For detailed information, view our MCP Server repository.
llms.txt
You can access the DuckDB and MotherDuck documentation in Markdown format at motherduck.com/docs/llms.txt and duckdb.org/docs/stable/llms.txt. These files are designed to help Large Language Models (LLMs) answer questions about DuckDB and MotherDuck based on the official documentation.
Purpose
The llms.txt
file follows the emerging llmstxt.org standard for organizing documentation in a format optimized for AI assistants.
It helps tools like ChatGPT, LangChain agents, and other LLMs:
- Discover relevant information about MotherDuck’s features and capabilities
- Understand and explain MotherDuck’s SQL dialect (including DuckDB and MotherDuck-specific syntax)
- Assist with integration and setup questions
- Troubleshoot common issues more effectively
By pointing AI tools to our llms.txt
, you make it easier for them to provide accurate, up-to-date answers based on our official documentation.
Available Files
MotherDuck
llms.txt
: Contains key information about MotherDuck’s features and capabilities.llms-full.txt
: Comprehensive documentation covering all MotherDuck pages.
DuckDB
llms.txt
: Focused on DuckDB’s SQL dialect and features.llms-full.txt
: Full documentation for DuckDB.
Difference between llms.txt
and llms-full.txt
While the llms.txt
is an index with hyperlinks to documentation pages, the llms-full.txt
file provides the complete text of all documentation pages for either MotherDuck or DuckDB.
It is designed for scenarios where the AI tool does not have the ability to navigate hyperlinks, and offers a way to make the full documentation accessible at once. It requires a model with large context size (>128k tokens).
Usage Tips
- For MotherDuck-specific features and capabilities, use the MotherDuck
llms.txt
orllms-full.txt
. - For DuckDB SQL dialect-specific questions, refer to the DuckDB
llms.txt
orllms-full.txt
. - If unsure, add both files to the context to ensure comprehensive coverage of your question.
Example Usage
To prompt an LLM with questions using the llms.txt
or llms-full.txt
files:
- Copy the content from an available llms.txt or llms-full.txt file.
- Use the following prompt format:
Documentation:
{paste documentation here}
---
Based on the above documentation, answer the following:
{your question about DuckDB or MotherDuck}
When using Cursor, the llms-full.txt
file can be added directly to the chat context with @ (Weblink)
, e.g. @motherduck.com/docs/llms-full.txt
.