No More Writing SQL for Quick Analysis

2026/01/21Featuring:

This video demonstrates how to set up and use the MotherDuck MCP (Model Context Protocol) server with Claude Desktop for AI-powered data analysis without writing SQL.

What You'll Learn

  • What is MCP? The Model Context Protocol is a standard that lets AI assistants like Claude interact with external tools. Think of it like a USB interface—you can connect any MCP server to your AI tool and perform actions like querying databases.

  • Quick Setup with Claude Desktop: Add the MotherDuck MCP connector in just a few clicks. Go to Settings → Connectors → Browse, search for MotherDuck, and authenticate with your free MotherDuck account.

Three Practical Use Cases

1. Query a Cloud Database

Ask broad questions like "give me some analytics on the Hacker News database" and watch Claude automatically explore the schema, run queries, and return insights—including top stories, most active users, and data shape analysis across 40+ million rows.

2. Query Private Files on S3

Connect to private Parquet files stored in AWS S3 by setting up AWS credentials in MotherDuck. The video shows querying a movie embeddings dataset and running vector similarity searches to find movies similar to Toy Story or Batman.

3. Query Public APIs

Use DuckDB's HTTP capabilities through Claude to query public APIs like GitHub's repository language statistics. Get insights like "C++ dominates data infrastructure" across repos like DuckDB, ClickHouse, Spark, and Trino.

Key Takeaways

  • No SQL required: Natural language questions are converted to SQL automatically
  • Self-correcting: When queries fail, Claude iterates and fixes errors without manual intervention
  • Read-only and safe: The MCP server only performs read queries, so there's no risk of data modification
  • Fast cloud queries: Data stays in the cloud—nothing transits to your laptop

Perfect for exploratory data analysis, quick insights, and anyone who wants to leverage AI for analytics without deep SQL knowledge.

0:00Hello data nerd. In this video, you'll learn how to quickly get insights on any data using motherduct. The only thing you need to know is SQL. I'm just kidding. AI will actually do that for you. So, you just need to speak English or if you're more comfortable.

0:27So, we'll be able to ask natural language questions in our AI tool and get back results thanks to the Motoc MCP server. MCP in a nutshell is a standard for LMS to interact with external tools.

0:40Thinks of the MCP protocol like an USB interface. You can connect to any MCP server to your LM which acts as an MCP client and can perform action defined in that MCP. For us, it is query a database. We'll do a deep dive on MCP on this channel, but that's for another video. So yeah, you want to subscribe to

1:00this channel to not miss it up. So to set up your MCP client, you usually have two options in your LLM tool, whether it's your chat GPT cloud cursor or others. The option one is an improved remote MCP server and the setup is just a few clicks. The option two, you run the MCP server locally. In this video,

1:19we'll go with option one since it's the easiest route is true code where the model NCP server is already approved.

1:27And at the time of recording this video, we are in the process of getting verified on other popular platform as well. So check the docs in the description if you want to know the latest support on that topic. So on cloud desktop you can add the MCP server to settings to connectors browse connector and then you can search for

1:46the modak uh MCP connector. You're going to be prompt to go to a web authentification. Uh if you don't have an account yet on modak you can create one for free and then you're going to be prompt to open the cloud app again and this configuration will be done. And as you can see if we go back to the

2:08settings now I can see all the reaction that my MCP server can do. I can ask actually doc question this column list database and the most important one uh query uh of course uh the data on the

2:25right side is that I can change u how the LLM will ask for my permission or not before running an action. If you allow everything by default, the loop is faster because if a lm does an error, it's not going to prompt again for confirming a specific action. It'll just do it directly. But it also has more

2:47risk. But I'm going to activate the yolo mode here because anyway, the MCP server can only do read only SQL query. So there's no real danger. All right. So now let's confirm that uh the uh MCP is actually working and we're going to ask can you list my modd DBs really simple action. So you see it's done uh an

3:08action on a list database and he can list all the database I have would it be a mo DB share or a dlick uh database. So

3:18now we are all right to continue and start asking question in plain English. So first give me some analytics on the I

3:28can use database. So I on purpose give it really a broad question. Of course the more accurate and narrower your prompt is the better the results. But it's just to show you that for simple exploratory data analysis this is really

3:46really good. you see is getting the columns uh getting the schema and then performing relevant queries for analytics. So now I have more time I guess to talk with you our review. All right. So now it's getting me some information. You see it's pretty quite already a large table.

4:05We have more than 40 millions rows uh the date range. So he's done some minmax around just to get you know a vision of uh just to get an idea of the shape of the data and uh basically the number of comments and of course some analytics on the top most uh popular uh stories and you see you have the relevant years and

4:27then other analytics like the most active users and so forth. That's for the first use case. For a second use case we're going to query a file on a object storage. So not stored on motherdox typically on AWS S3 a park file and for that it's a private bucket.

4:45So the thing first that I'm going to do is log to using the AWS uh CLI to my SSO

4:53credential. And when it's done I can launch any local ductb client but the easiest one is a dagb uh cli client. I'm

5:03going to connect to motoduck using the attach nd command. So now I'm authenticate to motodak to the dagbli and then create uh a secrets in motoc.

5:14I'm refreshing my AWS credential in modc. You can also go to the web UI and uh head over the secrets tab and then uh

5:25add manually uh this one. So now I can ask you give me some analytics on that

5:32file and I'm going to copy paste my private uh file in a private history bucket and because motoc has access to

5:42uh this bucket I'm leveraging the cloud network so there is nothing transiting actually to my laptop and of course the speeds of the bandwidth to query would it be a large file or not is faster and As you can see there is sometimes some error but just the LMS is iterating and

6:03fixing based on the error it's getting. So there is really no copy paste for me to just say hey this query fail uh try something else. It's doing that automatically for me. So now I can see that the movie park file is actually a data set which is a vector ready data set good for semantic search or

6:25recommendation system. And the LM is asking me actually do you want to u do you want to uh run some sample vector similarity search.

6:36So let's let's actually do that. Now it's going to query using some specific function of Doug DB to do uh a sim vector similarity search. And again as a result I have a couple of information about similar movie to Toy Story or uh Batman as you can see with the similarity score. All right. For a third example, we're going to push the more

6:58forward and try to query actually a public API with again just mo duct DB

7:05behind the scene because it can query an HTTP endpoint and it can parse a response from an API, a JSON response pretty really well. So I'm going to ask use modak mimcp to query some common

7:21data repos to see what's the top language programming language used for instance uh I'm giving an example of the github

7:33endpoints for the ddb repo so this is returning this endpoint the percentage in bytes or what's the the amounts of a typical programming language of a given repo and so you see it's going to go through multiple repository I haven't asked actually any which repository it should go just again a wide broad question data repository and now it's start to give me a nice uh

8:01top language per repository and you see we have trino click spark and see what's the top language used in each of those and we've uh given actually a specific conclusion ion and key insights which is C++ dominates the data infrastructure worldb click house and arrow are all based on C++ R is rising Python is everywhere but but but mostly as binding

8:28test not for core engine and the GVM still matters um I have to disagree with that but that's for another video that's it for this video we saw how easy it is to set up the MCP motoduck uh server using code and we didn't do any SQL the

8:47LM just interacts with himself loops if there is error or suggest things and we can get some first insight really quickly with really broad question I didn't push the prompts at all if you want to see other AI workflow like this let me know in the comment what you would like to see and in [music] the meantime take care of your ducks and

9:08I'll see you in the next one cheers

FAQS

What is the Model Context Protocol (MCP)?

MCP is a standard for large language models to interact with external tools. Think of MCP like a USB interface — you can connect any MCP server to your LLM, which acts as an MCP client, and it can perform actions defined in that server. With the MotherDuck MCP server, that action is querying a database using natural language instead of writing SQL.

What is an MCP server and how does it connect to an LLM?

An MCP server is a connector that gives an LLM access to external tools or data sources. Your LLM tool — whether it's Claude Desktop, ChatGPT, Cursor, or others — acts as the MCP client. You connect an MCP server to it, and the LLM can then perform the actions that server defines, like querying databases, listing tables, or searching documentation.

How do I set up the MotherDuck MCP server with Claude Desktop?

There are two options: use a remote MCP server (just a few clicks) or run one locally. The easiest route is the remote option in Claude Desktop. Go to Settings, then Connectors, browse for the MotherDuck MCP connector, and authenticate with your free MotherDuck account. Once configured, you can see all available actions — querying data, listing databases, listing columns, and asking documentation questions.

Can I analyze data with AI without writing SQL?

Yes. With the MotherDuck MCP server, the LLM converts your natural language questions into SQL automatically. You can ask broad questions like "give me some analytics on this database" and it will discover the schema, run relevant queries, and return insights — all without you writing a single line of SQL. If a query fails, the LLM iterates and fixes the error on its own without any copy-pasting from you.

Is it safe to let an AI query my database through MCP?

Yes. The MotherDuck MCP server can only execute read-only SQL queries, so there is no risk of data being modified or deleted. You can also configure permission levels in your LLM tool to control whether the AI asks for confirmation before running each action, or allow it to run freely. Since all queries are read-only, there is no real danger either way.

What data sources can I query through the MotherDuck MCP server?

The video demonstrates three types of data sources: cloud databases hosted on MotherDuck (like a 40-million-row Hacker News dataset), private files on Amazon S3 (such as Parquet files, using your AWS credentials stored as secrets in MotherDuck), and public APIs over HTTP (like GitHub's repository endpoints). Because queries run in the cloud via DuckDB, nothing needs to transit to your laptop — even for large files.

How does the AI handle errors when querying data?

The LLM automatically iterates and self-corrects when a query fails. As shown in the video, when querying a Parquet file on S3, the AI encountered errors but fixed them on its own based on the error messages it received — no manual intervention or copy-pasting required. It loops through attempts, adjusts the SQL, and continues until it gets a valid result.

Related Videos

"MCP: Understand It, Set It Up, Use It" video thumbnail

9:09

2026-02-13

MCP: Understand It, Set It Up, Use It

Learn what MCP (Model Context Protocol) is, how its three building blocks work, and how to set up remote and local MCP servers. Includes a real demo chaining MotherDuck and Notion MCP servers in a single prompt.

YouTube

MCP

AI, ML and LLMs

" Preparing Your Data Warehouse for AI: Let Your Agents Cook" video thumbnail

2026-01-27

Preparing Your Data Warehouse for AI: Let Your Agents Cook

Jacob and Jerel from MotherDuck showcase practical ways to optimize your data warehouse for AI-powered SQL generation. Through rigorous testing with the Bird benchmark, they demonstrate that text-to-SQL accuracy can jump from 30% to 74% by enriching your database with the right metadata.

AI, ML and LLMs

SQL

MotherDuck Features

Stream

Tutorial

"The MCP Sessions - Vol 2: Supply Chain Analytics" video thumbnail

2026-01-21

The MCP Sessions - Vol 2: Supply Chain Analytics

Jacob and Alex from MotherDuck query data using the MotherDuck MCP. Watch as they analyze 180,000 rows of shipment data through conversational AI, uncovering late delivery patterns, profitability insights, and operational trends with no SQL required!

Stream

AI, ML and LLMs

MotherDuck Features

SQL

BI & Visualization

Tutorial