Why web developers should care about analytical databases
2024/12/11Featuring:You often start your web application with an OLTP database like Postgres, but should you crunch your analytics there? In this video, Mehdi explains what OLAP databases are, when to use them, and more.
Transcript
0:00if you are building web apps whether front end or backend you're probably fine using postgress or another transactional database for most of your use cases but as soon as your app needs data intensive features like analytics dashboard for user or insights on product usage things can slow down that's because transactional databases iron buil for complex analytical queries
0:26in the past you would often hand this half to a separate team with a specialized setup but today infrastructure is simpler and SQL has become the go-to tool for analytics which makes the W thing more accessible in this video we quickly cover what analytical databases are when to use them how to move data from your oltp database and a practical example of
0:50using an NOAB cloud services like mod duck which is ddb in the cloud directly in your versal application let's start with what analytical database is analytical databases or olap databases are designed for querying and analyzing large data sets unlike transactional databases like pest which is great at handling fast and small scale operation like creating and updating records holap
1:18databases are optimized for heavy read intensive operations they are built for complex query like calculating average across millions of rows filtering data by multiple criteria or aggregating metrics over time they're also much faster at these operation because they store and process data differently typically using a colonar storage in short all our databases are ideal for scenarios where you need to Crunch large
1:48thisa set to find Trends patterns or Insight so when to use antical database first it's important to know that it's perfectly fine to start prototyping your analytics use case on your current transactional databases like pogress many antical projects begin like that especially for smaller data sets or simple reporting however as your application grows and the complexity or
2:13volume of data increases you likely hit a performance bottle next that's when you should consider moving to an analytical database as you don't want these antical queries consuming all the resources of your entire database so here are some common scenarios for analytical database first user facing analytics if your application needs to show user detailed analytics like tracking usage Trends or performance
2:42metrics all that databases make it easy to generate fast interactive reports the second one is product Insight if you want to understand how your user are interacting with your application like which features are most popular or what leads to churn all that database lets you run exploratory queries efficiently yes there are a lot of tools to cover
3:05product Insight but at the end of the day you're going to have data that you need to Crunch and owning this data in your own databases will give you much more freedom the third one is combining data sources if you need to merge data from multiple systems like crn data with application usage data analytical database simplifies this process by
3:27handling large diverse data sets this use case are not exclusive of course but they are the most common ones you might see now how do you acely get data into a all app database let's cover at high level three common methods the first one is ETL pipelines ETL stands for extract Transformer load and this is a standard
3:46approach to move data you extract it from your LP database clean or reformat it and load it into your allab database you typically have a separate process in python or whatever have you that would move that data there are two classic approach either you directly qu the olab system you can process and load data directly into the unal database or you
4:10offload to an object storage you can write your data to an object storage like it be ss3 and this give you more flexibility to process the data later be free on the processing tool you want to use instead of leveraging the all app database directly the second is realtime streaming if you need live update for dashboard or or analytics you can use
4:31realtime streaming tools like capka or iws Kinesis to move data these are event streaming service often integrate with change the D capture CDC tools to track and stream the change in real time they're really great for incremental UB dates and singing them efficiently into your olab database and your allp database and finally the third way to move data is by direct quering from your
4:55olab database some olap system allows Direct quer to your transactional database without moving data or rather relying on another process so for example ddb is POS Scanner with ddb you can connect directly to POS and run analytical queries on your existing data pgj DB extension is another meod this is a new poest extension that embeds dgdb directly inside poest and is allow you
5:22to leverage dgdb analytical capabilities without additional infrastructure So within poest but o to connect to mod deck directly so each methods depends on your application needs realtime streaming is ideal for live dashboard ETL is great for batch analytics and direct quering works really well at smaller scale use case as it's really easy to get started so let's now dive
5:46into an example of how you can connect your web application to a noap database in your data stack using versal and its native integration with mother duck which runs duck DB in the cloud if you not find L for of what is duck DB and mod duck I would recommend you to check the video output in the description so
6:05in this use skills will'll hydrate anical data stored in modac to feed directly into your application with the native versal integration you can create a mod account without ever leaving versal streamlining the process with a single platform for both setup and building to start with the integration just go to the marketplace listing and pick model dieck if you want to start
6:28from a template you can head over the template listing and you can deploy a already made templates that carry data from Mod so in this video we are showcasing a reversal data dashboard and as you notice it's incredibly fast and responsive and here is why it's leverage mod. Cloud for hling larger queries and second it used ddb wasm enabling
6:50analytical database to run directly in the browser so this approach take advantage of the client's processing power reducing extra iio traffic and the result is a smoother experience for the user and a lower compute cost for developers so let's deploy this example directly in our reversal account so you just go over the template and click on the deploy button and we're going to
7:14follow the step from there first thing to do is specify which repository you want to clone to and then basically click on add the storage for the M do integration here you're going to specify the tier you want to start you can start with the free tier and finally you give a database name behind the scene a new
7:33organization and a new account in mod deck is going to be created and also a database with the same name that you just gave so you can see everything is Deploy on my uh nexs dashboard I can go on the storage Tab and see the mod deck here integration and we have directly two type of token populated I'll leave
7:53you more information exactly about the difference between mother dark weed scaling token and the no more service token but these two are essentially all you need to connect your application to mod deck the mod deck token is a standard token for read and write and the other one is really for mean to read application to wrap up analytical
8:14databases and log a word of possibilities for web developers they help you handle data intensive features like user dashboards gain deeper insight into your product and combine data from multiple sources all without overloading your transaction database and with modern tours and SQL as the common language setting up these workflows has never been easier it's just to click on
8:37the template so take care of your all La databases and keep quacking [Music]
Related Videos

2026-01-27
Preparing Your Data Warehouse for AI: Let Your Agents Cook
Jacob and Jerel from MotherDuck showcase practical ways to optimize your data warehouse for AI-powered SQL generation. Through rigorous testing with the Bird benchmark, they demonstrate that text-to-SQL accuracy can jump from 30% to 74% by enriching your database with the right metadata.
AI, ML and LLMs
SQL
MotherDuck Features
Stream
Tutorial
2026-01-21
The MCP Sessions - Vol 2: Supply Chain Analytics
Jacob and Alex from MotherDuck query data using the MotherDuck MCP. Watch as they analyze 180,000 rows of shipment data through conversational AI, uncovering late delivery patterns, profitability insights, and operational trends with no SQL required!
Stream
AI, ML and LLMs
MotherDuck Features
SQL
BI & Visualization
Tutorial
2026-01-13
The MCP Sessions Vol. 1: Sports Analytics
Watch us dive into NFL playoff odds and PGA Tour stats using using MotherDuck's MCP server with Claude. See how to analyze data, build visualizations, and iterate on insights in real-time using natural language queries and DuckDB.
AI, ML and LLMs
SQL
MotherDuck Features
Tutorial
BI & Visualization
Ecosystem

