YouTubeAI, ML and LLMsBI & Visualization

AI Powered BI: Can LLMs REALLY Generate Your Dashboards? ft. Michael Driscoll

2025/05/20

The Evolution of Business Intelligence: From Clicks to Code

Business Intelligence is experiencing a fundamental shift from traditional drag-and-drop interfaces to code-based approaches, accelerated by the rise of large language models (LLMs). This transformation promises to make data analytics more accessible while maintaining the precision and version control benefits that come with code-based solutions.

Understanding BI as Code

BI as code follows the same principles that revolutionized infrastructure management with infrastructure as code. Instead of configuring dashboards through complex user interfaces, the entire BI stack—data sources, models, semantic layers, and dashboards—exists as declarative code artifacts in a GitHub repository.

This approach offers several advantages:

  • Version control: Every change is tracked and reversible
  • Collaboration: Teams can review and contribute through standard development workflows
  • Expressiveness: Code provides far more flexibility than any UI could offer
  • AI compatibility: LLMs excel at generating and modifying code

The Role of LLMs in Modern BI Workflows

Large language models are particularly well-suited for BI as code frameworks. Unlike low-code or no-code interfaces that were trending a few years ago, English has become the universal interface. LLMs can take natural language prompts and generate the necessary code to build metrics, dashboards, and analyses.

Schema-First AI Generation

One of the most powerful approaches involves sending only the schema of a dataset to an LLM, rather than the entire dataset. This method:

  • Preserves data privacy and security
  • Reduces processing time significantly
  • Still provides enough context for intelligent metric generation
  • Allows the LLM to infer meaningful metrics based on column names and data types

English vs SQL: Finding the Right Balance

The debate between using natural language versus SQL for analytics reveals interesting nuances:

When English Works Better

  • Simple data transformations
  • Date truncations and timezone conversions
  • Basic aggregations
  • Initial exploration of datasets

When SQL Excels

  • Complex window functions
  • Precise time-based calculations
  • Multi-step algorithms
  • Production-grade queries requiring exact specifications

The Time Complexity Challenge

A compelling example involves specifying time ranges. Asking for "revenue over the last three days" in English seems straightforward, but raises multiple questions:

  • Should it include today's partial data?
  • Should it truncate to complete hours?
  • When comparing periods, should partial days be handled consistently?

These nuances demonstrate why code-based approaches maintain their value even as natural language interfaces improve.

Local Development with DuckDB

DuckDB has emerged as a powerful engine for local BI development. By embedding DuckDB directly into BI tools, developers can:

  • Work with multi-million row datasets on their local machines
  • Get instant feedback on metric changes
  • Avoid cloud processing costs during development
  • Maintain complete data privacy

The combination of DuckDB's performance and BI as code principles creates a development experience similar to modern web development—write code, see immediate results, iterate quickly.

MCP Servers: Bridging AI and Data

Model Context Protocol (MCP) servers represent a new paradigm for AI-data interactions. Instead of developing separate integrations for each service, MCP provides a common interface that allows AI assistants to:

  • Query databases directly
  • Access real-time data
  • Generate insights without manual data export
  • Maintain context across multiple queries

Local MCP Implementation

Running MCP servers locally with tools like DuckDB offers unique advantages:

  • Complete data privacy—no data leaves your machine
  • Instant query execution
  • Direct access to local datasets
  • No cloud infrastructure required

Practical Implementation: From Data to Dashboard

The modern BI workflow demonstrates impressive efficiency:

  1. Data Import: Connect to data sources (S3, local files, databases)
  2. AI-Generated Metrics: LLMs analyze schemas and suggest relevant metrics
  3. Code Generation: Create YAML configurations for metrics and dashboards
  4. Instant Preview: See results immediately with local processing
  5. Iterative Refinement: Use natural language to adjust and improve

Real-World Example: GitHub Analytics

Analyzing GitHub commit data showcases the power of this approach:

  • Identify top contributors across multiple usernames
  • Analyze code changes by file type and category
  • Generate insights about project focus areas
  • Complete complex analyses in minutes rather than hours

The Future of BI: Notebooks, Dashboards, or Both?

The evolution of BI interfaces suggests multiple modalities will coexist:

Traditional Dashboards

  • Best for monitoring key metrics
  • Daily business operations
  • Executive reporting
  • Standardized views

Notebook-Style Interfaces

  • Ideal for exploratory analysis
  • Root cause investigations
  • Iterative questioning
  • Sharing analytical narratives

Challenges and Considerations

Despite the excitement around AI-powered BI, several challenges remain:

Technical Hurdles

  • LLMs perform better with established coding patterns
  • Innovation in interface design may be constrained by AI training data
  • Documentation quality directly impacts AI effectiveness

Human Factors

  • Metadata often exists in people's heads, not systems
  • Business logic requires human understanding
  • Quality control remains essential

The Augmentation Perspective

Rather than replacement, AI in BI represents augmentation. Data professionals who embrace these tools will find their capabilities enhanced, not diminished. The technology handles routine tasks while humans provide context, validation, and strategic thinking.

Best Practices for AI-Powered BI

To maximize success with AI-powered BI:

  1. Follow conventions: Use established patterns that LLMs recognize
  2. Document thoroughly: Quality documentation improves AI performance
  3. Start with schemas: Let AI infer from structure before accessing data
  4. Maintain human oversight: Verify generated code and results
  5. Iterate quickly: Leverage local processing for rapid development

Looking Ahead

The convergence of BI as code, powerful local databases like DuckDB, and sophisticated LLMs creates unprecedented opportunities for data analysis. As these technologies mature, we can expect:

  • More sophisticated metric generation
  • Better handling of complex business logic
  • Seamless integration between exploration and production
  • Continued importance of human expertise in context and validation

The future of business intelligence isn't about choosing between humans or AI, dashboards or notebooks, clicks or code. It's about combining these elements to create more powerful, flexible, and accessible analytics experiences for everyone.

CONTENT
  1. The Evolution of Business Intelligence: From Clicks to Code
  2. Understanding BI as Code
  3. The Role of LLMs in Modern BI Workflows
  4. English vs SQL: Finding the Right Balance
  5. Local Development with DuckDB
  6. MCP Servers: Bridging AI and Data
  7. Practical Implementation: From Data to Dashboard
  8. The Future of BI: Notebooks, Dashboards, or Both?
  9. Challenges and Considerations
  10. Best Practices for AI-Powered BI
  11. Looking Ahead
CONTENT
  1. The Evolution of Business Intelligence: From Clicks to Code
  2. Understanding BI as Code
  3. The Role of LLMs in Modern BI Workflows
  4. English vs SQL: Finding the Right Balance
  5. Local Development with DuckDB
  6. MCP Servers: Bridging AI and Data
  7. Practical Implementation: From Data to Dashboard
  8. The Future of BI: Notebooks, Dashboards, or Both?
  9. Challenges and Considerations
  10. Best Practices for AI-Powered BI
  11. Looking Ahead

Related Videos

"Faster Data Pipelines development with MCP and DuckDB" video thumbnail

2025-05-13

Faster Data Pipelines development with MCP and DuckDB

Discover how the Model Context Protocol (MCP) accelerates data pipeline development with AI tools, DuckDB, and MotherDuck, streamlining schema discovery and code generation for data engineers.

YouTube

AI, ML and LLMs

"More Than a Vibe: AI-Driven SQL That Actually Works" video thumbnail

2025-04-22

More Than a Vibe: AI-Driven SQL That Actually Works

Jacob Matson shares insights from AI-powered spatial data analysis, exploring how to "vibe code" with AI-generated SQL using MotherDuck and DuckDB for real-world decision-making scenarios.

Talk

AI, ML and LLMs

"A duck in the hand is worth two in the cloud" video thumbnail

33:49

2024-11-08

A duck in the hand is worth two in the cloud

What if I told you that you could complete a JSON parse and extract task on your laptop before a distributed compute cluster even finishes booting up?

YouTube

BI & Visualization

AI, ML and LLMs

SQL

Python

Talk