AI Powered BI: Can LLMs REALLY Generate Your Dashboards? ft. Michael Driscoll
2025/05/20The Evolution of Business Intelligence: From Clicks to Code
Business Intelligence is experiencing a fundamental shift from traditional drag-and-drop interfaces to code-based approaches, accelerated by the rise of large language models (LLMs). This transformation promises to make data analytics more accessible while maintaining the precision and version control benefits that come with code-based solutions.
Understanding BI as Code
BI as code follows the same principles that revolutionized infrastructure management with infrastructure as code. Instead of configuring dashboards through complex user interfaces, the entire BI stack—data sources, models, semantic layers, and dashboards—exists as declarative code artifacts in a GitHub repository.
This approach offers several advantages:
- Version control: Every change is tracked and reversible
- Collaboration: Teams can review and contribute through standard development workflows
- Expressiveness: Code provides far more flexibility than any UI could offer
- AI compatibility: LLMs excel at generating and modifying code
The Role of LLMs in Modern BI Workflows
Large language models are particularly well-suited for BI as code frameworks. Unlike low-code or no-code interfaces that were trending a few years ago, English has become the universal interface. LLMs can take natural language prompts and generate the necessary code to build metrics, dashboards, and analyses.
Schema-First AI Generation
One of the most powerful approaches involves sending only the schema of a dataset to an LLM, rather than the entire dataset. This method:
- Preserves data privacy and security
- Reduces processing time significantly
- Still provides enough context for intelligent metric generation
- Allows the LLM to infer meaningful metrics based on column names and data types
English vs SQL: Finding the Right Balance
The debate between using natural language versus SQL for analytics reveals interesting nuances:
When English Works Better
- Simple data transformations
- Date truncations and timezone conversions
- Basic aggregations
- Initial exploration of datasets
When SQL Excels
- Complex window functions
- Precise time-based calculations
- Multi-step algorithms
- Production-grade queries requiring exact specifications
The Time Complexity Challenge
A compelling example involves specifying time ranges. Asking for "revenue over the last three days" in English seems straightforward, but raises multiple questions:
- Should it include today's partial data?
- Should it truncate to complete hours?
- When comparing periods, should partial days be handled consistently?
These nuances demonstrate why code-based approaches maintain their value even as natural language interfaces improve.
Local Development with DuckDB
DuckDB has emerged as a powerful engine for local BI development. By embedding DuckDB directly into BI tools, developers can:
- Work with multi-million row datasets on their local machines
- Get instant feedback on metric changes
- Avoid cloud processing costs during development
- Maintain complete data privacy
The combination of DuckDB's performance and BI as code principles creates a development experience similar to modern web development—write code, see immediate results, iterate quickly.
MCP Servers: Bridging AI and Data
Model Context Protocol (MCP) servers represent a new paradigm for AI-data interactions. Instead of developing separate integrations for each service, MCP provides a common interface that allows AI assistants to:
- Query databases directly
- Access real-time data
- Generate insights without manual data export
- Maintain context across multiple queries
Local MCP Implementation
Running MCP servers locally with tools like DuckDB offers unique advantages:
- Complete data privacy—no data leaves your machine
- Instant query execution
- Direct access to local datasets
- No cloud infrastructure required
Practical Implementation: From Data to Dashboard
The modern BI workflow demonstrates impressive efficiency:
- Data Import: Connect to data sources (S3, local files, databases)
- AI-Generated Metrics: LLMs analyze schemas and suggest relevant metrics
- Code Generation: Create YAML configurations for metrics and dashboards
- Instant Preview: See results immediately with local processing
- Iterative Refinement: Use natural language to adjust and improve
Real-World Example: GitHub Analytics
Analyzing GitHub commit data showcases the power of this approach:
- Identify top contributors across multiple usernames
- Analyze code changes by file type and category
- Generate insights about project focus areas
- Complete complex analyses in minutes rather than hours
The Future of BI: Notebooks, Dashboards, or Both?
The evolution of BI interfaces suggests multiple modalities will coexist:
Traditional Dashboards
- Best for monitoring key metrics
- Daily business operations
- Executive reporting
- Standardized views
Notebook-Style Interfaces
- Ideal for exploratory analysis
- Root cause investigations
- Iterative questioning
- Sharing analytical narratives
Challenges and Considerations
Despite the excitement around AI-powered BI, several challenges remain:
Technical Hurdles
- LLMs perform better with established coding patterns
- Innovation in interface design may be constrained by AI training data
- Documentation quality directly impacts AI effectiveness
Human Factors
- Metadata often exists in people's heads, not systems
- Business logic requires human understanding
- Quality control remains essential
The Augmentation Perspective
Rather than replacement, AI in BI represents augmentation. Data professionals who embrace these tools will find their capabilities enhanced, not diminished. The technology handles routine tasks while humans provide context, validation, and strategic thinking.
Best Practices for AI-Powered BI
To maximize success with AI-powered BI:
- Follow conventions: Use established patterns that LLMs recognize
- Document thoroughly: Quality documentation improves AI performance
- Start with schemas: Let AI infer from structure before accessing data
- Maintain human oversight: Verify generated code and results
- Iterate quickly: Leverage local processing for rapid development
Looking Ahead
The convergence of BI as code, powerful local databases like DuckDB, and sophisticated LLMs creates unprecedented opportunities for data analysis. As these technologies mature, we can expect:
- More sophisticated metric generation
- Better handling of complex business logic
- Seamless integration between exploration and production
- Continued importance of human expertise in context and validation
The future of business intelligence isn't about choosing between humans or AI, dashboards or notebooks, clicks or code. It's about combining these elements to create more powerful, flexible, and accessible analytics experiences for everyone.
CONTENT
- The Evolution of Business Intelligence: From Clicks to Code
- Understanding BI as Code
- The Role of LLMs in Modern BI Workflows
- English vs SQL: Finding the Right Balance
- Local Development with DuckDB
- MCP Servers: Bridging AI and Data
- Practical Implementation: From Data to Dashboard
- The Future of BI: Notebooks, Dashboards, or Both?
- Challenges and Considerations
- Best Practices for AI-Powered BI
- Looking Ahead
CONTENT
- The Evolution of Business Intelligence: From Clicks to Code
- Understanding BI as Code
- The Role of LLMs in Modern BI Workflows
- English vs SQL: Finding the Right Balance
- Local Development with DuckDB
- MCP Servers: Bridging AI and Data
- Practical Implementation: From Data to Dashboard
- The Future of BI: Notebooks, Dashboards, or Both?
- Challenges and Considerations
- Best Practices for AI-Powered BI
- Looking Ahead
Related Videos

2025-05-13
Faster Data Pipelines development with MCP and DuckDB
Discover how the Model Context Protocol (MCP) accelerates data pipeline development with AI tools, DuckDB, and MotherDuck, streamlining schema discovery and code generation for data engineers.
YouTube
AI, ML and LLMs