
Power BI Composite Models + Aggregations: Enterprise Pattern for 100B+ Row Semantic Models
Power BI composite models and aggregations for 100B+ row enterprise semantic models. Architecture patterns, performance tuning, governance for Fortune 500 deployments.
Power BI composite models and aggregations for 100B+ row enterprise semantic models. Architecture patterns, performance tuning, governance for Fortune 500 deployments.

A Fortune 500 financial-services enterprise we work with maintains a customer-transaction semantic model with 100 billion rows of transactional detail spanning 12 years of history. Pure Import mode is impractical (memory pressure, refresh time). Pure DirectQuery is too slow for executive dashboards. The architectural answer: composite model with Import-mode aggregations over DirectLake detail. Executives query the aggregations and get sub-second response; analysts drill through to the detail when needed.
This guide details the composite model + aggregations pattern for enterprise scale.
A composite Power BI semantic model combines:
The combinations work together. A query that touches multiple tables across storage modes is resolved by the Power BI engine, which orchestrates the appropriate data access for each table.
An aggregation table is a pre-summarized version of a fact table at a defined grain (typically date × dimension):
The standard enterprise pattern combines composite mode with aggregations:
The result: Executive dashboards hit the aggregation (sub-second). Analyst drill-through hits the detail (also fast on DirectLake). Memory consumption is bounded by the aggregation + dimensions, not the full detail.
For a single 100B+ row fact table with executive dashboards:
For multi-source analytics combining, e.g., on-premises SQL Server and Azure Data Lake:
For workloads with substantial current-period activity and infrequent historical access:
The aggregation grain should match the dominant query pattern:
Aggregation tables in Import mode refresh on schedule. The refresh aggregates the detail data to the aggregation grain. Common refresh cadences:
The Power BI Performance Analyzer shows whether queries are using the aggregation or falling through to detail. Periodic validation confirms aggregation use; queries falling through unexpectedly indicate the aggregation grain may need adjustment.
Composite models with aggregations have specific capacity-sizing characteristics:
EPC Group's sizing approach: estimate the working-set for aggregations + dimensions in memory; estimate the DirectLake/DirectQuery resource consumption based on detail-query frequency; combine for the F-SKU capacity recommendation.
A Power BI composite model combines multiple data sources and storage modes (Import + DirectQuery + DirectLake) within a single semantic model. The Power BI engine orchestrates query execution across the modes.
Aggregations are pre-summarized versions of fact tables at defined grains. The Power BI engine automatically routes appropriate queries to the aggregation, providing fast query performance for the aggregation grain while preserving detail-level access for drill-through.
For very large fact tables where Import mode is impractical, for multi-source analytics combining different data sources, or for hot/cold hybrid patterns where current-period data is in memory and historical data is queried on demand.
Aggregations and composite models complement each other. The standard enterprise pattern combines them: aggregation in Import mode for fast executive queries; detail in DirectLake or DirectQuery for drill-through.
With composite models and DirectLake, semantic models can address 100+ billion rows of detail. The aggregation pattern keeps memory consumption bounded.
DirectLake reads Delta files in OneLake directly with column-segment caching. DirectQuery sends each query to the source system. DirectLake performance is closer to Import; DirectQuery performance is bound by the source.
Estimate aggregation + dimension memory for the Import portion. Estimate DirectLake column-segment working-set for the detail portion. Estimate DirectQuery resource consumption based on detail-query frequency. Combine for capacity sizing. Validate during pilot.
Composite models with DirectQuery over a real-time-updated source can serve near-real-time queries. For sub-minute latency, Fabric Real-Time Intelligence with Eventhouse is typically a better architectural fit.
Power BI Performance Analyzer shows whether the aggregation was used. If queries fall through unexpectedly, common causes include: filter context doesn't match aggregation columns, measures reference columns not in the aggregation, or the aggregation precedence is misconfigured.
Composite models work in Power BI Pro, Power BI Premium Per User, Power BI Premium Per Capacity, and Microsoft Fabric F-SKU. Aggregations are available across the same tiers. Some advanced patterns (very large datasets, DirectLake) require Premium or F-SKU.
The migration is typically: identify the detail table appropriate for DirectLake/DirectQuery, restructure the model to keep dimensions and aggregations in Import, and switch the detail to the appropriate non-Import mode. Validate performance after each step.
Composite models can reference other Power BI datasets (now semantic models). The pattern is sometimes called "thin reports over composite models." Multi-source composition extends naturally to multiple Power BI semantic models.
EPC Group works with Fortune 500 enterprises on composite model implementations for very large semantic models. The standard engagement is 12 weeks for a substantial composite model build with aggregations. Our consultants — including Microsoft Press bestselling author Errin O'Connor — bring direct enterprise composite model experience.
V-Order optimization applies to Delta files in OneLake that back DirectLake tables. Composite models using DirectLake benefit from V-Order on the underlying Delta tables.
Yes. Row-Level Security applies across composite models. RLS rules can reference columns in Import tables, DirectQuery tables, or DirectLake tables. The engine enforces RLS during query execution regardless of storage mode.
If your enterprise has semantic models exceeding pure-Import scale:
EPC Group has 29 years of enterprise Microsoft consulting experience and is Microsoft Solutions Partner with the core designations. We were historically the oldest continuous Microsoft Gold Partner in North America from 2016 until the program's retirement. Our consultants — including Microsoft Press bestselling author Errin O'Connor — bring direct enterprise composite model experience across Fortune 500 deployments. To discuss your semantic model architecture, contact EPC Group for a 30-minute discovery call.
CEO & Chief AI Architect
Microsoft Press bestselling author with 29 years of enterprise consulting experience.
View Full ProfilePower BI May 2026 enterprise rollout: Visual Calculations GA, Exploration Perspective, Copilot Summarize. Governance patterns, migration plan, semantic model impact.
Power BIPower BI Embedded vs Fabric Embedded 2026 decision framework: pricing, capacity, multi-tenancy, security, ISV vs internal scenarios for enterprise embedded analytics.
Power BIPower BI Performance Engineering playbook: VertiPaq tuning, DAX optimization, aggregations, partitioning, capacity sizing for Fortune 500 sub-second enterprise dashboards.
Our team of experts can help you implement enterprise-grade power bi solutions tailored to your organization's needs.