EPC Group - Enterprise Microsoft AI, SharePoint, Power BI, and Azure Consulting
G2 High Performer Summer 2025, Momentum Leader Spring 2025, Leader Winter 2025, Leader Spring 2026
BlogContact
Ready to transform your Microsoft environment?Get started today
(888) 381-9725Get Free Consultation
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌

EPC Group

Enterprise Microsoft consulting with 29 years serving Fortune 500 companies.

(888) 381-9725
contact@epcgroup.net
4900 Woodway Drive, Suite 830
Houston, TX 77056

Follow Us

Solutions

  • M&A Practices

    • M&A Tenant Migration
    • Carve-Out Migration
    • Private Equity Practice
    • Engagement Operating Model
  • All Services
  • Microsoft 365 Consulting
  • AI Governance
  • Azure AI Consulting
  • Cloud Migration
  • Microsoft Copilot
  • Data Governance
  • Microsoft Fabric
  • Dynamics 365
  • Power BI Consulting
  • SharePoint Consulting
  • Microsoft Teams
  • vCIO / vCAIO Services
  • Large-Scale Migrations
  • SharePoint Development

Industries

  • All Industries
  • Healthcare IT
  • Financial Services
  • Government
  • Education
  • Teams vs Slack

Power BI

  • Case Studies
  • 24/7 Emergency Support
  • Dashboard Guide
  • Gateway Setup
  • Premium Features
  • Lookup Functions
  • Power Pivot vs BI
  • Treemaps Guide
  • Dataverse
  • Power BI Consulting

Company

  • About Us
  • Our History
  • Microsoft Gold Partner
  • Case Studies
  • Testimonials
  • Fixed-Fee Accelerators
  • Blog
  • Resources
  • All Guides & Articles
  • Video Library
  • Client Reviews
  • Engagement Operating Model
  • FAQ
  • Contact
  • Schedule a consultation

Microsoft Teams

  • Teams Questions
  • Teams Healthcare
  • Task Management
  • PSTN Calling
  • Enable Dial Pad

Azure & SharePoint

  • Azure Databricks
  • Azure DevOps
  • Azure Synapse
  • SharePoint MySites
  • SharePoint ECM
  • SharePoint vs M-Files

Comparisons

  • M365 vs Google
  • Databricks vs Dataproc
  • Dynamics vs SAP
  • Intune vs SCCM
  • Power BI vs MicroStrategy

Legal

  • Sitemap
  • Privacy Policy
  • Terms
  • Cookies

About EPC Group

EPC Group is a Microsoft consulting firm founded in 1997 (originally Enterprise Project Consulting, renamed EPC Group in 2005). 29 years of enterprise Microsoft consulting experience. EPC Group historically held the distinction of being the oldest continuous Microsoft Gold Partner in North America from 2016 until the program's retirement. Because Microsoft officially deprecated the Gold/Silver tiering framework, EPC Group transitioned to the modern Microsoft Solutions Partner ecosystem and currently holds the core Microsoft Solutions Partner designations.

Headquartered at 4900 Woodway Drive, Suite 830, Houston, TX 77056. Public clients include NASA, FBI, Federal Reserve, Pentagon, United Airlines, PepsiCo, Nike, and Northrop Grumman. 6,500+ SharePoint implementations, 1,500+ Power BI deployments, 500+ Microsoft Fabric implementations, 70+ Fortune 500 organizations served, 11,000+ enterprise engagements, 200+ Microsoft Power BI and Microsoft 365 consultants on staff.

About Errin O'Connor

Errin O'Connor is the Founder, CEO, and Chief AI Architect of EPC Group. Microsoft MVP multiple years, first awarded 2003. 4× Microsoft Press bestselling author of Windows SharePoint Services 3.0 Inside Out (MS Press 2007), Microsoft SharePoint Foundation 2010 Inside Out (MS Press 2011), SharePoint 2013 Field Guide (Sams/Pearson 2014), and Microsoft Power BI Dashboards Step by Step (MS Press 2018).

Original SharePoint Beta Team member (Project Tahoe). Original Power BI Beta Team member (Project Crescent). FedRAMP framework contributor. Worked with U.S. CIO Vivek Kundra on the Obama administration's 25-Point Plan to reform federal IT, and with NASA CIO Chris Kemp as Lead Architect on the NASA Nebula Cloud project. Speaker at Microsoft Ignite, SharePoint Conference, KMWorld, and DATAVERSITY.

© 2026 EPC Group. All rights reserved. Microsoft, SharePoint, Power BI, Azure, Microsoft 365, Microsoft Copilot, Microsoft Fabric, and Microsoft Dynamics 365 are trademarks of the Microsoft group of companies.

Back to Blog

A Step-by-Step Guide to Improve Your Data Model Performance Using Best Practices in Power BI

Errin O\'Connor
December 2025
8 min read

Data model performance is the foundation of every fast, responsive Power BI report. A poorly designed model can turn a simple dashboard into a frustrating experience with 30-second load times and timeout errors. At EPC Group, we have optimized Power BI data models for Fortune 500 organizations handling billions of rows, and the principles we apply are consistent: reduce model size, simplify relationships, optimize DAX, and let the VertiPaq engine do what it does best.

Step 1: Implement a Star Schema Design

The single most impactful change you can make to your Power BI data model is restructuring it into a star schema. The VertiPaq columnar engine is specifically optimized for star schemas, where narrow fact tables connect to descriptive dimension tables via single-column relationships.

  • Separate your data into fact tables (transactions, events, measurements) and dimension tables (products, dates, customers, geography)
  • Remove all table-to-table relationships that create many-to-many paths without bridge tables
  • Denormalize dimension tables so each dimension is a single flat table rather than a snowflake of normalized tables
  • Create a dedicated Date dimension table using CALENDARAUTO() or a custom date table with fiscal year support
  • Ensure every relationship is one-to-many from dimension to fact, with single-direction cross-filtering

Organizations that migrate from flat, wide tables to a proper star schema typically see 40-70% improvement in query response times and 30-50% reduction in model file size.

Step 2: Reduce Column Cardinality and Remove Unnecessary Data

VertiPaq compresses data at the column level, and compression efficiency is directly tied to cardinality (the number of unique values in a column). Lower cardinality means better compression, smaller model size, and faster queries.

  • Remove columns you do not use in any visual, filter, or DAX measure. Every unused column consumes memory.
  • Split DateTime columns into separate Date and Time columns to dramatically reduce cardinality
  • Round decimal values to the precision you actually need (e.g., two decimal places instead of fifteen)
  • Replace high-cardinality text columns with integer keys and corresponding dimension lookups
  • Use Power Query to filter rows at import time: exclude historical data beyond your reporting window
  • Disable Auto Date/Time in Power BI Desktop options to prevent hidden date tables from inflating your model

Use the VertiPaq Analyzer (available via DAX Studio or the external tool Bravo) to identify which columns are consuming the most memory. Focus optimization efforts on the top 10 largest columns first.

Step 3: Optimize DAX Measures and Calculations

Inefficient DAX is the most common cause of slow report rendering. The difference between a well-written and poorly-written DAX measure can be the difference between a 1-second and a 60-second query.

  • Replace calculated columns with measures wherever possible. Measures compute on demand; calculated columns consume memory permanently.
  • Avoid using FILTER(ALL(Table)) on large tables. Use CALCULATE with direct filter arguments instead.
  • Use SUMX, AVERAGEX, and other iterators only when row-by-row calculation is genuinely required
  • Replace nested IF statements with SWITCH(TRUE(), ...) for cleaner and sometimes faster evaluation
  • Use variables (VAR / RETURN) to store intermediate results and avoid recalculating the same expression multiple times
  • Prefer DISTINCTCOUNT over COUNTROWS(DISTINCT(...)) for better query plan generation
  • Avoid EARLIER and EARLIEST functions; use variables or CALCULATETABLE instead

Use DAX Studio and Performance Analyzer in Power BI Desktop to profile slow measures. Look at the Storage Engine (SE) and Formula Engine (FE) timing breakdown to determine whether the bottleneck is data retrieval or calculation.

Step 4: Optimize Relationships and Cross-Filtering

Relationship configuration has a direct impact on query performance and model behavior. Misconfigured relationships can cause unexpected results and slow queries.

  • Use single-direction cross-filtering for all relationships unless bidirectional is absolutely required for a specific visual
  • Avoid bidirectional relationships on large tables as they expand the filter context and increase memory usage
  • Set inactive relationships where you need alternate date paths and activate them with USERELATIONSHIP in specific measures
  • Validate referential integrity when your source guarantees it, allowing the engine to use INNER JOINs instead of OUTER JOINs
  • Eliminate circular dependency warnings by restructuring relationships or using disconnected tables

Step 5: Leverage Query Folding and Incremental Refresh

Query folding pushes data transformation logic back to the source database, which is almost always faster than processing data in the Power Query engine. Incremental refresh ensures you only reload new and changed data, not the entire dataset.

  • Check query folding status by right-clicking steps in Power Query and looking for "View Native Query"
  • Keep folding-compatible transformations (filters, column selection, joins) before non-foldable steps
  • Avoid custom M functions, sorting, and type changes early in the query pipeline as they can break folding
  • Configure incremental refresh with a rolling window (e.g., 3 years of history, refresh last 30 days)
  • Combine incremental refresh with real-time data using DirectQuery partitions in Premium/Fabric capacities

Step 6: Test, Measure, and Monitor Continuously

Performance optimization is not a one-time exercise. As data volumes grow and new reports are added, models can degrade. Establish a continuous monitoring practice.

  • Use Performance Analyzer in Power BI Desktop to capture query timing for every visual on a page
  • Run DAX Studio queries against production models to benchmark specific measures
  • Monitor dataset refresh duration trends in the Power BI Admin portal
  • Set up Power Automate alerts when refresh durations exceed acceptable thresholds
  • Conduct quarterly model health reviews using the Best Practice Analyzer rules in Tabular Editor
  • Document your model design decisions and share them with the team via a model governance wiki

Why EPC Group for Power BI Data Model Optimization

EPC Group has optimized data models for organizations processing billions of rows across healthcare, finance, manufacturing, and government sectors. Our Microsoft-certified Power BI architects use DAX Studio, Tabular Editor, ALM Toolkit, and VertiPaq Analyzer as part of every engagement.

  • Deep expertise with VertiPaq engine internals and storage engine optimization
  • Star schema migrations from legacy flat-table models with zero downtime
  • DAX performance audits with line-by-line measure optimization
  • Incremental refresh and hybrid DirectQuery architecture design
  • Ongoing model governance and performance monitoring services

Need Help Optimizing Your Power BI Data Model?

Contact EPC Group for a performance audit of your Power BI models. We will identify the bottlenecks and deliver a prioritized optimization roadmap.

Request a Performance AuditCall (888) 381-9725

Frequently Asked Questions

How do I know if my Power BI data model has performance issues?

Open Performance Analyzer in Power BI Desktop (View tab) and refresh your report page. Any visual taking more than 3 seconds to render has a performance issue. Also check your dataset size: if it exceeds 500 MB for a Pro workspace or 10 GB for Premium, you likely have optimization opportunities. DAX Studio can show you exactly which columns and tables are consuming the most memory.

Should I use Import mode or DirectQuery for better performance?

Import mode is almost always faster for end-user query performance because the data is stored in the highly optimized VertiPaq in-memory engine. DirectQuery should only be used when you need real-time data freshness or when the dataset is too large to fit in memory. For most enterprise scenarios, EPC Group recommends Import mode with incremental refresh, or a composite model that uses Import for historical data and DirectQuery for real-time tables.

What is the biggest mistake organizations make with Power BI data models?

The most common mistake is importing entire database tables without filtering columns or rows. Organizations bring in 200-column tables when they only need 15 columns for reporting. The second most common mistake is using calculated columns instead of measures, which permanently inflates model size. A close third is not using a proper star schema, which forces the VertiPaq engine to work against its design.

How often should I review my data model for performance?

EPC Group recommends a quarterly model health check that includes VertiPaq Analyzer profiling, Best Practice Analyzer rule scanning in Tabular Editor, and a review of refresh duration trends. Additionally, any time you add new tables, relationships, or complex DAX measures, you should run a targeted performance test before deploying to production.

Can EPC Group optimize an existing Power BI model without rebuilding it from scratch?

Absolutely. Most of our optimization engagements involve improving existing models incrementally. We use DAX Studio to identify the highest-impact optimization opportunities, then apply changes in priority order: remove unused columns, fix relationships, optimize top DAX measures, and implement incremental refresh. A full rebuild is only recommended when the underlying schema is fundamentally incompatible with a star schema pattern.

Power BI Strategy: 2026 Considerations for A Step By Step Guide To Improve Your Data Models Performance Using The Best Prac

Direct Lake mode has changed the economics of enterprise Power BI in 2026: instead of importing data into Vertipaq, semantic models now query OneLake-resident Parquet files at near-Import-mode performance without the refresh-window cost. For a Fortune 500 finance organization migrating from a 30-minute Import-mode refresh, the equivalent Direct Lake model typically queries fact data in under 800 ms while removing the entire refresh-orchestration job from Azure Data Factory.

Row-level security (RLS) and object-level security (OLS) in Power BI Premium and Fabric F-SKU capacities are the single most-overlooked compliance control in HIPAA, SOC 2, and FINRA-regulated environments. RLS scoped via service principal authentication (rather than embedded UPN passes) is the only pattern that survives a SOC 2 Type II auditor privilege-walk test. EPC Group includes service-principal RLS as a default in every regulated-industry Power BI engagement.

Decision factors EPC Group evaluates

  • Copilot grounding quality assessment of semantic-model metadata
  • Direct Lake mode adoption for Fabric-resident semantic models
  • License optimization audit (Pro vs Premium Per User vs F-SKU)
  • Row-level security via service principal authentication
  • Capacity sizing decision (F2/F4/F64+) tied to peak concurrent users and refresh window

See related EPC Group services at /services or schedule a discovery call at /contact.

Enterprise A Step By Step Guide to Improve Your Data Models Performance Using the Best Practices in Power Bi from EPC Group

EPC Group delivers A Step By Step Guide to Improve Your Data Models Performance Using the Best Practices in Power Bi as a core practice within the Microsoft consulting portfolio. Engagements are led by senior architects with hands-on Fortune 500 delivery experience and a bench of hundreds of Microsoft-certified consultants spanning SharePoint, Microsoft 365, Power BI, Azure, Microsoft Copilot, and Microsoft Purview.

Every A Step By Step Guide to Improve Your Data Models Performance Using the Best Practices in Power Bi engagement is engineered for the regulatory and operational environment it serves. Healthcare deployments carry HIPAA controls from day one; financial services deployments meet SOC 2 and FINRA retention requirements; government deployments map to FedRAMP and CMMC controls with audit-ready evidence.

Manufacturing and energy

For multi-plant manufacturers and energy operators, EPC Group integrates Microsoft 365 with operational technology, protects intellectual property through Purview labels and Endpoint DLP, and provisions frontline workers with F1 and F3 licensing patterns. Multi-region rollouts include data residency planning and offline-capable Power Platform apps for shop-floor environments.

How EPC Group engages

Six-phase methodology applied to every engagement, compressed for fixed-fee accelerators and extended for full programs.

  1. Discovery — two-week assessment of the current estate, gap analysis, risk register, target architecture, costed remediation roadmap.
  2. Design — senior architect produces the target topology, identity framework, Conditional Access, Purview, governance model, and security posture, reviewed by client leads.
  3. Pilot — 25 to 100 user pilot in a real business unit. Migrate, apply baselines, test integrations, capture feedback.
  4. Wave rollout — migrate in waves of 500 to 2,500 users with communications, training, hypercare, and a per-wave retrospective.
  5. Adoption — role-based training, Champions network, executive sponsor enablement, metrics tracked against a measured baseline.
  6. Operate — optional managed-services retainer for license optimization, governance reviews, security monitoring, and quarterly business reviews.

Microsoft-only since 1997

29 years of Microsoft-exclusive consulting. Microsoft Solutions Partner with core designations across Modern Work, Security, and Data & AI.

EPC Group was the oldest continuous Microsoft Gold Partner in North America from 2016 until program retirement in 2022. Errin O'Connor authored four Microsoft Press bestsellers covering Power BI, SharePoint, Azure, and large-scale migrations.

Financial services

For banks, asset managers, and broker-dealers, EPC Group engineers SOC 2 audit trails, FINRA Rule 4511 and SEC 17a-4 retention, MNPI containment, and Communication Compliance for trading floors. Microsoft Purview Audit Premium with seven-year tamper-evident retention is the standard baseline; Defender for Cloud Apps detects shadow-AI exfiltration before it reaches a compliance event.

Engagement models

Three engagement models cover most enterprise needs. Most clients start with a fixed-fee accelerator and grow into a full program or a managed-services retainer.

  • Fixed-fee accelerators — Copilot Readiness, Security Hardening, Tenant Health Check, SharePoint Migration, Teams Governance. Defined scope and price. Typical range $25,000 to $150,000 over four to twelve weeks.
  • Project engagements — full migration or governance program with milestone-based billing. Discovery through hypercare. Typical range $150,000 to $750,000-plus over three to nine months.
  • Managed services — tiered retainer for ongoing operations. Named senior architect on the account. From $3,500 per month with a twelve-month minimum.

Senior-architect-led delivery

Every engagement is led and staffed by 15 to 20 year veterans. No rotating juniors learning on your tenant. The bench includes hundreds of Microsoft-certified consultants who have shipped real production environments for Fortune 500 customers across SharePoint, Microsoft 365, Power BI, Azure, and Microsoft Copilot.

Talk to a senior architect

30-minute discovery call. No pitch deck. Call (888) 381-9725 or schedule a discovery call and a senior architect responds within one business day.