EPC Group - Enterprise Microsoft AI, SharePoint, Power BI, and Azure Consulting
G2 High Performer Summer 2025, Momentum Leader Spring 2025, Leader Winter 2025, Leader Spring 2026
BlogContact
Ready to transform your Microsoft environment?Get started today
(888) 381-9725Get Free Consultation
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌

EPC Group

Enterprise Microsoft consulting with 28+ years serving Fortune 500 companies.

(888) 381-9725
contact@epcgroup.net
4900 Woodway Drive - Suite 830
Houston, TX 77056

Follow Us

Solutions

  • All Services
  • Microsoft 365 Consulting
  • AI Governance
  • Azure AI Consulting
  • Cloud Migration
  • Microsoft Copilot
  • Data Governance
  • Microsoft Fabric
  • vCIO / vCAIO Services
  • Large-Scale Migrations
  • SharePoint Development

Industries

  • All Industries
  • Healthcare IT
  • Financial Services
  • Government
  • Education
  • Teams vs Slack

Power BI

  • Case Studies
  • 24/7 Emergency Support
  • Dashboard Guide
  • Gateway Setup
  • Premium Features
  • Lookup Functions
  • Power Pivot vs BI
  • Treemaps Guide
  • Dataverse
  • Power BI Consulting

Company

  • About Us
  • Our History
  • Microsoft Gold Partner
  • Case Studies
  • Testimonials
  • Blog
  • Resources
  • Contact

Microsoft Teams

  • Teams Questions
  • Teams Healthcare
  • Task Management
  • PSTN Calling
  • Enable Dial Pad

Azure & SharePoint

  • Azure Databricks
  • Azure DevOps
  • Azure Synapse
  • SharePoint MySites
  • SharePoint ECM
  • SharePoint vs M-Files

Comparisons

  • M365 vs Google
  • Databricks vs Dataproc
  • Dynamics vs SAP
  • Intune vs SCCM
  • Power BI vs MicroStrategy

Legal

  • Sitemap
  • Privacy Policy
  • Terms
  • Cookies

© 2026 EPC Group. All rights reserved.

‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
Home / Blog / Getting Started with Microsoft Fabric

Getting Started with Microsoft Fabric: Enterprise Guide

By Errin O'Connor, Chief AI Architect at EPC Group  |  Published April 2026  |  Updated April 15, 2026

Your first 30 days with Microsoft Fabric will determine whether your organization sees it as transformative or just another platform to manage. Here is how to get it right.

Before Day 1: Capacity Sizing for Trial

Microsoft offers a 60-day Fabric trial capacity that any Power BI admin can activate. The trial provides an F64-equivalent capacity — enough for serious evaluation but not enough for production load testing. Here is how to approach capacity sizing for your first real deployment:

F-SKUMonthly CostBest For
F2~$262Individual developer sandbox
F4~$524Small team development/testing
F8~$1,048Proof-of-concept with real data
F64~$8,400Production — mid-size enterprise
F128~$16,800Production — large enterprise
F256~$33,600Production — heavy concurrent workloads

EPC Group recommendation: Start with the free trial (F64-equivalent) for evaluation. When moving to production, begin with F64 and monitor with the Fabric Capacity Metrics App for 30 days. Scale up or down based on actual utilization data. F-SKUs support pause/resume — unlike old P-SKUs — so you can pause non-production capacities overnight and on weekends to cut costs by 60%.

Week 1: Choose Your First Project

The first Fabric project should be high-visibility, low-risk, and deliver value within 2 weeks. Here are the three project types EPC Group recommends:

Option A: Direct Lake Conversion (Best for Power BI-heavy orgs)

Take your largest Import-mode Power BI dataset — the one that takes 45 minutes to refresh every morning and occasionally fails. Move its source data into a Fabric Lakehouse using a Dataflow Gen2 or Pipeline. Rebuild the semantic model to use Direct Lake connectivity.

Result: Near-instant report loading with no scheduled refresh. The report looks identical to users, but it is always current and never fails due to refresh timeouts. This is the single most compelling Fabric demo for business stakeholders.

Option B: Dataflow Gen2 Replacing Manual ETL (Best for data teams)

Identify a process where someone downloads data from a source system, transforms it in Excel, and uploads it to a SharePoint list or Power BI dataset. Replace that manual process with a Dataflow Gen2 that connects to the source, transforms the data visually, and lands it in a Lakehouse table.

Result: Hours of manual work eliminated. Data freshness improves from daily/weekly to every 15 minutes. The data team sees Fabric as a productivity multiplier, not just another platform.

Option C: Real-Time Dashboard (Best for IT/Operations)

Connect Azure Event Hubs, IoT Hub, or a custom Kafka stream to a Fabric KQL Database using Eventstream. Build a Real-Time Dashboard showing live metrics — server health, application errors, transaction volumes, or IoT sensor readings.

Result: A live operational dashboard built in hours, not weeks. IT leadership sees immediate value. This is particularly compelling for organizations that have been trying to build real-time dashboards with Power BI streaming datasets and hitting limitations.

Week 2: Lakehouse vs Warehouse — Your First Storage Decision

Both Lakehouse and Warehouse store data in OneLake. Both support SQL queries. The choice comes down to your team's skills and your data's structure:

FactorLakehouseWarehouse
Query languagesSpark (Python, Scala, R) + SQLT-SQL only
Schema approachSchema-on-read (flexible)Schema-on-write (structured)
Data typesStructured + semi-structuredStructured only
Best for teams withPython/data engineering skillsSQL Server/T-SQL skills
ML/data science supportNative (Spark notebooks)Limited (SQL only)
Direct Lake supportYesYes

EPC Group recommendation: Start with Lakehouse unless your entire team is SQL-only. The Lakehouse includes a SQL analytics endpoint that lets T-SQL users query it while data engineers use Spark — you get both interfaces. You can always add a Warehouse later for specific structured workloads.

Week 3: Connecting Existing Power BI

Your existing Power BI reports do not need to be rebuilt. Moving a Power BI workspace to a Fabric capacity is a configuration change, not a migration. Here is the process:

  1. Assign workspace to Fabric capacity: In the Power BI admin portal, assign your workspace to the Fabric F-SKU capacity. All reports, datasets, and dataflows continue to work.
  2. Identify Direct Lake candidates: Look for Import-mode datasets larger than 1GB with scheduled refreshes. These benefit most from Direct Lake conversion.
  3. Create a Lakehouse for source data: Use a Pipeline or Dataflow Gen2 to land the source data that currently feeds your Power BI Import datasets into Lakehouse Delta tables.
  4. Rebuild semantic models for Direct Lake: Create new semantic models that point to the Lakehouse tables using Direct Lake mode. Reconnect existing reports to the new semantic models.
  5. Deprecate old datasets: Once reports are running on Direct Lake, decommission the old Import datasets and their refresh schedules.

EPC Group typically converts 10-20 Power BI datasets to Direct Lake in the first month of a Fabric engagement. The performance improvement is immediately visible to report consumers.

Week 4: When to Keep Synapse

Not everything should move to Fabric on day one. Keep Azure Synapse if any of these conditions apply:

  • Dedicated SQL Pool with complex T-SQL: Synapse Dedicated SQL Pools support stored procedures, materialized views, and result set caching that Fabric Warehouse is still maturing on. If your data warehouse relies on 500+ stored procedures, the migration effort is significant.
  • Custom Spark configurations: Synapse Spark pools allow custom library installations, cluster sizing, and autoscale configurations that Fabric Spark does not fully support yet. If your data science team needs specific Spark configurations, keep Synapse.
  • Synapse Link integrations: Synapse Link for Cosmos DB and Synapse Link for Dataverse provide near-real-time replication. Fabric supports Mirroring as an alternative, but it is newer and may not cover all your Synapse Link scenarios.
  • Compliance requirements locked to specific Azure regions: Fabric capacities are available in most Azure regions, but if your compliance framework requires data residency in a region where Fabric is not yet available, Synapse is the safer choice.

The hybrid approach works: keep Synapse for complex workloads, use Fabric for new workloads and Power BI. OneLake shortcuts can connect to Azure Data Lake Storage that Synapse writes to, letting Fabric read Synapse output without data duplication.

Common Mistakes in the First 30 Days

  • Starting with F2/F4 and judging performance: F2 has 2 Capacity Units. F64 has 64. Performance on F2 is not representative of production experience. Always evaluate on at least F64 or the free trial capacity.
  • Migrating everything at once: Fabric is not a big-bang migration. Start with one workload (usually Power BI with Direct Lake), prove value, and expand. Trying to move ADF + Synapse + Power BI + ML simultaneously creates risk with no quick wins.
  • Ignoring OneLake governance: OneLake is multi-tenant by default within your Fabric tenant. Without workspace-level permissions and sensitivity labels, data engineers in one workspace can see data in another. Set up governance policies from day one.
  • Not monitoring capacity utilization: Install the Fabric Capacity Metrics App immediately. If your capacity is consistently above 80% utilization, you need to scale up before users experience throttling.
  • Skipping the Center of Excellence: Fabric touches data engineering, data science, BI, and IT infrastructure. Without a cross-functional governance group (Center of Excellence), workspace sprawl and ungoverned data proliferation will happen within weeks.
  • Forgetting about cost management: Fabric capacities bill 24/7 unless paused. Set up auto-pause for development capacities. Use workload management settings to prevent a single runaway Spark job from consuming the entire capacity.

The 30-Day Implementation Timeline

Days 1-5: Foundation

Activate trial or provision F64. Assign Power BI workspaces. Install Capacity Metrics App. Establish workspace naming conventions. Set OneLake permissions.

Days 6-10: First Lakehouse

Create your first Lakehouse. Ingest data from one source system via Pipeline or Dataflow Gen2. Validate data quality. Build a SQL analytics endpoint view.

Days 11-15: Direct Lake

Build a Direct Lake semantic model on the Lakehouse. Connect existing Power BI reports. Benchmark performance against Import mode. Share with stakeholders.

Days 16-20: Second Workload

Add a second workload: Dataflow Gen2 for a manual ETL process, or a Notebook for data science exploration. Begin training the team on the new workload.

Days 21-25: Governance

Deploy sensitivity labels on Lakehouse tables. Set up Purview integration. Define workspace access policies. Create a Fabric governance runbook.

Days 26-30: Decision

Review Capacity Metrics. Calculate TCO comparison vs current stack. Present findings to leadership. Decide: expand to production, adjust capacity, or stay on current stack.

Frequently Asked Questions

What is the minimum Fabric capacity for enterprise workloads?

F64 (~$8,400/month) is the minimum production capacity for enterprise workloads. F2 ($262/month) and F4 ($524/month) are suitable for development, testing, and proof-of-concept only — they will throttle under production load. F64 provides enough Capacity Units (CUs) to run concurrent Power BI reports, data pipelines, and notebook workloads for a mid-size team. For large enterprises with 500+ active analytics users, F128 or F256 is typical. Use the Fabric Capacity Metrics App to monitor utilization and right-size after 30 days of production usage.

Should I start with a Lakehouse or Warehouse in Fabric?

Start with a Lakehouse if your team includes data engineers comfortable with Python/PySpark and you need to process semi-structured data (JSON, CSV, Parquet). Start with a Warehouse if your team is SQL-first and your data is already structured in relational databases. Both store data in OneLake and both support SQL queries. The Lakehouse also supports Spark notebooks, making it more flexible for data science workloads. If unsure, start with Lakehouse — it has a SQL analytics endpoint that lets SQL users query it while data engineers use Spark. You can always add a Warehouse later.

Can I connect my existing Power BI reports to Fabric without rebuilding them?

Yes. When you move a Power BI workspace to a Fabric capacity (F-SKU), all existing reports, datasets, and dataflows continue to work unchanged. There is no rebuild required. If you want to take advantage of Direct Lake mode, you will need to create a Lakehouse or Warehouse in Fabric, land your data there, and then rebuild the semantic model (dataset) to use Direct Lake connectivity instead of Import or DirectQuery. The reports themselves do not change — only the underlying dataset connection changes.

When should I keep Azure Synapse instead of moving to Fabric?

Keep Synapse if: 1) You have a mature Synapse Dedicated SQL Pool with complex T-SQL stored procedures, views, and security policies that would require significant refactoring. 2) Your team relies on Synapse Spark pools with custom cluster configurations not yet supported in Fabric. 3) You have Synapse Link for Cosmos DB or Dataverse integrations in production. 4) Your organization is not ready to adopt capacity-based billing (Fabric) vs. resource-based billing (Synapse). Microsoft has committed to long-term Synapse support. Migration to Fabric is optional, not mandatory.

What are the best quick wins to demonstrate Fabric value in the first 30 days?

Three quick wins EPC Group recommends: 1) Direct Lake on your largest Power BI dataset — if you have a 5GB+ Import dataset that takes 30+ minutes to refresh, move the source data to a Lakehouse and switch to Direct Lake. Report performance stays fast and you eliminate the refresh schedule entirely. 2) Dataflow Gen2 replacing a manual ETL process — identify a team that exports data to Excel, transforms it, and uploads to Power BI. Replace that with a Dataflow Gen2 that lands data directly in OneLake. 3) Real-Time Dashboard for IT operations — connect an Event Hub or Log Analytics workspace to a KQL Database and build a real-time dashboard in 2 hours. These three wins demonstrate Fabric's value to business users, data teams, and IT leadership simultaneously.

Get a Fabric Quick Start Engagement

EPC Group's 4-week Fabric Quick Start gets your enterprise from zero to production-ready: capacity provisioning, first Lakehouse, Direct Lake conversion, governance setup, and team training. Fixed scope, fixed price.

Call (888) 381-9725 or schedule a consultation below.

Schedule a Fabric Quick Start

Ready to get started?

EPC Group has completed over 10,000 implementations across Power BI, Microsoft Fabric, SharePoint, Azure, Microsoft 365, and Copilot. Let's talk about your project.

contact@epcgroup.net(888) 381-9725www.epcgroup.net
Schedule a Free Consultation