EPC Group - Enterprise Microsoft AI, SharePoint, Power BI, and Azure Consulting
Clutch Top Power BI & Data Solutions Company 2026, G2 High Performer, Momentum Leader, Leader Awards
BlogContact
Ready to transform your Microsoft environment?Get started today
(888) 381-9725Get Free Consultation
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌
‌

EPC Group

Enterprise Microsoft consulting with 28+ years serving Fortune 500 companies.

(888) 381-9725
contact@epcgroup.net
4900 Woodway Drive - Suite 830
Houston, TX 77056

Follow Us

Solutions

  • All Services
  • Microsoft 365 Consulting
  • AI Governance
  • Azure AI Consulting
  • Cloud Migration
  • Microsoft Copilot
  • Data Governance
  • Microsoft Fabric
  • vCIO / vCAIO Services
  • Large-Scale Migrations
  • SharePoint Development

Industries

  • All Industries
  • Healthcare IT
  • Financial Services
  • Government
  • Education
  • Teams vs Slack

Power BI

  • Case Studies
  • 24/7 Emergency Support
  • Dashboard Guide
  • Gateway Setup
  • Premium Features
  • Lookup Functions
  • Power Pivot vs BI
  • Treemaps Guide
  • Dataverse
  • Power BI Consulting

Company

  • About Us
  • Our History
  • Microsoft Gold Partner
  • Case Studies
  • Testimonials
  • Blog
  • Resources
  • Contact

Microsoft Teams

  • Teams Questions
  • Teams Healthcare
  • Task Management
  • PSTN Calling
  • Enable Dial Pad

Azure & SharePoint

  • Azure Databricks
  • Azure DevOps
  • Azure Synapse
  • SharePoint MySites
  • SharePoint ECM
  • SharePoint vs M-Files

Comparisons

  • M365 vs Google
  • Databricks vs Dataproc
  • Dynamics vs SAP
  • Intune vs SCCM
  • Power BI vs MicroStrategy

Legal

  • Sitemap
  • Privacy Policy
  • Terms
  • Cookies

© 2026 EPC Group. All rights reserved.

Back to Blog

A Brief Guide To Business Central Data Integration In Azure Storage

Errin O\'Connor
December 2025
8 min read

Dynamics 365 Business Central is Microsoft's cloud ERP platform for small-to-midsize enterprises, managing financials, supply chain, manufacturing, and project operations. While Business Central provides built-in reporting, enterprises increasingly need to integrate Business Central data with Azure Storage services -- Blob Storage, Data Lake Storage Gen2, and Azure SQL Database -- to enable advanced analytics, cross-system reporting, long-term archival, and AI/ML workloads that exceed the capabilities of Business Central's native tools. EPC Group architects data integration pipelines that connect Business Central to Azure's analytics ecosystem, unlocking deeper insights from your ERP data.

Why Integrate Business Central with Azure Storage?

Business Central stores operational data optimized for transactional processing, not analytics. Moving data to Azure Storage enables scenarios that are impractical within Business Central alone:

  • Advanced Analytics: Combine Business Central financial data with CRM data (Dynamics 365 Sales), operational data (IoT telemetry), and market data in Azure Synapse or Databricks for comprehensive business intelligence that Business Central's built-in reports cannot provide.
  • Power BI at Scale: Direct Query against Business Central has performance limitations. Extracting data to Azure SQL or Data Lake and building Power BI datasets on top provides faster, more scalable reporting with support for large historical datasets and complex DAX calculations.
  • Data Archival and Compliance: Regulatory requirements (SOX, GDPR, industry-specific retention rules) may require long-term storage of transactional data beyond Business Central's practical retention window. Azure Blob Storage with lifecycle policies provides cost-effective, compliant archival.
  • AI and Machine Learning: Train predictive models (demand forecasting, customer churn, cash flow prediction) using historical Business Central data stored in Azure Data Lake. Azure Machine Learning can access the data directly for model training and batch scoring.
  • Cross-System Integration: Azure Data Factory pipelines can combine Business Central data with data from other systems (Salesforce, SAP, custom applications) in a centralized data lake for unified reporting.

Integration Methods

There are several approaches to moving Business Central data into Azure Storage, each with different trade-offs in complexity, latency, and cost:

  • Business Central API + Azure Data Factory: Use ADF's REST connector to call Business Central OData or API v2.0 endpoints and load data into Azure Blob Storage, Data Lake, or SQL Database. This is the most common approach. ADF handles scheduling, pagination, error handling, and incremental loading. Best for daily or hourly batch integration.
  • Business Central Change Log + Event-Driven: Configure Business Central's change log to track modifications to key tables (customers, items, sales orders, GL entries). A Logic Apps or Azure Function workflow polls the change log and writes incremental changes to Azure Storage. Provides near-real-time integration for critical data entities.
  • Dataverse Integration: Business Central data can sync to Microsoft Dataverse (via virtual tables or direct integration), and Dataverse data can then be exported to Azure Data Lake using the Azure Synapse Link for Dataverse. This is the preferred path when you already use Dataverse for Dynamics 365 cross-app integration.
  • Custom AL Extensions: Build custom Business Central extensions (in AL language) that directly write data to Azure Blob Storage or Azure Service Bus using the Azure REST APIs. Provides the most control and lowest latency but requires AL development expertise and Business Central development lifecycle management.
  • Third-Party Connectors: Tools like Jet Reports, Zetadocs, and CData provide pre-built connectors that simplify Business Central to Azure data movement without custom development. Suitable for organizations that prefer configuration over code.

Architecture: Business Central to Azure Data Lake

The most scalable architecture uses Azure Data Lake Storage Gen2 as the central landing zone for Business Central data, with downstream consumption by analytics and AI services:

  • Extraction Layer: Azure Data Factory pipelines call Business Central API v2.0 endpoints on a schedule (e.g., every 4 hours for financials, hourly for inventory). Incremental extraction uses delta tokens or timestamp-based filtering to pull only changed records.
  • Landing Zone (Raw): Raw JSON or Parquet files land in a "raw" container in Azure Data Lake Storage Gen2. Files are organized by entity, date, and extraction batch for traceability.
  • Transformation Layer: ADF Data Flows or Databricks notebooks cleanse, deduplicate, and transform raw data into analytics-ready format. Apply business rules, currency conversions, and dimensional modeling (star schema) for reporting.
  • Curated Layer: Clean, transformed data is written to a "curated" container in Parquet format or loaded into Azure Synapse dedicated SQL pool for high-performance query access.
  • Consumption Layer: Power BI connects to the curated layer for enterprise dashboards. Azure Machine Learning accesses historical data for model training. Azure Data Explorer can be used for ad-hoc exploration of operational data.

Security and Authentication

Securing the integration between Business Central and Azure Storage requires attention at multiple layers:

  • Business Central Authentication: Use Microsoft Entra ID (Azure AD) service principal authentication for API access. Register an application in Entra ID, grant it API permissions for Business Central, and configure ADF linked services with the client ID and secret (stored in Azure Key Vault).
  • Azure Storage Security: Enable Entra ID RBAC on Storage accounts. Use managed identities for ADF to access storage without storing credentials. Enable storage firewall rules and private endpoints to restrict access to authorized networks.
  • Data Encryption: Data is encrypted in transit (TLS 1.2+) and at rest (Azure Storage Service Encryption with Microsoft-managed or customer-managed keys). For sensitive financial data, consider additional encryption layers with Azure Key Vault.
  • Access Control: Implement least-privilege access using Azure RBAC and storage ACLs. Data engineers get write access to raw and transformation layers; business analysts get read-only access to the curated layer.

Why EPC Group for Business Central Integration

Integrating Business Central with Azure requires expertise across the Dynamics 365 ecosystem, Azure data services, and enterprise data architecture. EPC Group provides:

  • Data Strategy: We assess your reporting, analytics, and AI requirements to determine which Business Central data entities need to be integrated, at what frequency, and to which Azure storage targets.
  • Pipeline Development: Our team builds production ADF pipelines with incremental extraction, error handling, retry logic, and monitoring for reliable, automated data integration.
  • Data Modeling: We design dimensional models (star/snowflake schemas) optimized for Power BI reporting on Business Central financial, sales, inventory, and manufacturing data.
  • Power BI Dashboards: We build enterprise Power BI dashboards and reports on top of the Azure-hosted Business Central data, providing faster performance and richer analytics than native Business Central reporting.
  • Compliance: For organizations in regulated industries, we ensure data integration pipelines comply with SOX, GDPR, and industry-specific data handling requirements including audit logging and data retention policies.

Unlock Your Business Central Data

Contact EPC Group to design a data integration strategy that connects Dynamics 365 Business Central to Azure's analytics ecosystem. From pipeline development to Power BI dashboards, we help you extract maximum value from your ERP data.

Schedule a ConsultationCall (888) 381-9725

Frequently Asked Questions

Can I do real-time integration between Business Central and Azure?

Near-real-time integration is achievable using Business Central webhooks or change notifications combined with Azure Functions or Logic Apps. When a record changes in Business Central (e.g., a sales order is posted), a webhook triggers an Azure Function that writes the change to Azure Storage or Service Bus. True real-time (sub-second) integration is limited by Business Central's webhook delivery frequency and API rate limits. For most enterprise analytics scenarios, batch integration every 15-60 minutes provides sufficient data freshness at lower complexity and cost.

What are the API rate limits for Business Central?

Business Central online enforces API rate limits to protect the shared environment. The limits include a maximum of 6,000 OData requests per 5-minute window per environment, and concurrent session limits. For large data extractions, EPC Group implements pagination, incremental loading (delta queries), and request throttling in ADF pipelines to stay within limits. We also schedule large extractions during off-peak hours and use batch endpoints where available to maximize throughput within the rate limit boundaries.

Should I use Dataverse or direct API integration?

Use Dataverse integration if you already use Dynamics 365 Sales, Customer Service, or Field Service and need cross-app data integration. Dataverse provides a unified data model and the Azure Synapse Link for Dataverse simplifies Data Lake export. Use direct API integration (ADF + Business Central API) if Business Central is your only Dynamics 365 application, you need more control over the extraction logic, or you want to avoid the additional Dataverse licensing and storage costs. EPC Group evaluates both approaches based on your Dynamics 365 footprint and analytics requirements.

What format should I use for Azure Storage -- Parquet or CSV?

Parquet is strongly recommended for analytics workloads. It is a columnar format that provides 3-10x compression over CSV, supports schema evolution, and enables predicate pushdown (column pruning) in Synapse, Databricks, and ADF Data Flows, resulting in significantly faster queries and lower compute costs. CSV should only be used when downstream systems explicitly require it or for simple file exchange scenarios. EPC Group configures ADF pipelines to write Parquet with Snappy compression as the default format for all Data Lake targets.

How do I handle historical data migration?

Initial historical data migration is typically a one-time full extraction of all relevant Business Central entities into Azure Storage. For large environments, this may take several hours due to API rate limits. EPC Group uses parallel extraction (multiple entities simultaneously), pagination optimization, and off-peak scheduling to minimize the migration window. After the initial load, incremental pipelines maintain ongoing synchronization. We validate record counts and key financial totals between Business Central and the Azure target to ensure data integrity after migration.