Elevate your Business Central Reporting: Modern Data Architecture for Analytics

OmniData_D365-Business-Central-Reporting-Modern-Data-Architecture

Business Central helps companies take a major leap in the depth of processes they can automate, as well as the data available for analysis. But what is the right solution for your business?

“Over the last five to ten years, there has been a pretty massive level of innovation — which is great and that’s awesome because there’s lots of new technologies out there — but it’s also caused a lot of fragmentation of the modern data stack. There’s literally hundreds — if not thousands — of products and open source technologies and solutions that customers have to make sense of.

~ Arun Ulag, Corporate VP for Azure Data, Microsoft

Due to the complexity of BC’s table structures, it can be difficult and time-intensive for accounting teams to generate both the financial and operational reporting needed, a problem that compounds itself when adding additional data sources. While smaller, more straightforward companies may find the native Business Central reporting sufficient, the bulk of BC users will look for additional tools to supplement their reporting capabilities, often by leveraging a data warehousing solution.

Of course, a data warehouse requires its own storage and compute resources beyond Business Central. For years SQL Server has been the standard for data storage and processing, but thanks to Microsoft, newer technology is more accessible than ever. Azure Data Lake was built to address the growing need for scalable and cost-effective solutions to manage large volumes of data. While initially this problem only existed in the enterprise space, data demands have grown across companies of all sizes. In fact, the average SMB organization now has valuable data spread out across at least 7 different systems. Fortunately, Azure Data Lake Storage is very accessible and cost effective, even for smaller organizations.

Performance & Functionality

From a performance perspective, ADLS offers the unique ability to scale storage and compute power independently, with no real cap on the amount of storage available. Rather than a fixed cost like SQL, ADLS has a pay as you go model, which makes it easy to optimize cost and scale to match need. The compute resources can be accessed on demand through Synapse, which means you have lightning fast data processing at a fraction of the cost, since once again you’re only paying for the compute resources you use. ADLS goes a step further than SQL Server in the types of data it supports as well, being able to handle unstructured data and data of any type or size, allowing you to bring in streaming data (IoT, sensor data, etc) social media data, image and video files and more.

Azure Data Lake Storage has ready-made integrations with other Azure services and the broader Microsoft technology stack, making it easy to implement alongside your Business Central deployment. Security integrates with Entra ID (formerly, Azure Active Directory) seamlessly and ensures your data is protected at rest and when being consumed through PowerBI or Excel.

Cost & Scalability

Typically, you’d expect newer/better technology to come with a higher cost as well, but that’s not necessarily the case here. To compare the cost of running your organization’s business intelligence we looked at what a typical deployment looks like for ADLS and for SQL Server.

When we deploy a business intelligence solution we need to address the storage of data, the compute resources to ensure timely report refreshes, and a way for end users to interact with that data. For ADLS that includes the Data Lake storage itself, on demand Synapse for compute, and PowerBI Premium to put the data models in the hands of users. With Azure SQL you need the server, an instance of Analysis Services to deploy the data models (usually OLAP or Tabular models), Power BI Pro licenses for consumption of the data, and often an Azure VM to hold the software managing your data pipelines and transformations.

Here’s how the numbers stack up, based on the data volume and consumption needs of a typical Business Central customer:

Resource Cost Model Approximate Total
Azure SQL 300 eDTU: 300GB Storage $650.00
Azure Analysis Services Standard S0: 40QPU, 10GB $700.00
Azure VM DS3v2 $350.00
Power BI Pro x 10 Users $100.00
$1,800.00
Resource Cost Model Approximate Total
ADLS ADLS Premium Storage $150.00
Azure Synapse Serverless SQL Pool & Spark Pool: 4 vCore/32GB $450.00
Power BI Premium x 10 Users $200.00
$800.00

As you can see, ADLS represents a tremendous value compared to the old SQL Server approach, providing you with a modernized data infrastructure at under half the cost. The increase in cost as you grow is marginal as well. Our largest customers in the enterprise space only pay around ~$x more per month, meaning that you can count on very stable pricing without any gotchas or surprises down the road.

Additional Benefits

For many organizations the performance and cost improvements are reason enough to move to Azure Data Lakes, but the long-term benefits should be compelling as well: unlimited scalability, compatibility with data in any format, and the ability to launch AI and ML initiatives. Additionally, with the release of Microsoft Fabric, your entire data platform– from data ingestion, to storage and computing, to end-user consumption – will be integrated, with simplified licensing, and can be managed in one place.

While there are some changes to how data flows through the data lake, when paired with a properly implemented data lakehouse and PowerBI models the end-user isn’t exposed to any additional complexity. This can allow for a relatively seamless transition that doesn’t burden end users.

Are you exploring a move to Azure Data Lake Storage? OmniData™ can help, whether through our turnkey reporting & analytics environment OmniAnalytics™™, or through a more custom engagement. Contact us today.