Close this search box.

BARC Perspective on Microsoft’s Announcement of “MS Fabric”

In its latest update, Microsoft has announced the bundling of its data & analytics product stack into Microsoft Fabric.

What happened?

Microsoft has bundled its data & analytics product stack into the newly announced Microsoft Fabric. The product is currently in preview status, although early adopters have been testing it as part of an ongoing pre-beta program.

Why is it important?

Microsoft has gained a significant share in the business intelligence market since the 1990s with SQL Server and, over the last decade, mainly Power BI. In recent years, Azure, Synapse and other components have been the focus of a renewal of its data & analytics strategy to implement modern data management and, in particular, to secure a substantial share of the AI market opportunity.

This latest announcement reconfigures these components into Microsoft Fabric with one small addition (Data Activator) and is sure to apply new pressures to the data & analytics market.

What is interesting about it?

  • The modernization of data management is progressing rapidly. Driven by an increasingly dynamic environment and the trend toward the cloud and AI, many major providers are rebuilding or modernizing their back ends with concepts such as Data Mesh, data lakehouse and data fabric. Microsoft is now finalizing its transformation with the newly branded Microsoft Fabric offering.
  • Through this move, Microsoft, one of the dominant vendors in the data & analytics market, has united its products in one stack – not only from a licensing point of view, but also with a unified UI and easier interaction between the individual components. In recent months, the acquisition of Talend by Qlik and SAP’s Datasphere announcement have already signalled the intentions of two larger providers to improve the integration of the front and back ends within their portfolios.
  • Microsoft Fabric is entering the market with a more aggressive licensing model, designed to offer access to a complete modern data & analytics stack for mid-market companies in particular. Until now, its rather ambitious pricing model, especially around the core Synapse component for modern data management and data warehousing, has been a barrier to widespread use. The new licensing model is purely capacity-based, which makes it harder to calculate the cost of ownership in advance.
  • It appears that after its positive experience with Power BI as an inexpensive entry-level product, Microsoft also wants to leverage its market position to bring more users to its platform on the cloud data management and data warehouse side. One hope is undoubtedly to leverage more use cases, especially in this current period of AI hype, through an initially more affordable, complete offering and to monetize them later through significant usage of Azure compute.
  • The ‘fabric’ concept has been hyped in the data & analytics market for some time. Microsoft is taking the bold step of using a potentially cyclical term in the product name. This adoption by Microsoft substantially upgrades the term ‘fabric’ in the data & analytics industry.
  • Compared to Synapse, the engine behind Microsoft Fabric has now been completely rebuilt with new cloud storage and compute separation paradigms. The goal is to compete and scale comparably with vendors like Databricks and Snowflake while also providing the same degree of flexibility.

Background and technological fit

MS Fabric brings together components that have also worked well together in the past:

  • Azure Data Factory for data integration
  • Synapse for cloud-based data warehousing and engineering
  • Different services for machine learning and AI
  • Purview for data governance
  • Power BI – the popular front-end for dashboarding and reporting

OneLake is a rebranded version of Azure Data Lake Services (ADLS) with an enhanced user interface as well as new functionality for virtualization, such as integrating AWS S3 data. Known as “Shortcuts”, this functionality will likely be extended to other cloud technology providers including Google and SAP. It appears that Microsoft is driving OneLake in the direction of an overarching data virtualization engine.

The newly announced “Data Activator” component allows users to define rule-based event processing and automated workflows. This functionality was supported in previous Azure environments with scripting, but Data Activator offers a new, end-user-oriented, low-code toolset with greater flexibility.

MS Fabric could be seen primarily as a new commercial bundle of familiar components, but there are two other interesting aspects:

  • A user interface that will be unified step by step. This is intended to significantly simplify the interaction between the components, but a proper assessment will only be possible from the public beta phase, when the integration is largely complete. Integrating generative AI capabilities throughout the product portfolio has also been announced as an important part of the unified user interface (and beyond).
  • A new pricing model based on combined usage and billing with a Microsoft tenant. Once activated, all components of MS Fabric are available and ready to use.

Potential negative effects for customers

  • Bundling products has no added value in itself, and the deeper technical integration could even slow the development of individual components for some time. Microsoft has an excellent track record in the data & analytics space, especially from integrating Power Query, Power View and Power Pivot to Power BI a few years ago. On the other hand, some of the bundled components (e.g., Purview for data governance and data intelligence) have functional gaps in comparison to major competitors, which still need to be closed.
  • Microsoft is trying hard to demonstrate the continued openness of the platform for partners and this is also underpinned by press releases from partners such as Informatica. Nevertheless, customers with best-of-breed strategies could suffer from the fact that cooperation with third parties has potentially less focus in the future.
  • A stronger bundling of products in a cloud provider’s stack increases the lock-in effect and makes customers more susceptible to subsequent price increases.
  • Although MS Fabric follows a multi-cloud approach and includes functionalities to integrate data from AWS and Google Cloud, the level of integration is rather basic. Customers with multi-cloud strategies could be pushed to standardize their data in Azure.
  • For existing Synapse users, MS Fabric is a logical upgrade path, but the migration seems to require a lot of manual effort. It remains to be seen whether Microsoft will offer migration tools in the near future.

Potential positive effects for customers

  • Existing users of individual components like Power BI will have an easier path to evaluate and try out additional components and benefit from better integration of components and a user interface that feels familiar.
  • MS Fabric tries to simplify and push implementation tasks that used to be pure ‘data engineering’ tasks closer to the business user by integrating them into the Power BI workspace. This could help scale the data engineering process in companies.
  • The integrated approach of MS Fabric will help users move from simple data management applications to more complex data science scenarios from a technical point of view.
  • It appears that the new licensing model will be more affordable, especially for mid-size companies, although detailed pricing is yet to be announced.
  • Companies gain flexibility in using their existing Azure licenses for different MS Fabric components.
  • The administration of the components will be easier with a centralized approach within one tenant.

Strategic outlook

In general, the trend in the market is reverting to the standardization of stacks by the larger vendors. Since its inception, movements between best-of-breed approaches and vendor-specific stacks have characterized the data & analytics market. After a prolonged period with more combined and self-service approaches, it seems that the centralization of the data & analytics infrastructure is coming back. This is being driven mainly by the mega-vendors, who are trying to push as many workloads as possible in their single cloud environments.

Such a development has positive and negative effects for customers. Greater integration makes it easier to implement new and more complex use cases. On the other hand, the lock-in effect with individual providers becomes stronger. Nevertheless, vendors are striving to demonstrate openness and ensure interoperability. It will be interesting to observe how practicable multi-cloud concepts evolve in the data & analytics world in the future.

We believe that ‘data fabric’ solutions will become more functionally rich and widespread as they are more capable of dealing with multi-cloud environments than pure data lakehouse solutions. However, as long as the mega-vendors are doing very good business migrating customers into their “single” clouds, we expect new ‘fabric’ features to trickle in at a slow to medium pace.

In any case, the transformation of the data & analytics infrastructure from the good old data warehouse – which still has its place for some scenarios – to new data management concepts is in full progress.

Don‘t miss out!
Join over 25,775 data & analytics professionals and get the latest product insights, research, surveys and more!

Discover more content


BARC Fellow

Stefan Sexl is a BARC Fellow specializing in CPM, BI and reporting. With 30 years of industry experience with leading vendors such as MIS AG, pmOne and Tagetik, he advises vendors on marketing, sales and product strategy as well as end customers on the selection and implementation of solutions.

He has a particular focus on the market for CPM and group accounting solutions including planning, consolidation, ESG, account reconciliation and other components.


Shawn has over 28 years of international experience as an industry analyst, thought leader, speaker, author, and instructor on data, business intelligence (BI), analytics, artificial intelligence (AI), machine learning (ML), and cloud technologies. His former analyst roles and executive strategy positions with enterprise software firms give him a unique industry.

Shawn is a published author and has co-written two industry-leading books, “Social Data Analytics” and the latest, “Analytics: How to Win with Intelligence.” He was recently named one of Top 50 Global Thought Leaders on Analytics 2023 by Thinkers360.

Senior Analyst Data & Analytics

As a BARC Fellow, Gernot Molin advises local and international organizations of all sizes and industries in defining strategy for advanced analytics and business intelligence, as well as data platforms for machine learning and artificial intelligence.

He is also a consultant for digital transformation, data architecture concepts and software selection, data modeling and solution design, as well as cloud strategy and cloud platform concepts.

He has been with BARC since 2019.

Senior Analyst Data & Analytics

Timm Grosser is a Senior Analyst Data & Analytics at BARC with a focus on data strategy, data governance and data management. His core expertise is the definition and implementation of data & analytics strategy, organization, architecture and software selection.

He is a popular speaker at conferences and seminars and has authored numerous BARC studies and articles.

Senior Analyst Data & Analytics

Jacqueline Bloemen is a Senior Analyst for Data & Analytics, focused on data & analytics strategy and culture, architecture & technology, governance and organization. She is an author and speaker, and has been advising companies of all sizes and industries for over 40 years.

Her current research and consulting activities are focused on the transformation towards data-driven business and data democracy. She has been with BARC since 2005.

Check out the world´s most comprehensive guide to the Power BI ecosystem.