Most companies use Power BI for their reporting and analytics use cases. Behind it, some of them have data engineering teams, Azure Data Factory and Synapse Analytics for making data available for Power BI’s consumption. Not just for that purpose, of course. They have their Data Warehouses in Synapse, or maybe they have Lakehouses in Databricks, you name it. Fabric might be an upgrade to the Power BI game, but how do you move forward when you already have a massive investment in a data platform on your Azure cloud?
Firstly, do you have to move to Fabric when it becomes generally available? No, you don’t. You would need specific use cases and requirements to justify that move. If you have £1M+ spending on Azure data platforms and spent so much more building those platforms, you require hard evidence that tomorrow will be better with Fabric.
What is Fabric for? We’ve covered this in our Microsoft Fabric vs Synapse Analytics article, but let’s recap: Fabric is a turnkey and SaaS data platform solution that Microsoft manages for you. It’s built on top of Power Platform and its Workspaces, allowing you to bring data next to your reports and analytics instead of building bridges between two galaxies of Azure and Power Platform.
Don’t get me wrong. Fabric is a fantastic, all-in-one tool that will disrupt the industry. But also, it’s not the answer to your every question or a remedy for all your aches. It won’t save you from your data universe troubles, but it can help you ease it.
Fabric’s Strengths
Let’s dive into a few examples that would demonstrate the strengths of Fabric.
1. Unleash The Power Of Raw Data
The ability to bring raw data to your Power Platform and transform it with modern data tools is excellent news for companies’ BI and reporting teams. Fabric can replace your existing data pipelines and transformation rules depending on your data size. The Power Platform and the M query language certainly shine regarding data transformation and scripting capabilities, but you may feel unwelcome if you’re not used to those tools. M language, specifically, is an acquired taste.
2.Start Streaming Real-time Data
Real-time data has always been a challenge in Power Platform, but not any more. KQL database and Spark notebook support can help you tap into the streaming data and show it on your reports or run them through trained ML models to gain real-time insights. One example is to have your sales orders flow into your Fabric Lakehouses, Warehouses or KQL databases and then to your sales reports when you have a big product launch day.
3.Move Data Warehouses And Lakehouses Closer To Your Business
Data Warehouses have always been a sensitive topic for most companies. You want them to give you all the necessary answers; they get slower. Not only do they give you slow answers, but also you can’t ask rapid questions because they take time to implement. The business teams will hear their IT teams pointing to their Warehouse and saying, “This is why we can’t have nice things”.
Now, you can build Warehouses and Lakehouses quickly and move them closer to your business cases. You can create more than one Warehouse, with pipelines publishing multiple targets. You can tailor your specific Warehouse/Lakehouse to the needs of a department or a product. You don’t need to have a single and expensive Warehouse that’s always unsatisfactory. Now, you can move them to the right-hand side, closer to your businesses, and make them performant thanks to Direct Lake and Lakehouse endpoints.
4.Let It Be Managed
You don’t need to manage a mountain of infrastructure to get answers to some burning business questions. You don’t need a DevOps Team bigger than your Sales Department to keep your platform upright.
As mentioned in other articles, consider Fabric a charming and managed hotel. Get yourself a suit, and make yourself comfortable.
Subscribe to receive weekly tips!
Learn how to design awesome apps and data platforms on Azure & Fabric
SubscribeFabric Weaknesses
Let’s go through some examples where Fabric might not be the hero you need it to be.
1.Fabric’s Capabilities Are Too Rounded
Think of Fabric like a Rock and Roll Greatest Hits album. It contains 20 hit songs from some favourite bands you can’t get enough of. Sounds wonderful, right? Let me ask: From which era of rock and roll bands will that album have? 60s? 70s? 80s? Let’s say it has Guns N’ Roses but not Paradise City. Would you consider that album great? Would it give you the same rush of feelings if you were to listen to the produced album of Guns N’ Roses?
When you think about Databricks workspaces or Synapse Analytics Workspaces, you get a lot of functionality with those tools. Some of them you depend on. You may have invested in Delta Live Tables that Databricks has. Or you are using Key Vault references in your Linked Services in Synapse Analytics. Or maybe you’re using the Managed VNets capability of Synapse, so it’s integrated with your hub/spoke architecture. None of these are supported in Fabric.
Long story short, Fabric has all the greatest hits, but it may not be enough to make it work. Of course, it depends on your use cases, but it invites a thorough investigation.
2.Limited Customisation Support
Fabric is a managed platform that comes with many fixed behaviours. For example, currently, the only authentication provider it supports is Azure AD (Microsoft Entra ID). You can’t use SAS tokens on OneLake or connect with SQL credentials to the Data Warehouse. If you primarily use Azure AD as your application authentication method, and all the tools you’re using support OAuth protocol, this limitation won’t apply to you.
But if you want your Data Warehouse queryable from your on-premise apps, or if you have a multi-cloud strategy that includes AWS and GCP, you may hit a few roadblocks where you need to create custom apps/functions/APIs on Azure that would talk to OneLake on behalf of the client. You can’t directly customise Fabric, or run apps on it (except notebooks), so you would still need other means to accomplish that.
Of course, this doesn’t mean that lack of customisation is a showstopper. You must be aware of the limitations and have the resources to circumvent/remedy those. For example, if you’ve been relying on SFTP support on ADLS Gen2, which isn’t available in OneLake, you’ll have to set up an SFTP server that would talk to OneLake for you. You can do that with an SFTP container hosted on Azure Container Apps and mount your OneLake to the container. Or, you can keep an ADLS Gen2 account on your Azure subscription with SFTP enabled and have Fabric’s pipelines pull the data from there.
3.Migration Is A Challenge
Fabric is built on Power Platform, which means the structure is based on how Power BI works instead of Synapse Analytics. But there are limitations around Fabric that would prevent you from doing lift-and-shift migration. One example of this is that the Dataflows Gen2 is based on Power Query / M, significantly different from ADF / Synapse Dataflows, which are Spark-based. You can’t migrate your dataflows directly into Fabric; you would need to rewrite them.
Another example is if you’ve been using Azure Databricks for your Lakehouse, you won’t be able to migrate your notebooks directly onto Fabric notebooks. True, you wouldn’t be able to lift and shift into Synapse either, but whilst Synapse Analytics is a peer to Databricks on the technology landscape, Fabric is closer to business. You could’ve unlocked some value if you could migrate easily, but that’s not the case here.
Let’s not forget that the learning curve will also be a challenge. If you have a team primarily working on Spark-based technologies and languages, you won’t hit the ground running with Power Query / M.
Conclusion
Whilst Fabric brings a lot to the table, it may not replace your existing stack with a turnkey SaaS platform.
However, suppose you were to use Fabric side-by-side with your existing data platform and only migrate things that would make sense to do so. In that case, you might receive the best of both worlds: Your data platform can knead the raw data into shape and serve it to Fabric, and Fabric can refine that data further and make diamonds out of coal lumps.
I plan to write an article on how to use Synapse and Fabric together soon. Subscribe and get notified when that article drops!