Microsoft Fabric – where is it headed, and is it for you?

Summary
Close

Microsoft has been making waves in the data space with the arrival of Fabric, MS’s be-all and end-all solution to all data problems. Should I adopt the new player in the field? Where is Fabric going and what can we expect in the service? We went to find out in Sweden.

The first ever European Microsoft Fabric Community Conference (Fabcon Europe) 2024 was held from 24th to 27th of September in the cloudy Stockholm – a boon for anyone to whom Las Vegas is not readily accessible. It collected some 3000 delegates in Stockholmsmässan for a one-day tutorial day followed by three days of keynotes and other sessions.  

I chose to attend a tasty mix of Fabric feature presentations, case studies and ideas for higher-level considerations. During breaks, one could wander the expo area, which featured some 20-odd stands of Fabric-related product vendors and fellow consultants, as well as my favourite attraction, the Ask the Experts zone.

Three guys and no cubes

Self-service

A major highlight of my Fabcon trip was the one-day pre-conference session about Adopting Microsoft Fabric for self-service analysis, held by Microsoft’s own Matthew Roche I picked this session since self-service is a complex theme that is piquing the interest of many clients. I found that this session set the expectations for the future of Fabric: Microsoft is emphasising ease-of-use over advanced but quirkier features.  

Self-service analytics concentrates on the vision of data democratisation and the citizen developer. The idea is to unload development work from dedicated developers and analysts and pass it to the business users that consume the insights. The motivation is clear: bring the work to where the business understanding is and eliminate the often-long wait time between submitting your analysis need to a developer and getting it implemented. However, the obvious question is about the technical expertise and organisational framework required. How do you empower the enlightened layman while ensuring quality standards get fulfilled?

While the day was – obviously – named after Fabric, it mostly dealt with organisational processes and the lessons learned could easily be applied to any combination of services, not just Fabric. That said, it seems clear that Microsoft is especially targeting this market with many of the design decisions behind Fabric. This starts with the low barrier of entry: just buy a capacity and you’re good to go! Beyond Spark notebooks, Fabric services are generally low-to-no code, with drag-and-drop, mouse driven interfaces and drop lists to guide the user through. And if you do need to code, MS is quick to bring out Copilot to prevent anyone from getting too intimidated (and equally quick to remind you that it needs human oversight).  

Microsoft will copilot you to the future

Key takeaways

What, then, were the key takeaways from a day of diving into organisational acceptance of new technology?

In the presentation Microsoft envisions a centre-of-excellence pattern where a core team of Fabric experts supports a much larger team of corporate citizen developers. The main role of the CoE is not to as much as develop themselves but to create documentation and design patterns, educate and provide consultation. Maximising impact is the key!

There are two main considerations for introducing a Fabric CoE for an organisation:  

The first is scale: A rule of thumb for the size of a centre of excellence is one member per one hundred Power BI users. If an organisation does not have at least a couple of hundred of PBI users, it’s not going to be able to reap the benefits of a CoE since the users it’s supposed to empower are not there. That’s not to say that a small–to–medium size organisation (in terms of analytics users) could not find value in Fabric. They can, but the priorities for its adoption will be elsewhere.

Another one has to do with time: One has to be prepared to play the long game and avoid the temptation of expecting short-term gains. It will take time to set up a centre of excellence, and even longer for it to make its presence felt and self-service to truly start rolling. Once it does, though, the critical mass of citizen developers will – at least in Microsoft’s view – be able to extract value from Fabric much more quickly than a developer-centric, less flexible organisation can. That’s not to say that the former would not use dedicated developers, but such developers would be freed to deliver higher-value, more sophisticated projects without having the business users with more specific need left hanging.

One final point that should be mentioned in relation to Fabric is that when leveraging the citizen developer, special care should be taken with workspace and capacity management. This is the unfortunate flip side of Fabric’s ”broad strokes” approach to billing. Fabric currently does not have very fine guardrails in place for computing quota management. As such, a few badly written DAX queries can mean trouble to the entire capacity if left unchecked. A special thing to keep note of is that if Fabric’s under too heavy a load, it uses capacity units from the future. This means that a solution that may seem to be running smoothly, but is actually living on borrowed time and will hit the wall when throttling kicks in. It is obvious that performance-related issues will happen more easily with citizen developers than more technically oriented people, introducing a heightened need for checks and balances. At the very least, one should have a separate development capacity for Fabric, but the larger the organisation, the more sophisticated the capacity handling must be.  

Should I?

The most obvious question remains: should I adopt Fabric, or advise my clients to do so? The answer is equally obvious: it depends. With its ease of use and low barriers between different workloads, there’s a lot to like in Fabric. And still, despite its general availability status, there are a lot of details that are not quite there yet or act quirky under real-world usage.

So, if you have a relatively modern data solution, it’s probably good enough to stick with that. However, if you’re still running on-prem and want to jump on the cloud train, Fabric is worth giving serious consideration. The same goes for users of Microsoft’s older cloud-based solutions, such as Synapse and Azure Data Factory. While Microsoft has committed to maintaining them, they will not be getting new features or enhancements. Fabric is where the future of the Microsoft data world will be.

While Fabric may be a new beast under the hood – it is built on open-source Delta format files – the user experience of its engines has been built to closely resemble their last-gen counterparts. This is a definite plus for anyone moving over from older MS solutions, as the amount of upskilling required for their teams is going to be significantly less than when jumping to another provider altogether.

One of biggest pain points in Fabric is the lack of finer security controls. Currently Fabric does not offer a governance solution such as Unity Catalog that would provide a unified security solution. Instead, security must be managed separately for each, meaning that you might set up row-level security in Data Warehouse, but can bypass that security with Spark if you have access to that. This means that governing access to sensitive data is that much more complicated and error prone. Now, Microsoft is very much aware of this problem: the current promise for the availability of OneLake Security is the first quarter of 2025. The fact remains, though, that it is not in place now, and it remains an article of faith whether MS can have it running when a customer starting in Fabric would need it, and whether it will live up to its promises.

Some vendors that do believe in Fabric

There’s no denying that the major alternatives to Fabric – Databricks and Snowflake are the obvious candidates – have been out for a while and one would be hard pressed to claim Fabric to have the same level of maturity. Even so, in the analytics end of the data ecosystem, Microsoft’s Power BI remains a heavy-hitter even for otherwise non-MS-users – and it is conveniently part of Fabric. While for now the added convenience of running Power BI on top of Fabric sources is not too large – Direct Lake, for instance, is really useful in only rather specific circumstances – it remains to be seen how the situation develops in the future.  

In any case, if your organisation is currently considering options, it is a good time to run a limited proof-of-concept project. Microsoft still needs to have a critical mass of users behind Fabric and likely in that interest is giving out a rather generous 2-month trial of a relatively hefty Fabric capacity, which should be enough for prototyping and testing. The major players are also quick to point out that there’s no vendor lock-in for the data itself: Delta Lake is open-source, so you can read your Fabric data with any tool supporting Delta Parquet, not just Fabric. So if you’re weighing your options, do consider doing a comparative POC with Fabric as one service.

Conclusion

FabCon Europe provided a great venue to learn, connect and stay up-to-date on Microsoft Fabric. My selected tutorial provided a valuable roadmap for guiding a larger organisation to adopting Fabric – or any self-service-powered data and analytics solution. The sessions provided a great deal of clarity on the current state, prospects and best practices of Fabric. While Fabric is still rough around many of its edges, there’s plenty of promise and even in its current state, it’s definitely worth a look for anyone looking to modernise their data platform – especially if you’re coming from the MS ecosystem to start with.

About the author
Luke Jäppinen

Luke is a data professional with experience ranging from data warehousing and relational databases to analytics. He specialises in Databricks and Microsoft data solutions (Power Bi / Fabric etc.) and is constantly looking out for advanced methods to deliver ever higher quality data and analytics solutions to clients.

How can
we help you?
Are you looking for data driven digital solutions that add business value? Our senior technical experts help you build just the right solutions for your unique challenges and operational environment.