How big is your Semantic Model in Power BI??
- jeroen dekker
- Jul 22
- 3 min read
During a recent Power BI training session, a participant asked
“"Our dataset has become too large for Power BI Pro. Where can I see what is causing this?"?”
Here’s how we tackled the problem—and how you can, too.
Why this question?

One participant’s semantic model had grown unexpectedly large, creating issues across their organization.
An external admin proposed upgrading everyone to Premium per User licenses—a costly solution.
But the team questioned whether this was really needed—the PBIX file was “only” 700 MB, well under the 1 GB limit for Power BI Pro. Was the error message even accurate?
Why model size matters—for licensing and performance
Model size matters not just for licensing, but also for report performance. Even if your file size fits within the allowed limits, a bloated model can slow down your dashboards.
For organizations using Microsoft Fabric, model size also influences the number of Capacity Units (CUs) consumed—directly impacting cost.
Power BI Pro allows a maximum model size of 1 GB. If your model exceeds this, a Premium per User license (roughly €10/month extra per user) may be required. But before upgrading, it’s essential to check whether your model can be optimized instead.
Before reaching for the credit card, ask: Can this model be reduced? Often, the answer is yes. First, measure the true size of your semantic model.
But how do you know the real size of your model?
Power BI Desktop files (.PBIX) include visuals, images, and report pages—not just the data model.
When saving a PBIX file, Power BI Desktop applies a strong compression. So the file size is not a reliable indicator of your model’s true size.
Unfortunately, Power BI desktop itself doesn’t show the internal model size. For that, you’ll need an external tool like DAX Studio.
Example: Measuring with DAX Studio
To demonstrate, I downloaded a large CSV from the Dutch RDW: a dataset of all vehicles with license plates in the Netherlands.
I loaded the data unfiltered into Power BI. The CSV was 10.6 GB, but the resulting PBIX file was only 598 MB. Still, neither of those tells us the actual model size.

When we open it in DAX Studio, we see the real model size: 1.19 GB—well over the Power BI Pro limit.

So this model wouldn’t work under Power BI Pro. DAX Studio’s Tables tab shows the size of each column and table—perfect for identifying areas for optimization.

Important: the Cardinality of the columns
A key metric to monitor is Cardinality—the number of unique values in a column. Power BI uses a columnar storage format, compressing each column individually. High cardinality means less compression and larger models.
For example, a “Yes/No” column is highly compressible so even if its 16 million rows, the size is probably manageable. A column with 16 million unique license plates? Not so much. That’s why the “License Plate” (kenteken) column is the largest in the example.
How we fixed the issue
When we connected the participant’s file to DAX Studio, we immediately spotted the issue. The model was indeed 1.1 GB, however:
There were many hidden date tables that had very high 'cardinality'.
One of the auto-generated date table had over 500,000 unique dates—wildly unnecessary, considering a year has only 366 days at most.
We identified three key issues causing model bloat—and all were easy fixes.
Some date fields had errors—one even showed dates 1400+ years ago. Simple data-entry mistakes that caused major issues.
Auto Date/Time was enabled, inflating the model by at least 300+ MB.
The CALENDARAUTO() function generated a timeline spanning over a millennium due to the bad data.
Fixing these reduced the model by 500 MB. The participant also removed high-cardinality columns that weren’t needed. In less than 30 minutes, the model was down to under 500 MB—and could be optimized even further.
Conclusion:
There’s a Dutch saying: “Meten is weten”—measuring is knowing. That’s especially true in Power BI. Before you upgrade your license, measure and analyze your model.
Model bloat doesn’t just affect performance—it can drive up costs unnecessarily. Small data issues often have a big impact.
Optimization starts with accurate measuring your data model. For this, an external tool such as DAX Studio, Tabular Editor or Measurekiller is indispensable.


Comments