Microsoft Fabric is an end-to-end, unified analytics platform that integrates data engineering, data warehousing, data science, real-time analytics, and business intelligence. It streamlines data workflows across services like Azure, Power BI, and Synapse in a single SaaS offering.
When dropping a table in Microsoft Fabric, we sometimes notice that the Delta files have not been completely removed. This can cause issues when recreating the table, as the location may still contain old metadata or non-Delta files. To resolve this, we first list the files in the target folder location, identify any remaining Delta files, and then perform the necessary actions to delete them. Once the location is clean, we recreate the table to ensure a fresh and error-free setup. This article describes how to delete delta files from Fabric Lakehouse.
In the example below, we want to drop and recreate a table ‘lh_gold.t_fact_day_sales’ in fabric lakehouse.
Error Message:
The typical error message we receive:
Fabric getting this error while dropping and recreating table| AnalysisException: [DELTA_CREATE_TABLE_WITH_NON_EMPTY_LOCATION] Cannot create table (‘lh_gold.t_fact_day_sales’). The associated location (‘abfss://7eaf3bc1-2d0c-43d9-bcbf-12b1195cb9a5@onelake.dfs.fabric.microsoft.com/cd923083-448c-43a9-bc86-94cfc95a7bc8/Tables/t_fact_day_sales’) is not empty and also not a Delta table.
Why Microsoft Fabric Can’t Auto-Delete Old Delta Files Like Databricks
Databricks has a built-in mechanism through its runtime to automatically manage files behind Delta Tables — especially when you DROP TABLE
, TRUNCATE
, or do a VACUUM
. It deeply integrates with the storage layer (like DBFS, or direct Azure Data Lake) and uses dbutils
and Delta Lake transaction logs (_delta_log/
) to keep everything consistent.
However, Microsoft Fabric (as of 2024–2025):
- Uses a different storage abstraction: OneLake, not direct ADLS.
- Runs on Spark pools that don’t fully expose low-level file system utilities like
dbutils.fs
. - Manages tables a little differently — especially when they’re shortcuts or external tables — and sometimes does not automatically delete orphaned files if a table is dropped.
- Also, Fabric’s Spark runtime is hardened and sandboxed for multi-tenancy and governance, which limits direct file system operations unless explicitly triggered by the user (via
mssparkutils
, Fabric UI, or API).
In short, Fabric prioritizes safety and governance over automatic cleanup, and doesn’t yet implement the same auto-vacuuming behaviors Databricks does.
Follow the steps below to delete Delta files from Fabric workspace programmatically.
1. Check if the Location Contains Files
from pyspark.sql.functions import col
files = mssparkutils.fs.ls(“abfss://<a href="mailto:7eaf3bc1-2d0c-43d9-bcbf-12b1195cb9a5@onelake.dfs.fabric.microsoft.com" target="_blank" rel="noreferrer noopener">7eaf3bc1–[email protected]</a>/cd923083–448c-43a9-bc86–94cfc95a7bc8/Tables/t_fact_day_sales”)
display(files)
2. Delete Non-Delta Files (If Any)
mssparkutils.fs.rm(“abfss://<a href="mailto:7eaf3bc1-2d0c-43d9-bcbf-12b1195cb9a5@onelake.dfs.fabric.microsoft.com" rel="noreferrer noopener" target="_blank">7eaf3bc1–[email protected]</a>/cd923083–448c-43a9-bc86–94cfc95a7bc8/Tables/t_fact_day_sales”, recurse=True)
3. Drop the Table Before Recreating
DROP TABLE IF EXISTS lh_gold.t_fact_day_sales;
4. Create the Table Again Using Delta Format
CREATE TABLE lh_gold.t_fact_day_sales
USING DELTA
LOCATION ‘abfss://7eaf3bc1–[email protected]/cd923083–448c-43a9-bc86–94cfc95a7bc8/Tables/t_fact_day_sales’;
Alternative: Delete the Folder Manually
If you are unable to delete files programmatically, try these options:
- OneLake Explorer → Navigate to
Tables/t_fact_day_sales
and delete the folder. - Azure Storage Explorer → Connect to the Fabric OneLake storage and manually delete the folder.
Once the folder is clean, retry the table creation.
Pro tips:
1. When using an external table, we can easily apply an alternative way to drop the folder. However, when working with a managed table, we have to use the fabric api as above.
2. If you want to automate deletion of old files from blob storage, follow this article.
See more
Jagdish Wagh
Jagdish is a seasoned Solution and Cloud Architect with over a decade of experience in data engineering, BI, and analytics. He is specialised in Microsoft cloud and Databricks solutions.