John Liu Blog

Every drop counts

Fabric Capacity

When we use paid Fabric Capacity, we might need to auto pause and resume the capacity for cost saving. We can use Azure Automation Runbook to achieve this. We setp an Automation Account with a Managed Idenity. Step 1: Set Up Permissions Go to your Azure Automation Account. Under Account Settings, click Identity and ensure the “System assigned” toggle is On. Go to your Fabric Capacity resource in the Azure Portal.

Fabric Mirrored SQL Server Data Gateway Issue

When create mirrored SQL server in Fabric to on-premises SQL database, we will need to use On-Premises Data Gateway. We might encounter issue with Use Encrypted Connection option with self-signed certificate. To resolve issue with Use Encrypted Connection, try following solutions: Solution 1: Add your SQL Server to the “Trusted Servers” List: There is a “hidden” configuration in the On-premises Data Gateway that allows you to bypass certificate validation for specific servers.

Fabric KQL Database

In Fabric KQL database, we can copy the Query URI link and using it (without the https://) as the Server Name in SSMS and use Microsoft Entra MFA authentication to connect to the KQL database and query the data using T-SQL syntax. Alternative, we can using the T-SQL syntax in the Fabric UI query tab but add the T-SQL comment line “–” as the first line of the query. That will indicate to the engine that the query syntax is T-SQL instead of KQL.

Fabric Study Notes

Load data into Lakehouse. %%python # this create a managed delta table, parquet file will be managed under the Tables folder. When table is deleted, associated # parquet files will be auto deleted as well df = spark.read.load(path='Files/Data/sales.csv',format='csv',header=True) df.write.format('delta').saveAsTable('test') %%python # we can create an external delta table, parquet file will be saved under external location specified. When table is deleted, # associated parquet files will not be auto deleted. df = spark.

Fabric References

Some reference articles about Microsoft Fabrics. Moving Dataflows Gen 2 to different workspaces