John Liu Blog

Every drop counts

Fabric Capacity

When we use paid Fabric Capacity, we might need to auto pause and resume the capacity for cost saving. We can use Azure Automation Runbook to achieve this. We setp an Automation Account with a Managed Idenity. Step 1: Set Up Permissions Go to your Azure Automation Account. Under Account Settings, click Identity and ensure the “System assigned” toggle is On. Go to your Fabric Capacity resource in the Azure Portal.

Grant user access to Azure MSDN subscription

For old MSA-originated MSDN / Visual Studio subscriptions, we might not be able to grant user access to the subscription. The subscription is linked to your personal account, and there is a portal-side RBAC picker defect caused this issue. To grant access to the subscription to other users created in your directory, using following Azure CLI (Cloud Shell) and grant the user contributor role. az role assignment create \ --assignee-object-id <USER_OBJECT_ID> \ --assignee-principal-type User \ --role Contributor \ --scope /subscriptions/<SUBSCRIPTION_ID> Subsitute <USER_OBJECT_ID> with the Object ID for the user in Microsoft Entra ID, and <SUBSCRIPTION_ID> for the subscription in question.

Data API Builder notes

Docker Desktop We can deploy Data API Builder (DAB) using Docker Compose in Docker Desktop. An example compose file: version: '3' services: DAB-AutoGen: image: "mcr.microsoft.com/azure-databases/data-api-builder:latest" container_name: DAB-AutoGen ports: - "5002:5000" extra_hosts: - "host.docker.internal:host-gateway" volumes: - c:\DataAPIBuilder\Samples\:/App/DAB-Configs command: ["--ConfigFileName", "/App/DAB-Configs/dab-config-AutoGen.json"] DAB-AutoGen2: image: "mcr.microsoft.com/azure-databases/data-api-builder:latest" container_name: DAB-AutoGen2 ports: - "5003:5000" extra_hosts: - "host.docker.internal:host-gateway" volumes: - c:\DataAPIBuilder\Samples\dab-config-AutoGen.json:/App/dab-config.json Azure Container Instance We can also deploy the container in Azure Container Instance (ACI). The DAB configuration files need to be store in Azure File Share.

Azure Cosmos DB study notes

TTL (time-to-live): max 2,147,483,647seconds (~68years) For TTL to work at either container or item level, it needs to be enable/configured at container level first. If only configured at item level, it will be ignored unless TTL is enabled/configured at container level default TTL is not configured max configureable TTL is ? Previsioned throughput vs. serverless: previsioned throughput is ideal for predictable traffic patterns that require sustained and predictable performance with minimal variance Serverless can handle wildly varying traffic and low average-to-peak traffic ratios

Azure SQL DB call external API

With the introducing of sp_invoke_external_rest_endpoint in Azure SQL DB, it’s possible to directly calling the Azure OpenAI service within the database. Azure OpenAI is in the safe-listed Azure services. For other API service (like common OpenAI) that’s not in the safe-list, you will need to create a wrapper API in the Azure API Management Instance. To invoke external rest endpoint in Azure SQL DB, a few setup steps are required: