Loading source
Pulling the file list, source metadata, and syntax-aware rendering for this listing.
Source from repo
Deploy applications and infrastructure to Azure using Copilot-guided workflows and Azure MCP
Files
Skill
Size
Entrypoint
Format
Open file
Syntax-highlighted preview of this file as included in the skill package.
references/recipes/azd/errors.md
1# AZD Errors23## Deployment Runtime Errors45These errors occur **during** `azd up` execution:67| Error | Cause | Resolution |8|-------|-------|------------|9| `unknown flag: --location` | `azd up` doesn't accept `--location` | Use `azd env set AZURE_LOCATION <region>` before `azd up` |10| Provision failed | Bicep template errors | Check detailed error in output |11| Deploy failed | Build or Docker errors | Check build logs |12| Package failed | Missing Dockerfile or deps | Verify Dockerfile exists and dependencies |13| Quota exceeded | Subscription limits | Request increase or change region |14| `PrincipalId '...' has type 'ServicePrincipal', which is different from specified PrincipalType 'User'` | Base template RBAC assigns roles with `principalType: 'User'` but deploying identity is a service principal (CI/CD) | Set `allowUserIdentityPrincipal: false` in the `storageEndpointConfig` variable in `infra/main.bicep`. Do NOT try clearing `AZURE_PRINCIPAL_ID` — azd repopulates it. See [Principal Type Mismatch](#principal-type-mismatch). |15| `ImagePullBackOff` or `azd up` hangs during provision for Container Apps | Container App references an image that doesn't exist in ACR yet | See [Container Apps Bootstrap Problem](#container-apps-bootstrap-problem) |16| `unauthorized: authentication required` on `docker push` to ACR | ACR auth token expired or scoped incorrectly | See [ACR Authentication Failures](#acr-authentication-failures) |17| `could not determine container registry endpoint` | Missing `AZURE_CONTAINER_REGISTRY_ENDPOINT` | See [Missing Container Registry Variables](#missing-container-registry-variables) |18| `map has no entry for key "AZURE_CONTAINER_REGISTRY_MANAGED_IDENTITY_ID"` | Missing managed identity env vars | See [Missing Container Registry Variables](#missing-container-registry-variables) |19| `map has no entry for key "MANAGED_IDENTITY_CLIENT_ID"` | Missing managed identity client ID | See [Missing Container Registry Variables](#missing-container-registry-variables) |20| `Operation expired` / revision creation timeout (900s) | RBAC propagation delay — Container App's managed identity doesn't have `AcrPull` on ACR yet | See [Container App Revision Timeout](#container-app-revision-timeout) |21| `found '2' resources tagged with 'azd-service-name: <name>'` | Previous deployment left duplicate-tagged resources in same RG | **Preferred**: Create fresh env with `azd env new <new-name> --no-prompt`, set subscription/location, redeploy. **Alternative**: Delete conflicting resources (requires `ask_user`). |22| Literal `{{ .Env.* }}` in Terraform errors | azd does not interpolate template variables in `.tfvars.json` | See [Unresolved Terraform Template Variables](#unresolved-terraform-template-variables) |2324> ℹ️ **Pre-flight validation**: Run `azure-validate` before deployment to catch configuration errors early. See [Pre-Deploy Checklist](../../pre-deploy-checklist.md).2526## Container App Revision Timeout2728**Symptom:** `azd up` provisions infrastructure successfully but the Container App revision creation times out after ~900 seconds. The Container App enters a `Failed` provisioning state with no active revision. The `azd` output shows `Operation expired` or `The operation did not complete within the permitted time`.2930**Cause:** Azure RBAC propagation delay. When `azd up` runs both `azd provision` and `azd deploy` in a single step:311. Bicep creates the Container App with a system-assigned managed identity and a public placeholder image322. Bicep creates an `AcrPull` role assignment for that identity on ACR in a separate module using the two-phase deployment pattern333. `azd deploy` immediately pushes the real image and creates a new Container App revision344. The revision tries to pull the image from ACR, but the `AcrPull` role assignment hasn't propagated yet (can take 1–5 minutes)355. The image pull fails repeatedly until the 900-second timeout is reached3637**Solution:**38391. **Verify the Container App state:**40```bash41az containerapp show --name <app-name> --resource-group <resource-group> \42--query "{provisioningState:properties.provisioningState, latestRevision:properties.latestRevisionName}" -o json43```44452. **Confirm the AcrPull role exists (it may have propagated by now):**46```bash47PRINCIPAL_ID=$(az containerapp identity show --name <app-name> --resource-group <resource-group> --query principalId -o tsv)48az role assignment list --scope $(az acr show --name <acr-name> --resource-group <resource-group> --query id -o tsv) \49--assignee-object-id "$PRINCIPAL_ID" --query "[].roleDefinitionName" -o tsv50```5152**PowerShell:**53```powershell54$PrincipalId = az containerapp identity show --name <app-name> --resource-group <resource-group> --query principalId -o tsv55$AcrScope = az acr show --name <acr-name> --resource-group <resource-group> --query id -o tsv56az role assignment list --scope $AcrScope --assignee-object-id $PrincipalId --query "[].roleDefinitionName" -o tsv57```58593. **If AcrPull is missing, assign it:**60```bash61az role assignment create \62--assignee-object-id "$PRINCIPAL_ID" \63--assignee-principal-type ServicePrincipal \64--role AcrPull \65--scope $(az acr show --name <acr-name> --resource-group <resource-group> --query id -o tsv)66```6768**PowerShell:**69```powershell70$AcrScope = az acr show --name <acr-name> --resource-group <resource-group> --query id -o tsv71az role assignment create `72--assignee-object-id $PrincipalId `73--assignee-principal-type ServicePrincipal `74--role AcrPull `75--scope $AcrScope76```77784. **Wait for propagation, then redeploy:**79```bash80azd env set AZURE_CONTAINER_REGISTRY_ENDPOINT $(az acr show --name <acr-name> --resource-group <resource-group> --query loginServer -o tsv)8182for attempt in 1 2 3 4 5; do83echo "Waiting for RBAC propagation (attempt $attempt/5)..."84sleep 608586if azd deploy --no-prompt; then87echo "Deployment succeeded after RBAC propagation."88break89fi9091if [ "$attempt" -eq 5 ]; then92echo "Deployment still failing after 5 minutes. Re-check AcrPull assignment and Container App revision status."93exit 194fi95done96```9798**PowerShell:**99```powershell100azd env set AZURE_CONTAINER_REGISTRY_ENDPOINT (az acr show --name <acr-name> --resource-group <resource-group> --query loginServer -o tsv)101102for ($attempt = 1; $attempt -le 5; $attempt++) {103Write-Output "Waiting for RBAC propagation (attempt $attempt/5)..."104Start-Sleep -Seconds 60105106azd deploy --no-prompt107if ($LASTEXITCODE -eq 0) {108Write-Output "Deployment succeeded after RBAC propagation."109break110}111112if ($attempt -eq 5) {113Write-Output "Deployment still failing after 5 minutes. Re-check AcrPull assignment and Container App revision status."114exit 1115}116}117```118119> 💡 **Prevention:** To avoid this in future deployments, ensure the Bicep template includes the `AcrPull` role assignment with `principalType: 'ServicePrincipal'`, and consider using `azd provision` + `azd deploy` as separate steps instead of `azd up` to allow RBAC propagation time between infrastructure creation and app deployment.120121## Container Apps Bootstrap Problem122123**Symptom:** `azd up` hangs or fails during provisioning with `ImagePullBackOff`, or the Container App cannot start because the referenced image doesn't exist in ACR yet.124125**Cause:** The Bicep template creates the Container App referencing an ACR image, but that image doesn't exist until `azd deploy` builds and pushes it. This chicken-and-egg problem blocks provisioning.126127**Solution — use two-phase deployment:**128129```bash130# Phase 1: Provision infrastructure (Container App uses placeholder image)131azd provision --no-prompt132133# Phase 2: Build, push, and update Container App with real image134azd deploy --no-prompt135```136137> ⚠️ This requires the Bicep template to use a placeholder image parameter (e.g., `mcr.microsoft.com/azuredocs/containerapps-helloworld:latest`) so provisioning succeeds without the app image. If the Bicep hardcodes the ACR image reference, update it to accept a `containerImageName` parameter with a placeholder default before provisioning.138139> ⚠️ Do **NOT** repeatedly poll a hanging `azd up` — if there is no provisioning progress or you continue to see `ImagePullBackOff` events for several minutes during a Container Apps deployment, stop it and switch to the two-phase approach above.140141## ACR Authentication Failures142143**Symptom:** `docker push` fails with `unauthorized: authentication required` even after `az acr login` succeeds.144145**Solution — try these methods in order:**146147```bash148# Method 1: AAD-based login (preferred)149az acr login --name <acr-name>150docker push <acr-name>.azurecr.io/<image>:<tag>151152# Method 2: Admin credentials (fallback)153ACR_USER=$(az acr credential show --name <acr-name> --query username -o tsv)154ACR_PASS=$(az acr credential show --name <acr-name> --query "passwords[0].value" -o tsv)155docker login <acr-name>.azurecr.io -u "$ACR_USER" -p "$ACR_PASS"156docker push <acr-name>.azurecr.io/<image>:<tag>157```158159**PowerShell (Method 2):**160```powershell161$AcrUser = az acr credential show --name <acr-name> --query username -o tsv162$AcrPass = az acr credential show --name <acr-name> --query "passwords[0].value" -o tsv163docker login <acr-name>.azurecr.io -u $AcrUser -p $AcrPass164docker push <acr-name>.azurecr.io/<image>:<tag>165```166167> 💡 **Tip:** Prefer `azd deploy` over manual `docker push` — azd handles ACR authentication automatically.168169## Missing Container Registry Variables170171**Symptom:** Errors during `azd deploy` about missing container registry or managed identity environment variables:172173```174ERROR: could not determine container registry endpoint, ensure 'registry' has been set in the docker options or 'AZURE_CONTAINER_REGISTRY_ENDPOINT' environment variable has been set175```176177Or:178179```180ERROR: failed executing template file: template: manifest template:6:14: executing "manifest template" at <.Env.AZURE_CONTAINER_REGISTRY_MANAGED_IDENTITY_ID>: map has no entry for key "AZURE_CONTAINER_REGISTRY_MANAGED_IDENTITY_ID"181```182183Or:184185```186ERROR: failed executing template file: template: manifest template:39:26: executing "manifest template" at <.Env.MANAGED_IDENTITY_CLIENT_ID>: map has no entry for key "MANAGED_IDENTITY_CLIENT_ID"187```188189**Cause:** This typically occurs with .NET Aspire projects using azd "limited mode" (in-memory infrastructure generation without explicit `infra/` folder). The `azd provision` command creates the Azure Container Registry and Managed Identity resources but doesn't automatically populate the environment variables that `azd deploy` needs to reference them.190191> ⚠️ **Prevention is Better:** For .NET Aspire projects, this issue should be addressed PROACTIVELY before deployment by setting up environment variables after `azd init` but before `azd up`. This avoids deployment failures entirely.192193**Solution:**194195After `azd provision` succeeds, manually set the missing environment variables by querying the provisioned resources:196197```bash198# Get the resource group name (typically rg-{environment-name})199azd env get-values200201# Set container registry endpoint202azd env set AZURE_CONTAINER_REGISTRY_ENDPOINT $(az acr list --resource-group <resource-group-name> --query "[0].loginServer" -o tsv)203204# Set managed identity resource ID205azd env set AZURE_CONTAINER_REGISTRY_MANAGED_IDENTITY_ID $(az identity list --resource-group <resource-group-name> --query "[0].id" -o tsv)206207# Set managed identity client ID208azd env set MANAGED_IDENTITY_CLIENT_ID $(az identity list --resource-group <resource-group-name> --query "[0].clientId" -o tsv)209```210211**PowerShell:**212```powershell213# Set container registry endpoint214azd env set AZURE_CONTAINER_REGISTRY_ENDPOINT (az acr list --resource-group <resource-group-name> --query "[0].loginServer" -o tsv)215216# Set managed identity resource ID217azd env set AZURE_CONTAINER_REGISTRY_MANAGED_IDENTITY_ID (az identity list --resource-group <resource-group-name> --query "[0].id" -o tsv)218219# Set managed identity client ID220azd env set MANAGED_IDENTITY_CLIENT_ID (az identity list --resource-group <resource-group-name> --query "[0].clientId" -o tsv)221```222223After setting these variables, retry the deployment:224```bash225azd deploy --no-prompt226```227228> 💡 **Tip:** This issue is specific to Aspire limited mode. Manually setting these environment variables after `azd provision` is the recommended workaround.229230## Unresolved Terraform Template Variables231232**Symptom:** Terraform receives literal Go-style template strings instead of resolved values during `azd provision`:233234```235Error: Invalid value for variable "environment_name"236The value "{{ .Env.AZURE_ENV_NAME }}" is not valid.237```238239Or Terraform silently uses the literal string, causing resource naming failures, state conflicts, and cascading errors that lead to deployment timeouts.240241**Cause:** azd reads `infra/main.tfvars.json`, substitutes `${VAR}` references using its built-in envsubst, and passes the resolved file to Terraform via `-var-file=`. Go-style `{{ .Env.* }}` variables are only processed in `azure.yaml` and service manifests — they are **NOT** interpolated in `.tfvars.json` or any Terraform variable files. If `azure-prepare` generated a `main.tfvars.json` with Go-style template expressions, those literal strings are passed to Terraform.242243**Solution:**2442451. **Fix the syntax** in `infra/main.tfvars.json` — replace Go-style `{{ .Env.* }}` with `${VAR}`:246```json247{248"environment_name": "${AZURE_ENV_NAME}",249"location": "${AZURE_LOCATION}",250"subscription_id": "${AZURE_SUBSCRIPTION_ID}"251}252```2532542. **Or use `TF_VAR_*` environment variables** if you don't have `main.tfvars.json`:255```bash256azd env set TF_VAR_environment_name "$(azd env get-value AZURE_ENV_NAME)"257azd env set TF_VAR_location "$(azd env get-value AZURE_LOCATION)"258azd env set TF_VAR_subscription_id "$(azd env get-value AZURE_SUBSCRIPTION_ID)"259```2602613. **Ensure `variables.tf`** declares all required variables:262```hcl263variable "environment_name" { type = string }264variable "location" { type = string }265```2662674. **Re-run deployment:**268```bash269azd up --no-prompt270```271272> ⚠️ **Prevention:** This issue should be caught by `azure-validate` Step 10 (Template Variable Resolution Check) before deployment. If you encounter it, re-run validation after fixing.273274## Retry275276After fixing the issue:277```bash278azd up --no-prompt279```280281## Principal Type Mismatch282283**Symptom:** `azd up` fails during provisioning with:284285```286PrincipalId '...' has type 'ServicePrincipal', which is different from specified PrincipalType 'User'287```288289**Cause:** Many AZD templates (e.g., `functions-quickstart-python-http-azd`) include RBAC role assignments for the deploying user with hardcoded `principalType: 'User'`. This is controlled by an `allowUserIdentityPrincipal` flag in `main.bicep`'s `storageEndpointConfig` variable. When deploying from CI/CD with a service principal, `azd` sets `AZURE_PRINCIPAL_ID` to that service principal's object ID, but the Bicep still tries to create a role assignment with `principalType: 'User'`, causing ARM to reject it.290291**Solution:**292293In `infra/main.bicep`, find the `storageEndpointConfig` variable and set `allowUserIdentityPrincipal` to `false`:294295```bicep296var storageEndpointConfig = {297enableBlob: true298enableQueue: false299enableTable: false300enableFiles: false301allowUserIdentityPrincipal: false // Set to false for service principal deployments302}303```304305> ⚠️ **Do NOT** try to fix this by running `azd env set AZURE_PRINCIPAL_ID ""`. The `azd` CLI repopulates this value from the current auth context, so clearing it has no effect.306307## Cleanup (DESTRUCTIVE)308309```bash310azd down --force --purge311```312313⚠️ Permanently deletes ALL resources including databases and Key Vaults.314