Installation

Install with CLI Recommended
gh skills-hub install azure-cloud-migrate

Don't have the extension? Run gh extension install samueltauil/skills-hub first.

Download and extract to your repository:

.github/skills/azure-cloud-migrate/

Extract the ZIP to .github/skills/ in your repo. The folder name must match azure-cloud-migrate for Copilot to auto-discover it.

Skill Files (12)

SKILL.md 1.9 KB
---
name: azure-cloud-migrate
description: "Assess and migrate cross-cloud workloads to Azure with migration reports and code conversion guidance. Supports AWS, GCP, and other providers. WHEN: migrate Lambda to Azure Functions, migrate AWS to Azure, Lambda migration assessment, convert AWS serverless to Azure, migration readiness report, migrate from AWS, migrate from GCP, cross-cloud migration."
license: MIT
metadata:
  author: Microsoft
  version: "1.0.1"
---

# Azure Cloud Migrate

> This skill handles **assessment and code migration** of existing cloud workloads to Azure.

## Rules

1. Follow phases sequentially — do not skip
2. Generate assessment before any code migration
3. Load the scenario reference and follow its rules
4. Use `mcp_azure_mcp_get_bestpractices` and `mcp_azure_mcp_documentation` MCP tools
5. Use the latest supported runtime for the target service
6. Destructive actions require `ask_user` — [global-rules](references/services/functions/global-rules.md)

## Migration Scenarios

| Source | Target | Reference |
|--------|--------|-----------|
| AWS Lambda | Azure Functions | [lambda-to-functions.md](references/services/functions/lambda-to-functions.md) |

> No matching scenario? Use `mcp_azure_mcp_documentation` and `mcp_azure_mcp_get_bestpractices` tools.

## Output Directory

All output goes to `<source-folder>-azure/` at workspace root. Never modify the source directory.

## Steps

1. **Create** `<source-folder>-azure/` at workspace root
2. **Assess** — Analyze source, map services, generate report → [assessment.md](references/services/functions/assessment.md)
3. **Migrate** — Convert code using target programming model → [code-migration.md](references/services/functions/code-migration.md)
4. **Ask User** — "Migration complete. Test locally or deploy to Azure?"
5. **Hand off** to azure-prepare for infrastructure, testing, and deployment

Track progress in `migration-status.md` — see [workflow-details.md](references/workflow-details.md).
references/services/functions/
assessment.md 6.3 KB
# Assessment Phase

Generate a migration assessment report before any code changes.

## Prerequisites

- Workspace contains AWS Lambda functions, SAM templates, or CloudFormation templates
- Prompt user to upload relevant files if not present

## Assessment Steps

1. **Identify Functions** — List all Lambda functions with runtimes, triggers, and dependencies
2. **Map AWS Services** — Map AWS services to Azure equivalents (see [lambda-to-functions.md](lambda-to-functions.md))
3. **Map Properties** — Map Lambda properties to Azure Functions properties
4. **Check Dependencies** — List 3rd-party libraries and verify Azure compatibility
5. **Analyze Code** — Check language support and runtime differences
6. **Map Triggers** — Identify equivalent Azure Functions triggers
7. **Map Deployment** — Identify equivalent Azure deployment strategies (CLI, Bicep, azd)
8. **Review CI/CD** — Check pipeline compatibility with Azure DevOps or GitHub Actions
9. **Map Monitoring** — Map CloudWatch → Application Insights / Azure Monitor

## Code Preview

During assessment, show a **sneak peek** of what the migrated Azure Functions code will look like for each function. Use bindings and triggers (not SDKs) in all previews, following Azure Functions best practices. **Always use the newest generally available (GA) language runtime listed in the Azure Functions supported languages documentation** in previews (for example, the latest Node.js LTS or newest Python GA version). This helps the user understand the migration scope before committing to code migration.

> āš ļø **Binding-first rule**: Code previews MUST use `input.storageBlob()`, `output.storageBlob()`, `app.storageQueue()`, etc. instead of `BlobServiceClient`, `QueueClient`, or other SDK clients. Only use SDK for services that have no binding equivalent.

## Architecture Diagrams

Generate two diagrams:
1. **Current State** — AWS Lambda architecture with triggers and integrations
2. **Target State** — Azure Functions architecture showing equivalent structure

## Assessment Report Format

> āš ļø **MANDATORY**: Use these exact section headings in every assessment report. Do NOT rename, reorder, or omit sections.

The report MUST be saved as `migration-assessment-report.md` inside the output directory (`<aws-folder>-azure/`).

```markdown
# Migration Assessment Report

## 1. Executive Summary

| Property | Value |
|----------|-------|
| **Total Functions** | <count> |
| **Source Platform** | AWS Lambda |
| **Source Runtime** | <runtime and version> |
| **Target Platform** | Azure Functions |
| **Target Runtime** | <runtime and version> |
| **Migration Readiness** | <High / Medium / Low> |
| **Estimated Effort** | <Low / Medium / High> |
| **Assessment Date** | <date> |

## 2. Functions Inventory

| # | Function Name | Runtime | Trigger Type | Memory (MB) | Timeout (s) | Description |
|---|--------------|---------|-------------- |-------------|-------------|-------------|
| 1 | | | | | | |

## 3. Service Mapping

| AWS Service | Azure Equivalent | Migration Complexity | Notes |
|-------------|------------------|----------------------|-------|
| Lambda | Azure Functions | | |
| API Gateway | Azure Functions HTTP Trigger / APIM | | |
| S3 | Azure Blob Storage | | |
| DynamoDB | Cosmos DB | | |
| SQS | Service Bus / Storage Queue | | |
| SNS | Event Grid | | |
| CloudWatch | Application Insights / Azure Monitor | | |
| IAM Roles | Managed Identity + RBAC | | |
| CloudFormation / SAM | Bicep / ARM Templates | | |

## 4. Trigger & Binding Mapping

| # | Function Name | AWS Trigger | Azure Trigger | AWS Inputs/Outputs | Azure Bindings | Notes |
|---|--------------|-------------|---------------|--------------------| ---------------|-------|
| 1 | | | | | | |

## 5. Dependencies Analysis

| # | Package/Library | Version | AWS-Specific? | Azure Equivalent | Compatible? | Notes |
|---|----------------|---------|---------------|------------------|-------------|-------|
| 1 | | | | | | |

## 6. Environment Variables & Configuration

| # | AWS Variable | Purpose | Azure Equivalent | Auth Method | Notes |
|---|-------------|---------|------------------|-------------|-------|
| 1 | | | | Managed Identity / App Setting | |

## 7. Architecture Diagrams

### 7a. Current State (AWS)

<!-- Mermaid or ASCII diagram of AWS Lambda architecture -->

### 7b. Target State (Azure)

<!-- Mermaid or ASCII diagram of Azure Functions architecture -->

## 8. IAM & Security Mapping

| AWS IAM Role/Policy | Azure RBAC Role | Scope | Notes |
|---------------------|-----------------|-------|-------|
| | | | |

## 9. Monitoring & Observability Mapping

| AWS Service | Azure Equivalent | Migration Notes |
|-------------|------------------|-----------------|
| CloudWatch Logs | Application Insights | |
| CloudWatch Metrics | Azure Monitor Metrics | |
| CloudWatch Alarms | Azure Monitor Alerts | |
| X-Ray | Application Insights (distributed tracing) | |

## 10. CI/CD & Deployment Mapping

| AWS Tool | Azure Equivalent | Notes |
|----------|------------------|-------|
| SAM CLI | Azure Functions Core Tools / azd | |
| CloudFormation | Bicep / ARM Templates | |
| CodePipeline | Azure DevOps Pipelines / GitHub Actions | |
| CodeBuild | Azure DevOps Build / GitHub Actions | |

## 11. Project Structure Comparison

| AWS Lambda Structure | Azure Functions Structure |
|---------------------|--------------------------|
| `template.yaml` (SAM) | `host.json` |
| `handler.js / handler.py` | `src/app.js` / `src/function_app.py` |
| `requirements.txt` / `package.json` | `requirements.txt` / `package.json` |
| Per-function directories (optional) | Single entry point (v4 JS / v2 Python) |
| `event` object | Trigger-specific parameter |
| `context` object | `InvocationContext` |

## 12. Recommendations

1. **Runtime**: <recommended Azure Functions runtime and version>
2. **Hosting Plan**: <Flex Consumption / Premium>
3. **IaC Strategy**: <Bicep with azd / Terraform / ARM>
4. **Auth Strategy**: <Managed Identity for all service-to-service>
5. **Monitoring**: <Application Insights + Azure Monitor>

## 13. Next Steps

- [ ] Review and approve this assessment report
- [ ] Proceed to code migration (azure-cloud-migrate Phase 2)
- [ ] Hand off to azure-prepare for IaC generation
```

> šŸ’” **Tip:** Use `mcp_azure_mcp_get_bestpractices` tool to learn Azure Functions project structure best practices for the comparison.
code-migration.md 5.9 KB
# Code Migration Phase

Migrate AWS Lambda function code to Azure Functions.

## Prerequisites

- Assessment report completed
- Azure Functions extension installed in VS Code
- Best practices loaded via `mcp_azure_mcp_get_bestpractices` tool

## Rules

- If runtime is Python or Node.js: **do NOT create function.json files**
- If runtime is .NET (in-process or isolated) or Java: **do NOT hand-author function.json** — bindings metadata is generated from attributes/annotations at build time
- Use extension bundle version `[4.*, 5.0.0)` in host.json
- Use latest programming model (v4 for JavaScript, v2 for Python)
- **Always use bindings and triggers instead of SDKs** — For blob read/write, use `input.storageBlob()` / `output.storageBlob()` with `extraInputs`/`extraOutputs`. For queues, use `app.storageQueue()` or `app.serviceBusQueue()`. Only use SDK when there is no equivalent binding (e.g., Azure AI, custom HTTP calls)
- **Always use the latest supported language runtime** — Consult [supported languages](https://learn.microsoft.com/en-us/azure/azure-functions/supported-languages) and select the newest GA version. Do NOT default to an older LTS version when a newer version is available on Azure Functions.

## Steps

1. **Install Azure Functions Extension** — Ensure VS Code extension is installed
2. **Load Best Practices** — Use `mcp_azure_mcp_get_bestpractices` tool for code generation guidance
3. **Create Project Structure** — Set up the Azure Functions project inside the output directory (`<aws-folder>-azure/`). Do NOT create files inside the original AWS directory
4. **Migrate Functions** — Convert each Lambda function to Azure Functions equivalent
5. **Update Dependencies** — Replace AWS SDKs with Azure SDKs in package.json / requirements.txt
6. **Configure Bindings** — Set up triggers and bindings inline (v4 JS / v2 Python)
7. **Configure Environment** — Map Lambda env vars to Azure Functions app settings
8. **Add Error Handling** — Ensure proper error handling in all functions

## Key Configuration Files

### host.json

```json
{
  "version": "2.0",
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle",
    "version": "[4.*, 5.0.0)"
  },
  "extensions": {
    "queues": {
      "maxPollingInterval": "00:00:02",
      "visibilityTimeout": "00:00:30",
      "batchSize": 1,
      "maxDequeueCount": 5
    }
  },
  "logging": {
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled": true,
        "excludedTypes": "Request"
      }
    }
  }
}
```

## Critical Infrastructure Dependencies

### Blob Trigger with EventGrid Source — Additional Requirements

When migrating S3 event triggers to Azure blob triggers with `source: 'EventGrid'`, the following infrastructure must be configured **at the IaC level** (not code level). Failure to set these up results in silent trigger failures.

| Requirement | Why | Consequence of Missing |
|------------|-----|----------------------|
| **Queue endpoint** (`AzureWebJobsStorage__queueServiceUri`) | Blob extension uses queues internally for poison-message tracking with EventGrid source | Function fails to index: "Unable to find matching constructor...QueueServiceClient" |
| **Always-ready instances** (Flex Consumption only) | Blob trigger group must be running to register the Event Grid webhook | Trigger group never starts → webhook never registered → events never delivered |
| **Event Grid subscription via Bicep/ARM** | CLI-based webhook validation handshake times out on Flex Consumption | Use `listKeys()` in Bicep to obtain the `blobs_extension` system key at deployment time |
| **Storage Queue Data Contributor** RBAC | Identity-based queue access for poison messages | 403 errors during blob trigger indexing |

See [lambda-to-functions.md](lambda-to-functions.md#flex-consumption--blob-trigger-with-eventgrid-source) for Bicep patterns.

### UAMI Credential Pattern

When using User Assigned Managed Identity (UAMI), `DefaultAzureCredential()` without arguments tries System Assigned first and fails. Always pass the client ID:

```javascript
const credential = new DefaultAzureCredential({
  managedIdentityClientId: process.env.AZURE_CLIENT_ID
});
```

Add `AZURE_CLIENT_ID` as an app setting in Bicep pointing to the UAMI client ID.

### azd init Workaround for Non-Empty Directories

`azd init --template <template>` refuses to run in a non-empty directory. Use a temp-directory approach:

1. `azd init --template <template>` in an empty temp directory
2. Copy IaC files (`infra/`, `azure.yaml`, etc.) into the project root
3. Clean up the temp directory

### package.json (JavaScript)

```json
{
  "dependencies": {
    "@azure/functions": "^4.0.0",
    "@azure/identity": "<latest>"
  },
  "devDependencies": {
    "@azure/functions-core-tools": "^4",
    "jest": "<latest>"
  }
}
```

## Runtime-Specific Trigger & Binding Patterns

Load the appropriate runtime reference for the target language:

| Runtime | Reference |
|---------|----------|
| JavaScript (Node.js v4) | [runtimes/javascript.md](runtimes/javascript.md) |
| TypeScript (v4) | [runtimes/typescript.md](runtimes/typescript.md) |
| Python (v2) | [runtimes/python.md](runtimes/python.md) |
| C# (Isolated Worker) | [runtimes/csharp.md](runtimes/csharp.md) |
| Java | [runtimes/java.md](runtimes/java.md) |
| PowerShell | [runtimes/powershell.md](runtimes/powershell.md) |

## Scenario-Specific Guidance

See [lambda-to-functions.md](lambda-to-functions.md) for detailed trigger mapping, code patterns, and examples.

## Handoff to azure-prepare

After code migration is complete:

1. Update `migration-status.md` — mark Code Migration as āœ… Complete
2. Invoke **azure-prepare** — pass the assessment report context so it can:
   - Use the service mapping as requirements input (skips manual gather-requirements)
   - Generate IaC (Bicep/Terraform) for the mapped Azure services
   - Create `azure.yaml` and `.azure/preparation-manifest.md`
   - Apply security hardening
3. azure-prepare will then chain to **azure-validate** → **azure-deploy**
global-rules.md 3.3 KB
# Global Rules

These rules apply to ALL phases of the migration skill.

## Destructive Action Policy

ā›” **NEVER** perform destructive actions without explicit user confirmation via `ask_user`:
- Deleting files or directories
- Overwriting existing code
- Deploying to production environments
- Modifying existing Azure resources
- Removing AWS resources

## User Confirmation Required

Always use `ask_user` before:
- Selecting Azure subscription
- Selecting Azure region/location
- Deploying infrastructure
- Making breaking changes to existing code

## Best Practices

- Always use `mcp_azure_mcp_get_bestpractices` tool before generating Azure code
- Prefer managed identity over connection strings
- **Always use the latest supported language runtime** — check [supported languages](https://learn.microsoft.com/en-us/azure/azure-functions/supported-languages) for the newest GA version. Never default to older versions
- **Always prefer bindings over SDKs** — use `input.storageBlob()`, `output.storageBlob()`, `app.storageQueue()`, etc. instead of `BlobServiceClient`, `QueueClient`, or other SDK clients. Only use SDK when no binding exists for the service
- Follow Azure naming conventions
- Use Flex Consumption hosting plan for new Functions

## Identity-First Authentication (Zero API Keys)

> Enterprise subscriptions commonly enforce policies that block local auth. Always design for identity-based access from the start.

- **Storage accounts**: Set `allowSharedKeyAccess: false`. Use identity-based connections with `AzureWebJobsStorage__credential`, `__clientId`, and service-specific URIs (`__blobServiceUri`, `__queueServiceUri`, etc.)
- **Cognitive Services**: Set `disableLocalAuth: true`. Use UAMI + RBAC role (e.g., Cognitive Services User) instead of API keys
- **Application Insights**: Set `disableLocalAuth: true`. Use `APPLICATIONINSIGHTS_AUTHENTICATION_STRING` with `ClientId=<uamiClientId>;Authorization=AAD`
- **DefaultAzureCredential with UAMI**: When using User Assigned Managed Identity, always pass `managedIdentityClientId` explicitly:
  ```javascript
  const credential = new DefaultAzureCredential({
    managedIdentityClientId: process.env.AZURE_CLIENT_ID
  });
  ```
  Without this, `DefaultAzureCredential` tries SystemAssigned first and fails. Add `AZURE_CLIENT_ID` as an app setting mapped to the UAMI client ID.

## Flex Consumption Specifics

- **Always-ready for non-HTTP triggers**: Blob trigger groups on Flex Consumption require `alwaysReady: [{ name: "blob", instanceCount: 1 }]` to bootstrap the trigger listener. Without it, the trigger group never starts and Event Grid subscriptions are never auto-created (chicken-and-egg problem)
- **Blob trigger with EventGrid source requires queue endpoint**: The blob extension internally uses queues for poison-message tracking. Must include `AzureWebJobsStorage__queueServiceUri` even when using blob trigger (not queue trigger)
- **Event Grid subscriptions via Bicep/ARM only**: Do NOT create Event Grid event subscriptions via CLI — webhook validation fails on Flex Consumption with "response code Unknown". Deploy as Bicep resources using `listKeys()` to resolve the `blobs_extension` system key at deployment time
- **azd init on non-empty directories**: `azd init --template` refuses non-empty directories. Use temp directory approach: init in temp, copy template infrastructure files back
lambda-to-functions.md 10.2 KB
# AWS Lambda to Azure Functions Migration

Detailed guidance for migrating AWS Lambda functions to Azure Functions.

## Overview

| AWS Service | Azure Equivalent |
|-------------|------------------|
| Lambda | Azure Functions |
| API Gateway | Azure Functions HTTP Trigger / API Management |
| S3 | Azure Blob Storage |
| S3 Event | Azure Blob Storage + Event Grid |
| DynamoDB | Cosmos DB |
| SQS | Azure Service Bus / Storage Queue |
| SNS | Azure Event Grid |
| EventBridge | Azure Event Grid |
| CloudWatch | Application Insights / Azure Monitor |
| IAM Roles | Managed Identity + Azure RBAC |
| CloudFormation / SAM | Bicep / ARM Templates |
| Rekognition | Azure AI Computer Vision (Image Analysis) |

## Programming Model Mapping

| AWS Lambda | Azure Functions |
|------------|-----------------|
| `exports.handler` | `app.http()`, `app.storageBlob()`, etc. (v4) |
| `event` object | `request` / `blob` / trigger-specific param |
| `context` object | `context` (InvocationContext) |
| `callback` | Return value |
| `function.json` (v1-v3) | Inline bindings in code (v4 JS, v2 Python) |

## Trigger Mapping

| AWS Trigger | Azure Trigger | Notes |
|-------------|---------------|-------|
| API Gateway (REST/HTTP) | `app.http()` | Direct equivalent |
| S3 Event | `app.storageBlob()` | Use `source: 'EventGrid'` for reliability |
| SQS | `app.storageQueue()` or `app.serviceBusQueue()` | Service Bus for advanced scenarios |
| SNS | `app.eventGrid()` | Event Grid is push-based |
| EventBridge | `app.eventGrid()` | Map event patterns to filters |
| CloudWatch Events (Scheduled) | `app.timer()` | NCRONTAB expressions |
| DynamoDB Streams | Cosmos DB Change Feed trigger | Via `app.cosmosDB()` |

## Runtime-Specific Migration Patterns

For language-specific migration rules, correct/incorrect patterns, and code examples, see the runtime reference for the target language:

| Runtime | Migration Patterns |
|---------|-------------------|
| JavaScript (Node.js v4) | [runtimes/javascript.md — Lambda Migration Rules](runtimes/javascript.md#lambda-migration-rules) |
| Python (v2) | [runtimes/python.md — Lambda Migration Rules](runtimes/python.md#lambda-migration-rules) |
| TypeScript (v4) | [runtimes/typescript.md](runtimes/typescript.md) |
| C# (Isolated Worker) | [runtimes/csharp.md](runtimes/csharp.md) |
| Java | [runtimes/java.md](runtimes/java.md) |
| PowerShell | [runtimes/powershell.md](runtimes/powershell.md) |

## Project Structure

```
REQUIRED for Azure Functions:
src/
ā”œā”€ā”€ app.js (or function_app.py)   # Main entry point
ā”œā”€ā”€ host.json                      # Function host configuration
ā”œā”€ā”€ local.settings.json            # Local development settings
ā”œā”€ā”€ package.json (or requirements.txt)
ā”œā”€ā”€ [helper-modules]               # Business logic
└── tests/                         # Test files

āŒ NEVER create:
ā”œā”€ā”€ [functionName]/                # No individual function directories
│   ā”œā”€ā”€ function.json              # No function.json (JS v4, Python v2)
│   └── index.js
```

## Environment Variables

```
āœ… Use managed identity connections for Azure Functions storage:
   AzureWebJobsStorage__blobServiceUri
   AzureWebJobsStorage__queueServiceUri
   AzureWebJobsStorage__tableServiceUri

āœ… Use specific endpoint variables for other services:
   COMPUTER_VISION_ENDPOINT
   STORAGE_ACCOUNT_URL
   SOURCE_CONTAINER_NAME

āŒ Avoid:
   CONNECTION_STRING (use managed identity)
   API_KEY (use managed identity)
```

## Reference Links

- [AWS Lambda vs Azure Functions comparison](https://aka.ms/AWSLambda)
- [AWS to Azure services comparison](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/)
- [Supported language runtimes](https://learn.microsoft.com/en-us/azure/azure-functions/supported-languages)
- [Triggers and bindings overview](https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings)
- [Functions quickstart JS (azd)](https://github.com/Azure-Samples/functions-quickstart-javascript-azd/tree/main/infra)
- [Functions quickstart .NET Event Grid (azd)](https://github.com/Azure-Samples/functions-quickstart-dotnet-azd-eventgrid-blob/tree/main/infra)

## Flex Consumption + Blob Trigger with EventGrid Source

> **āš ļø CRITICAL**: When deploying blob triggers with `source: 'EventGrid'` on Flex Consumption, there are three infrastructure requirements that are NOT automatically handled and will cause silent trigger failures if missed.

### 1. Always-Ready Instances (Bootstrap Problem)

On Flex Consumption, trigger groups only start when there's work to do. But the blob extension needs to be running to create the Event Grid subscription that would deliver work — a chicken-and-egg problem.

**Solution**: Configure `alwaysReady` for the blob trigger group in the function app's `scaleAndConcurrency`:

```bicep
// In api.bicep — functionAppConfig section
scaleAndConcurrency: {
  alwaysReady: [
    {
      name: 'blob'
      instanceCount: 1
    }
  ]
  instanceMemoryMB: 2048
  maximumInstanceCount: 100
}
```

Without this, the trigger group never starts → Event Grid subscription never gets created → no events are delivered → function never triggers.

### 2. Queue Endpoint Required

The blob extension internally uses Storage Queues for poison-message tracking when `source: 'EventGrid'` is configured. Without the queue endpoint, the function fails to index with:

```
Unable to find matching constructor while trying to create an instance of QueueServiceClient.
Expected: serviceUri. Found: credential, clientId, blobServiceUri
```

**Solution**: Always enable the queue endpoint alongside blob when using EventGrid source:

```bicep
// In identity-based storage configuration
AzureWebJobsStorage__blobServiceUri: storageAccount.properties.primaryEndpoints.blob
AzureWebJobsStorage__queueServiceUri: storageAccount.properties.primaryEndpoints.queue  // REQUIRED for EventGrid source
AzureWebJobsStorage__credential: 'managedidentity'
AzureWebJobsStorage__clientId: managedIdentityClientId
```

Also assign **Storage Queue Data Contributor** RBAC role to the UAMI.

### 3. Event Grid Subscription via Bicep (Not CLI)

Do **NOT** create Event Grid event subscriptions via CLI. The `az eventgrid system-topic event-subscription create` command requires a webhook validation handshake that consistently fails on Flex Consumption with "response code Unknown" (timeout during cold start).

**Solution**: Deploy the Event Grid system topic and event subscription as Bicep resources. ARM handles the webhook validation internally and reliably:

```bicep
// eventGrid.bicep
resource systemTopic 'Microsoft.EventGrid/systemTopics@2024-06-01-preview' = {
  name: 'evgt-${storageAccountName}'
  location: location
  properties: {
    source: storageAccount.id
    topicType: 'Microsoft.Storage.StorageAccounts'
  }
}

resource eventSubscription 'Microsoft.EventGrid/systemTopics/eventSubscriptions@2024-06-01-preview' = {
  parent: systemTopic
  name: 'blob-trigger-sub'
  properties: {
    destination: {
      endpointType: 'WebHook'
      properties: {
        // ARM resolves system key and handles validation at deployment time
        endpointUrl: 'https://${functionApp.properties.defaultHostName}/runtime/webhooks/blobs?functionName=${functionName}&code=${listKeys('${functionApp.id}/host/default', '2023-12-01').systemKeys.blobs_extension}'
      }
    }
    filter: {
      includedEventTypes: [ 'Microsoft.Storage.BlobCreated' ]
      subjectBeginsWith: '/blobServices/default/containers/${sourceContainerName}/'
    }
  }
}
```

**RBAC requirement**: Assign **EventGrid EventSubscription Contributor** role to the UAMI.

## User Assigned Managed Identity (UAMI) Auth Patterns

### DefaultAzureCredential with UAMI

When using UAMI, `DefaultAzureCredential()` without arguments tries SystemAssigned first and fails. Always pass the client ID:

```javascript
const { DefaultAzureCredential } = require('@azure/identity');

const credential = new DefaultAzureCredential({
  managedIdentityClientId: process.env.AZURE_CLIENT_ID
});
```

Add `AZURE_CLIENT_ID` as an app setting pointing to the UAMI client ID:

```bicep
appSettings: {
  AZURE_CLIENT_ID: managedIdentity.outputs.clientId
}
```

### Identity-Linked Storage Connection

```bicep
AzureWebJobsStorage__credential: 'managedidentity'
AzureWebJobsStorage__clientId: managedIdentityClientId
AzureWebJobsStorage__blobServiceUri: storageAccount.properties.primaryEndpoints.blob
AzureWebJobsStorage__queueServiceUri: storageAccount.properties.primaryEndpoints.queue
```

### Required RBAC Roles for Face Blur Pattern

| Role | Scope | Purpose |
|------|-------|---------|
| Storage Blob Data Owner | Storage Account | Read source blobs, write destination blobs |
| Storage Queue Data Contributor | Storage Account | Poison-message queue for blob extension |
| EventGrid EventSubscription Contributor | Resource Group | Create/manage Event Grid subscriptions |
| Cognitive Services User | Cognitive Services Account | Call Computer Vision API |
| Monitoring Metrics Publisher | Application Insights | Emit telemetry |

## AWS Rekognition → Azure AI Computer Vision

| AWS | Azure |
|-----|-------|
| `@aws-sdk/client-rekognition` | `@azure-rest/ai-vision-image-analysis` |
| `DetectFaces` | Image Analysis `People` feature |
| `FaceDetails[].BoundingBox` (relative 0-1) | `peopleResult.values[].boundingBox` (pixel coordinates) |

> **āš ļø Package version**: `@azure-rest/ai-vision-image-analysis` is still in beta. The `^1.0.0` semver does NOT resolve. Pin explicitly: `"@azure-rest/ai-vision-image-analysis": "1.0.0-beta.3"`

### Coordinate Conversion

AWS Rekognition returns relative coordinates (0-1). Azure AI returns pixel coordinates. Convert for consistent face processing:

```javascript
const sharp = require('sharp');
const metadata = await sharp(imageBuffer).metadata();

const faces = result.body.peopleResult.values.map(person => ({
  BoundingBox: {
    Width: person.boundingBox.width / metadata.width,
    Height: person.boundingBox.height / metadata.height,
    Left: person.boundingBox.x / metadata.width,
    Top: person.boundingBox.y / metadata.height
  }
}));
```

### Auth: Use UAMI (No API Keys)

```bicep
// computerVision.bicep
resource computerVision 'Microsoft.CognitiveServices/accounts@2024-10-01' = {
  kind: 'ComputerVision'
  properties: {
    disableLocalAuth: true  // Enterprise policy compliance — no API keys
  }
}
```
references/services/functions/runtimes/
csharp.md 5.5 KB
# C# — Azure Functions Isolated Worker Model Triggers & Bindings

> **Model**: .NET isolated worker model (recommended). Uses attributes on methods/parameters.
> Import: `using Microsoft.Azure.Functions.Worker;`

## HTTP Trigger

```csharp
[Function("HttpFunction")]
public static HttpResponseData Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestData req,
    FunctionContext context)
{
    var response = req.CreateResponse(HttpStatusCode.OK);
    response.WriteString("Hello!");
    return response;
}
```

## Blob Storage

```csharp
// Trigger (EventGrid source)
[Function("BlobTrigger")]
public static void Run(
    [BlobTrigger("samples-workitems/{name}", Source = BlobTriggerSource.EventGrid,
     Connection = "AzureWebJobsStorage")] string blobContent,
    string name, FunctionContext context)
{
    context.GetLogger("BlobTrigger").LogInformation($"Blob: {name}");
}

// Input
[BlobInput("input/{name}", Connection = "AzureWebJobsStorage")] string inputBlob

// Output
[BlobOutput("output/{name}", Connection = "AzureWebJobsStorage")] out string outputBlob
```

## Queue Storage

```csharp
// Trigger
[Function("QueueTrigger")]
public static void Run(
    [QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] string message,
    FunctionContext context)
{
    context.GetLogger("Queue").LogInformation($"Message: {message}");
}

// Output (via return type)
[Function("QueueOutput")]
[QueueOutput("outqueue", Connection = "AzureWebJobsStorage")]
public static string Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post")] HttpRequestData req)
{
    return "queue message";
}
```

## Timer

```csharp
[Function("TimerFunction")]
public static void Run(
    [TimerTrigger("0 */5 * * * *")] TimerInfo timer,
    FunctionContext context)
{
    context.GetLogger("Timer").LogInformation($"Last: {timer.ScheduleStatus?.Last}");
}
```

## Event Grid

```csharp
// Trigger
[Function("EventGridTrigger")]
public static void Run(
    [EventGridTrigger] EventGridEvent eventGridEvent,
    FunctionContext context)
{
    context.GetLogger("EG").LogInformation($"Event: {eventGridEvent.Subject}");
}

// Output
[EventGridOutput(TopicEndpointUri = "MyTopicUri", TopicKeySetting = "MyTopicKey")]
```

## Cosmos DB

```csharp
// Trigger (Change Feed)
[Function("CosmosDBTrigger")]
public static void Run(
    [CosmosDBTrigger("mydb", "mycontainer",
     Connection = "CosmosDBConnection",
     CreateLeaseContainerIfNotExists = true)] IReadOnlyList<MyDocument> documents,
    FunctionContext context)
{
    foreach (var doc in documents)
        context.GetLogger("Cosmos").LogInformation($"Doc: {doc.Id}");
}

// Input
[CosmosDBInput("mydb", "mycontainer", Connection = "CosmosDBConnection",
 Id = "{id}", PartitionKey = "{partitionKey}")] MyDocument doc

// Output
[CosmosDBOutput("mydb", "mycontainer", Connection = "CosmosDBConnection")]
```

## Service Bus

```csharp
// Queue Trigger
[Function("SBQueueTrigger")]
public static void Run(
    [ServiceBusTrigger("myqueue", Connection = "ServiceBusConnection")] string message,
    FunctionContext context)
{
    context.GetLogger("SB").LogInformation($"Message: {message}");
}

// Topic Trigger
[Function("SBTopicTrigger")]
public static void Run(
    [ServiceBusTrigger("mytopic", "mysubscription",
     Connection = "ServiceBusConnection")] string message,
    FunctionContext context)
{
    context.GetLogger("SB").LogInformation($"Topic: {message}");
}

// Output
[ServiceBusOutput("outqueue", Connection = "ServiceBusConnection")]
```

## Event Hubs

```csharp
// Trigger
[Function("EventHubTrigger")]
public static void Run(
    [EventHubTrigger("myeventhub", Connection = "EventHubConnection")] EventData[] events,
    FunctionContext context)
{
    foreach (var e in events)
        context.GetLogger("EH").LogInformation($"Event: {e.EventBody}");
}

// Output
[EventHubOutput("outeventhub", Connection = "EventHubConnection")]
```

## Table Storage

```csharp
// Input
[TableInput("mytable", "{partitionKey}", "{rowKey}",
 Connection = "AzureWebJobsStorage")] TableEntity entity

// Output
[TableOutput("mytable", Connection = "AzureWebJobsStorage")]
```

## SQL

```csharp
// Trigger
[Function("SqlTrigger")]
public static void Run(
    [SqlTrigger("dbo.MyTable", "SqlConnection")] IReadOnlyList<SqlChange<MyItem>> changes,
    FunctionContext context)
{
    foreach (var change in changes)
        context.GetLogger("SQL").LogInformation($"Change: {change.Item.Id}");
}

// Input
[SqlInput("SELECT * FROM dbo.MyTable WHERE Id = @Id",
 "SqlConnection", parameters: "@Id={id}")] IEnumerable<MyItem> items

// Output
[SqlOutput("dbo.MyTable", "SqlConnection")]
```

## SignalR

```csharp
// Output
[SignalROutput(HubName = "myhub", ConnectionStringSetting = "AzureSignalRConnectionString")]
```

## SendGrid

```csharp
[SendGridOutput(ApiKey = "SendGridApiKey", From = "noreply@example.com")]
```

## Multiple Outputs

```csharp
// Use a custom return type for multiple outputs
public class MultiOutput
{
    [QueueOutput("outqueue", Connection = "AzureWebJobsStorage")]
    public string QueueMessage { get; set; }

    public HttpResponseData HttpResponse { get; set; }
}

[Function("MultiOutput")]
public static MultiOutput Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post")] HttpRequestData req)
{
    return new MultiOutput
    {
        QueueMessage = "new item",
        HttpResponse = req.CreateResponse(HttpStatusCode.OK)
    };
}
```

> Full reference: [Azure Functions C# isolated worker guide](https://learn.microsoft.com/en-us/azure/azure-functions/dotnet-isolated-process-guide)
java.md 6.4 KB
# Java — Azure Functions Triggers & Bindings

> **Model**: Java annotation-based model. Uses `@FunctionName` and trigger/binding annotations.
> Import: `com.microsoft.azure.functions.*` and `com.microsoft.azure.functions.annotation.*`

## HTTP Trigger

```java
@FunctionName("HttpFunction")
public HttpResponseMessage run(
    @HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST},
                 authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Optional<String>> request,
    final ExecutionContext context) {
    String name = request.getQueryParameters().get("name");
    return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
```

## Blob Storage

```java
// Trigger
@FunctionName("BlobTrigger")
public void run(
    @BlobTrigger(name = "blob", path = "samples-workitems/{name}",
                 connection = "AzureWebJobsStorage") byte[] content,
    @BindingName("name") String name,
    final ExecutionContext context) {
    context.getLogger().info("Blob: " + name + ", Size: " + content.length);
}

// Input
@BlobInput(name = "inputBlob", path = "input/{name}", connection = "AzureWebJobsStorage")

// Output
@BlobOutput(name = "outputBlob", path = "output/{name}-out", connection = "AzureWebJobsStorage")
```

## Queue Storage

```java
// Trigger
@FunctionName("QueueTrigger")
public void run(
    @QueueTrigger(name = "message", queueName = "myqueue-items",
                  connection = "AzureWebJobsStorage") String message,
    final ExecutionContext context) {
    context.getLogger().info("Queue message: " + message);
}

// Output
@QueueOutput(name = "output", queueName = "outqueue", connection = "AzureWebJobsStorage")
```

## Timer

```java
@FunctionName("TimerFunction")
public void run(
    @TimerTrigger(name = "timer", schedule = "0 */5 * * * *") String timerInfo,
    final ExecutionContext context) {
    context.getLogger().info("Timer triggered");
}
```

## Event Grid

```java
// Trigger
@FunctionName("EventGridTrigger")
public void run(
    @EventGridTrigger(name = "event") EventGridEvent event,
    final ExecutionContext context) {
    context.getLogger().info("Event: " + event.subject());
}

// Output
@EventGridOutput(name = "output", topicEndpointUri = "MyTopicUri",
                 topicKeySetting = "MyTopicKey")
```

## Cosmos DB

```java
// Trigger (Change Feed)
@FunctionName("CosmosDBTrigger")
public void run(
    @CosmosDBTrigger(name = "documents", databaseName = "mydb",
                     containerName = "mycontainer", connection = "CosmosDBConnection",
                     createLeaseContainerIfNotExists = true) String[] documents,
    final ExecutionContext context) {
    for (String doc : documents) context.getLogger().info("Doc: " + doc);
}

// Input
@CosmosDBInput(name = "doc", databaseName = "mydb", containerName = "mycontainer",
               connection = "CosmosDBConnection", id = "{id}", partitionKey = "{pk}")

// Output
@CosmosDBOutput(name = "newdoc", databaseName = "mydb", containerName = "mycontainer",
                connection = "CosmosDBConnection")
```

## Service Bus

```java
// Queue Trigger
@FunctionName("SBQueueTrigger")
public void run(
    @ServiceBusQueueTrigger(name = "message", queueName = "myqueue",
                            connection = "ServiceBusConnection") String message,
    final ExecutionContext context) {
    context.getLogger().info("Message: " + message);
}

// Topic Trigger
@FunctionName("SBTopicTrigger")
public void run(
    @ServiceBusTopicTrigger(name = "message", topicName = "mytopic",
                            subscriptionName = "mysubscription",
                            connection = "ServiceBusConnection") String message,
    final ExecutionContext context) {
    context.getLogger().info("Topic: " + message);
}

// Output
@ServiceBusQueueOutput(name = "output", queueName = "outqueue",
                       connection = "ServiceBusConnection")
```

## Event Hubs

```java
// Trigger
@FunctionName("EventHubTrigger")
public void run(
    @EventHubTrigger(name = "event", eventHubName = "myeventhub",
                     connection = "EventHubConnection", cardinality = Cardinality.MANY) List<String> events,
    final ExecutionContext context) {
    events.forEach(e -> context.getLogger().info("Event: " + e));
}

// Output
@EventHubOutput(name = "output", eventHubName = "outeventhub",
                connection = "EventHubConnection")
```

## Table Storage

```java
// Input
@TableInput(name = "entity", tableName = "mytable", partitionKey = "{pk}",
            rowKey = "{rk}", connection = "AzureWebJobsStorage")

// Output
@TableOutput(name = "output", tableName = "mytable", connection = "AzureWebJobsStorage")
```

## SQL

```java
// Trigger
@FunctionName("SqlTrigger")
public void run(
    @SqlTrigger(name = "changes", tableName = "dbo.MyTable",
                connectionStringSetting = "SqlConnection") SqlChange[] changes,
    final ExecutionContext context) {
    for (SqlChange change : changes) context.getLogger().info("Change: " + change);
}

// Input
@SqlInput(name = "items", commandText = "SELECT * FROM dbo.MyTable WHERE Id = @Id",
          commandType = "Text", parameters = "@Id={id}",
          connectionStringSetting = "SqlConnection")

// Output
@SqlOutput(name = "output", commandText = "dbo.MyTable",
           connectionStringSetting = "SqlConnection")
```

## SignalR

```java
@SignalROutput(name = "signalr", hubName = "myhub",
               connectionStringSetting = "AzureSignalRConnectionString")
```

## SendGrid

```java
@SendGridOutput(name = "mail", apiKey = "SendGridApiKey",
                from = "noreply@example.com", to = "{email}")
```

## Multiple Outputs

Java uses `OutputBinding<T>` for additional outputs:

```java
@FunctionName("MultiOutput")
public HttpResponseMessage run(
    @HttpTrigger(name = "req", methods = {HttpMethod.POST},
                 authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<String> request,
    @QueueOutput(name = "output", queueName = "outqueue",
                 connection = "AzureWebJobsStorage") OutputBinding<String> queueOutput,
    final ExecutionContext context) {
    queueOutput.setValue("new item");
    return request.createResponseBuilder(HttpStatus.OK).body("Processed").build();
}
```

> **Note**: Java uses `function.json` generated at build time by the Maven plugin — you don't write them manually.

> Full reference: [Azure Functions Java developer guide](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-java)
javascript.md 8.5 KB
# JavaScript (Node.js) — Azure Functions v4 Triggers & Bindings

> **Model**: JavaScript v4 programming model. **NO** `function.json` files.
> Import: `const { app, input, output } = require('@azure/functions');`

## Lambda Migration Rules

> Shared rules (bindings over SDKs, latest runtime, identity-first auth) → [global-rules.md](../global-rules.md)

JS-specific:
- Use `extraInputs` / `extraOutputs` with binding path expressions (e.g., `{queueTrigger}`) for dynamic blob I/O
- Access metadata via `context.triggerMetadata`
- `package.json`: `"@azure/functions": "^4.0.0"`

### Correct Migration Pattern

```javascript
const { app, input, output } = require('@azure/functions');

// Use bindings for blob I/O instead of BlobServiceClient SDK
const blobInput = input.storageBlob({
  path: 'source-container/{queueTrigger}',
  connection: 'AzureWebJobsStorage'
});

const blobOutput = output.storageBlob({
  path: 'destination-container/{queueTrigger}',
  connection: 'AzureWebJobsStorage'
});

app.storageQueue('processImage', {
  queueName: 'image-processing',
  connection: 'AzureWebJobsStorage',
  extraInputs: [blobInput],
  extraOutputs: [blobOutput],
  handler: async (queueItem, context) => {
    const sourceBlob = context.extraInputs.get(blobInput);
    context.log(`Processing blob: ${queueItem}`);
    // Process the blob...
    context.extraOutputs.set(blobOutput, processedBuffer);
  }
});
```

> āŒ Do NOT use legacy v1-v3 `module.exports` — always use `app.*()` registration.

## HTTP Trigger

```javascript
app.http('httpFunction', {
  methods: ['GET', 'POST'],
  authLevel: 'anonymous',
  handler: async (request, context) => {
    const name = request.query.get('name') || (await request.text());
    return { body: `Hello, ${name}!` };
  }
});
```

## Blob Storage

```javascript
// Trigger (use EventGrid source for reliability)
app.storageBlob('blobTrigger', {
  path: 'samples-workitems/{name}',
  connection: 'AzureWebJobsStorage',
  source: 'EventGrid',
  handler: async (blob, context) => {
    context.log(`Blob: ${context.triggerMetadata.name}, Size: ${blob.length}`);
  }
});

// Input binding
const blobInput = input.storageBlob({
  path: 'samples-workitems/{queueTrigger}',
  connection: 'AzureWebJobsStorage'
});

// Output binding
const blobOutput = output.storageBlob({
  path: 'samples-output/{name}-out',
  connection: 'AzureWebJobsStorage'
});
```

> **āš ļø Flex Consumption + EventGrid Source Requirements:**
> When using `source: 'EventGrid'` on a Flex Consumption plan, three infrastructure requirements MUST be met or the trigger will silently fail:
>
> 1. **Always-ready instances**: Configure `alwaysReady: [{ name: 'blob', instanceCount: 1 }]` in Bicep. Without this, the trigger group never starts and the Event Grid webhook endpoint is never registered.
> 2. **Queue endpoint**: Set `AzureWebJobsStorage__queueServiceUri` in app settings. The blob extension uses queues internally for poison-message tracking with EventGrid source, even though you're not using a queue trigger.
> 3. **Event Grid subscription via Bicep/ARM**: Do NOT create event subscriptions via CLI — webhook validation times out on Flex Consumption. Deploy as a Bicep resource using `listKeys()` to obtain the `blobs_extension` system key.
>
> See [lambda-to-functions.md](../lambda-to-functions.md#flex-consumption--blob-trigger-with-eventgrid-source) for full Bicep patterns.

### Using Azure AI Services with UAMI

When calling Azure AI services (Computer Vision, etc.) from a function, use `DefaultAzureCredential` with explicit UAMI client ID:

```javascript
const { DefaultAzureCredential } = require('@azure/identity');
const createClient = require('@azure-rest/ai-vision-image-analysis').default;

const credential = new DefaultAzureCredential({
  managedIdentityClientId: process.env.AZURE_CLIENT_ID  // Required for UAMI
});
const client = createClient(process.env.COMPUTER_VISION_ENDPOINT, credential);

const result = await client.path('/imageanalysis:analyze').post({
  body: { url: blobUrl },
  queryParameters: { features: ['People'] }  // Use 'People' for face detection
});
```

> **Note**: `@azure-rest/ai-vision-image-analysis` is still in beta. Pin explicitly: `"1.0.0-beta.3"` — the `^1.0.0` semver range does NOT resolve.

## Queue Storage

```javascript
// Trigger
app.storageQueue('queueTrigger', {
  queueName: 'myqueue-items',
  connection: 'AzureWebJobsStorage',
  handler: async (queueItem, context) => {
    context.log('Queue item:', queueItem);
  }
});

// Output
const queueOutput = output.storageQueue({
  queueName: 'outqueue',
  connection: 'AzureWebJobsStorage'
});
```

## Timer

```javascript
app.timer('timerFunction', {
  schedule: '0 */5 * * * *', // Every 5 minutes (NCRONTAB)
  handler: async (myTimer, context) => {
    context.log('Timer fired at:', myTimer.scheduleStatus.last);
  }
});
```

## Event Grid

```javascript
// Trigger
app.eventGrid('eventGridTrigger', {
  handler: async (event, context) => {
    context.log('Event:', event.subject, event.eventType);
  }
});

// Output
const eventGridOutput = output.eventGrid({
  topicEndpointUri: 'MyEventGridTopicUriSetting',
  topicKeySetting: 'MyEventGridTopicKeySetting'
});
```

## Cosmos DB

```javascript
// Trigger (Change Feed)
app.cosmosDB('cosmosDBTrigger', {
  connectionStringSetting: 'CosmosDBConnection',
  databaseName: 'mydb',
  containerName: 'mycontainer',
  createLeaseContainerIfNotExists: true,
  handler: async (documents, context) => {
    documents.forEach(doc => context.log('Changed doc:', doc.id));
  }
});

// Input
const cosmosInput = input.cosmosDB({
  connectionStringSetting: 'CosmosDBConnection',
  databaseName: 'mydb',
  containerName: 'mycontainer',
  id: '{id}',
  partitionKey: '{partitionKey}'
});

// Output
const cosmosOutput = output.cosmosDB({
  connectionStringSetting: 'CosmosDBConnection',
  databaseName: 'mydb',
  containerName: 'mycontainer'
});
```

## Service Bus

```javascript
// Queue Trigger
app.serviceBusQueue('sbQueueTrigger', {
  queueName: 'myqueue',
  connection: 'ServiceBusConnection',
  handler: async (message, context) => {
    context.log('Message:', message);
  }
});

// Topic Trigger
app.serviceBusTopic('sbTopicTrigger', {
  topicName: 'mytopic',
  subscriptionName: 'mysubscription',
  connection: 'ServiceBusConnection',
  handler: async (message, context) => {
    context.log('Topic message:', message);
  }
});

// Output
const sbOutput = output.serviceBusQueue({
  queueName: 'outqueue',
  connection: 'ServiceBusConnection'
});
```

## Event Hubs

```javascript
// Trigger
app.eventHub('eventHubTrigger', {
  eventHubName: 'myeventhub',
  connection: 'EventHubConnection',
  cardinality: 'many',
  handler: async (events, context) => {
    events.forEach(event => context.log('Event:', event));
  }
});

// Output
const ehOutput = output.eventHub({
  eventHubName: 'outeventhub',
  connection: 'EventHubConnection'
});
```

## Table Storage

```javascript
// Input
const tableInput = input.table({
  tableName: 'mytable',
  partitionKey: '{partitionKey}',
  rowKey: '{rowKey}',
  connection: 'AzureWebJobsStorage'
});

// Output
const tableOutput = output.table({
  tableName: 'mytable',
  connection: 'AzureWebJobsStorage'
});
```

## SQL

```javascript
// Trigger
app.generic('sqlTrigger', {
  trigger: { type: 'sqlTrigger', tableName: 'dbo.MyTable', connectionStringSetting: 'SqlConnection' },
  handler: async (changes, context) => {
    changes.forEach(change => context.log('Change:', change));
  }
});

// Input
const sqlInput = input.sql({
  commandText: 'SELECT * FROM dbo.MyTable WHERE Id = @Id',
  commandType: 'Text',
  parameters: '@Id={id}',
  connectionStringSetting: 'SqlConnection'
});

// Output
const sqlOutput = output.sql({
  commandText: 'dbo.MyTable',
  connectionStringSetting: 'SqlConnection'
});
```

## SignalR

```javascript
// Output
const signalROutput = output.generic({
  type: 'signalR',
  hubName: 'myhub',
  connectionStringSetting: 'AzureSignalRConnectionString'
});
```

## SendGrid

```javascript
const sendGridOutput = output.generic({
  type: 'sendGrid',
  apiKey: 'SendGridApiKey',
  from: 'noreply@example.com',
  to: '{email}'
});
```

## Using Bindings with Functions

```javascript
// Combine trigger with input/output bindings
app.http('processItem', {
  methods: ['POST'],
  extraInputs: [cosmosInput],
  extraOutputs: [queueOutput],
  handler: async (request, context) => {
    const doc = context.extraInputs.get(cosmosInput);
    context.extraOutputs.set(queueOutput, JSON.stringify(doc));
    return { body: 'Processed' };
  }
});
```

> Full reference: [Azure Functions JavaScript developer guide](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-node)
powershell.md 4.9 KB
# PowerShell — Azure Functions Triggers & Bindings

> **Model**: PowerShell uses `function.json` for binding definitions + `run.ps1` for handler code.
> Each function lives in its own directory: `<FunctionName>/function.json` + `run.ps1`

## Project Structure

```
src/
ā”œā”€ā”€ host.json
ā”œā”€ā”€ local.settings.json
ā”œā”€ā”€ requirements.psd1
ā”œā”€ā”€ profile.ps1
ā”œā”€ā”€ HttpFunction/
│   ā”œā”€ā”€ function.json
│   └── run.ps1
└── TimerFunction/
    ā”œā”€ā”€ function.json
    └── run.ps1
```

## HTTP Trigger

**function.json:**
```json
{
  "bindings": [
    { "authLevel": "anonymous", "type": "httpTrigger", "direction": "in",
      "name": "Request", "methods": ["get", "post"] },
    { "type": "http", "direction": "out", "name": "Response" }
  ]
}
```

**run.ps1:**
```powershell
param($Request, $TriggerMetadata)
$name = $Request.Query.Name
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
    StatusCode = [HttpStatusCode]::OK
    Body = "Hello, $name!"
})
```

## Blob Storage

**function.json:**
```json
{
  "bindings": [
    { "name": "InputBlob", "type": "blobTrigger", "direction": "in",
      "path": "samples-workitems/{name}", "connection": "AzureWebJobsStorage" },
    { "name": "OutputBlob", "type": "blob", "direction": "out",
      "path": "output/{name}", "connection": "AzureWebJobsStorage" }
  ]
}
```

**run.ps1:**
```powershell
param($InputBlob, $TriggerMetadata)
Write-Host "Blob: $($TriggerMetadata.Name), Size: $($InputBlob.Length)"
Push-OutputBinding -Name OutputBlob -Value $InputBlob
```

## Queue Storage

**function.json:**
```json
{
  "bindings": [
    { "name": "QueueItem", "type": "queueTrigger", "direction": "in",
      "queueName": "myqueue-items", "connection": "AzureWebJobsStorage" },
    { "name": "OutputQueue", "type": "queue", "direction": "out",
      "queueName": "outqueue", "connection": "AzureWebJobsStorage" }
  ]
}
```

**run.ps1:**
```powershell
param($QueueItem, $TriggerMetadata)
Write-Host "Queue message: $QueueItem"
Push-OutputBinding -Name OutputQueue -Value "Processed: $QueueItem"
```

## Timer

**function.json:**
```json
{
  "bindings": [
    { "name": "Timer", "type": "timerTrigger", "direction": "in",
      "schedule": "0 */5 * * * *" }
  ]
}
```

**run.ps1:**
```powershell
param($Timer)
Write-Host "Timer triggered at: $(Get-Date)"
```

## Event Grid

**function.json:**
```json
{
  "bindings": [
    { "name": "EventGridEvent", "type": "eventGridTrigger", "direction": "in" }
  ]
}
```

**run.ps1:**
```powershell
param($EventGridEvent, $TriggerMetadata)
Write-Host "Event: $($EventGridEvent.subject) - $($EventGridEvent.eventType)"
```

## Cosmos DB

**function.json:**
```json
{
  "bindings": [
    { "name": "Documents", "type": "cosmosDBTrigger", "direction": "in",
      "connectionStringSetting": "CosmosDBConnection", "databaseName": "mydb",
      "containerName": "mycontainer", "createLeaseContainerIfNotExists": true },
    { "name": "OutputDoc", "type": "cosmosDB", "direction": "out",
      "connectionStringSetting": "CosmosDBConnection", "databaseName": "mydb",
      "containerName": "outcontainer" }
  ]
}
```

**run.ps1:**
```powershell
param($Documents, $TriggerMetadata)
foreach ($doc in $Documents) {
    Write-Host "Changed doc: $($doc.id)"
}
```

## Service Bus

**function.json:**
```json
{
  "bindings": [
    { "name": "Message", "type": "serviceBusTrigger", "direction": "in",
      "queueName": "myqueue", "connection": "ServiceBusConnection" },
    { "name": "OutputMessage", "type": "serviceBus", "direction": "out",
      "queueName": "outqueue", "connection": "ServiceBusConnection" }
  ]
}
```

**run.ps1:**
```powershell
param($Message, $TriggerMetadata)
Write-Host "Message: $Message"
Push-OutputBinding -Name OutputMessage -Value "Processed: $Message"
```

## Event Hubs

**function.json:**
```json
{
  "bindings": [
    { "name": "Events", "type": "eventHubTrigger", "direction": "in",
      "eventHubName": "myeventhub", "connection": "EventHubConnection",
      "cardinality": "many" }
  ]
}
```

## Table Storage

**function.json:**
```json
{
  "bindings": [
    { "name": "TableEntity", "type": "table", "direction": "in",
      "tableName": "mytable", "partitionKey": "{pk}", "rowKey": "{rk}",
      "connection": "AzureWebJobsStorage" },
    { "name": "TableOut", "type": "table", "direction": "out",
      "tableName": "mytable", "connection": "AzureWebJobsStorage" }
  ]
}
```

## SQL

**function.json:**
```json
{
  "bindings": [
    { "name": "Changes", "type": "sqlTrigger", "direction": "in",
      "tableName": "dbo.MyTable", "connectionStringSetting": "SqlConnection" }
  ]
}
```

## SendGrid

**function.json:**
```json
{
  "bindings": [
    { "name": "Mail", "type": "sendGrid", "direction": "out",
      "apiKey": "SendGridApiKey", "from": "noreply@example.com" }
  ]
}
```

> **Note**: PowerShell always requires `function.json` — it does not support inline binding definitions.

> Full reference: [Azure Functions PowerShell developer guide](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell)
python.md 6.4 KB
# Python — Azure Functions v2 Triggers & Bindings

> **Model**: Python v2 programming model. **NO** `function.json` files.
> Entry point: `function_app.py`
> Import: `import azure.functions as func`

## Lambda Migration Rules

> Shared rules (bindings over SDKs, latest runtime, identity-first auth) → [global-rules.md](../global-rules.md)

Python-specific: all functions use the v2 decorator model shown throughout this file. No additional migration rules beyond global.

## HTTP Trigger

```python
@app.route(route="hello", methods=["GET", "POST"], auth_level=func.AuthLevel.ANONYMOUS)
def http_function(req: func.HttpRequest) -> func.HttpResponse:
    name = req.params.get('name') or req.get_body().decode()
    return func.HttpResponse(f"Hello, {name}!")
```

## Blob Storage

```python
# Trigger (use EventGrid source)
@app.blob_trigger(arg_name="myblob", path="samples-workitems/{name}",
                  connection="AzureWebJobsStorage", source="EventGrid")
def blob_trigger(myblob: func.InputStream):
    logging.info(f"Blob: {myblob.name}, Size: {myblob.length}")

# Input
@app.blob_input(arg_name="inputblob", path="input/{name}", connection="AzureWebJobsStorage")

# Output
@app.blob_output(arg_name="outputblob", path="output/{name}", connection="AzureWebJobsStorage")
```

## Queue Storage

```python
# Trigger
@app.queue_trigger(arg_name="msg", queue_name="myqueue-items",
                   connection="AzureWebJobsStorage")
def queue_trigger(msg: func.QueueMessage):
    logging.info(f"Queue message: {msg.get_body().decode()}")

# Output
@app.queue_output(arg_name="outputmsg", queue_name="outqueue",
                  connection="AzureWebJobsStorage")
```

## Timer

```python
@app.timer_trigger(schedule="0 */5 * * * *", arg_name="mytimer",
                   run_on_startup=False)
def timer_function(mytimer: func.TimerRequest):
    logging.info("Timer triggered")
```

## Event Grid

```python
# Trigger
@app.event_grid_trigger(arg_name="event")
def eventgrid_trigger(event: func.EventGridEvent):
    logging.info(f"Event: {event.subject} - {event.event_type}")

# Output
@app.event_grid_output(arg_name="outputEvent",
                       topic_endpoint_uri="MyEventGridTopicUriSetting",
                       topic_key_setting="MyEventGridTopicKeySetting")
```

## Cosmos DB

```python
# Trigger (Change Feed)
@app.cosmos_db_trigger_v3(arg_name="documents",
                          connection="CosmosDBConnection",
                          database_name="mydb",
                          container_name="mycontainer",
                          create_lease_container_if_not_exists=True)
def cosmosdb_trigger(documents: func.DocumentList):
    for doc in documents:
        logging.info(f"Changed doc: {doc['id']}")

# Input
@app.cosmos_db_input(arg_name="doc", connection="CosmosDBConnection",
                     database_name="mydb", container_name="mycontainer",
                     id="{id}", partition_key="{partitionKey}")

# Output
@app.cosmos_db_output(arg_name="newdoc", connection="CosmosDBConnection",
                      database_name="mydb", container_name="mycontainer")
```

## Service Bus

```python
# Queue Trigger
@app.service_bus_queue_trigger(arg_name="msg", queue_name="myqueue",
                               connection="ServiceBusConnection")
def sb_queue_trigger(msg: func.ServiceBusMessage):
    logging.info(f"Message: {msg.get_body().decode()}")

# Topic Trigger
@app.service_bus_topic_trigger(arg_name="msg", topic_name="mytopic",
                                subscription_name="mysubscription",
                                connection="ServiceBusConnection")
def sb_topic_trigger(msg: func.ServiceBusMessage):
    logging.info(f"Topic message: {msg.get_body().decode()}")

# Output
@app.service_bus_queue_output(arg_name="outmsg", queue_name="outqueue",
                              connection="ServiceBusConnection")
```

## Event Hubs

```python
# Trigger
@app.event_hub_message_trigger(arg_name="event", event_hub_name="myeventhub",
                                connection="EventHubConnection",
                                cardinality="many")
def eventhub_trigger(event: func.EventHubEvent):
    logging.info(f"Event: {event.get_body().decode()}")

# Output
@app.event_hub_output(arg_name="outevent", event_hub_name="outeventhub",
                      connection="EventHubConnection")
```

## Table Storage

```python
# Input
@app.table_input(arg_name="tableEntity", table_name="mytable",
                 partition_key="{partitionKey}", row_key="{rowKey}",
                 connection="AzureWebJobsStorage")

# Output
@app.table_output(arg_name="tableOut", table_name="mytable",
                  connection="AzureWebJobsStorage")
```

## SQL

```python
# Trigger
@app.sql_trigger(arg_name="changes", table_name="dbo.MyTable",
                 connection_string_setting="SqlConnection")
def sql_trigger(changes: func.SqlRowList):
    for change in changes:
        logging.info(f"Change: {change}")

# Input
@app.sql_input(arg_name="items",
               command_text="SELECT * FROM dbo.MyTable WHERE Id = @Id",
               command_type="Text", parameters="@Id={id}",
               connection_string_setting="SqlConnection")

# Output
@app.sql_output(arg_name="newitem", command_text="dbo.MyTable",
                connection_string_setting="SqlConnection")
```

## SignalR

```python
@app.generic_output_binding(arg_name="signalr", type="signalR",
                             hub_name="myhub",
                             connection_string_setting="AzureSignalRConnectionString")
```

## SendGrid

```python
@app.generic_output_binding(arg_name="mail", type="sendGrid",
                             api_key="SendGridApiKey",
                             from_address="noreply@example.com")
```

## Combining Decorators

```python
@app.route(route="process", methods=["POST"])
@app.cosmos_db_input(arg_name="doc", connection="CosmosDBConnection",
                     database_name="mydb", container_name="mycontainer",
                     id="{id}", partition_key="{pk}")
@app.queue_output(arg_name="outmsg", queue_name="outqueue",
                  connection="AzureWebJobsStorage")
def process_item(req: func.HttpRequest, doc: func.DocumentList,
                 outmsg: func.Out[str]) -> func.HttpResponse:
    outmsg.set(doc[0].to_json())
    return func.HttpResponse("Processed")
```

> Full reference: [Azure Functions Python developer guide](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python)
typescript.md 3.5 KB
# TypeScript — Azure Functions v4 Triggers & Bindings

> **Model**: Node.js v4 programming model (TypeScript). Same as JavaScript v4 with type annotations.
> **NO** `function.json` files.
> Import: `import { app, HttpRequest, HttpResponseInit, InvocationContext, input, output } from '@azure/functions';`

TypeScript uses the same `app.*()` registration as JavaScript. See [javascript.md](javascript.md) for all trigger/binding patterns. Below are TypeScript-specific type signatures.

## HTTP Trigger

```typescript
import { app, HttpRequest, HttpResponseInit, InvocationContext } from '@azure/functions';

app.http('httpFunction', {
  methods: ['GET', 'POST'],
  authLevel: 'anonymous',
  handler: async (request: HttpRequest, context: InvocationContext): Promise<HttpResponseInit> => {
    const name = request.query.get('name') || (await request.text());
    return { body: `Hello, ${name}!` };
  }
});
```

## Blob Storage Trigger

```typescript
import { app, InvocationContext } from '@azure/functions';

app.storageBlob('blobTrigger', {
  path: 'samples-workitems/{name}',
  connection: 'AzureWebJobsStorage',
  source: 'EventGrid',
  handler: async (blob: Buffer, context: InvocationContext): Promise<void> => {
    context.log(`Blob: ${context.triggerMetadata.name}, Size: ${blob.length}`);
  }
});
```

## Queue Storage Trigger

```typescript
app.storageQueue('queueTrigger', {
  queueName: 'myqueue-items',
  connection: 'AzureWebJobsStorage',
  handler: async (queueItem: unknown, context: InvocationContext): Promise<void> => {
    context.log('Queue item:', queueItem);
  }
});
```

## Timer Trigger

```typescript
import { app, InvocationContext, Timer } from '@azure/functions';

app.timer('timerFunction', {
  schedule: '0 */5 * * * *',
  handler: async (myTimer: Timer, context: InvocationContext): Promise<void> => {
    context.log('Timer fired at:', myTimer.scheduleStatus?.last);
  }
});
```

## Cosmos DB Trigger

```typescript
app.cosmosDB('cosmosDBTrigger', {
  connectionStringSetting: 'CosmosDBConnection',
  databaseName: 'mydb',
  containerName: 'mycontainer',
  createLeaseContainerIfNotExists: true,
  handler: async (documents: unknown[], context: InvocationContext): Promise<void> => {
    documents.forEach(doc => context.log('Changed doc:', doc));
  }
});
```

## Service Bus Queue Trigger

```typescript
app.serviceBusQueue('sbQueueTrigger', {
  queueName: 'myqueue',
  connection: 'ServiceBusConnection',
  handler: async (message: unknown, context: InvocationContext): Promise<void> => {
    context.log('Message:', message);
  }
});
```

## Event Grid Trigger

```typescript
import { app, EventGridEvent, InvocationContext } from '@azure/functions';

app.eventGrid('eventGridTrigger', {
  handler: async (event: EventGridEvent, context: InvocationContext): Promise<void> => {
    context.log('Event:', event.subject, event.eventType);
  }
});
```

## Event Hubs Trigger

```typescript
app.eventHub('eventHubTrigger', {
  eventHubName: 'myeventhub',
  connection: 'EventHubConnection',
  cardinality: 'many',
  handler: async (events: unknown[], context: InvocationContext): Promise<void> => {
    events.forEach(event => context.log('Event:', event));
  }
});
```

## Input/Output Bindings

TypeScript uses the same `input.*()` and `output.*()` helpers as JavaScript. See [javascript.md](javascript.md) for full binding examples — all patterns are identical with added type annotations.

> Full reference: [Azure Functions TypeScript developer guide](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-node?tabs=typescript)
references/
workflow-details.md 1.2 KB
# Workflow Details

## Status Tracking

Maintain a `migration-status.md` file in the output directory (`<source-folder>-azure/`):

```markdown
# Migration Status
| Phase | Status | Notes |
|-------|--------|-------|
| Assessment | ⬜ Not Started | |
| Code Migration | ⬜ Not Started | |
```

Update status: ⬜ Not Started → šŸ”„ In Progress → āœ… Complete → āŒ Failed

## Error Handling

| Error | Cause | Remediation |
|-------|-------|-------------|
| Unsupported runtime | Source runtime not available in target Azure service | Check target service's supported languages documentation |
| Missing service mapping | Source service has no direct Azure equivalent | Use closest Azure alternative, document in assessment |
| Code migration failure | Incompatible patterns or dependencies | Review scenario-specific guide in [lambda-to-functions.md](services/functions/lambda-to-functions.md) |
| `azd init` refuses non-empty directory | azd requires clean directory for template init | Use temp directory approach: init in empty dir, copy files back |

> For scenario-specific errors (e.g., Azure Functions binding issues, trigger configuration), see the error table in the corresponding scenario reference.

License (MIT)

View full license text
MIT License

Copyright 2025 (c) Microsoft Corporation.

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.