Logs
EDOT Cloud Forwarder for Azure forwards Azure resource logs to Elasticsearch. This page describes the supported sources, the collection methods on the Azure side, the encoding modes available in ECF, and what the resulting documents look like in Elasticsearch.
EDOT Cloud Forwarder for Azure supports the following log sources:
| Source | Description |
|---|---|
| Activity logs | Subscription-level resource logs that record operations performed on Azure resources, including categories such as Administrative, Security, Service Health, and Resource Health. |
| Resource logs | Per-resource diagnostic logs emitted by Azure services (for example, audit logs from Key Vault or query logs from a database). |
Azure streams logs to the logs Event Hub created by EDOT Cloud Forwarder for Azure through Diagnostic Settings. Set up a diagnostic setting on each subscription or resource you want to collect logs from, and point its destination at the logs event hub.
- In the Azure portal, open Subscriptions from the search bar.
- Select the Azure subscription from which you want to collect Activity logs.
- Go to Activity log → Export Activity Logs → Add diagnostic setting.
- In Diagnostic setting name, enter a name for the diagnostic setting.
- Select the categories you want to collect (for example, all categories under Logs → Administrative).
- In Destination details, select Stream to an event hub.
- In Event hub namespace, select the namespace created by EDOT Cloud Forwarder for Azure.
- In Event hub name (optional), select the
logsevent hub created by EDOT Cloud Forwarder for Azure.
- Select Save to create the diagnostic setting.
- After a few minutes, the diagnostic setting starts streaming the Activity logs to the
logsevent hub.
For per-resource logs, create a diagnostic setting on the Azure resource itself (for example, a Key Vault or a Storage Account) and point its destination at the same logs event hub. The available log categories depend on the resource type.
Once the diagnostic setting is active:
- Go to Discover in your Elastic Cloud deployment or Serverless project, and select the
logs-*data view. - Filter by
data_stream.datasetto browse the logs streamed to Elasticsearch.
The encoding mode controls how Azure logs are transformed before being sent to Elasticsearch.
| Mode | Description | Data stream dataset | Prerequisites |
|---|---|---|---|
| OTel Native (default) | Parses logs into OpenTelemetry semantic conventions with structured attributes. | azure.<service>.otel |
None |
| ECS | Preserves the original Azure log as a string and routes it through the Azure Logs integration for ECS processing. | azure.events, routed by category. |
Azure Logs integration |
The encoding mode is selected at deployment time through the logsEncodingExtension Bicep parameter. Refer to Configure EDOT Cloud Forwarder for Azure for details.
ECS mode requires the Azure Logs integration on your Elasticsearch cluster. The integration provides the ingest pipelines that parse the raw event, route it to the correct data stream based on the category field, and apply ECS field mappings.
To install the Azure Logs integration:
- In your Elastic Cloud deployment or Serverless project, go to Integrations.
- Search for Azure Logs.
- Select Install.
Without the integration, the documents remain in logs-azure.events-default unprocessed and might not be indexed correctly.
When an Azure resource log arrives at the Event Hub, EDOT Cloud Forwarder for Azure decodes and transforms it according to the selected encoding mode before sending it to Elasticsearch.
The following example shows how the same Azure Activity Log appears in Elasticsearch depending on the encoding mode.
This is a simplified Azure Activity Log as it arrives in the Event Hub:
Log sample
{
"time": "2025-06-10T14:30:00.1234567Z",
"resourceId": "/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM",
"operationName": "Microsoft.Compute/virtualMachines/write",
"category": "Administrative",
"resultType": "Success",
"correlationId": "abc-123-def-456",
"level": "Information",
"location": "westeurope",
"callerIpAddress": "203.0.113.50",
"properties": {
"message": "Virtual machine updated successfully"
}
}
The log is parsed into OpenTelemetry semantic conventions. Azure-specific fields are mapped to structured attributes, and cloud resource metadata is placed in resource.attributes:
Log parsed to OTel semantic conventions
{
"@timestamp": "2025-06-10T14:30:00.123Z",
"attributes": {
"azure.category": "Administrative",
"azure.correlation_id": "abc-123-def-456",
"azure.operation.name": "Microsoft.Compute/virtualMachines/write",
"azure.result.type": "Success",
"network.peer.address": "203.0.113.50"
},
"body": {
"text": "Virtual machine updated successfully"
},
"data_stream": {
"dataset": "azure.activity.otel",
"namespace": "default",
"type": "logs"
},
"resource": {
"attributes": {
"cloud.provider": "azure",
"cloud.region": "westeurope",
"cloud.resource_id": "/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM"
}
},
"severity_text": "Information"
}
In ECS mode, EDOT Cloud Forwarder for Azure preserves the original Azure log as a string in the message field and sends it to the logs-azure.events-default data stream. What happens next depends on whether the Azure Logs integration is installed.
The document is stored in logs-azure.events-default with the original event as an unparsed string. Without the integration, documents remain unprocessed and might not be indexed correctly or lack the expected field mappings.
Log sample
{
"message": "{\"time\":\"2025-06-10T14:30:00.1234567Z\",\"resourceId\":\"/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM\",\"operationName\":\"Microsoft.Compute/virtualMachines/write\",\"category\":\"Administrative\",\"resultType\":\"Success\",\"correlationId\":\"abc-123-def-456\",\"level\":\"Information\",\"location\":\"westeurope\",\"callerIpAddress\":\"203.0.113.50\",\"properties\":{\"message\":\"Virtual machine updated successfully\"}}",
"data_stream": {
"dataset": "azure.events",
"namespace": "default",
"type": "logs"
},
"event": {
"dataset": "azure.events"
},
"input": {
"type": "azure-eventhub"
},
"tags": ["forwarded", "preserve_original_event"]
}
The integration's ingest pipeline parses the message string, extracts the category field, and routes the document to the appropriate final data stream. Because this log has "category": "Administrative", it is routed to logs-azure.activitylogs-default and processed into ECS fields:
Log sample
{
"@timestamp": "2025-06-10T14:30:00.123Z",
"azure": {
"activitylogs": {
"category": "Administrative",
"operation_name": "Microsoft.Compute/virtualMachines/write",
"result_type": "Success"
},
"correlation_id": "abc-123-def-456",
"resource": {
"group": "TEST-RG",
"id": "/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM",
"provider": "MICROSOFT.COMPUTE"
},
"subscription_id": "TEST-SUB-001"
},
"client": {
"ip": "203.0.113.50"
},
"cloud": {
"provider": "azure"
},
"data_stream": {
"dataset": "azure.activitylogs",
"namespace": "default",
"type": "logs"
},
"event": {
"action": "Microsoft.Compute/virtualMachines/write",
"kind": "event"
},
"log": {
"level": "Information"
},
"tags": ["preserve_original_event"]
}
- Logs are streamed exclusively through Diagnostic Settings. Data Collection Rules are not supported for logs.
- ECS mode requires the Azure Logs integration to be installed on the destination Elasticsearch cluster. Without it, documents remain unparsed.
- Failed deliveries are persisted as blobs in the
logs-error-containerstorage container. Refer to Troubleshooting.