This repository hosts an Azure Function App Python code in order to run Log Analytics queries and send result to Splunk Observability (formerly SignalFx).
- A Python 3.8 Azure Function App
- A Log Analytics Workspace with resources Diagnostic Settings linked to it
- A Table Storage containing the queries.
- Function Managed Identity
or Azure Service Principal
with at least
Log Analytics Reader
right on the Log Analytics Workspace andReader and Data Access
on the Storage Account if storage key is not provided. - A Splunk Observability account and its associated ingest Token
- QUERIES_STORAGE_ACCOUNT_NAME (optional): The name of the Storage Account containing the table with the queries. If not set, use the
AzureWebJobsStorage
connection string. - QUERIES_STORAGE_ACCOUNT_KEY (optional): The key to access the Storage Account containing the table with the queries, will try to fetch it if empty. If not set, use the
AzureWebJobsStorage
connection string. - QUERIES_STORAGE_TABLE_NAME (optional, defaults to
LogQueries
): The name of the table in the Storage Account with the queries - SFX_TOKEN (required): The Splunk Observability token for metric sending
- SFX_REALM (optional, defaults to
eu0
): Splunk realm (region) to use for metric sending - LOG_ANALYTICS_WORKSPACE_GUID (required): ID of the Log Analytics Workspace
- LOG_LEVEL (optional, defaults to
INFO
): Logging level - SFX_EXTRA_DIMENSIONS (optional): Extra dimensions to send to Splunk Observability.
Example:
env=prod,sfx_monitored=true
- AZURE_CLIENT_ID (optional): Azure Service Principal ID if Service Principal authentication is used
- AZURE_TENANT_ID (optional): Azure Tenant ID if Service Principal authentication is used
- AZURE_CLIENT_SECRET (optional): Azure Service Principal secret key if Service Principal authentication is used
The function runs all the queries stored in the associated Table Storage every minute within the given Log Analytics Workspace and send the result to Splunk Observability.
Each query specifies the value of the metric and its associated time. Every column in the query is sent as metric
dimension along with the defined EXTRA_DIMENSIONS
variable.
The records in the Table STorage must have the following columns:
- MetricName: Name of the metric to send to Splunk Observability
- MetricType: Type of metric, can be gauge, counter or cumulative_counter (See https://docs.signalfx.com/en/latest/metrics-metadata/metric-types.html)
- Query: Query to run on the Log Analytics Workspace (See https://docs.microsoft.com/en-us/azure/azure-monitor/logs/get-started-queries)
The query must contain the columns metric_value
with a metric value and timespan
with the datetime of the metric to send.
The others columns are treated as dimensions for the metric.
You can use Zip deployment, Azure Function Core Tools or any other Azure deployment method to deploy this application.