I've recently been playing around with my own hosted ELK stack. When I looked at some of the third party SaaS solutions, I saw that they had certain plugins that would gather data from say, Azure, and import it into their hosted stack. Looking through the different Beats available with Elastic.co's offering, I didn't see anything out of the box that would do that for me. It looks like they're trying to go down that path with the "Functionbeat", however, that only has limited support for AWS logs, and nothing for Azure.
So I did some research and came up with a way to get some of the Azure logs to be imported into my ELK stack. For the purposes of this walkthrough, I'm going to assume you already have your own ELK stack setup, that you're not using TLS or Authentication for ELK, and that your Elasticsearch endpoint is accessible from Azure.
Our goal here is to get the Azure Activity Log data into ELK stack. Here's a quick overview of what we're going to do to accomplish that:
- Setup a resource group, storage account, and event hub namespace in Azure
- Create a new Azure Function project in Visual Studio
- Add in our code to post the data to the ELK stack in the function
- Deploy the function to Azure with a publish profile from Visual Studio
- Setup Application Insights so we can monitor our Azure function
- Configure Azure Activity logs to export to an event hub
- View our results!
Let's get started! First, run the following Powershell script to generate a resource group, storage account, and event hub namespace. You'll need to specify your own values for the $subscriptionName and $resourceGroupName. Make sure to save the values off that are output at the end:
#******************************************************************************
# Script parameters - Set these to your own values!
#******************************************************************************
$subscriptionName = "Your Subscription Name"
$resourceGroupName = "Your Resource Group Name"
$location = "eastus"
#******************************************************************************
# Script body
# Execution begins here
#******************************************************************************
Write-Host "Importing Azure Modules..."
Import-Module -Name Az
$ErrorActionPreference = "Stop"
Write-Host ("Script Started " + [System.Datetime]::Now.ToString()) -ForegroundColor Green
# Sign in to Azure account
Write-Host "Logging in..."
$currentContext = Get-AzContext
if ($null -eq $currentContext.Subscription) {
$verboseMessage = Connect-AzAccount
Write-Verbose $verboseMessage
# reload context
$currentContext = Get-AzContext
}
# Select subscription
Write-Host "Selecting subscription '$subscriptionName'"
$verboseMessage = Select-AzSubscription -SubscriptionName $subscriptionName
Write-Verbose $verboseMessage
# Create resource group
Write-Host ("Creating Resource Group '$resourceGroupName' " + [System.Datetime]::Now.ToString()) -ForegroundColor Green
$verboseMessage = New-AzResourceGroup -Name $resourceGroupName -Location $location
Write-Verbose $verboseMessage
# Get initials to prepend our resource names
$userInitials = Read-Host -Prompt 'Enter your initials'
if (!$userInitials) {
Write-Host 'User initials were not supplied - script is aborting!' -ForegroundColor Red
throw "Unable to continue - user initials not supplied"
}
$userInitials = $userInitials.ToLower()
# Create storage account
$storageAccountName = "{0}evthubstorage" -f $userInitials
Write-Host ("Creating Storage Account '$storageAccountName' " + [System.Datetime]::Now.ToString()) -ForegroundColor Green
$verboseMessage = New-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName -Location $location -SkuName Standard_LRS
Write-Host $verboseMessage
# Create event hub namespace
$eventHubNamespaceName = "{0}eventhub" -f $userInitials
Write-Host ("Creating Event Hub Namespace '$eventHubNamespaceName' " + [System.Datetime]::Now.ToString()) -ForegroundColor Green
$verboseMessage = New-AzEventHubNamespace -ResourceGroupName $resourceGroupName -Name $eventHubNamespaceName -Location $location -SkuName Standard
Write-Host $verboseMessage
# Get the primary key connection string for our newly created event hub
$key = Get-AzEventHubKey -ResourceGroupName $resourceGroupName -Namespace $eventHubNamespaceName -AuthorizationRuleName "RootManageSharedAccessKey"
Write-Host "Save off the following values for use later:"
Write-Host ("Resource Group: '{0}'" -f $resourceGroupName)
Write-Host ("Storage Account: '{0}'" -f $storageAccountName)
Write-Host ("Event Hub Namespace: '{0}'" -f $eventHubNamespaceName)
Write-Host ("Event Hub Connection String: '{0}'" -f $key.PrimaryConnectionString)
Write-Host ("Script Completed " + [System.Datetime]::Now.ToString()) -ForegroundColor Green
Now we have our main resources setup in Azure so we can start with creating our function code. I'm using Visual Studio 2019. Create a new project and choose the "Azure Functions" template:

For the options, choose the "Event Hub trigger" type, enter in the name of your storage account you created earlier, enter "EventHubConnectionString" for the Connection string setting name, and enter "insights-operational-logs" for the event hub name. Note that at this point that an event hub with this name does NOT exist in our namespace - but it will later after a few steps.

Once the project is created, you'll need to open up the "local.settings.json" file and add a new item in the "Values" object with a key of "EventHubConnectionString", and a value of the Event Hub connection string that was output by the initial Powershell script.

At this point, you could leave the code as-is (the 'template' code that VS puts in the Run(...) function) and follow the remaining steps, and it would just output each message to the logger for your function. However, we want to be able to take these messages and POST them into an Elasticsearch endpoint, so we can then view/query them in Kibana.
To do that we will use the Elasticsearch.Net low level client. This allows lower level access to directly write to an Elasticsearch endpoint. Remember - I'm not using SSL or any kind of authentication for this example, so if you have that enabled you'll probably need to build in more code than I have to make it work, but it does look like those options are supported with this client.
We'll install the needed NuGet package in the package manager console with the following command:
Install-Package Elasticsearch.Net
Add in a "using Elasticsearch.Net" statement in your usings section, and replace the entirety of the "Run" method with the following code:
[FunctionName("Function1")]
public static async Task Run([EventHubTrigger("insights-operational-logs", Connection = "EventHubConnectionString")] EventData[] events, ILogger log)
{
var exceptions = new List();
var elasticsearchIndex = "azureactivitylog";
# Replace this with the actual address to your elasticsearch endpoint!
var elasticsearchAddress = "http://1.2.3.4:9200";
var settings = new ConnectionConfiguration(new Uri(elasticsearchAddress));
var client = new ElasticLowLevelClient(settings);
foreach (EventData eventData in events)
{
try
{
string messageBody = Encoding.UTF8.GetString(eventData.Body.Array, eventData.Body.Offset, eventData.Body.Count);
// you probably wouldn't want this log message in a production instance, but we'll keep it here for our testing purposes
log.LogInformation($"Raw Data From Function: {messageBody}");
var response = client.Index(elasticsearchIndex, (PostData)PostData.String(messageBody));
if (!response.Success)
{
throw response.ApiCall.OriginalException;
}
await Task.Yield();
}
catch (Exception e)
{
// We need to keep processing the rest of the batch - capture this exception and continue.
// Also, consider capturing details of the message that failed processing so it can be processed again later.
exceptions.Add(e);
}
}
// Once processing of the batch is complete, if any messages in the batch failed processing throw an exception so that there is a record of the failure.
if (exceptions.Count > 1)
throw new AggregateException(exceptions);
if (exceptions.Count == 1)
throw exceptions.Single();
}
All we're really doing here is setting up a connection to our Elasticsearch endpoint near the top of the function, and then making a call to "client.Index(...)" for each event in the event hub. Since the data coming from Azure is already in JSON format, we don't have to do anything special to process it.
What happens now, is that every time an item is put on the "insights-operational-logs" event hub in the event hub namespace we've defined with our connection string, this function will be triggered and run the code above.
Now let's go ahead and make a Publish Profile in Visual Studio to publish this as a new function. Right click on your project and select "Publish".
Choose "Azure Functions Consumption Plan", "Create New", and check the "Run from package file" checkbox, then click "Create Profile":

Give your app service a name - then select the appropriate subscription, resource group, location, and storage account. Note that these should match up with the values you used/created via the initial Powershell script. When ready, click "Create":

Now we've created our publish profile, we just need to do an actual deployment with that profile, and specify our production parameters needed. Remember we defined the "EventHubConnectionString" in our local.settings.json file? Well that's only used when you're running the function locally in debug mode. We have to provide a value for the deployment so it knows what to use. Click on the "Edit Azure App Service settings" link:

Copy the value you put in the "Local" field for EventHubConnectionString into the Remote field as well, and hit OK.

Click "Publish" back on the main screen and wait for all your resources to publish to Azure. Once completed you should see a new App Service and App Service plan on your resource group, that correspond to the function you just published:

Open up the App Service record and expand functions, and you'll see our "Function1" we created. If you go to "Monitor", you can configure Application Insights to enable logging for the function. Click "Configure" and then on the next screen just select "Create new resource" and give a name for the Application Insights resource to be created:

Once that's setup you can go back to the "Monitor" section and you should now see this:

We're basically all setup now! All we need is to create an event hub in our namespace and have it start pushing messages to it, so our function can pick them up.
For this setup, we're going to stream Azure Activity Logs. These are the access/audit logs that Azure maintains to show who added/deleted/edited different resources in the Azure Portal or via the CLI. (For more detailed information, you can check out this link: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/activity-log-export)
To set this up, we will go to the "Activity Log" in the Azure portal (just search for 'Activity Log', open it up, and click on the "Diagnostic settings" button. Then, click on the purple banner for the "Legacy experience" to get to this screen:


Select your subscription, and whatever regions you want to monitor. Then, check the box for "Export to an event hub", and for the Service bus namespace, you'll specify the Subscription where your event hub namespace is, along with the namespace name itself, and then specify "RootManageSharedAccessKey" from the policy name drop down. Click OK, then click Save.
Let's go load up our Event Hub namespace now and look at the Event Hub entities:

Look at that - Azure created a new hub for us in our namespace called "insights-operational-logs"!
Now let's do something to force an audit change. I'm just going to create a storage account, making sure I do it in the subscription/region that I specified when setting up the activity log export. After your resource is successfully created, wait 5 minutes or so (there is a slight lag/delay on the event hub log data showing up in the portal)
Go back to the App Service for your function and load up the "Monitor" section again:

Click on the row there and you can see the actual logged raw data!

Since we're not seeing any error messages here, we can assume our function was able to successfully parse the records and send them to Elasticsearch. Let's go view the Elasticsearch Index Management page in the Kibana portal:

There's our new index! Next we can add an index pattern in Kibana:


And then go to the discover page to view the raw index data:

That's it - you're done! I've never really done anything with Azure Functions or the Event Hub, so this was definitely a fun learning experience. Some other things to eventually focus on for improvements might be:
- Figure out how to communicate with TLS using the Elasticsearch.Net client
- Figure out how to use Basic Auth with the Elasticsearch.Net client to communicate with a protected Elasticsearch endpoint
- Consider parsing out the data a little better in the Azure function and only sending certain elements to Elasticsearch, instead of the entire (giant) JSON message that Azure sends
- Look into all the other types of Azure resources that allow streaming of Diagnostic data to event hubs so we could consume those in ELK as well (see details on how here: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings). The following is a short list of what I found that looks like it would be available to setup in a similar way to what we just did with the Azure Activity Logs

Hopefully you've found this useful - if you have any questions let me know!
Thanks,
Justin