Getting Started with KQL Part 3: Query and Structure Data

In the last post we went through different KQL Operators,

In this post we will be going to querying and structuring date as well as create some basic charts.

Logon to Azure and go to Log Analytics workspace and select the workspace.

Click on logs, the main logs we will be working with in these posts will be storage blobs but the same principal can be used on any logs.

I used the below learn article for reference on the different columns

https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/storagebloblogs

I also used the quick reference for what each operators is.

https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/kql-quick-reference

If we want to filter by a specific column and value we can use where operator to return only the specific rows.

The below query is to get the authentication that use SAS.

If we want to only select certain columns we can use project.

If we wanted a count of each authentication type we can use summarize.

Next we can create a pie chart by a column , I used operation name.

We can also render as charts.

In the next post will go through create email reports and action groups.

Getting Started with KQL Part 2: Working with Operators

In the last post we went through setting up a log workspace and setting up diagnostic setting to send data to the workspace.

In this post we will be going through using the different KQL operators

First we will use the search operator to return all data in the log workspace, this can be useful when trying to find a the table we want to query or see specific event type.

search *

If we want to return specific number for of rows but not in specific order we can use take 10

Table 
|take  10 

To return a list of unique values in a column we can use distinct.

Table 
| distinct AppDisplayName

To select multiple rows we can use the or operator

Table
| where colume_name contains "value" or colume_name  contains "value" Signup Portal"
| project  value1, value2

To order the data we can use order by

Table
| where colume_name contains "value" or colume_name  contains "value" Signup Portal"
| order by type
| project  value1, value2

To return the first set to rows we can use top

Table
| top 10 by colume_name
| project value1 

To return data between a specific date and time we can use the between operator

Table
| where TimeGenerated between (datetime(2023-08-14T19:12:00) .. datetime(2023-08-15T19:12:00))

In the next post we will go through Query and Structure Data, as well as creating data in visualizing data in charts using the render operator.

Getting Started with KQL Part 1: Azure Log Workspace and diagnostic settings

Kusto Query Language (KQL), is a query language developed by Microsoft for querying and analyzing data. KQL is specifically designed for working with structured, semi-structured, and unstructured data, and it offers a simple and intuitive syntax for expressing complex queries.

KQL is used in log analytics, security monitoring, and business intelligence. It allows users to extract insights from vast amounts of data by using a combination of filtering, aggregating, and transforming operations. With its rich set of operators, functions, and control flow statements, KQL enables users to perform advanced analytics and create sophisticated queries.

In the next few blog post’s we will be going through how to send data to a log workspace and creating KQL queries to show how we can visualize and gather data.

I have wanted to do more with KQL and I am using this series to improve my own KQL and hopefully it will be of use to other if you are just starting out.

First step is we need to create a Azure Log workspace.

Go to Log Analytics workspace blade in Azure and click create.

  • Select a subscription
  • Create or select and existing resource group
  • Give the workspace a name
  • Set the region

Select tags if needed, review and create.

It will take a few minutes to finish deploying.

Once we have the log workspace created we can start to send data that we can then query.

First we will be sending storage account diagnostic data.

To enabled to go the storage account and select diagnostic settings.

Enabled the diagnostic for the required storage type, I am going to enable for blob and file.

Click on the storage type and click add diagnostic settings.

Select the logs to send and the log analytics workspace.

After enabling for both file and blob the diagnostic status should now show as enabled.

We will can generate some data by creating and delete some blob and azure file share.

Once we have some data can start to write our KQL query. First we can run against the full table by using StorageBlobLogs. I used the Azure monitor reference to find the correct table to use.

https://learn.microsoft.com/en-us/azure/azure-monitor/reference/

If we want to select specify rows and filter the data that is returned we can use the where operator. In below example I am select only the storage write results.

StorageBlobLogs
| where Category == 'StorageWrite'

In the next post we will go through using different operators like project, summarize, rendering…