Azure Log Analytics Query Quick Start

This post is an Azure Log Analytics query quick start to get you up and running with queries in a few minutes. It follows on my previous post showing some of the common tasks performed in Azure Log Analytics. The official documentation can be foundĀ here.

Query Syntax

Queries in Azure Log Analytics start with a data table name followed by query operators and optionally rendering instructions. Data is piped from one operator to the next using a pipe sign.

Event
 | where Computer like "webserver" and EventID == 14
 | summarize count() by Computer
 | render barchart

Common Query Operators

Summarize : Similar to the SQL Group By statement it applies one or more aggregations with optionally one or more grouping expressions. For example to see how many events were logged per computer and when the last one was logged

Event
| summarize count(Computer), max(TimeGenerated) by Computer, EventLevelName

Distinct: Returns distinct values from one or more column(s)

Event | distinct Computer

Bin: Groups records into bins and works on numbers and dates. For example if you wanted to see how many events were logged per hour.

Event | summarize count(EventID) by bin(TimeGenerated, 1h)

Or group events by their eventid in bins of 1000.

Event | summarize count(EventID) by bin(EventID, 1000)

Or aggregate events by eventid and group on EventLevelName per day. For example to see the failed versus successful entries per day

Event | summarize count(EventID) by EventLevelName , bin(TimeGenerated, 1d)

Join: Just like your typical SQL join.

Event | join kind=inner (SecurityEvent) on Computer

Let: Let stores a value in a variable, the values can be tabular query results or any user supplied value. The variable can then be used in queries. For example if you have to join results from two very long queries or store constants used in your queries.

let errors = Event | ... very long query ...| summarize Count = count(EventID) by Computer, bin(TimeGenerated, 1h);

let memory = Perf | ... very long query ... | summarize avg(CounterValue) by Computer, bin(TimeGenerated,1h);

errors | join kind= inner (memory) on TimeGenerated

Project: Selects the columns to include in the query result, just like SQL Select statement.

Parse: Parse text into columns, it is a long explanation link here. It is really handy to extract data into columns from free-form text like custom log files.

Event| where EventLog == "DPM Backup Events"| parse RenderedDescription with * "Backup job for datasource: " ServerName " on " *

In this example RenderedDescription looked like this:
Backup job for datasource: SomeServerName on production server: SomeDPMServerName completed succcessfully…“.

The section between the qoutes are the “guide” string to look for, including the spaces and ServerName is new column name to extract.

Render: Renders a chart, too many options to mention here look at the documentation.

Event
| where Computer like "local" and EventID == 14
| summarize count() by Computer
| render piechart

Extend: Create calculated columns.

Event | extend RequiresAttention = iif(EventLevel == 4, "Yes", "No" )
Scope: You can create queries that span multiple applications and workspaces. Use the app() and workspace() expression to include other Application Insights applications and OMS workspaces in your query.
Datetime: You can use basic arithmetic on datetime values. You can subtract two datetime values to get a timespan value representing the difference between them. You can add or subtract a timespan value from a datetime value. You can use todatetime() to convert literals to datetime values. You can find the list of supported literals here. As per the documentation try to stick to ISO8610 date formats.
Case Sensitivity: String comparisons are case insensitive except for ==. There are case sensitive equivalents for most comparison operators ending in “_cs”
Entity names and query operator names are case sensitive.

 

Francois Delport

Published by

Francois Delport

I am a cloud and devops consultant, technology fan and previously a professional C# developer with a keen interest in system design and architecture. Currently I am involved in projects using Azure, the Microsoft stack and DevOps. I am based in Melbourne, Australia. Email: [email protected]

2 thoughts on “Azure Log Analytics Query Quick Start”

  1. Good afternoon. I am familiar with Splunk, ELK stack and Nagios Log Server. It looks like Azure Log Analytics was going to allow me to perform similar indexing and searching of log data from various sources. So far, I have not been able to get Log Analytics to wire up to a custom text file log data source. Screen shots in walk-throughs show a + (add) button to be able to point to a file in the file system, but many are unable to get that button to show up (including me). Also, I have a firewall successfully send syslog data to an Azure Event Hub, but now I need a way to analyze the data and setup search patterns to alert on (was hoping to use Azure Log Analytics for this). I am seeing ways to take event hub data and dump it into a SQL or CosmosDB database, but the missing link here is the ability to query/search and alert. Wondering if you have any ideas on how to accomplish this?

    Any help would be greatly appreciated. Thanks!

    1. Hi Pat,

      Are you using Azure Portal or OMS portal to manage logs? Custom logs is a preview feature, you have to enable it in the OMS portal link under settings -> preview features. I can’t think of another reason for it not showing up.

      I think you might be better off sending your firewall syslog messages to OMS Log Analytics instead of Event Hubs. Assuming the firewall device can’t run the OMS Linux agent you’ll have to install the agent on a VM to act as the collector and configure it to forward syslog messages to OMS. On your firewall device you have to forward syslog messages to the Linux collector VM hosting the OMS agent.

Leave a Reply

Your email address will not be published. Required fields are marked *