Read SMART Attributes Using PowerShell And Smartmontools

In this post I’ll show you how to read SMART attributes using PowerShell and Smartmontools. I decided to use Powershell and Smartmontools because it will work from the command line which is great for Windows Server Core machines and it can be scheduled to run in the background. I also needed a way to receive notifications if the SMART attributes predicted disk failure. I decided to use Azure Log Analytics and Azure Monitor Alerts since I already use it for other tasks.

DISCLAIMER: I used this for my lab machines, it is not a practical or scalable solution for production.

Checking Drives For Failure Based On SMART Attributes

Windows already exposes the predicted failure status of drives based on SMART attributes so you don’t have to interpret the attributes yourself.

Get-WmiObject -namespace root\wmi –class MSStorageDriver_FailurePredictStatus

But if you want to you can retrieve more detailed SMART attribute data using PowerShell. There are various commands returning different levels of detail, for example:

Get-Disk | Get-StorageReliabilityCounter | fl
Get-WmiObject -namespace root\wmi –class MSStorageDriver_FailurePredictThresholds
Get-WmiObject -namespace root\wmi –class MSStorageDriver_FailurePredictData

If you feel brave you can read the raw SMART attributes but you’ll have to manipulate them to get something in a human readable form.

Get-WmiObject -namespace root\wmi –class MSStorageDriver_ATAPISMartData | Select-Object -ExpandProperty VendorSpecific

For my purposes I retrieve the disks that have the PredictFailure property set to true.

$failures = Get-WmiObject -namespace root\wmi –class MSStorageDriver_FailurePredictStatus -ErrorAction SilentlyContinue | Select InstanceName, PredictFailure, Reason | Where-Object -Property PredictFailure -EQ $true

To receive notifications for the predicted failures I write an error event to the Windows System Event Log.

Write-EventLog -LogName 'System' -Source 'YourSourceName' -EntryType Error -EventId 0 -Category 0 -Message "SMART Failure Prediction"

Before you write to the Event Log you have to register as a source.

New-EventLog -LogName 'System' -Source 'YourSourceName' 

If you are not registered as an event source you will get “The description for Event ID 0 from source source cannot be found.” as part of your event message.

The full script can be found here.

The error event will be picked up by Azure Log Analytics if you are collecting the System Event Log error entries.

AzureLogAnalyticsDataSources
Azure Log Analytics Data Sources

If you need more information on creating alerts read this post.

Retrieving SMART Attributes Using SmartMonTools

Apart from retrieving the failure status I also want to retrieve some extra SMART attributes from the disks. This data is for the next stage of my pet project to create some reports and track the degradation of the disks over time.

I’m using smartctl.exe from the Smartmontools package link. It works from the command line and it can return output in Json format. You can unzip the installer for Windows to get the bin folder containing the exe files.

In short I scan for drives and retrieve the SMART attributes for each one in JSON format and dump the attributes I need to a CSV file. The full script is to long to post but you can find it here.

Later on I will import this into a database for reporting. You could potentially leave it as JSON if you are going to use a document database but I’m leaning towards SQL at the moment.

Francois Delport

Adding Custom Log Files To OMS Log Analytics

In this post I will be adding custom log files to OMS Log Analytics. Custom log files give you the ability to add plain text logs into Log Analytics. Depending on your situation it might be easier to first explore structured logging options like Windows Event Log, Syslog or Application Insights since custom logs have a few limitations.

Configure Custom Logs

At the time of writing custom logs was still in preview, to use it you have to enable the feature in the OMS portal under Settings -> Preview Features. If you are using the Azure portal and the feature is not enabled you won’t see the + button to add a custom log. Once you have custom logs enabled you can use the OMS portal or Azure portal to add a custom log. In the OMS portal open the settings menu by clicking the gear icon in the top right. Under the Data -> Custom Logs menu you will see an Add button to add a custom log.

Adding Custom Log Files To OMS Log Analytics

It is a pretty simple process, just follow the wizard to select a sample file, choose the record delimiter which can be a timestamp or newline, specify the paths to monitor and provide a name for the custom log. Make sure you give the custom log a reasonable name since you will be using it as the identifier in queries.

Take note of the restrictions for custom logs which can be found here. If your custom logs violate any of the criteria they won’t show up in Log Analytics. My custom logs took 30 minutes to show up in Log Analytics but your mileage can vary.

Custom Fields

Log Analytics will store data from the custom log text files in a single field called RawData. To get anything useful out of the custom logs you have to create custom fields over the data. Custom fields are not unique to custom logs you can extract custom fields from any existing fields.

To create a custom field execute a search query that displays the field you want to extract from. In the case of your custom log the table name will be the custom log name. Once you have the results, click on the ellipse to the left of the field name and choose ‘Extract Fields From …’.

Adding Custom Log Files To OMS Log Analytics

 

 

 

 

 

 

 

 

 

On the next screen you can highlight the data you want to extract and match it against current records to refine the extraction process. You can click on records in the search results to further modify the extraction process.

Adding Custom Log Files To OMS Log Analytics

Once you are satisfied with the result save the extraction, detailed instructions here.

Take note, if you create a new custom field your existing data won’t be updated with the new custom field. The custom field will only show on new records ingested by Log Analytics after the custom field was created.

Francois Delport

How To Run Console Applications On A Schedule On Azure Functions

In this post I will show you how to run console applications on a schedule on Azure Functions. This feature is experimental and Azure Functions doesn’t have a project type for scheduled console applications but you can launch a console application from a PowerShell or Batch project using a Timer trigger.

Creating The Function

If this is your first function follow these instructions to create the Function App. Once you have the Function App create a new function inside it by clicking on the plus button next to the Function App. You have to enable experimental language support to see PowerShell and Batch projects.

How To Run Console Applications On A Schedule On Azure Functions

Choose PowerShell or Batch from the language dropdown and select Timer as your trigger .

Once the function is created you have to upload your console application. Easiest way I found is to use Kudu, especially when uploading multiple files. You can access Kudu from the Developer Tools menu in the Platform Features tab in the Functions blade or you can browse to it directly using https://{YourFunctionApp}.scm.azurewebsites.net/.

Alternatively you can upload files from the Azure Portal by clicking View files in the Function app blade.

In Kudu click on the Debug console menu and select Powershell or CMD. Browse to the /site/wwwroot/{YourFunction} folder where you will see your function.json and run.ps1 files already there. Drag and drop any files required onto the browser window to upload them. You can create folders using the plus icon or  the shell window. Make sure all the dlls required to run the application is also copied.

Edit the run.ps1 file using Kudu or the Azure Portal to set the working directory to the folder where your application files are located and add a line to run your application.

How To Run Console Applications On A Schedule On Azure Functions

Specify the schedule

You will find the schedule on the Integrate menu of the function. It uses a cron like syntax with an extra digit for the seconds, you can read more here. The default is every 5 minutes.

Logging and exit codes

To output text to the Azure Functions log you can use the standard Console.Write/WriteLine methods in your code. If you need something more advanced you can also connect Application Insights. If you connect Application Insights at design time you can emit custom logging and telemetry. If you add Application Insights at runtime you get the standard .NET telemetry. You should return specific exit codes from you application to make troubleshooting easier, more here.

Advanced Scenarios

If you have a console app that requires input parameters you can use a HTTP trigger with a C# application to call the console application after you parsed the input parameters.

Francois Delport

Creating And Restoring Azure Virtual Machine Snapshots For UnManaged Disks

In post I’m going to take a look at creating and restoring Azure Virtual Machine snapshots for unmanaged disks. If you are looking for instructions to create and restore snapshots for managed disks read this post. These instructions are for Azure Resource Manager (ARM) virtual machines, for Azure Service Manager (ASM) virtual machines read this post.

In the case of unmanaged disks the underlying VHD blob can be manipulated like any other blob. This means the blob backing unmanaged OS disks can be overwritten, saving you the trouble of recreating VMs just to revert to a snapshot.

I’m not aware of a way to create snapshots for unmanaged disks in the Azure Portal but there are some tools for example Azure Storage Explorer that can do it.

Update: You can now create and restore snapshots for unmanaged disks in the Azure Portal in the Storage Account blade.

Creating And Restoring Azure Virtual Machine Snapshots For UnManaged Disks

Creating Unmanaged Disk Snapshots

To create a snapshot using PowerShell you retrieve the storage account, reference the blob and create a snapshot. To ensure consistency shutdown your virtual machine beforehand.

$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName `
  -StorageAccountKey $StorageAccountKey
$blob = Get-AzureStorageBlob -Context $ctx -Container $ContainerName ` 
  -Blob $BlobName 
$snapshot = $blob.ICloudBlob.CreateSnapshot()

You can also create incremental snapshots but it is not available in the PowerShell cmdlets only the Azure Rest API, more info here.

Restore Unmanaged Disk Snapshots

Like I pointed out earlier in the case of unmanaged disks the underlying VHD file is manipulated like any other blob. You can copy the snapshot over its base blob to restore the blob to the snapshot state. This means you can replace the OS disk for a VM without rebuilding it. You can also copy it to another blob to clone your disk.

Since snapshots have the same name as their base blob you can’t retrieve a specific snapshot by its name. The easiest way seems to be retrieving all the snapshots for a blob using the Get-AzureStorageBlob cmdlet and filtering the results to get the specific snapshot you need. For example to retrieve a snapshot by SnapShotTime

$snap = Get-AzureStorageBlob –Context $ctx -Prefix $blobName `
  -Container $containerName | Where-Object {$_.ICloudBlob.IsSnapshot `
  -and $_.SnapshotTime -eq $snapshottime }

Or to retrieve the latest snapshot.

$lastsnap = (Get-AzureStorageBlob –Context $ctx -Prefix $BlobName `
  -Container $ContainerName | Where-Object {$_.ICloudBlob.IsSnapshot `
  -and $_.SnapshotTime -ne $null} | `
  Sort -Property SnapshotTime -Descending )[0]

To restore a snapshot you copy it to a new blob object or you can overwrite an existing blob. This makes it very easy to rollback a VHD to a snapshot without recreating the VM.

$lastsnap = (Get-AzureStorageBlob –Context $ctx -Prefix $BlobName `
  -Container $ContainerName | Where-Object `                                     
  {$_.ICloudBlob.IsSnapshot -and $_.SnapshotTime -ne $null} |
  Sort-Object -Property SnapshotTime -Descending)[0]
$snapshotblob = [Microsoft.WindowsAzure.Storage.Blob.CloudBlob] `    
  $lastsnap.ICloudBlob
$blob.ICloudBlob.BreakLease() 
Start-AzureStorageBlobCopy –Context $ctx -ICloudBlob $snapshotblob `
  -DestBlob $blobName -DestContainer $ContainerName
Get-AzureStorageBlobCopyState -Blob $blobName -Container $ContainerName `
  -Context $ctx -WaitForComplete

Take note if you try to overwrite a VHD that is attached to a VM you will receive an error message indicating there is an existing lease on the VHD. The BreakLease call will remove the lease and the lease will be created again when you start the VM.

Francois Delport

Creating And Restoring Azure Virtual Machine Snapshots For Managed Disks

In post I’m going to take a look at creating and restoring Azure Virtual Machine snapshots for managed disks. These instructions are for Azure Resource Manager (ARM) virtual machines, for Azure Service Manager (ASM) virtual machines read this post. If you are looking for instructions to create and restore snapshots for unmanaged disks read this post.

Why use snapshots?

Snapshots can be used to duplicate a VM relatively quickly since it is faster than copying a blob. You still have to shutdown the source VM to ensure consistency but creating the snapshot only takes a few seconds. Once you have the snapshot you can use it to create new VMs or copies of the VHD blob while the source VM can be powered on again.

Snapshots can also be used as a backup strategy for VMs although Azure Backups provides better functionality with a lot less effort albeit a bit more expensive since you pay for the recovery vault.

The one advantage of snapshots is the ability to overwrite the OS disk without recreating the VM but only for unmanaged disks at the time of writing.

Managed Disk Snapshots

You can create snapshots in the Azure portal by selecting your managed disk from the disks main menu or the disks section on the VM blade.

Azure Virtual Machine Snapshots For Managed Disks

Take note of the export button, you can use it to export the managed disk to a VHD blob which can be used to create unmanaged disks.

To create a snapshot using PowerShell call the New-AzureRmSnapshot command along with New-AzureRmSnapshotConfig to configure the snapshot options.

$mandisk = Get-AzureRmDisk -ResourceGroupName $rsgName -DiskName $diskname
$snapshot = New-AzureRmSnapshotConfig -SourceUri $mandisk.Id `
  -CreateOption Copy -Location $azureregion 
New-AzureRmSnapshot -Snapshot $snapshot -SnapshotName $Name `
  -ResourceGroupName $rsgName

Restoring Managed Disk Snapshots

At the time of writing you couldn’t overwrite or change the managed OS disk on an existing VM but you can create a new managed disk from a snapshot and then create a new VM from the managed disk.

Update: You can now swap a Managed Disk on a VM by replacing the disk Id with another one. This still involves creating another disk from your snapshot and swapping it but at least you don’t have to recreate the VM anymore. Thanks to Ralph Herold for pointing it out to me.

$vm = Get-AzureRmVM -ResourceGroupName osrg -Name vm2 
$disk = Get-AzureRmDisk -ResourceGroupName osrg -Name osbackup 
Set-AzureRmVMOSDisk -VM $vm -ManagedDiskId $disk.Id -Name $disk.Name 
Update-AzureRmVM -ResourceGroupName osrg -VM $vm

I assume you can do the same for data disks using Set-AzureRmVMDataDisk but I didn’t try it yet. Full documentation here.

You can create a new Managed Disk from a snapshot in the Azure Portal or PowerShell. In the Azure Portal select Create New Resource from the main portal menu and search for Managed Disks to create a new Managed Disk. There will be a Source Type dropdown where you can select Snapshot and you will see a list of your snapshots to use as the source for the new Managed Disk.

Azure Virtual Machine Snapshots For Managed Disks

When the create disk operation is completed select the newly created managed disk and create a new VM from the disk.

Azure Virtual Machine Snapshots For Managed Disks

If you want to script it in PowerShell the steps are basically:

  1. Retrieve the snapshot.
  2. Create a new disk configuration specifying the snapshot as the source.
  3. Create a new managed disk from the disk configuration and attach it to a new virtual machine.

If this is a data disk you can attach it to an existing VM instead of creating a new VM but OS disks can only be attached to new VMs.

...

$snapshot = Get-AzureRmSnapshot -ResourceGroupName $rsg `
  -SnapshotName $yoursnapshotname 
 
$diskconf = New-AzureRmDiskConfig -AccountType $storagetype `
  -Location   $snapshot.Location -SourceResourceId $snapshot.Id `
  -CreateOption Copy

$disk = New-AzureRmDisk -Disk $diskconf -ResourceGroupName $rsg `
  -DiskName $osDiskName
$vm = Get-AzureRmVM ...
$vm = Set-AzureRmVMOSDisk -VM $vm -ManagedDiskId $disk.Id `
  -CreateOption Attach -Windows

...

Full script can be found here.

Francois Delport

Azure Log Analytics Query Quick Start

This post is an Azure Log Analytics query quick start to get you up and running with queries in a few minutes. It follows on my previous post showing some of the common tasks performed in Azure Log Analytics. The official documentation can be found here.

Query Syntax

Queries in Azure Log Analytics start with a data table name followed by query operators and optionally rendering instructions. Data is piped from one operator to the next using a pipe sign.

Event
 | where Computer like "webserver" and EventID == 14
 | summarize count() by Computer
 | render barchart

Common Query Operators

Summarize : Similar to the SQL Group By statement it applies one or more aggregations with optionally one or more grouping expressions. For example to see how many events were logged per computer and when the last one was logged

Event
| summarize count(Computer), max(TimeGenerated) by Computer, EventLevelName

Distinct: Returns distinct values from one or more column(s)

Event | distinct Computer

Bin: Groups records into bins and works on numbers and dates. For example if you wanted to see how many events were logged per hour.

Event | summarize count(EventID) by bin(TimeGenerated, 1h)

Or group events by their eventid in bins of 1000.

Event | summarize count(EventID) by bin(EventID, 1000)

Or aggregate events by eventid and group on EventLevelName per day. For example to see the failed versus successful entries per day

Event | summarize count(EventID) by EventLevelName , bin(TimeGenerated, 1d)

Join: Just like your typical SQL join.

Event | join kind=inner (SecurityEvent) on Computer

Let: Let stores a value in a variable, the values can be tabular query results or any user supplied value. The variable can then be used in queries. For example if you have to join results from two very long queries or store constants used in your queries.

let errors = Event | ... very long query ...| summarize Count = count(EventID) by Computer, bin(TimeGenerated, 1h);

let memory = Perf | ... very long query ... | summarize avg(CounterValue) by Computer, bin(TimeGenerated,1h);

errors | join kind= inner (memory) on TimeGenerated

Project: Selects the columns to include in the query result, just like SQL Select statement.

Parse: Parse text into columns, it is a long explanation link here. It is really handy to extract data into columns from free-form text like custom log files.

Event| where EventLog == "DPM Backup Events"| parse RenderedDescription with * "Backup job for datasource: " ServerName " on " *

In this example RenderedDescription looked like this:
Backup job for datasource: SomeServerName on production server: SomeDPMServerName completed succcessfully…“.

The section between the qoutes are the “guide” string to look for, including the spaces and ServerName is new column name to extract.

Render: Renders a chart, too many options to mention here look at the documentation.

Event
| where Computer like "local" and EventID == 14
| summarize count() by Computer
| render piechart

Extend: Create calculated columns.

Event | extend RequiresAttention = iif(EventLevel == 4, "Yes", "No" )
Scope: You can create queries that span multiple applications and workspaces. Use the app() and workspace() expression to include other Application Insights applications and OMS workspaces in your query.
Datetime: You can use basic arithmetic on datetime values. You can subtract two datetime values to get a timespan value representing the difference between them. You can add or subtract a timespan value from a datetime value. You can use todatetime() to convert literals to datetime values. You can find the list of supported literals here. As per the documentation try to stick to ISO8610 date formats.
Case Sensitivity: String comparisons are case insensitive except for ==. There are case sensitive equivalents for most comparison operators ending in “_cs”
Entity names and query operator names are case sensitive.

 

Francois Delport

OMS Log Analytics Common Tasks

In this post I’m going to give a quick overview of some the common tasks you can perform in OMS using queries. If you are looking for an Azure Log Analytics query quick start you can find it here. You can also find the official documentation here.

Lookup Tables

To create your own lookup tables you create a query that will return the desired results. Save the query and provide a function name for it. The function name will be the identifier you use to reference the lookup table in queries. In this example AllComputers is the lookup table/function

Event | join kind= inner (
AllComputers
) on Computer

Computer Groups

Computer groups are basically a specialised lookup table. You can use it in queries or other OMS functionality that act on a group of machines like scheduling updates. To create a computer group follow the procedure to create a lookup table but select the “Save this query as computer group” option to save it as a computer group instead of a plain lookup table.

OMS Log Analytics Common Tasks

Creating Custom Alerts

Alerts are based on queries that execute on a schedule, if the query returns any records the alert is triggerd. To setup an alert you start with a query to check for the alert condition. Click on the alert button on the top left to open the alert rule screen and configure your alert rules. Out of interest take a look at the actions the alert rule can perform on the right hand side, you can execute Azure Automation Runbooks or webhooks to create self healing systems  or generate work items in your ITSM application.

UPDATE: Alerts are now created in the Monitor blade of the Portal in the Alerts menu -> Manage Alert Rules.

Create Custom OMS Dashboards

To create custom dashboards you use the View Designer which can be opened by clicking the green plus sign on the left panel. Double click the tile you want for your overview tile and fill in a query that will be used to populate the tile. This will be the tile you see on the home screen.

OMS Log Analytics Common Tasks

Add additional tiles to the view dashboard tab. These will be displayed when you click on the overview tile in the home screen.

Create Custom OMS Dashboards For Azure

To create custom dashboards for Azure from your OMS data you have to create a shared Azure dashboard first, more info here. The functionality to pin the dashboard is not in the OMS query screen, it is in the Azure Log Analytics screen. On the OMS query screen click on Advanced Analytics to open Azure Log Analytics in a new window.

OMS Log Analytics Common Tasks

Create your query in Azure Log Analytics and click on the pin on the right hand side to pin the chart to a shared Azure dashboard.

OMS Log Analytics Common Tasks

You can read more about OMS and Azure integration in this post.

It is a bit confusing having functionality split between OMS and Azure Log Analytics but eventually all the querying functionally will be in Azure Log Analytics.

OMS PowerBI Integration

There are two ways to use PowerBI with OMS. The first and simplest but more manual way is to export a query to PowerBI by clicking on the PowerBI button in the OMS query screen.

OMS Log Analytics Common Tasks

This will download your current query as a query text file that you can then import in PowerBI.

The second and more streamlined method is to link your OMS account to PowerBI but this requires an organisational/paid PowerBI account. In OMS in the settings menu click on Accounts and Connect To PowerBI account.

OMS Log Analytics Common Tasks

Francois Delport

Azure Relay Service

In this post I’m going to take a quick look at the Azure Relay service and what it provides.

What Is Azure Relay

Azure Relay is a service that enables communication between applications in different networks, usually public cloud to on-premise but in reality it can be any two networks with internet access. It supports listening for incoming connections as well as outgoing connections without using VPN, special network configuration or opening firewall ports

How Does It Work

Azure Relay service directs requests between different networks using a rendezvous service hosted in Azure. You can read the official documentation here but in short both applications connect to the Service Bus rendezvous namespace and the service then relays communication between the connected parties. The Azure Relay service operates at the application level. You have to write your applications to specifically make use of the Relay WCF connections or Websocket Hybrid Connections. The WCF Relay connections work with .NET only via Nuget packages while Hybrid Connections uses Web Sockets and any language can use it. The service does have some smarts to determine the best way to create connections and will create a direct connection between two parties if possible for example two applications on the same network.

When To Use It

If you require point to point communication between applications on a specific port without using a VPN connection or opening firewall ports Azure Relay is a good candidate. The service is not well suited for real time communication due to the slight delay introduced by the rendezvous service. It is also not well suited for very high volume data transfer or a large number of connections. For example it would not be a good idea to expose a high traffic website hosted on-premise to the internet using the Azure Relay service. If you use the Hybrid Connection integration provided by App Services there is a limit on the number of connections at a time based on your App Service Plan.

Technical Details

Azure Relay service offers 2 connection options:

  • New Hybrid Connections using web sockets which is supported by multiple languages, most new applications or cross platform applications will use this type.
  • Older WCF Relays using WCF Relay bindings and WCF Relay Rest bindings for .NET only, mostly legacy applications or applications leveraging WCF specific features will use this type.

To use relays in your application you have to develop them using the specific Azure Relay connections in the form of WCF Relay bindings or HybridConnectionClient and HybridConnectionListeners from the Microsoft Azure Relay Nuget package. When using Hybrid Connections in your application you will be listening for requests and sending requests. In the case of WCF Relays most of the heavy lifting is done for you by the WCF Relay bindings. When using WebApp Hybrid Connections integration or PortBridge your application is not directly responsible for the relay communication but you will be configuring selected ports that will be forwarded to the relay.

The connections are encrypted using TLS and access to the Azure Relay Namespace is protected with access keys.

Generic Point To Point Connections With PortBridge

The PortBridge sample application uses Azure Hybrid Relay to tunnel communications between two TCP ports without modifying the applications sending or receiving the requests. It uses a server side application to forward requests from a specific port to the Azure Hybrid Relay and a client side application that responds back to the relay. This is handy for applications where you don’t have control over the source code or if you just need a quick way for Azure to reach a service on-premise.

Azure WebApp Integration

Hybrid connections are exposed directly to Azure WebApps. You can access it under the Networking tab.

Azure Relay Service
Azure Relay Service

To use WebApp Hybrid Connections you have to install a connection manager on-premise. The download link for the connection manager is on the Hybrid Connections blade.

Francois Delport

Managing Azure SQL Database And Azure Storage Costs

In this post I’ll highlight a few ideas around managing Azure SQL Database and Azure Storage costs. It follows on my previous post that looked at managing Azure costs when using Azure Virtual Machines and Azure Subscriptions in general.

Azure SQL Database

  • If you have multiple databases with occasional/spiky workloads you can reduce costs by using Elastic Pools to share the pooled DTUs among your databases. If you use Azure Advisor it will analyse your Azure SQL usage and recommend Elastic Pools if your database instances will benefit from it.
  • Keep an eye on Azure SQL service tier updates. I had some database in the Premium tier purely based on the maximum DTUs per database required but I was able to move some of them to Standard tier after the recent service tier updates.
  • If you have high database performance requirements but you can tolerate a lower SLA and some downtime to recover from backups consider using the Premium RS tier which was in preview at the time of writing.
  • Investigate whether running SQL Server on Azure Virtual Machines will be more cost effective than Azure SQL Database. It depends greatly on your requirements, database size and utilisation but keep the following in mind:
    • If you are using SQL for dev/test you can use SQL Server Developer Edition and avoid SQL licensing costs.
    • If your production databases are very small you could get away with using SQL Express but keep the other limitations in mind.
    • You can bring your own licence for SQL running on Azure VMs.
    • If you have high availability requirements using Azure SQL is much less effort  and cost than running VMs. Azure SQL comes with a 99.99% uptime guarantee. VMs have a 99.95% uptime guarantee if you have multiple instances in an availability set plus you have to replicate to another set of VMs in a second Azure region and pay for multiple SQL licences.
    • If your database is only in use for a portion of the day you can switch the virtual machine hosting it off after hours.

Azure Storage

  • Premium Disks and Managed Disk storage is charged per disk size not disk space used while Standard storage is charged by disk space used.
  • Standard disks incur an access cost measured in 10,000 IO transaction batches but Premium Disks and Managed Disks don’t.
  • You can combine disks to achieve higher IOPS. Weigh up the cost and performance of Premium Storage Disks versus multiple Standard Storage Disks especially if you need lots of space.
  • If your data will not be accessed very often consider using Azure Cool Storage.
  • The default data redundancy setting for Standard Storage is read-access geographically redundant which means you pay for a second copy in another Azure region. For dev/test storage you may not need that level of redundancy since Microsoft keeps 3 copies in the data center anyway and you can rather use locally redundant storage.
  • Delete unused VHD files and Managed Disks. When you delete a VM its disks and underlying VHD files are not deleted. Same goes for VMs using Managed Disks although in this case there is no VHD file visible to you, just the Managed Disk to delete.
  • Don’t create a VM just to host a file share, rather use Azure Files.
 
Francois Delport

Managing Azure Costs

Managing Azure costs is an important but sometimes overlooked aspect of cloud utilisation. Many customers wrongfully believe moving to the cloud will automatically be cheaper than hosting it yourself. Truth is cloud can be cheaper but it depends on your requirements and managing your costs. There is a wide spectrum of remedies to lower your costs ranging from simple non-technical steps to rearchitecting applications for serverless computing. In this post I’m going to cover some quick wins you can achieve when using virtual machines.

Managing Azure Costs

To reduce your Azure costs you have to measure it first. There are a few ways to view Azure costs depending on whether you are using an Enterprise Agreement, CSP partner or Visual Studio subscription. In the Azure Portal under the Subscriptions blade you will find some basic information for the current month. In the Cost Analysis section you can filter on different resource groups, types and time periods. In the new Cost Management + Billing blade you can sign up for Cloudyn cost management with more detailed analysis. Eventually it will be folded into the Azure Portal but for the moment you are directed to an external website where you have to sign up for a trail that is free until June 2018. Enterprise Agreement customers can use the EA portal and PowerBI to analyse their bill.

First Step

The easiest and first step you can take is to use the Azure Advisor blade to display any cost recommendations. It is very basic but can provide information around virtual machine under utilisation and Azure SQL databases that can benefit from elastic pools. While you are there also take a look at the security and performance recommendations.

Azure Virtual Machines

A few things to keep in mind to manage your Azure costs when using Azure Virtual Machines.

  • Newer generation virtual machines can sometimes be cheaper. Take for example D* v3 and D* v2 machines, taking into account that there is a small difference in RAM and temporary storage, v3 is cheaper. It was a similar situation when D* v1 was superseded by D* v2.

    Managing Azure Costs
    Managing Azure Costs
  • If you have Azure Batch jobs that are not time critical and they can safely be interrupted and resumed Azure Batch low priority virtual machines can offer a good discount, link.
  • If you are running workloads that occasionally consume high CPU cycles Azure B series virtual machines could be cost effective link. In short you build up credits when CPU utilisation is low which you then spend on bursts of high CPU utilisation.
  • Automatically shut down virtual machines when you don’t use them. It used to require a script and automation account but now it is available in the virtual machine blade.

    Managing Azure Costs
    Managing Azure Costs
  • If you have Software Assurance you can use your existing Windows Server licenses in Azure and only pay for the base computer power you consume. You can read more about Azure Hybrid Benefit here.
  • If you are a Visual Studio subscriber using Azure for development and testing you can get a discount on various Azure services by creating Azure Dev/Test subscriptions. These subscriptions are limited to development and testing workloads, link here. Each active Visual Studio subscriber also qualifies for monthly Azure credits but you have to activate the benefit first, more info here.
  • At the time of writing Reserved Instances were not available yet but it can also bring down cost by paying upfront for virtual machines, more info here.
  • Scale in and out by adding and removing VMs as needed rather than using larger VM instances.

Links to further posts covering Azure cost management
Managing costs when using Azure SQL Database and Azure Storage

 
Francois Delport