OMS Log Analytics Common Tasks

In this post I’m going to give a quick overview of some the common tasks you can perform in OMS using queries. If you are looking for an Azure Log Analytics query quick start you can find it here. You can also find the official documentation here.

Lookup Tables

To create your own lookup tables you create a query that will return the desired results. Save the query and provide a function name for it. The function name will be the identifier you use to reference the lookup table in queries. In this example AllComputers is the lookup table/function

Event | join kind= inner (
AllComputers
) on Computer

Computer Groups

Computer groups are basically a specialised lookup table. You can use it in queries or other OMS functionality that act on a group of machines like scheduling updates. To create a computer group follow the procedure to create a lookup table but select the “Save this query as computer group” option to save it as a computer group instead of a plain lookup table.

OMS Log Analytics Common Tasks

Creating Custom Alerts

Alerts are based on queries that execute on a schedule, if the query returns any records the alert is triggerd. To setup an alert you start with a query to check for the alert condition. Click on the alert button on the top left to open the alert rule screen and configure your alert rules. Out of interest take a look at the actions the alert rule can perform on the right hand side, you can execute Azure Automation Runbooks or webhooks to create self healing systems  or generate work items in your ITSM application.

UPDATE: Alerts are now created in the Monitor blade of the Portal in the Alerts menu -> Manage Alert Rules.

Create Custom OMS Dashboards

To create custom dashboards you use the View Designer which can be opened by clicking the green plus sign on the left panel. Double click the tile you want for your overview tile and fill in a query that will be used to populate the tile. This will be the tile you see on the home screen.

OMS Log Analytics Common Tasks

Add additional tiles to the view dashboard tab. These will be displayed when you click on the overview tile in the home screen.

Create Custom OMS Dashboards For Azure

To create custom dashboards for Azure from your OMS data you have to create a shared Azure dashboard first, more info here. The functionality to pin the dashboard is not in the OMS query screen, it is in the Azure Log Analytics screen. On the OMS query screen click on Advanced Analytics to open Azure Log Analytics in a new window.

OMS Log Analytics Common Tasks

Create your query in Azure Log Analytics and click on the pin on the right hand side to pin the chart to a shared Azure dashboard.

OMS Log Analytics Common Tasks

You can read more about OMS and Azure integration in this post.

It is a bit confusing having functionality split between OMS and Azure Log Analytics but eventually all the querying functionally will be in Azure Log Analytics.

OMS PowerBI Integration

There are two ways to use PowerBI with OMS. The first and simplest but more manual way is to export a query to PowerBI by clicking on the PowerBI button in the OMS query screen.

OMS Log Analytics Common Tasks

This will download your current query as a query text file that you can then import in PowerBI.

The second and more streamlined method is to link your OMS account to PowerBI but this requires an organisational/paid PowerBI account. In OMS in the settings menu click on Accounts and Connect To PowerBI account.

OMS Log Analytics Common Tasks

Francois Delport

Azure Relay Service

In this post I’m going to take a quick look at the Azure Relay service and what it provides.

What Is Azure Relay

Azure Relay is a service that enables communication between applications in different networks, usually public cloud to on-premise but in reality it can be any two networks with internet access. It supports listening for incoming connections as well as outgoing connections without using VPN, special network configuration or opening firewall ports

How Does It Work

Azure Relay service directs requests between different networks using a rendezvous service hosted in Azure. You can read the official documentation here but in short both applications connect to the Service Bus rendezvous namespace and the service then relays communication between the connected parties. The Azure Relay service operates at the application level. You have to write your applications to specifically make use of the Relay WCF connections or Websocket Hybrid Connections. The WCF Relay connections work with .NET only via Nuget packages while Hybrid Connections uses Web Sockets and any language can use it. The service does have some smarts to determine the best way to create connections and will create a direct connection between two parties if possible for example two applications on the same network.

When To Use It

If you require point to point communication between applications on a specific port without using a VPN connection or opening firewall ports Azure Relay is a good candidate. The service is not well suited for real time communication due to the slight delay introduced by the rendezvous service. It is also not well suited for very high volume data transfer or a large number of connections. For example it would not be a good idea to expose a high traffic website hosted on-premise to the internet using the Azure Relay service. If you use the Hybrid Connection integration provided by App Services there is a limit on the number of connections at a time based on your App Service Plan.

Technical Details

Azure Relay service offers 2 connection options:

  • New Hybrid Connections using web sockets which is supported by multiple languages, most new applications or cross platform applications will use this type.
  • Older WCF Relays using WCF Relay bindings and WCF Relay Rest bindings for .NET only, mostly legacy applications or applications leveraging WCF specific features will use this type.

To use relays in your application you have to develop them using the specific Azure Relay connections in the form of WCF Relay bindings or HybridConnectionClient and HybridConnectionListeners from the Microsoft Azure Relay Nuget package. When using Hybrid Connections in your application you will be listening for requests and sending requests. In the case of WCF Relays most of the heavy lifting is done for you by the WCF Relay bindings. When using WebApp Hybrid Connections integration or PortBridge your application is not directly responsible for the relay communication but you will be configuring selected ports that will be forwarded to the relay.

The connections are encrypted using TLS and access to the Azure Relay Namespace is protected with access keys.

Generic Point To Point Connections With PortBridge

The PortBridge sample application uses Azure Hybrid Relay to tunnel communications between two TCP ports without modifying the applications sending or receiving the requests. It uses a server side application to forward requests from a specific port to the Azure Hybrid Relay and a client side application that responds back to the relay. This is handy for applications where you don’t have control over the source code or if you just need a quick way for Azure to reach a service on-premise.

Azure WebApp Integration

Hybrid connections are exposed directly to Azure WebApps. You can access it under the Networking tab.

Azure Relay Service
Azure Relay Service

To use WebApp Hybrid Connections you have to install a connection manager on-premise. The download link for the connection manager is on the Hybrid Connections blade.

Francois Delport

Managing Azure SQL Database And Azure Storage Costs

In this post I’ll highlight a few ideas around managing Azure SQL Database and Azure Storage costs. It follows on my previous post that looked at managing Azure costs when using Azure Virtual Machines and Azure Subscriptions in general.

Azure SQL Database

  • If you have multiple databases with occasional/spiky workloads you can reduce costs by using Elastic Pools to share the pooled DTUs among your databases. If you use Azure Advisor it will analyse your Azure SQL usage and recommend Elastic Pools if your database instances will benefit from it.
  • Keep an eye on Azure SQL service tier updates. I had some database in the Premium tier purely based on the maximum DTUs per database required but I was able to move some of them to Standard tier after the recent service tier updates.
  • If you have high database performance requirements but you can tolerate a lower SLA and some downtime to recover from backups consider using the Premium RS tier which was in preview at the time of writing.
  • Investigate whether running SQL Server on Azure Virtual Machines will be more cost effective than Azure SQL Database. It depends greatly on your requirements, database size and utilisation but keep the following in mind:
    • If you are using SQL for dev/test you can use SQL Server Developer Edition and avoid SQL licensing costs.
    • If your production databases are very small you could get away with using SQL Express but keep the other limitations in mind.
    • You can bring your own licence for SQL running on Azure VMs.
    • If you have high availability requirements using Azure SQL is much less effort  and cost than running VMs. Azure SQL comes with a 99.99% uptime guarantee. VMs have a 99.95% uptime guarantee if you have multiple instances in an availability set plus you have to replicate to another set of VMs in a second Azure region and pay for multiple SQL licences.
    • If your database is only in use for a portion of the day you can switch the virtual machine hosting it off after hours.

Azure Storage

  • Premium Disks and Managed Disk storage is charged per disk size not disk space used while Standard storage is charged by disk space used.
  • Standard disks incur an access cost measured in 10,000 IO transaction batches but Premium Disks and Managed Disks don’t.
  • You can combine disks to achieve higher IOPS. Weigh up the cost and performance of Premium Storage Disks versus multiple Standard Storage Disks especially if you need lots of space.
  • If your data will not be accessed very often consider using Azure Cool Storage.
  • The default data redundancy setting for Standard Storage is read-access geographically redundant which means you pay for a second copy in another Azure region. For dev/test storage you may not need that level of redundancy since Microsoft keeps 3 copies in the data center anyway and you can rather use locally redundant storage.
  • Delete unused VHD files and Managed Disks. When you delete a VM its disks and underlying VHD files are not deleted. Same goes for VMs using Managed Disks although in this case there is no VHD file visible to you, just the Managed Disk to delete.
  • Don’t create a VM just to host a file share, rather use Azure Files.
 
Francois Delport

Managing Azure Costs

Managing Azure costs is an important but sometimes overlooked aspect of cloud utilisation. Many customers wrongfully believe moving to the cloud will automatically be cheaper than hosting it yourself. Truth is cloud can be cheaper but it depends on your requirements and managing your costs. There is a wide spectrum of remedies to lower your costs ranging from simple non-technical steps to rearchitecting applications for serverless computing. In this post I’m going to cover some quick wins you can achieve when using virtual machines.

Managing Azure Costs

To reduce your Azure costs you have to measure it first. There are a few ways to view Azure costs depending on whether you are using an Enterprise Agreement, CSP partner or Visual Studio subscription. In the Azure Portal under the Subscriptions blade you will find some basic information for the current month. In the Cost Analysis section you can filter on different resource groups, types and time periods. In the new Cost Management + Billing blade you can sign up for Cloudyn cost management with more detailed analysis. Eventually it will be folded into the Azure Portal but for the moment you are directed to an external website where you have to sign up for a trail that is free until June 2018. Enterprise Agreement customers can use the EA portal and PowerBI to analyse their bill.

First Step

The easiest and first step you can take is to use the Azure Advisor blade to display any cost recommendations. It is very basic but can provide information around virtual machine under utilisation and Azure SQL databases that can benefit from elastic pools. While you are there also take a look at the security and performance recommendations.

Azure Virtual Machines

A few things to keep in mind to manage your Azure costs when using Azure Virtual Machines.

  • Newer generation virtual machines can sometimes be cheaper. Take for example D* v3 and D* v2 machines, taking into account that there is a small difference in RAM and temporary storage, v3 is cheaper. It was a similar situation when D* v1 was superseded by D* v2.

    Managing Azure Costs
    Managing Azure Costs
  • If you have Azure Batch jobs that are not time critical and they can safely be interrupted and resumed Azure Batch low priority virtual machines can offer a good discount, link.
  • If you are running workloads that occasionally consume high CPU cycles Azure B series virtual machines could be cost effective link. In short you build up credits when CPU utilisation is low which you then spend on bursts of high CPU utilisation.
  • Automatically shut down virtual machines when you don’t use them. It used to require a script and automation account but now it is available in the virtual machine blade.

    Managing Azure Costs
    Managing Azure Costs
  • If you have Software Assurance you can use your existing Windows Server licenses in Azure and only pay for the base computer power you consume. You can read more about Azure Hybrid Benefit here.
  • If you are a Visual Studio subscriber using Azure for development and testing you can get a discount on various Azure services by creating Azure Dev/Test subscriptions. These subscriptions are limited to development and testing workloads, link here. Each active Visual Studio subscriber also qualifies for monthly Azure credits but you have to activate the benefit first, more info here.
  • At the time of writing Reserved Instances were not available yet but it can also bring down cost by paying upfront for virtual machines, more info here.
  • Scale in and out by adding and removing VMs as needed rather than using larger VM instances.

Links to further posts covering Azure cost management
Managing costs when using Azure SQL Database and Azure Storage

 
Francois Delport
 

Azure Event Grid Filters

In this post I’m taking a deeper look at Azure Event Grid filters and using them in Azure Logic Apps. Take note that Azure Event Grid was in preview at the time of writing and there were a few hiccups.

Event Data Schema

I used this quickstart from the Azure team as a base. It used this JSON file for the event data.

[
 {
 "id": "'"$RANDOM"'",
 "eventType": "recordInserted",
 "subject": "myapp/vehicles/motorcycles",
 "eventTime": "'`date +%Y-%m-%dT%H:%M:%S%z`'",
 "data":{
 "make": "Ducati",
 "model": "Monster"
 }
 }
]

You can read the full schema documentation here. The id, eventType, subject and eventTime properties are required and most of them are used internally by Azure Event Grid. The data object is for your custom data and can be any JSON object.

Filters

An event subscription can contain prefix, suffix and event type filters.

 

 

 

 

 

 

 

 

 

The event type filter will filter on the eventType property. You can add multiple event types separated by a semi colon, wild cards do not work.The prefix and suffix filters will filter on the subject property. You cannot add multiple values in a prefix or suffix filter and wild cards do not work.
Side Note: The Prefix filter was read only in the Azure portal when I tested it but you could set it using the Azure CLI.

az eventgrid topic event-subscription create --name eventsubprefic --endpoint https://requestb.in/1fy6fab1 -g gridResourceGroup --topic-name testtopic5765 --subject-begins-with test

You can read the full event subscription schema here.

Using Azure Event Grid In A Logic App

I used the quick start example here as a base and used the SendGrid connector to notify me of events in a resource group. At the time of writing Azure Event Grid was still in preview so there will be some problems which are pointed out in the documentation. You have to login with an Azure Directory user or use a Service Principal connection for the Azure Event Grid connector in the Logic App designer. If you use a Microsoft live account it won’t be able to connect to Azure.

I also had a problem accessing all the event properties in the dynamic content window, the body wasn’t showing for instance. To work around it, switch to the expression editor and start typing the property names to see the full list. Now you can switch back to the dynamic content window and select the properties you want in the email body.

 

 

 

 

 

 

 

 

 

You can apply prefix and suffix filters by clicking on Show advanced options but not on the eventType.

 

 

 

 

 

 

 

 

If you have a requirement to filter on other properties you can do it by adding a condition statement and writing some code. The quick start I mentioned earlier shows exactly how to do this.

Francois Delport

Azure Event Grid Explained

In this post I’m going to have a look at Azure Event Grid which recently entered public preview. This is a summary I put together to wrap my head around it after reading the official documentation.

First Some Terminology

  • Event Publisher – The application/service/component raising the event.
  • Event Handler –  The end consumer of the event.
  • PubSub pattern – Azure Event Hub implements the Publish-Subscribe message pattern, link here. The main benefit of this pattern is loose coupling between publishers of events and the clients consuming them. In other words the publisher of an event doesn’t have to know about each consumer of the event.
  • Topic – The endpoint for event publishers to send their events to and also the endpoint for event subscribers to subscribe to events. Also provides security, message filtering and reliable delivery.
  • Event Subscription – The mechanism used to distribute events to registered event handlers.

What does it do

Azure Event Grid is a managed event routing service. It uses a publish-subscribe model to register client web endpoints with a publisher to receive events. It is aimed at but not limited to serverless applications with built in support in Azure Functions and Azure Logic Applications. That said it supports custom events as well and any web URL can be registered as a webhook to receive events. It is built on Service Fabric to provide scalability and resilience transparently to the end user.

How does it work

Event publishers and event handlers to subscribe to a Topic. The Topic will provide the endpoint for event publishers to sent their events to and event handlers to register to receive events. The mechanism that connects event handlers to the specific events they want to handle is the Event Subscription. The Event Subscription contains the URL to the event handler which it will invoke when a matching event occurs. Event Subscriptions can filter events by type, prefix filter or suffix filter. You can add multiple strings in the filter fields by separating them with a semi colon.

Azure Event Grid

If you look at this sample event used in the quick start, you will see the eventType field used for the type filter.

Azure Event Grid

 

 

 

 

Azure Event Support

At the time of writing Azure Event Grid already exposed some Azure events like Resource Group and Subscription CRUD events. You will find them in the Event Subscriptions blade in the Azure Portal.

Azure Event Grid

Azure Logic Apps also comes with a connector for Azure Event Grid.

Azure Event Grid

Francois Delport

Azure SQL Data Sync Best Practice And Tips

In this post I’m going to share some Azure SQL Data Sync best practice and tips. Most of the information comes from the official documentation which is very comprehensive and quite a read. This is more of a summary for future me to avoid reading through all the documentation again. You can find my 2 other posts on Azure SQL Data Sync here part1 and part2.

Initial Synchronisation

It is best to start the sync group with only one database containing data and the others empty. If the other databases contain rows in the synced tables every row will be treated as a potential conflict during the initial sync. This will lead to back and forth traffic and extra processing to determine if there is a conflict. If rows exist with the same primary key it will be a conflict. The conflict policy will then be applied to resolve the conflict. This leads to even more back and forth traffic on every conflict row.

Queuing Up Synchronisations

If you are using automatic sync make sure your scheduled sync time window is greater than the time it takes to complete a sync. For example if a sync takes 30 minutes to complete but you have it set to sync every 10 minutes the syncs will queue up indefinitely.

Stale Databases And Sync Groups

If a database is offline for 45 days it will be considered stale and will be removed from the sync group. When you reconnect this database to the sync group it will be marked as “out of date”. To start syncing again you have to remove it from the sync group and add it again. This will trigger an initial sync which can take very long.

If there is a problem for instance a schema mismatch, preventing the sync group from syncing successfully to all databases for 45 days the sync group will become stale. There is no way to recover from out of date sync groups. You’ll have to delete the sync group and create it again.

Synchronisation Loops

Sync loops are basically a circular reference within the same database or across databases. It usually happens when you have multiple sync groups and they are configured to sync the same rows in a table. Every time a change from one sync group is synced to the hub the other sync group will see it as a modified row. The second sync group will perform a sync on the same row and the first sync group will then see that as a row modification and start a sync of its own. This will go on indefinitely. The same can happen if a single database is registered with more than one agent. The tool is capable of handling databases and tables that are part of multiple sync groups but a row should never take part in more than one sync group. If you have to use multiple sync groups apply row filters to limit the rows for each sync group.

Azure SQL Firewall

It may seem obvious but if you are syncing on-premise databases the sync agents have to connect to the hub and sync meta databases in Azure. You have to configure the Azure SQL firewall on both of them to white list your agent IP addresses.

Troubleshooting

SqlException Error Code: -2146232060 – SqlError Number:3952 Message: Snapshot isolation transaction failed…

For Azure SQL Data Sync to work  ‘Allow Snapshot Isolation’ and ‘Read Committed Snapshot’ must be on. In my case ‘Allow Snapshot Isolation’ was off and setting it using ALTER DATABASE statements failed but I was able to modify this setting in the properties window of the database in SQL Server Management Studio.

Francois Delport

How To Handle Schema Changes In Azure SQL Data Sync

In this post I will show you how to handle schema changes in Azure SQL Data Sync. It follows on my previous post which gave a quick overview of Azure SQL Data Sync.

How To Handle Schema Changes In Azure SQL Data Sync

Since schema changes are not handled automatically by Azure SQL Data Sync you have to perform some manual steps in the Azure Portal or automate the steps. I’m going to cover doing it manually but same sequence of steps can be automated using PowerShell cmdlets or direct calls to the REST API.

Refreshing The Schema

For Azure SQL Data Sync to pickup changes in the schema of Sync Group members you have to refresh the schema.

In the Azure Portal browse to your Sync Group.

How To Handle Schema Changes In Azure SQL Data Sync

 

Select properties and disable automatic sync to prevent errors while changes are taking place.

How To Handle Schema Changes In Azure SQL Data Sync

 

Select Tables and for each database in the Sync Group click on refresh schema to pickup any changes. Select the table and columns to sync and click on save.

How To Handle Schema Changes In Azure SQL Data Sync

Adding A New Column To Sync

There are two scenarios here:

  • Adding a new column that is empty or the same in all databases

In this case you create the column in all the databases with a null or default value that will result in the same rows across all the databases.

  • Adding an existing column with data that was not synced yet

In this case the column exists with possibly different data between databases. If the rows are different between databases your conflict resolution policy and client sync sequence will determine the end result. If the conflict resolution policy is set to “hub wins” the member databases will end up with the values from the hub. If the policy is set to “client wins” the last client to sync will set the values in the hub and those values will be synced to the other clients.

  1. Disable automatic sync.
  2. Add the column in all Sync Group databases.
  3. Refresh the schema for each database in the Sync Group.
  4. Enable automatic sync.

Changing the data type of a column

You can change the data type of a column if the change will not cause any data loss for instance changing int to bigint.

  1. Disable automatic sync.
  2. Change the column type in all Sync Group databases.
  3. Refresh the schema for each database in the Sync Group.
  4. Enable automatic sync.

If the change will lead to data loss you can treat it as removing and adding a new column to achieve the same outcome.

Changing the name of a column

Changing the name of column basically involves removing the existing column and adding a new one.

  1. Disable automatic sync.
  2. Change the column name in all Sync Group databases.
  3. Refresh the schema for each database in the Sync Group and select the new column.
  4. Enable automatic sync.

NOTE: You cannot change a column if it is used in a filter.

Deleting a column

  1. Disable automatic sync.
  2. Delete the column from all databases in the Sync Group.
  3. Refresh the schema for each database in the Sync Group and select the new column.
  4. Enable automatic sync.

NOTE: You cannot remove a column if it is used in a filter.

Francois Delport

Azure SQL Data Sync

In this post I’m going to take a quick look at Azure SQL Data Sync. What is does, how it works, scenarios where you would use it and recent updates.

What Does It Do

As the name implies you use Azure SQL Data Sync to sync data between MS SQL databases via Azure. The databases can be on-premise MS SQL or Azure SQL databases. Take note that this is data only, not schema changes or other database objects, just data in tables. The data sync can be configured to run on a schedule with the smallest recurrence of 5 minutes or it can be a manual sync. Individual tables and columns can be selected for sync.

How Does It Do It

Azure SQL Data Sync uses a hub and spoke model. The hub in this case is an Azure SQL database, the spokes or member databases are connected to the hub. Changes will flow from a member database to the hub and then to other member databases. Azure SQL databases can connect directly to the hub but on-premise databases have to use Sync Agents which you install on-premise. When you connect a member database you can choose the sync direction:

  • bi-directional
  • to the hub
  • from the hub

To handle conflicts you select a conflict policy, the options are, hub wins or client wins.

Usage Scenarios

By choosing different sync directions you can utilise Azure SQL Data Sync for a variety of scenarios:

  • Bi-directional sync for all members can be used to keep all connected databases and the hub in sync
  • Syncing to the hub for all members can be used to collect all records from member database into the hub. For example creating a central aggregated database for reporting
  • Syncing from the hub for all members can be used to create a single source of truth in the hub. Member databases will have a local copy of the data
  • Setting one member to sync to the hub and another to sync from the hub will keep the read only member database in sync with the transactional member database. This is handy when moving a database that is constantly being updated into Azure SQL. You can continue to use the on-prem database and make the switch with very little down time and without restoring from a backup or bulk copy procedure. UPDATE: Transactional Replication is now available for Azure SQL and is the preferred method to sync database changes in a migration scenario link.

It is important to remember Azure SQL Data Sync will sync only data, schema changes are not handled automatically by the tool. In a future post I will dig deeper into handling schema changes.

Azure SQL Data Sync Update

Previously Azure SQL Data Sync was only available in the old Azure portal. After the update it is now available in the new Azure portal. It also received PowerShell cmdlets and Rest API enhancements, previously everything had to be done manually in the UI. The hub used to have a shared database maintained by the Azure SQL Data Sync service. Users will now have to provision their own database in their own subscription and use it as the hub.

Resources

The Azure SQL Data Sync team did a very good job providing comprehensive documentation which can be downloaded here link.

Francois Delport

Create Custom Virtual Machine Image In New Azure Portal

In this post I’ll show you how to create custom virtual machine images in the new Azure Portal for ARM Virtual Machines and Dev Test Labs. Creating custom virtual machine images from your existing virtual machines is a bit different in the new Azure portal compared to the old one. There are lots of resources showing how to do it in PowerShell, Azure CLI and ASM VMs but not so much ARM VMs in the new Azure Portal. For some reason it is not so intuitive that I could just stumbled upon it by exploring the portal.

Create Custom Virtual Machine Image In The Portal

My first thought was to look for a capture image button on the VM blade but there isn’t one, it is now a separate resource on its own blade.

ImagesMenu

If you plan on creating multiple distinct VMs from this image you have to run sysprep before creating the image. If you really want to make clones of this instance you can skip sysprep. The images are created as Managed Disks and you can’t change it so keep the associated cost in mind.

Create Virtual Machine From Custom Image In The Portal

To create a VM from the custom image created earlier you have to go back to the Image blade.

CreateVMFromImage

Create Custom Virtual Machine Image In Dev Test Labs
Firstly Dev Test Labs are pretty awesome, have a look at it if you have to manage multiple VMs for development, testing or training labs. Creating custom images in Dev Test Labs are a bit easier, the “Create custom image” menu item is right on the VM blade.

CreateImageDevTest

You also have the option to run sysprep if you didn’t already or to skip it. Note that the VM will become unusable if you run sysprep.

RunSysPrep

To manage the existing custom images you have to open the “Configuration and policies” blade and you’ll see the “Custom Images” menu item.

ManageCustomImages

Create Virtual Machine From Custom Image In Dev Test Labs

To create a VM from your custom image is very intuitive in Dev Test Labs, when you click on the Add button to create a VM your custom images will be right there with the existing VM templates.

CreateFromCustomImageDevTest

The same applies when you create new Formulas, which is basically creating the new templates in Dev Test Labs.

Francois Delport