Octopus Tentacle Automated Deployment And Registration

In post I’m going to cover Octopus Tentacle automated deployment and registration.

Recently I had situation where I had to install and then register Octopus Tentacles in Azure using ARM templates. The Octopus server was only reachable via public internet using it’s DNS name, by default Octopus will register using the local hostname which wouldn’t work in this case. I couldn’t find a complete example that did exactly what I needed, I’m posting the solution I came up with in case it is needed again.

There is some guidance here around ARM it but it doesn’t cover ARM templates only PowerShell scripts. I took this example anyway and modified it to be a DSC extension in my ARM template and it kind of worked. You can only use one instance of the DSC extension in a template since ARM will try to install it multiple times if you have multiple instances. I needed multiple instances and it failed. I uploaded the ARM example to GitHub anyway if someone needs it for Octopus servers installed in Azure.

In the end I used Azure Custom Script extensions to execute a PowerShell script for the installation. To create the PowerShell script I tried to follow the examples here using Tentacle.exe but still had the problem with the wrong tentacle URL. Using the Octopus.Client dll which was also part of the example didn’t work either, it didn’t configure the Windows service and it assumed you knew the tentacle client thumbprint which you don’t since this is a new installation and the tentacle will generate new thumbprint. Eventually I got it working using a combination of the two and some extra code to retrieve the tentacle client thumbprint. The full sample is here but I will highlight the important parts.

To retrieve the tentacle client thumbprint run:

$raw = .\Tentacle.exe show-thumbprint --nologo --console
	$client_thumbprint = $raw -Replace 'The thumbprint of this Tentacle is: ', ''

It will print the thumbprint to screen with some extra text you don’t need so I remove the first part.

Secondly don’t call the register-with command shown in the Tentacle.exe example since it will be done using the Octopus.Client dll.

To add multiple roles you have to add them one at a time, to achieve this I pass the roles string to PowerShell as a comma separated value.

"Web Server, Database Server"

And then I split it into an array and loop through it:

foreach($role in (New-Object -TypeName System.String -ArgumentList $roles).Split(','))
	{
		$tentacle.Roles.Add($role)
	}

After it is all done you will end up with a new machine in the desired environment with the correct URL.

Octopus Tentacle Automated Deployment And Registration

Francois Delport

Moving To Azure Active Directory

I recently moved from using Microsoft Live logins to Azure Active Directory at work and this is brain dump of the process. But first some theory and history around Azure directories, accounts and subscriptions.

Microsoft Live Based Logins
Most people start exploring Azure using their Microsoft Live account to sign up for the service. This will result in you having a subscription where you are the account and service administrator with a default directory for your account in Azure called <username>.onmicrosoft.com. You can add more subscriptions to your account and they will by default share the default directory for your account and the same account and service administrator. Account administrators can control billing settings and assign administrator access while service administrators can control resources in the subscription and grant administrator access but not billing settings and co-administrators can control resources but they can’t grant access to other users or billing settings. You can change the service administrator for the subscription but to change the account administrator you have to transfer the subscription. When you add more users to the subscription they will be co-administrators, this doesn’t give you granular way to control access to resources since it’s all or nothing. All resources are children of the subscription and when you delete the subscription all the resources are also deleted. This used to be the only way to access Azure.

Azure Active Directory Based Logins
The better and newer way to handle access to Azure is using Azure Active Directory. You can create multiple directories and they are independent of a subscription but a subscription will have one and only one directory that it trusts. You can add users from your organisation to the Azure Active Directory using their corporate credentials, this can be done manually or you can configure synchronisation with your on premise Active Directory. You can also add users using their Microsoft Live account or users from a third party or another Azure Active Directory. When you add users to the directory they will be assigned a security role that will control what they can do in the directory but not to resources in Azure. You can create your own groups and add your users to these groups in Azure. Using RBAC you assign users or groups to roles at a resource group level to control access to Azure resources. The users you add to a directory won’t have a subscription of their own but will have access to the resources in subscriptions that trust the directory that they are members of. You will not be able to login to the old Azure portal with your Azure Active Directory credentials. You can create your own custom roles for RBAC to gain finer control over resource. Azure Active Directory Services is currently in preview, enabling you to join machines to your Azure Active Directory and perform LDAP queries just like on-premise domains.

How to move from Microsoft Live accounts to Azure Directory.
These are the steps I followed, it worked for me and didn’t disrupt any of the existing resources in Azure. At the moment I’m not syncing from on-premise Active Directory.

  1. Change the name of your default directory to something meaningful since it will appear on every users subscription tab.
  2. In your directory create a new custom domain matching the companies corporate domain and change this domain to be the primary domain for this directory. If you leave out this step your users will end up in your default domain. Note: you have to proof that your company is the owner of the domain by adding TXT or MX records to your DNS entries and then you have to verify the domain in the Azure portal.
  3. Create users for yourself and other administrators in the directory and add them as global administrators to the directory.
  4. Change your subscriptions to trust this directory. You have to do this in the old portal under the Subscription -> Manage Subscriptions/Directory menu. If it is read-only you probably don’t have service administrator access on the subscription. When you change the directory your existing co-administrators will be removed from the subscription but you can easily add them back again afterwards. The co-administrators and service administrators will be automatically added to the owners group of the subscription in the new portal.
  5. Using the new portal you can add users to roles at the subscription level for instance I added all the other administrators to the Owner role giving them access to all resources in the subscription. You can give users access to specific resource groups, you don’t have to add them to any roles at the subscription level before settings roles on a resource in the subscription.
  6. To assign access to resource groups, open the resource group blade and click on the users button and add users or groups to roles. Make sure you choose the correct groups since there are different groups for classic and ARM deployments. Most resources will have the users button on its blade to control access, not just at the resource group level.

Resources:

Francois Delport

Unit Testing PowerShell With Pester

In this post I’m going to cover unit testing PowerShell with Pester.

As your PowerShell scripts become more complicated it can be difficult and time consuming to manually test them. This is especially true when your script relies on external resources like ActiveDirectory or Azure infrastructure. Pester is a framework specifically designed to unit test PowerShell scripts. It enables mocking, assertions and running tests using nunit or Visual Studio test explorer among others.

How to install it
In PowerShell 5.0 and greater you can install the Pester module by running:

Install-Module Pester

If for some reason you are unable to do that you can manually download it from Github and copy it to a folder that is in your $ENV:PSModulePath environment variable or use choclatey.

Invoke-Expression ((new-object   net.webclient).DownloadString('https://chocolatey.org/install.ps1'))
choco install pester -y

To confirm that the module is loaded run:

Get-Module -ListAvailable

Authoring Tests
I highly recommend using Visual Studio with PowerShell tools installed for your PowerShell projects, this demo will be using it. In VS create a PowerShell Script Project, add a PowerShell test to it using the PowerShell Test template.

Unit Testing PowerShell With Pester

The VS test explorer will pick up your PowerShell tests, same it does for other unit tests. Pester has a naming convention to follow for the tests and scripts, the test file should have the same name as the script to test but with .test before the extensions. If you don’t follow the convention the tests will show as greyed out in the VS test explorer.

Unit Testing PowerShell With Pester

The structure of a test
At the top of each test file you have to dot source the script file that will be tested.

.\PathTo\ScriptFiletoTest.ps1

In the demo project this is done using logic based on the naming convention. The structure of the actual test looks like this:

Unit Testing PowerShell With Pester

The Description section will group your tests together and this is also the name you will see in VS test explorer. Context creates a scope for variables and mocked objects. It will contain the code for your test and at least one assertion or else the test will not run.

Mocking And Assertions
You can mock calls to your own functions as well as calls to PowerShell modules. For example if you want Get-Date to return a specific date in your test you can do this:

Mock Get-Date { New-Object DateTime (2000, 1, 1) }

$date = GetMyOwnDate
It "It Should Be The Year 2000"{
    ($date).year | Should be "2000"
}

If you want to verify that a method is called with specific parameters you can do this:

Mock Myfunction {} -Verifiable -ParameterFilter {$myparam -eq $folder}
...
Assert-VerifiableMocks

I have a sample project on GitHub to show some of the functionality of Pester but read the wiki on their GitHub repo to get the full picture .

Some Errors I Encountered And Tips

  • The nuget package didn’t work for since nuget doesn’t understand PowerShell projects.
  • If your tests doesn’t run and you get a yellow exclamation mark next to the test in text explorer one cause could be that there are no assertions in your test “It” block.
  • I also tried mocking some ActiveDirectory calls on a Windows 1o machine. To get the PowerShell modules you have to install Remote Server Administration tools for your client OS.
  • If you want to mock the calls inside another module instead of just mocking your calls to those functions follow this guide.

Francois Delport

Practical Tips And Tooling For Azure Resource Manager

Moving to Azure Resource Manager (ARM) can look daunting, especially when you are staring at a 1000+ line JSON ARM template that just failed deployment but once you get into it you will see it is not so bad. In this post I’m going to share some practical tips and tooling for Azure Resource Manager.

To read more about ARM and why you should use it instead of the classic Azure API take a look at this article for the official version. In my environment the main drivers for moving to ARM was RBAC and ARM deployment templates that make deployments to complex environments easier. Moving to ARM should not be an automatic decision just because it is new, you should evaluate the benefits first. At the time of writing this post there were no plans to discontinue the classic deployment model yet, so you still have time. That said for new projects I would suggest you use ARM and for existing ones look if you gain anything first before you migrate since it is not a trivial tasks for a large environment and a large script library.

Exploring The ARM API
If you use Azure PowerShell v1.0 or greater you will find the ARM cmdlets use the {verb}-AzureRM* naming convention to differentiate from the classic cmdlets, you can use Get-Help {cmdletname} -detailed to explore what each one does. If you use the Rest API directly you can use the ARM Explorer website to explore the Rest API against your own subscriptions. It is also useful to perform adhoc tasks if scripting is not necessary. ARM uses providers to manage the resources you can deploy, when you open ARM Explorer you can browse the list of providers registered with your subscription for example there are providers for computing, networking, websites etc.

Authoring ARM Templates
ARM templates make it easier to deploy complex environments using a declarative model to describe the resources that should be in the environment. The ARM deployment engine is clever enough to deploy the resources that are not there and to ignore the ones that exist already, it is like DSC for Azure. The templates are JSON and as we know, it is not the easiest thing to edit a large JSON document. Luckily you can use Visual Studio to make life easier, install the Azure SDK and add a new Azure Resource Group project to your solution. When you edit the JSON deployment templates and parameter files, there are schemas to validate the document, intellisense popups and the editor has some smarts around resources.

Practical Tips And Tooling For Azure Resource Manager

There is a large library of quick start templates on gitbub at this link. You can also export a template representing the resources currently in a resource group by opening the resource group in the portal and selecting export under the settings section, official block post here. This can also be accomplished from PowerShell using the Save-AzureRmResourceGroupDeploymentTemplate cmdlet.

Side Note: You can also use the ARMViz webapp to visualise and edit ARM templates.

Deploying ARM Templates
Deploying  your template is also easy, you right click on the project in Visual Studio and choose deploy. It will show a popup to choose the subscription, resource group, template and parameter files etc.

If you used the Visual Studio ARM project you will see it contains a PowerShell script called Deploy-AzureResourceGroup.ps1 that will run the ARM deployment.   The PowerShell command called New-AzureRmResourceGroupDeployment will run the deployment. If you want to validate your template you can use the Test-AzureRmResourceGroupDeployment command.

TroubleShooting Tips:

  • You can set the log level of the deployment and retrieve the detailed logs afterwards more here.
  • The Templates blade in the new portal can run ARM Templates and it performs more validation before executing and gives more descriptive error messages when failures occur than PowerShell or VisualStudio deployments.
  • You can view the audit logs and errors for your deployments in the Audit Logs blade in the new portal.

Francois Delport

Creating Reports And Formatting Output In PowerShell

In PowerShell there are a few ways to format your output, in this post I’m going to tie together a few ideas creating reports and formatting output in PowerShell.

Formatting output for the console
The Format-* cmdlets are used to format output and works best for the console window or any display using fixed width font. You can store the formatted output in a variable and you can write it to a text file for instance and keep the format. If you want to iterate over the lines of the formatted text object you can use the Out-String cmdlet to convert it to an array of strings. For example Format-Table will produce output like this:

Creating Reports And Formatting Output In PowerShell

There is also Format-List to list item properties per line and Format-Wide among others, play around with them to see how it looks.

Converting output to HTML
If you need output fit for reporting or email purposes you have to convert it to HTML. Luckily this functionality is already in PowerShell with the ConvertTo-* cmdlets and specifically for HTML the ConvertTo-HTML cmdlet. Obviously the default HTML has no styling and looks pretty dull but you have some control over the HTML generated. You can add CSS to the HTML by specifying a custom Head section. The script below is using an inline style sheet to keep it with the script but you can point to external CSS files as well.

$css = "<title>Files</title>
<style>
table { margin: auto; font-family: Arial; box-shadow: 10px 10px 5px #777; border: black; }
th { background: #0048e3; color: #eee; max-width: 400px; padding: 5px 10px; }
td { font-size: 10px; padding: 5px 20px; color: #111; }
tr { background: #e8d2f3; }
tr:nth-child(even) { background: #fae6f5; }
tr:nth-child(odd) { background: #e9e1f4; }
</style>"

dir | Select-Object -Property Name,LastWriteTime | ConvertTo-Html -Head $css | Out-File C:\Temp\report.html

It generated the following HTML:

Creating Reports And Formatting Output In PowerShell

Convert output for data transfer purposes
If you need the output to be structured for data transfer or machine input purposes you can convert it using the ConvertTo-JSON/XML/CSV cmdlets. JSON and CSV is pretty straight forward but XML needs some explaining. By default the output will be a XML Document and to get to the actual XML you have to use the OuterXML property od the XML Document. You can change the output mechanism by using the -As {format} parameter that takes the following parameter values:
String: Returns the XML as a single string.
Stream: Returns the XML as an array of strings.
Document: The default that returns a XML Document object.
You can control the inclusion of type information in the XML using the -NoTypeInformation parameter. If you have the requirement to keep the generated output as small as possible you can use JSON and pass in the -Compress parameter to compress the output.

Francois Delport