How To Configure Docker On Windows

I recently started playing around with Docker on Windows and one of the first hurdles was how to configure Docker on Windows. Since I’m completely new to containers I’m going to share what I  figured out in case it can help someone else.

I followed this quick start from Microsoft which actually works very well. I used the server version but heads up if you want to use Docker with Windows 10. Docker requires Hyper-V and if you use VMWare Workstation you can’t have both installed at the same time.

There is a ton of information about Docker for Windows on their website but I wanted to highlight a few practical bits which most people would need to get started.

How to configure Docker on Windows

By default the Docker executables are located in “C:\Program Files\Docker” and its data is stored in %programdata%\docker. You’ll see the default container, image and configuration directories are here as well. When you install Docker it will be running with default values. To change the defaults you have to create a configuration file %programdata%\docker\config\daemon.json. The full configuration reference is here. The part I was interested in was changing the location where containers and images are stored to get them of my OS disk. The setting turned out to be the “graph” settings.

{"data-root": "D:\\Data\\Docker\\"}

Take note this changes the root directory for Docker data, not just the containers and images.

How to get clean Windows  Images

If you ran the quick start mentioned earlier, you’ll see it was running a container using the Nano Server image with a sample application. I wanted to firstly get an empty Windows image to play around with and secondly I wanted to run full .NET applications which require a Server Core image. The command to pull down the images from the Docker Hub are:

docker pull mcr.microsoft.com/windows/servercore:1903

docker pull mcr.microsoft.com/windows/nanoserver:1903

You can explore the Docker Repository to see which other images are available, link.

Exploring The Container Environment

I wanted to see what you get in a container and play around a bit. To start a new container interactively run:

docker run -i mcr.microsoft.com/windows/nanoserver:1903

docker run -i mcr.microsoft.com/windows/nanoserver:1903 powershell.exe

In the second instance the container will run PowerShell.exe after it started or you just run it from the command prompt yourself. In a future post I’ll dive into the isolation and integration between the host and container in more detail. For now I could see you get your own filesystem, network, installed windows features and registry but you get the users from the host.

Note: These Docker Hub repos did not maintain a “latest” tag, you have to specify the image tag for example 1903. The host Windows OS version can’t be lower than the container version. When you run a container that is an older version than the host it will use Hyper-V isolation. I was on Windows 10 version 1909 and I was able to run NanoServer 1903 and Server Core ltsc2019 but not 20H2 more info here.

Francois Delport

Azure Enterprise Agreement Billing API And Billing Updates

In this post I’m going to have a look at the Azure Enterprise Agreement Billing API and billing updates. The updates enable emailing invoices and some PowerShell cmdlets to retrieve invoices.

Emailing Invoices

You can opt in to receive your monthly invoice via email, which beats logging into the portal to download them every time. You’ll find the options to configure recipients on the Invoices menu in the Subscriptions blade.

InvoicesBlade

This feature is only available to consumer Azure subscriptions.

PowerShell Invoice Cmdlet

The Get-​Azure​Rm​Billing​Invoice cmdlet that provides the ability to download invoices is currently in preview. Before you can successfully use the cmdlet you have to enable access to your invoices in the Subscription blade of the Azure Portal.

InvoiceCmdLets

By default users with Subscription administration access will be able to retrieve invoices. You can grant other users access by assigning the Billing Reader role to users from the Access Control menu on the Subscription blade.

BillingRBAC

Calling the Get-​Azure​Rm​Billing​Invoice cmdlet doesn’t return the actual invoice but rather an Invoice object that contains amongst others a DownloadURL property that you can use to download the invoice. Note the URL is valid for 1 hour.

Login-AzureRmAccount
 $inv = Get-AzureRmBillingInvoice -Latest
 Invoke-WebRequest -Uri $inv.DownloadUrl -OutFile C:\Temp\Invoice.pdf

This feature is only available to consumer Azure subscriptions.

Invoice Retrieval For Enterprise Agreement Customers

If your are an EA customer the above method won’t work for you but you can download your usage and charges using the Billing Rest API for EA customers. In some respects this is easier for EA customers since the dataset returned contains usage and charges, you don’t have to calculate it separately. Before using the API you have to get your API key from the EA portal.

EA

You can find out more about using the API in this channel 9 video which also contains this link in the show notes that describes the API.

PowerBI Integration With The EA Portal

You can also export your billing data to PowerBI from the EA portal using the Power BI Reporting tab on the Reports menu. This functionality is provided by the Power BI Enterprise Pack, although it is a manual process to export data the first time, you can schedule the dataset to refresh automatically, more here and you can subscribe to report emails, it is in preview so the functionality is still evolving, more here.

Retrieve Usage Using PowerShell Cmdlets

The Get-UsageAggregates cmdlet is not a new feature but I didn’t get a chance to cover it yet. You can use it to retrieve resource usage, take note that the cmdlet makes use of a continuation token since the dataset can be quite large if you download detailed usage. Your billing charges can be calculated by retrieving the rates separately using the RateCard API and matching it to your usage, I covered the RateCard API in a previous post. The Get-UsageAggregates cmdlet is available to consumer Azure subscriptions and Enterprise Agreement subscriptions but the RateCard API is for consumer Azure subscriptions only.

Francois Delport

Visual Studio 2017 Installer New Features

Visual Studio 2017 Installer New Features

At a high level the installation has been broken down into smaller chunks called workloads. You select which workloads you want to use and only the components required for the selected workloads will be installed. If you want to you can still fall back to the old method of installing individual components. The Visual Studio setup is available as an on-line installer only. You can’t download the ISO from MSDN or the Microsoft website any more. You can create your own off-line installer but more about that later in the post.

The Package Cache

For the installer to work and also perform maintenance tasks like updating and repairing your Visual Studio (VS) installation the VS installer will keep downloaded packages in a package cache. By default the package cache is in:

"%ProgramData%\Microsoft\VisualStudio\Packages"

At the time of writing the ability to disable or move the package cache was in preview, instructions here. Using the VS installer you can disable the package cache but that would have some nasty side affects. Every package the VS installer needs for updating or repairing your installation will be downloaded from the internet and deleted again after the installation is done. You can also specify the location to use for the package cache in the registry. If you create the registry key before you start the installation the VS installer will place the packages in the specified location. For existing installations you have to manually move the packages to the new location.

Creating An Offline Installer

You can create an offline installer using the –layout switch of the VS setup exe:

vs_enterprise.exe --layout c:\vs2017offline [--lang en-US]

The full VS 2017 download is quite large but you can save some space by specifying which language you want to use. By default it will download all of them. Keep in mind the VS installer will download files to your AppData temp folder before moving them to the layout folder. I ran into a problem with one installation where the installer was caching the whole installation in the AppData temp folder on my C: drive even though I was creating the offline installer on a different drive. I was unable to determine the cause but keep that in mind if you don’t have enough free space on your C: drive. To install VS using the offline installer you have to install the certificates used to verify the downloaded packages, instructions here.

Updating The Offline Installer

By default VS will look for updates online even if it was installed from an offline installer but you can force VS to look for updates in the offline installer folder by changing its ChannelURI. You have to edit the response.json file in your offline installer directory and change the ChannelURI value to point to the channelManifest.json file in your offline installer folder, full instructions here.

To update the offline installation folder you can run the VS installer with the
–layout parameter. Pointing it to the existing offline folder, the VS installer will download any new packages to the folder and update the channelManifest.json file. VS installations whose ChannelURI is pointing to the offline installer will pick up the updates from the folder instead of downloading them. At the time of writing the VS installer didn’t purge old packages from the layout folder, it only added new ones so the folder can grow significantly. I guess it makes sense since older VS installations would still need the older packages to repair installations or add features.

Setup Tools

There are also some new tools to detect Visual Studio installations instead of rummaging through the registry.

  • The VSSetup Powershell Module can be installed from the PowerShell gallery using:
Install-Module VSSetup
  • VSWhere exe can be downloaded from the GitHub page here.

Francois Delport

Application Insights Customisation And OMS Integration

In this post I’ll be taking a look at Application Insights customisation and OMS Integration.

Custom Dashboards

You can pin the out-of-the-box charts to your Azure Portal dashboard by clicking on the pin in the top right corner. Analytics query charts can also be pinned using the same method but they are pinned to shared dashboards. Shared dashboards exist as an Azure resource in a resource group and you can use RBAC to control access to it.

Application Insights Customisation And OMS Integration

Alerts

You can create alerts based on the telemetry in Application Insights, you will find the Alert Rule button in the Metrics Explorer blade.

Application Insights Customisation And OMS Integration

 

Currently you can only create alert rules based on out-of-the-box telemetry but not from custom events or analytics queries. Good news is the feature is in preview so it should be available soon, link to uservoice.

Correlating Entries And Custom Properties

By default Application Insights populates the operation_id for every web request and propagates that operation_id to dependencies that it is able to trace out -of-the-box for example SQL Server queries and WCF calls to HTTP endpoints. The example below is for SQL Server queries joined to web requests.

Application Insights Customisation And OMS Integration

If you have a dependency that Application Insight can’t trace automatically or you have multiple levels of dependencies you have to provide your own solution to propagate the operation_id or your own contextual identifier like customer id. You can do this by creating a TelemetryInitializer to add your custom id or to grab the web request id and pass it along to the other services, example here.

OMS Integration

You can view and query your Application Insights telemetry in OMS by installing the OMS Application Insights solution from the OMS portal and configuring which applications you want to connect from Application Insights.

Application Insights Customisation And OMS Integration

You can connect multiple applications from different subscriptions which makes it easy to create a consolidated view. You can correlate Application Insights telemetry with other data sources in OMS like infrastructure telemetry making it easier to pinpoint the cause of slow application response.

VSTS Integration

You can install Application Insight widgets on your VSTS dashboards to surface Application Insight data in VSTS, link to marketplace.

PowerBI Integration

There is a content pack to create PowerBI dashboards with telemetry from Application Insights, link here. The content pack comes with a set of built-in dashboards. If you need a custom dashboard you can export Analytics Queries to PowerBI as explained here.

Custom Telemetry

Contextual logging and telemetry that makes sense in a business context for instance orders completed per second or aborted shopping carts can be a very powerful tool to get useful information out of logs and to track problems related to a specific customer interaction. To achieve this you can add your own telemetry to Application Insights by writing custom events and logging exceptions, link here. You can also have your favorite logging library writing to Application Insights, examples here.

Francois Delport

Adding Application Insights To Existing Web And Service Applications

In this post I’m going to take a quick look at adding Application Insights to existing web and service applications. I’m going to highlight a few things I discovered and explore the difference between the two approaches.

Adding Application Insights During Development Or Post Deployment

There are two ways to add Application Insights to your application:

  • During development you can use the Application Insight SDK and wizard in Visual Studio to add it to your web application.
  • After deployment you can enable Application Insights using the Status Monitor application or PowerShell.

The web application I’m working on is tied to Visual Studio 2012 and Visual Studio 2013 or higher is required for the wizard in Visual Studio, hence the investigation to add Application Insights to deployed applications. Enabling Application Insights post deploy can also be handy for web applications where you don’t have access to the source or you are not allowed to make any changes. I followed the instructions here.

Some Difference Between The Two Methods

  • When you add Application Insights to your project using the wizard in Visual Studio it will install additional nuget packages, references and configuration settings to send telemetry data to Azure. For deployed applications you have to install the StatusMonitor application to configure Application Insights or use PowerShell. Application Insights will keep it’s configuration in an ApplicationInsights.config file in your web app root folder and some nuget packages in your App_Data folder and additional dlls in the bin folder.

Adding Application Insights To Existing Web And Sevice Applications

The configuration file, dlls and packages can be  be lost when you redeploy.

Side Note: You can use Application Insights locally to view telemetry data in Visual Studio without configuring Application Insights in Azure or sending any telemetry data to Azure.

  • Application Insights can monitor client side events if you add some Javascript snippets to your pages, it can provide more detailed telemetry and you can create custom telemetry entries using the SDK. For deployed applications you only get the telemetry that IIS and .NET provides by default.

How To Add Application Insights Post Deployment Using PowerShell

Obviously loosing your Application Insights configuration after deploying is not an ideal situation, luckily you can easily script the configuration using PowerShell.

Import-Module 'C:\Program Files\Microsoft Application Insights\Status Monitor\PowerShell\Microsoft.Diagnostics.Agent.StatusMonitor.PowerShell.dll'
#optional
Update-ApplicationInsightsVersion    

Start-ApplicationInsightsMonitoring -Name appName -InstrumentationKey {YourInstrumentationKeyFoundInTheAzurePortal}

#optional
Get-ApplicationInsightsMonitoringStatus

The Update command is optional and updates the version of Application Insights installed before adding it to the application. The Get command is also optional making it easy to see the status information in your VSTS deployment log.

Adding Application Insights To Service Or Desktop Applications

For non-web applications it can’t be done after deployment, you have to add the Microsoft.ApplicationInsights.WindowsServer nuget package during development and configure the InstumentationKey in your code, instructions here.

Side Note: Application Insights is billed by usage and the first GB of telemetry data is free, making it easy to experiment or even use on small applications permanently. You can use sampling to limit the telemetry data and configure a daily limit in the portal to make sure you stay in the free tier.

Francois Delport

Azure Resource Policies

Azure Resource Policies has been around for a while but it was a bit under the radar since it didn’t have portal support and it is aimed at larger environments where more control is required. Azure Resource Policies enable you to define rules that resources must comply to when you create or update them. For instance you can specify a naming convention, restrict locations where resources can be created or restrict which type of resources users can create. It differs from RBAC in that RBAC controlled what users could do based on permissions, you could prevent a user from creating VMs or Storage Accounts while Azure Resource Policies define finer grained rules based on conditional logic, for example allowing or blocking a specific VM series or type of storage account. Azure Resource Policies recently became available in the preview portal.

Tip: If you didn’t know it already you can see some of the upcoming features currently in preview by using the Azure Preview Portal link.

Azure Resource Policy Structure
Azure Resource Policies are authored using JSON and basically contain parameters and rules, there is some extra descriptive stuff but you can read the comprehensive documentation here. Parameters make your policies re-usable as opposed to hard coding values. The policy rules are if..then code blocks than can use a range of logical operators and can be nested.

{
  "if": {
    <condition> | <logical operator>
  },
  "then": {
    "effect": "deny | audit | append"
  }
}

In the documentation referenced earlier you will see a list of fields and resource properties you can access to build your rules for example the location of a resource or the SKU of a VM. The JSON document containing the parameters, rules etc is called a Policy Definition.

Deploying A Resource Policy
After you create the JSON Policy Definition you have to deploy it to your subscription, the detailed deployment documentation can be found here. Short version, you can use the Azure Rest API, PowerShell or Azure CLI but sadly not the portal at this point in time. Policy definitions are stored at subscription level but are not active until you assign them to a resource group or subscription scope. Make sure to update Azure PowerShell or the policy samples from the documentation won’t work. There are some predefined policies in Azure as well, if you run the Get-AzureRmPolicyDefinition PowerShell command you will see them.

Assigning A Resource Policy
After the Policy Definition is deployed you can assign it to a resource group or subscription using the Azure Rest API, PowerShell, Azure CLI or preview portal. You will find the Policies menu item under your subscription.

Azure Resource Policies

From here you can assign policies to a resource group or subscription and provide parameters for the policy. You will also see the predefined policies in the drop down list.

Azure Resource Policies

Francois Delport

Automating Azure VM Backups Using ARM Templates

In this post I will give a quick overview of automating Azure VM backups using ARM templates. There are quick start templates for backups when you search for them but the syntax for the resource names didn’t make sense to me, hence this post to explain it a bit more. I will also touch on using Azure Resource Explorer which is a great tool for understanding the ARM API.

Azure VM Backups Background

To backup Azure VMs with Azure Backup you have to create a Recovery Services vault, create a backup policy which contains the schedule and backup retention settings and register your VMs for backups to the vault and selected policy. There are quick start templates to create vaults, policies and schedules here. Although you won’t see it from the portal there is also the notion of a container for storing backups. The type of container depends on the items that are backed up, there are containers for Azure VMs, SQL backups and Windows backups. This is the part that wasn’t clear to me in authoring ARM templates but next I’ll show you how to shed some light on them.

Azure VM Backup Resource Syntax

One way to figure out the syntax for an ARM template is to look at existing resources and export the template from the Azure Portal using the Automation script blade but that will export your vault only, not your VM backups.

Next I tried Azure Resource Explorer, drilling down to my recovery vault I didn’t see the VMs that are registered for backups or the policies but I did manage to find the deployments for them in:

{resourcegroup}\Microsoft.Resources\deployments

The deployments to create a backup policy are named CreatePolicy* and the deployments to register a VM for backups are named ConfigureProtection*. In the deployment for VM protection I managed to find the syntax for the resourceName.

Automating Azure VM Backups Using ARM Templates

The ARM template to register multiple VMs for backups is in my GitHub repo here. The template assumes the VMs are not in the same resource group as the recovery vault since the backup vault was contained in a separate management resource group in this case.

A Bit More On Azure Resource Explorer

I highly recommend taking a few minutes to take a look at the Azure Resource Explorer. It is a great way to explore the syntax for Azure ARM Rest API, there are tabs to execute some Rest API commands directly from the explorer, PowerShell samples to create the currently selected resource and documentation for the selected resource.

Francois Delport

Azure Managed Disks

Azure Managed Disks became generally available recently and I decided to take the feature for a spin.

With Azure Managed Disks you don’t need a storage account anymore so you won’t be hitting I/O or size limits at the storage account level but the default limit is 2000 Managed Disks per subscription which can be increased if you contact support.

When you create a VM you can now choose to use Managed Disks or Unmanaged Disks, the default is Unmanaged Disks. The Managed Disks show up as resources directly in your resource group, no more drilling down in storage accounts to see what is happening.

Azure Managed Disks

When you add additional Managed Disks you can choose to create the VHDs from a snapshot, custom image blob or create one from scratch.

To create a snapshot in the Azure Portal you have to click on New and search for Snapshot. I found this a bit weird, my instinct was to browse to the disk in a resource group or a under a VM and look for a snapshot button there on the disk blade next to Export and Delete but anyway.

Azure Managed Disks

You can export a disk to blob storage by using the Export button on the disk blade.

Another great feature of Managed Disks is the ability to change the size and type of a disk but you have to stop the VM first or else you get this error.

Azure Managed Disks

Keep in mind you can’t change to a smaller disk size and if you want to use Premium Managed Disks you have to choose a VM family that supports Premium Storage like DS.

The price of Premium Managed Disks is the same as Premium Unmanaged Disks but the price of Standard Managed Disks is different from Standard Unmanaged Disks. For Standard Managed Disks you pay based on the size of the disk provisioned, not the data that is used on the disk. At the time of writing Standard Managed Disks were on a promotional price.

Azure Managed Disks

I did a quick comparison on the normal price and provided you use all the space on a disk a full Standard Managed Disk is 20% cheaper than a full Standard Unmanaged Disk. Since is not really practical to have a full disk you’ll have to way up the pros and cons for your scenario. Keep in mind you can start with a smaller disk and resize it later. You can also migrate Unmanaged Disks to Managed Disks but it looks like the only way to do it at the moment is PowerShell.

Francois Delport

Azure Tooling For Orchestration And Hybrid Environments

In this post I’m going to have a quick look at Azure tooling for orchestration and hybrid environments. The Azure Portal and its automation options works great for developers and operations staff to manage resources but sometimes you need more. The two scenarios I come across very often are hybrid cloud and service catalogs with an accompanying self service portal.

With hybrid cloud you have to consider the integration with on-premise resources and APIs or even other cloud providers. On the service catalog side you have to consider the services you will provide, the orchestration for provisioning,  integration with a service management tool and implementing business rules for approval and access.

At the moment the Microsoft tooling landscape is a bit of a minefield with the transition from ASM to ARM for Azure and the changes in System Center to cater for hybrid cloud environments. In this post I’ll be looking at tooling around System Center 2012 R2 and Azure Pack. May be later on I will take a look at System Center 2016 and Azure Stack.

Azure Pack Private Cloud
It supports an ASM style API that is different from the public Azure ASM API. The recommended automation solution for Azure Pack is Service Management Automation. It is stand alone component you install from the System Center Orchestrator installation media. It executes PowerShell workflows based on events or a schedule. It is aimed at administrating the fabric of your cloud and is not suited for tenants launching workflows to provision resources.

If you don’t need complicated workflows you can create Virtual Machine Roles, they enables users to select different options during deployment, install extensions and deploy multiple VMs together that form a logical grouping.

Azure Pack In A Hybrid Environment
In a hybrid environment you can use the Windows Azure Pack Connector to provision VMs on premise and in public Azure using the Azure Pack portal and API. There are also 3rd party solutions that provide similar functionality for other resource types. You will be writing ASM style PowerShell scripts that can execute against public Azure ARM via the connector or on premise Azure Pack VMs using the same Azure Pack API.

System Center Orchestrator
If you want to have even more control over the provisioning process or have integration requirements with non Microsoft hybrid environments you can use System Center Orchestrator to create complex workflows. There a few options when it comes to Orchestrator, this article describes it in more detail. Orchestrator offers a graphical authoring experience for on  premise resources using integration packs. You can connect it to public Azure using the Azure Integration Pack for Orchestrator but this is only for ASM not ARM. You can also use it to execute PowerShell to cater for any tasks not provided by integration packs like calling Azure ARM PowerShell, deploying ARM templates or executing automation runbooks.

Self Service Portal
To create your own front end or self service portal for users you can use System Center Service Manager Portal, it integrates with Orchestrator to run workflows. You can modify the front end to present users with prompts and you can control access and require change approval for actions.

Azure Automation
Azure Automation give you the ability to run PowerShell workflows and scripts in the public cloud, the scripts are stored in Azure along with other assets like connection strings, certificates etc. By default it can’t access on premise resources that are not publicly accessible. You have the option to install hybrid runbook workers on premise to receive jobs from Azure Automation, these will obviously have access to any local resources visible to them.

App Controller
This is not really automation but it can gives you more control over the users ability to access resources in the cloud and on premise. It also hides details from users, for instance they don’t have to know the Azure subscription details or even have a subscription to provision Azure resources. You define connections in App Controller and grant users access to it. It can connect to various resources like Hyper-V, Azure and vCenter.

Application Roadmap
Keep in mind all the information in this blog relates to Azure Pack and System Center 2012, a lot changed in System Center 2016 and Azure Stack. Before you invest in any of the tools take a look at the road map for the products for example App Controller has been deprecated in System Center 2016 and Orchestrator seems to be dead in water, no new features were added in System Center 2016.

Francois Delport

Using Microsoft Report Viewer In PowerShell

In this post I’m going to give a quick example using Microsoft Report Viewer in PowerShell. PowerShell makes is easy enough to slap together a simple report by converting data to HTML and adding some CSS but every now and then you need something a bit more polished. For instance generating a chart and or report in Word format that is ready for distribution or printing, laid out according to page size and orientation with headers, footers, logos etc. HTML doesn’t work that great for this so I did a POC using Microsoft ReportViewer 2012, this should work with the 2015 version as well but I didn’t try it.

I’m not going to dig into creating reports with the report viewer in this post, I’ll be focusing on using it with PowerShell. If you are not familiar with the report viewer you can catchup over here and there are some very good resources and samples on the GotReportViewer website as well. Short version, you design reports using the report designer in Visual Studio, at run time you pass it data and render the report to a file or display it in the report viewer. If you don’t have the report viewer installed you can download it here. The whole solution is in my GitHub repo here but I will be highlighting/explaining some aspects of it as we go along.

Code Highlights
When you design the report you have to assign data sources to it that will provide the report with the fields it will use at design time to author the report and it will also expect the same objects at run time with the populated data. You’ll have to create these in a .NET project, compile it and load the assembly along with the ReportViewer assembly into PowerShell.

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.ReportViewer`.WinForms") 
[System.Reflection.Assembly]::LoadWithPartialName("ReportPOC")

In this solution the data source is the ReportData class in the ReportPOC assembly. When you create the data source collection you have to create a strongly typed collection, in this case a generic List of the data source type.

$data = New-Object "System.Collections.Generic.List[ReportPOC.ReportData]"

Creating the records and adding them to the data source is pretty straight forward.

$item1 = New-Object ReportPOC.ReportData
$item1.ServerName = "Server1"
$item1.CPUAvail = "128"
$item1.CPUUsed = "103"
$data.Add($item1)

Then you create the report viewer instance specify the path to your report file and add the data source.

$rep = New-Object Microsoft.Reporting.WinForms.ReportDataSource
$rep.Name = "ReportDS"
$rep.Value = $data
$rv.LocalReport.ReportPath = "C:\MySource\ReportPOC\POC.rdlc";
$rv.LocalReport.DataSources.Add($rep);

Next you render the report and write the output byte array to a file stream, remember to cast the render result to type [byte[]] or else it won’t work.

[byte[]]$bytes = $rv.LocalReport.Render("WORDOPENXML");
$fs = [System.IO.File]::Create("C:\MySource\ReportPOC\POC.docx")
$fs.Write( [byte[]]$bytes, 0, $bytes.Length);
$fs.Flush()
$fs.Close()

The result, a chart and report using the same data source with a page break between them as it displays in Word 2013:

Using Microsoft Report Viewer In PowerShell

Next Steps
If you want a re-usable solution I would create a more generic data source class to avoid coding a new data source class for different charts. Also add some parameters to the report/chart to control the headings, legend and labels, page headers/footers etc. You can also export to other formats by passing different parameters to the render method like “EXCEL”, “PDF”, “HTML” and “XML”. You can also create different series to group categories and apply logic to the report/chart to calculate values or control colors for example if CPU usage is > 90 % color the bar red, this is done in the report designer.

Francois Delport