Search results for: dsc

Microsoft Infrastructure as Code with PowerShell DSC

Desired State Configuration (DSC) is a PowerShell platform that provides resources that enable the ability to deploy, configure and manage Windows servers. When you run a DSC configured resource on a target system it will first check if the target matches the configured resource. If it doesn’t match it, it will make it so.

To use DSC you have to author DSC configurations and stage the configuration to make it available for use on target systems and decide whether you will pull or push to your target systems. DSC is installed out of the box with PowerShell (starting with 4.0). PowerShell is already installed in the default configuration of Windows Server 2012 so you don’t have to jump through any major hoops to get going.

DSC Resources

DSC Resources are modules that DSC uses to actually do work on the server. DSC comes with basic resources out the box, but the PowerShell team provides a DSC Resource Kit that provides a collection of useful, yet experimental, DSC Resources. The Resource Kit will simplify usage of DSC as you won’t have to create a ton of custom resources to configure your target systems.

You can also create your own custom resources and it seems not too difficult to do. I even saw a post that suggests that you can code your resources with C#, I haven’t tried this, but it would be a plus if your not very comfortable with PowerShell.

There is a preview of a resource gallery here, You can probably find additional resources produced by the community with a little searching.

Push or Pull

Do you push changes to the target system or allow the target to pull changes from a central server? This is a debate with merits on both sides where opponents advocate for one or the other. I have seen arguments on both sides that state the scalability and maintainability benefits of one over the other. My opinion is I don’t really care right now (premature optimization).

One of the first DSC posts I read on TechNet said that Pull will most likely be the preferred method so I went with it, although there is more setup you have to go through to get going with pull. Push is basically a one line command. In the end, the decision is yours as you can do both, just don’t lose any sleep trying to pick one over the other. Pick one, learn how DSC works, and make the Push or Pull decision after you get your feet wet.


This is a diversion, but a good one IMHO. One problem with the Pull model is you need to have all of the resources (modules) downloaded on your target. Normally, you would have to devise a strategy to insure all of your DSC Resource dependencies are available on the target, but PowerShellGet solves this. PowerShellGet brings the concept of dependency management (similar to NuGet) to the PowerShell ecosystem.

Basically, you are able to discover, install, and update modules from a central module gallery server. This is not for DSC only, you can install any PowerShell modules available on the gallery (powerful stuff). PowerShellGet is part of the Windows Management Framework (WMF),

Infrastructure as Code

I was pleased at how simple it is to create DSC Configurations. Although, the jury is still out on how maintainable it is for large infrastructures. After reading about it, I saw no reason to wait any more to get my Windows infrastructure translated to code and stored along with my source code in a repository, as it should be. If you have multiple servers and environments and you don’t have your infrastructure configuration automated and you know its possible to do, your just plain dumb.

Infrastructure as code is a core principle of Continuous Delivery and DSC gives me an easy way to score some points in this regard and stop being so dumb. Also, with the Chef team developing Cookbooks that use DSC Configurations as Resources, I can plainly see a pathway to achieving all the cool stuff I have been reading about happening in the OpenSource environment stacks in regards to infrastructure as code.

DSC Configuration

The DSC Configuration is straight forward and I won’t bore you with a rehashing of the info found in the many resources you can find on the interwebs (some links below).

Configuration MyApp
      Node $AllNodes.NodeName
        #Install the IIS Role
        WindowsFeature IIS
          Ensure = “Present”
          Name = “Web-Server”
        #Install ASP.NET 4.5
        WindowsFeature ASP
          Ensure = “Present”
          Name = “Web-Asp-Net45”
      $ConfigurationData = @{
        AllNodes = @(

This simple configuration installs IIS and ASP.Net 4.5 on 2 target nodes.

Configuration Staging

To be consumed by DSC the configuration needs to be transformed into an MOF file (Management Object File). You can create this file by hand in a text editor, but why when it can be automated and I am obsessed with automation.

MyApp -ConfigurationData $ConfigurationData

This calls our MyApp configuration function and creates the MOF file. I can customize this a bit further by defining the ConfiguationData array in a separate file and defining exactly where I want the MOF file created. This gives me good separation of logic and data, like a good coder should.

MyApp -ConfigurationData c:\projects\myapp\deploy\MyAppConfigData.psd1 -OutputPath c:\projects\myapp\deploy\config

Above, the ConfigurationData is in a separate file named MyAppConfigData.psd1.

If I want, I can lean towards the push model and push this config to the target nodes.

MyApp -Path c:\projects\myapp\deploy\config -wait –Verbose

To use the pull model you have to configure a pull server and deploy your configurations. The pull server is basically a site hosted in IIS. Setting it up is a little involved so I won’t cover it here, but you can get details here and of course you can Bing more on it.


Well that’s enough for now. Hope it inspires someone to take the leap to infrastructure as code in Windows environments. I know I’m inspired and can finally stop being so scared or lazy, not sure which one, and code my infrastructure. Happy coding!

More Info

IIS 8 Configuration File

Note to self

The IIS 8 configuration file is located in %windir%\System32\inetsrv\config\applicationHost.config. It is just an XML file and the schema is well known. You can open it, edit it (if you are brave), and otherwise do configuration stuff with it. You can diff it from system to system to find inconsistencies or save it in a source code repository to standardize on a base configuration across web server nodes, if your project needs that kind of thing. Lastly, you can manage it with Powershell… you can manage it with Powershell… you can manage it with Powershell DSC!

The possibilities are endless so stop depending so much on the IIS Server Manager UI like you are in Dev preschool. You are a big boy now, remove the training wheels, but you might want to wear a helmet.

I don’t want to have this discussion again!

What is this CIM I keep running into in Powershell?

I keep having to use CIM in my scripts, but what is it? I understand how to use it, but where did it come from and what does it stand for. Like every developer I know, a search engine is the best tool to solve this mystery.

There is an industry standards organization called DMTF (Distributed Management Task Force) that defined a standard named Common Information Model. By the way, this is the same group that defined MOF (Managed Object Framework) which is the standard below the covers of DTC. CIM is defined in the MOF standard and is a cross platform common definition of management information for systems, networks, and applications and services that allows for vendor extensions. How was that for acronym soup?

Mystery Solved.

Update PSModulePath for Custom PowerShell Module Development

I am in the process of a deep dive into DSC and I want to store my custom modules and DSC Resources in source control. To make it easy to run PowerShell modules you have to import them or have them on the PSModulePath environment variable. Since I don’t want to point a source repository to the default PowerShell module path, I want to add my custom module path to PSModulePath. This will save me some time when it comes to having to import modules and module changes. This means I will always be running the most recent version of my modules even the buggy ones, so if you do this, understand the implications.

It’s actually pretty easy to automate this with PowerShell. Since I already have some experience updating environment variables with PowerShell I just created a new script to add my custom module path to PSModulePath.

$currentModulePath = [Environment]::GetEnvironmentVariable("PSModulePath", "Machine")
$customModulePath = "C:\_DSC\DSCResources"
$newModulePath = $currentModulePath + ";" + $customModulePath
[Environment]::SetEnvironmentVariable("PSModulePath", $newModulePath, "Machine")

I complicated this script a bit so it is more self evident on what is happening (code as documentation – no comments necessary).

I can envision someone needing to also remove a path from PSModulePath, but this is enough to get started so I will leave it up to you, until I have a need for that :).


When running this script in an Invoke-Command on a remote session the modules aren’t immediately available if I tried to use modules in the new path. This is because the path is not updated on the remote session. A quick workaround for me was to remove the session and recreate it.

Get-PSSession | Remove-PSSession;

This removes all sessions so you may not want to do this. Since I don’t care about sessions I like it. This was just a one line change in my workflow script and it didn’t cause too much latency in the script execution. I know there are some other solutions that involve messing with the registry, but this is a one time deal so resetting the remote session works for me.

GoCD: Automate Agent Install with PowerShell

I have been setting up build servers and I have been exploring automating the process. So, I have been scripting every step I take to stand the servers up. In this post I will sharing some of the commands I use to create GoCD Agents. If you decide to go down this road, you should think about creating reusable scripts and parameterize the things that change (I didn’t want to do all the work for your :). Also, it would make sense to use a configuration manager like DSC, Puppet or Chef to actually run the scripts.

I am using PowerShell remotely on the build servers, which is indicated by [winbuildserver1]: PS> in the command prompt. Check out my previous post to learn more about configuring remote servers with PowerShell.

Copy Install Files

The first thing I do is copy the install files from the artifact repository to the server.

[winbuildserver1]: Copy-Item -Path \\artifactserver\d$\repository\Go-Agent\go-agent-14.1.0-18882\go-agent-14.1.0-18882-setup.exe -Destination "D:\install-temp\" -Recurse -Force

Install Agent

[winbuildserver1]: PS>([WMICLASS]"Win32_Process").Create("D:\install-temp\go-agent-14.1.0-18882-setup.exe /S /SERVERIP=<ip of go server> /GO_AGENT_JAVA_HOME=<path to JRE> /D=D:\Go Agents\Internal\1\")

Here we are getting a reference to the static WMI class “Win32_Process”, call the create method passing the command line to install an agent ( In the command line we have

  • the path to the install file
  • /S switch for silent install (no user prompts)
  • /SERVERIP switch for the IP of the Go Server (this is optional)
  • /GO_AGENT_JAVA_HOME switch for the path to the JRE (this is optional)
  • /D switch is the path to location you want to install the agent.

Run Multiple Agents on Same Server

If I want to run multiple agents on the same server I do a little extra work to get the other agents installed.

[winbuildserver1]: PS> Copy-Item "D:\Go Agents\Internal\1\*" -Destination "D:\Go Agents\PCI\1"
[winbuildserver1]: PS> Remove-Item "D:\Go Agents\PCI\1\config\guid.txt"
[winbuildserver1]: PS> Remove-Item "D:\Go Agents\PCI\1\.agent-bootstrapper.running"

Here we are just copying an installed agent to a new location and removing a couple files to force the agent to recreate and register itself with the server.

Create Agent Service

Lastly, I create a service for the agent.

[winbuildserver1]: PS> New-Service -Name "Go Agent PCI 1" -Description "Go Agent PCI 1" -BinaryPathName "`"D:\Go Agents\PCI\1\cruisewrapper.exe`" -s `"D:\Go Agents\PCI\1\config\wrapper-agent.conf`""

Get more on using PowerShell to configure services in my previous post.


I use similar commands to install the server, plug-ins, and other tools and services (e.g. Git, SVN, NuGet…) that I need on the build server. I have to admit that this isn’t totally automated yet. I still have to manually update the service account, credentials and manually accept a certificate to get SVN working with the agent, but this got me 90% done. I don’t have to worry about my silly mistakes because the scripts will do most of the work for me.

Setup Zurb Foundation for .Net Development

Zurb Foundation

My wife wanted a new website built for a project she is working on. Being the good developer husband I am I decided to implement the site with a responsive design. I could research and build the CSS and javaScript from scratch, but I was sure there was a project already developed to help get me started. Well I landed on a project named Foundation. Here is what they have to say about themselves,

Foundation is the most advanced responsive front-end framework in the world. You can quickly prototype and build sites or apps that work on any kind of device with Foundation, which includes layout constructs (like a fully responsive grid), elements and best practices.

Is it the most advanced? I have no idea, but it seems simple and has a decent community and is in use on some major sites. Why not Bootstrap you ask. Well some respected front end designers said Foundation is cool. I’m a developer, so I respect what the design community has to say about design. This is not to say that designers don’t like Bootstrap, from what I can tell they love it too. I just wanted to learn some of the dark arts and Foundation is closer to the metal and Bootstrap hands everything to you on a platter. I am not qualified to compare them, but you can read about some differences between the two that Felippe Nardi posted on his blog, Actually, his post pushed me to Foundation. Even though he said I will have to get my hands dirty to use it I think I will enjoy the control and absence of unnecessary or accidental complexity.


First, in Visual Studio, I create an empty MVC project. I recently upgraded my VS 2012 to Web Tools 2013.1 for Visual Studio 2012 so I am using ASP.NET MVC 5.

Side note, this VS 2012 update includes some nice new features including the concept of round tripping so you can work with your MVC 5 projects in both VS 2012 and VS 2013 without having to change anything (sweet).

OK, I have gotten into the habit of checking NuGet before I attempt to bring new toys into my projects because it makes things so much easier. There is a NuGet to setup Foundation, Foundation5.MVC.Sass so pulling in the files I needed was a breeze. Setup on the other hand was a bear. For some reason I could not get the files to install correctly. Oh well, they downloaded to the solution package folder so I just had to manually copy them.

Manual Setup

First I created a Content folder in the root of my project. Then I opened the folder for the Foundation.Core.Sass package and copied the files from content/sass and dropped them in the Content folder in my project. This framework uses SASS. To compile the SASS files I install the Mindscape Web Workbench. This allows me to compile CSS files by just saving the scss file. So, I open and save the site.scss file to get the site.css created (you will see it directly under the .scss file.

Next, I set up the JavaScript. I open the conent folder in the Foundation.Core.Sass package and copied the Script folder to the root of my project. There are quite a few JavaScript files in there so I have to get them combined and minified to improve performance.


To do this, I used NuGet to install Microsoft.AspNet.Web.Optimization that provides JS and CSS bundling features to ASP.NET. Next, I create a BundleConfig.cs in the App_Start folder and add my bundles. Something like this

using System.Web;
using System.Web.Optimization;

namespace ZurbFoundationDotNetBase
public class BundleConfig
public static void RegisterBundles(BundleCollection bundles)
bundles.Add(new StyleBundle(“~/Content/css”).Include(“~/Content/site.css”));

bundles.Add(new ScriptBundle(“~/bundles/jquery”).Include(

bundles.Add(new ScriptBundle(“~/bundles/modernizr”).Include(

bundles.Add(new ScriptBundle(“~/bundles/foundation”).Include(

Next, I have to bootstrap this file by telling our Global.config to load it. I add this line under the RouteConfig line in the Global.asax.cs file


Views and Controllers

Since the MVC 5 update for VS 2012 only allows us to create empty MVC 5 projects, so I have to manually create the controllers and views. The NuGet install for Foundation also missed my MVC files so I have to move them from the package to the project. I open the content folder in the Foundation5.MVC.Sass package and copy the Views folder to the root of our project. The package doesn’t include a _ViewStart.cshtml so I create one and point the layout to the Shared/_Foundation.cshtml file.

 Layout = "~/Views/Shared/_Foundation.cshtml";

Next, I rename the Foundation_Index.cshtml to Index.cshtml. Then the last change in the View folder is to update the web.config so that the pages section looks something like this

<pages pageBaseType="System.Web.Mvc.WebViewPage">
 <add namespace="System.Web.Helpers" />
 <add namespace="System.Web.Mvc" />
 <add namespace="System.Web.Mvc.Ajax" />
 <add namespace="System.Web.Mvc.Html" />
 <add namespace="System.Web.Optimization" />
 <add namespace="System.Web.Routing" />
 <add namespace="System.Web.WebPages" />
 <add namespace="ZurbFoundationDotNetBase" />

Since there are no controllers I have to add one to manage the Home/Index view. In the Controllers folder I create an empty MVC 5 controller named HomeController.cs.


Well, that’s it. I was able to build and run the project and see the response index page. Now I have a base project for responsive web design using a less opinionated framework than Twitter Bootstrap. Although I will still advocate the use of Bootstrap, I believe Foundation will fit my personality and style of development a little better. If any of this interests you, I posted the solution on GitHub.