Mar 02

Build a Server 2016 using Desired State Configuration

Last night I was able to do a presentation for the Boston Area Windows Server User Group ( of which I am Vice President. It was a good meeting with lots of great questions an good feed back. I promised I would post my files on ( but Meetup no longer supports storing files there.

So I’m putting them here for now.

If you’re in the Boston area and are interested in Windows Boston, have a look at our website or sign up for a meeting on our Meetup page

Jan 13

PowerShell warnings with ASCII art

I thought I would share a little fun I had with some scripts I’ve written. The situation was that the script I was running was going to do something, which if not done correctly, or at the wrong time, would have disastrous consequences in a production environment. I didn’t want to use the standard warning tools provided with PowerShell because I didn’t want people mindlessly clicking the OK button and creating a LOT more work than would be necessary. My thought was to have a giant error which would take up the entire screen it would force people to pay attention.

Searching the internet I found this site ( which had ASCII art for text in a large array of different styles. For my purposes I liked “ANSI Shadow” using the word ‘WARNING’ I felt it was striking enough. I copied the text and I had the first part of my script. To that, I add the Instructions for what I want them to do.

I want to write that out it bold RED with a BLACK  background so I break it up by new lines “`n” and Write-Host

Next, I want the user to type an exact string to ensure they know what they are about to do. You can make this anything you want, but I like to keep in the idea of what is about to happen. Something like “I WANT TO DELETE ALL USERS IN CONTOSO.COM” If the user types this, they can’t say they didn’t know what they were doing.

To facilitate this, I use a while loop.

The ‘if’ statement is a friendly reminder that the reply needs to be in all capital letters. As long as they type it exactly as it appears, the script will go on. Otherwise it won’t go past this point.

Make things a bit more interesting…

Some of the characters in that original warning text are of a certain type. There are solid characters making up the majority of the letters and there is a collection of “boarder” characters. To identify these boarder characters I want to get the int equivalent for the char we see. To do this, I use the following syntax:

This identifies the character as type char and then converts it to its integer equivalent ([int]). This particular character gives us an integer value of 9565. Now I want to look at the other characters in that range. There are only so many “double-line boarder” characters so I choose all the characters between 9500 and 9600. To look at these characters and their values I run this:

What I’m doing here is taking all the numbers between 9500 and 9600 and looping through them one at a time writing the number and the char equivalent using string formatting (-f). “{0}{1}” the 0 is the first entry after the ‘-f or the number passed in ($_) converted to a string (.ToString()) and padded to the right 5 characters. I need to convert the number to a string in order to gain access to the ‘padright()’ method (all this is just for neatness. I could have easily just used “{0} {1}” since all the numbers are 4 digits anyways). In the next index {1} I take the number, Identify it as an integer ([int]) then convert it to a character ([char]) “$([char][int]$_))”.

All this tells me that the characters I want have integer values between 9552 and 9580.

Now that we know that, we can take each line, split it up by characters and if a character’s int value falls in between 9550 and 9580 we can apply a different color to it like this:

gives us this:

or DarkRed and Red gives us:

Using a different Font (Alligator2) and some more colors and this code:

You get the idea!

Oh, and if you want to get rid of the vertical lines (better seen in the first example, you’ll need to change your font to Lucida Console.

Jul 03

Back to blogging : PowerShell and Azure presentation

Hi All,

Been away for a while. I got a new job and I’ve been working like crazy. Also, I have a new son, so that’s been keeping me busy. I want to get back into blogging as I find it a great way to share helpful information and let people know what they can do with PowerShell.

I’ve got lots of ideas for posts, but for now, here is a video of a presentation I did for WindowsBoston on using PowerShell against Microsoft Azure. Enjoy

Sep 18

IT/DevConnections Day 3

dc14-header-logoWell, day three has come and gone. It has be a fantastic ride. I’ve learned a lot and it has been a great ride.

I started my last day with Jeff Guillet ( and “Build a Super-Fast Lab Exchange Lab for under $2,000.” I took copious amounts of notes for this; so much information. For Jeff’s presentation, he had a system which he outlines in his own blog ( Jeff was very thorough in his talk. He went over the pieces and parts as well as buying resources and sites he liked and didn’t like. What memory he liked and why he liked it. I’m not going to regurgitate as I’m sure he has already talked about much of this on his own site (

We also had some great discussions about setting up the environment and the tools he uses (I may see about tackling the setup using DSC). If you want to setup your own environment I’m going to suggest hitting his site for all the info you’ll need. I think of all the presentations I’ve been to this is probably the one I’m most interested in getting started with.

Next I went to “Rock your .NET coding” with David McCarter ( Now, this presentation was a bit out of my league as I’m not a professional developer. I do however write a lot of PowerShell code and I’m always trying to improve my code. Therefore a presentation on standards is a great place to go. While much of what was discussed is not directly applicable to me, it does make me think a bit harder about the code I do write, and that is not a bad thing. I will say, if you are a professional coder, you should take a look at Dave’s books and dvds. There was  room full of professional developers there and I feel like he stumped a lot of them with his examples. Proof positive that we can all learn more.

Next I went to “Building Custom tools Using PowerShell” by Kaido Jarvemets and Greg Ramsey. a good portion of this presentation was based around Configuration Manager which, unfortunately, was not mentioned in the description. That being said there was some great topics being discussed. Adding right-clicking capabilities to Configuration Manager which calls PowerShell scripts, Utilizing WPF to easily generate PowerShell GUIs, great for the not-so-PowerShell-Friendly-Admin. Lastly they talked about WMI events and creating actions based on those events (send an email, log the event…). I’ll probably need to go over this stuff myself as some pieces went very fast.

20140918_154654_2The last event was the most fun! “Ask the Exchange Experts” with a panel of Experts and some of the production team in the back of the room piping in as needed. Lots of questions and lots of level-headed answers. This was a lot of fun and I picked up a few things to think about.

And with that I have to say good-bye to IT/DEV Connections for 2014. I learned a lot of stuff that me and my company will be able to benefit from. I’m glad I did it. I wish I could do it more. In this industry you can never stop learning. Fortunately for me, I like the learning.

Thanks everyone and I hope to see you next year!


Sep 17

IT/DevConnections Day 2

dc14-header-logoLots of stuff being crammed into my brain.

Started today with a Jeffry Snover (@jsnover) presentation on Just Enough Admin (JEA) which I had seen in passing but hadn’t really delved into too. Once the explanation got going, I realized the name really was a good identifier of what JEA is. JEA is, basically, not so much taking away admin privileges, but more about only giving the admins what they need to fulfill their role. Just because someone should be able to patch a system or reboot or change an IP doesn’t mean they should be able to read all the (potentially confidential) files on that system. So the “Super-User” should probably go away in favor of the role based administration and JEA is used to make that kind of configuration easily available.

The JEA makes creating a server role for patching, or setting up SMB shares easier to setup the same way Desired State Configuration (DSC) makes it easy to setup a farm of IIS servers with a specific configuration. In fact, JEA uses DSC for it’s implementation. Jeffry was quick to point out that the JEA toolkit is in an Experimental stage (denoted by the ‘x’ in the front of the module name ‘xJEA’) so it may not be 100% for production environments but the concept is solid and, I think, one that should at least be investigated.

The second half of the presentation was a 400 level breakdown of some of the pieces and parts that I’ll need to go over and experiment with before I really have it down. As a bonus, there were some Segways into some of the new features of PowerShell v5. Again, most stuff to learn.

All in all, a great presentation. Just the time with Jeffrey is worth the money for the convention. I dare you to walk away from a Snover presentation on PowerShell and not get excited about it!

After the JEA presentation I went into Rick Claus (@RicksterCDN, on Storage Spaces, Scale Out File Server and SMB 3.0 (the “Fire-breathing Dragon”). Lots of great insights here on the state of things. Also, found out that according to Rick, Amazon and Microsoft don’t use any SANs in their cloud solutions because they are cost prohibitive at that scale. It is much easier, and easier on the wallet to have these large JBODs (Just a Bunch Of Disks) and utilized the storage capabilities of Windows Server 2012.

Rick’s presentation had side by side feature comparisons for SANs and Windows Server 2012 also a good discussion on disk tiering which is using SSDs for busy I/Os (Hot disk) and standard spinals for less busy I/O (Cold disk). The system can move data from one set of disks to the other without the accessing system having any idea of what was going on. Best part, he gave us the scripts and requirements to set these environments up with a USB SSD on a regular ol’ laptop. I love takeaways like that!

Next, I went to this “Mary-Jo Foley and Paul Thurrott on the State of Microsoft” presentation. Now, I’m not really familiar with Mary-Jo or Paul and I may have had preconceived notions of what their presentation was going to be but I found that in the first 10 minutes or so and more “We don’t know but…” statements than I wanted to. “We don’t know if Windows 9 is going to have feature X or not but we have  a screen shot from the [always trustworthy] Internet” or “We’re looking forward to a Microsoft presentation on [datetime]. We don’t know what they’ll say but…” So, I was less than interested in what they didn’t know so I left early.

As a result I found out where I should have been from the beginning, and that’s in Tim McMichael’s “Exchange 2013 Site Resilience” presentation. Boy if you wanted to know about Exchange DAGs and Clustering, Tim is the guy to follow ( Unfortunately for me, I haven’t looked into Exchange 2013 very much, because there isn’t a plan at my company to move to it at this time so I haven’t spent the cycles. Tim went through a whole host of scenarios for Exchange 2013 DAG and cluster failures including the option for a third site <shock>.

I ended my day with Brian Desmond (AD MVP,, @BrianDesmond) talking about all things ADFS and Federation and Microsoft’s new tool which will replace DirSync AADST (Azure Active Directory Sync Tool). It was good to go over this stuff and to know about where AADST isn’t as mature as DirSync and what kinds of things to expect.

Brian made a good point during his presentation, He said ADFS servers should be treated with the same level of security as domain controllers. After all they are holding on to potentially important information which, just like a DC shouldn’t be available on the Internet.

Another great day and I still feel like I’ve gotten my money’s worth. One more day for me. I’ll have to try and get everything I can out of it.


Sep 16

IT/DevConnections Day 1

dc14-header-logoToday was day one (for me) of ITDevConnections 2014 held in Las Vegas. I wanted to do a quick post of some of the sessions I went through and some of the things I learned.

It started at breakfast I started talking to a couple of guys who work for a company doing a new form of marketing which I thought was interesting. I may not be saying this correctly, but the gist was they link banks with various companies such that when you use your ATM card at, in the example we talked about, Home Depot, you would automatically get a coupon applied to your order. Not exactly at that time, but some time down the road, the bank would apply the money you saved back to your account, like a refund. Couponless coupons I think they called it. Anyways, interesting.

My first session was a presentation Mark Minasi (, @mminasi) called “Windows Clusters for Beginners: From Highly Fearful to Highly Reliable in 75 Minutes!” Now, I’ve used clusters before but generally only how the pertain to Exchange. I went to this one hoping to get some new info maybe I didn’t know before. It’s always good to go over things, especially from an expert like Mark.

Mark has a great presentation style, very clear very concise and very engaging. He took the time to talk with everyone before the presentation to get a little info on them and what they wanted to get out of the presentation. The presentation was very much a starting point for learning about Clusters. For me, it was good to go over things. Like I said, I don’t really live in Clustering, and it was helpful to hear the history and how the bits and pieces worked. Mark is very good at presenting complex material in a straight forward way. If you get a chance to see one of his presentations, I would do it. He also presents on (If you don’t know what pluralsight is, check it out).

Next, was a presentation by Andy Malone @AndyMalone. Andy is an MVP for Security and now a published Sci-Fi author (The Seventh Day). Andy’s presentation “Office 365: Migrating Your Business to Office 365” went through all the various ways in which mailboxes can be migrated from using pst files to hybrid. There was only so much time, and really a lot to cover and Andy got it all in, complete with demos. Along the way Andy gave out some key pieces of intel. which anyone doing a migration to Office 365 would like to have.

  • 9 out of 10 errors come from DNS issues (IMAP migrations)
  • OST files are recreated so be ready for that.
  • Where DirSync is needed and when it is not.
  • Dynamic Distribution Lists don’t migrate in a staged migration nor is Send-As rights.
  • And more…

Lots of things to go over. There is a Hybrid migration presentation coming up that I’ll have to go to (If there isn’t something else I’m interested in more)

During lunch, the conversation was about land owners not having mineral rights in the North Dakota areas where they’re doing fracking and how much Cobalt coders are making because no one wants to code in Cobalt! You meet interesting people at these conventions.Bxr0anxCMAAfAi5

After lunch was a REAL treat. One of the reasons I came to the ITDevConnection convention… Jeffry Snover with Hemant Mahawar presenting on “PowerShell Desired State Configuration for Securing Systems.” Jeffry called it “Chewy,” as in lots of information to chew on and boy was he right. The rough concept is you’re environment is hacked <period, end of story>. Here is an easy way to create a secure, cocoon-like area where people can do their work. In short, you create a subdomain of the current domain and, using PowerShell and DSC, create a new environment where “the bad guys” can’t get into. basically strip out the domain admin permissions on systems, setup a “Jumpbox” (a system that administrators need to go through) using PowerShell remoting that is stripped down to only the commands they need and only the end users can read/edit/delete files. In the example we were working with file systems. Here is a slide Jeffry retweeted from someone in the audience: PowerShell DSC for securing systems slide.

I’m sure I’m not doing his presentation justice, so please don’t go by what I say alone. It was a great presentation plus we got to talk about some of the great new features in PowerShell v5 like classes! Such great stuff here. If you’re not using PowerShell you’re wasting your time.

The last presentation for the day, for me, was “MAPI/HTTP in Depth” with Bhargav Shukla who works for Kemp Technologies. This may have been a bit too in-depth for me for the end of the day. I may still have been thinking about the DSC presentation previous. Bhargav did go over a lot of information about the transition from MAPI wrapped in RPC wrapped in HTTP to MAPI under HTTP and where the pros and cons of it is. It seems as though you get better performance and better end-user experience with MAPI over HTTP but there is a higher processor cost on the Exchange CAS servers. In the long run, it may be worth it to make this change. I would speculate the change isn’t going away anytime soon.

So, It was a great day, I learned a ton of stuff and I feel like my first day alone was worth the trip. Did I mention I’m paying for all of this and not my employer or anyone else. I’m doing this for me, so I can be better at what I do and it is totally worth it. Should have started doing this years ago.

Thanks for reading and stay tuned for day 2 & 3!


May 23

What’s in my profile

Recently the Scripting Guy! posted some MVPs and PFEs’ profiles. My own profile, generall fairly small was doubled after reading the PFEs’ profiles. So, I thought I would share my profile.

Initially, before reading the Scripting Guy blog, I had two functions Get-TitleBar and Set-TitleBar. These two functions have been a life saver. I generally work with several PowerShell windows open at one time. I have one for handling all my message filtering in Outlook. One for “General” stuff (usually testing) and at least 3 PowerShell windows which are used exclusively for connecting to Exchange. Using Set-TitleBar makes my life much easier as I set the titlebar to whatever project that windows is exclusively set for. So, my Outlook window is called “Outlook”, my General shell is called “General.” You get the idea.

The rest of it is all stuff I added from the blogs mentioned above. I’m looking forward to using the PSProfiler!

Here is my profile:


Apr 24

Rename a distribution group in Exchange with Powershell

Recently I was asked to rename some distribution groups in Exchange. Not so tough of a problem, but painstaking to do one at a time and there were a few of these to do. So I did what any good admin should do, I wrote a script to take care of this.

The issues
To properly rename a distribution group, you need to not only change the name of the group, but also the Alias, DisplayName and entries in the EmailAddresses field. The first two are easy but if you’re like me, you need to add a handful of emailaddresses to these fields. In our environment, we tend to add records when a name is changed rather than swap the old for the new. This way, if someone uses the old address, it still goes to where they want it too. Probably a better answer would be to create a mailbox with the old address and setup an auto reply that says “Hey, use the new address for this list.” That’ll be for another day.

Anyways, here is the script…
I should note, that I’m keeping this function (along with some other Exchange functions on github. you can get the most recent version of this script here:


Some things worth noting

First of all, I’m working trying to make every bit of code I write more of a tool for others rather than something I use in my environment only. As a result, I’m utilizing ShouldContinue and ShouldProcess more. In this script I use them twice. First when I change the Name, Alias and DisplayName fields

The second time I use ShouldProcess and ShouldContinue is when I set the EmailAddresses field.

Also, you’ll notice there isn’t much in the way of actual comments. This is a bit of a departure for me as I love my comments. But, rather than use comments, I’ve decided to use Write-Verbose so that everyone can share in “What should be happening.”

Let me know if you have any questions, or if you think there is a better way.


Mar 27

The 5 ways in which PowerShell can be used to make your work easier

PowerShell PowerShell PowerShell… It’s everywhere! You can’t read up on anything in Microsoft (or elsewhere) where it doesn’t talk about how PowerShell is a major part of the subject and yet, there are still lots of admins and engineers out there not using PowerShell. One response I got as to why this is the case was “My way still works, why should I change?” This response will probably come from someone who thinks the cloud is just a fad (Hint: It is not a fad).

I thought I would give a synopsis of the different ways in which you can use PowerShell.

#1 Automation

This you’ve probably heard, but it always bears repeating. PowerShell is an automation machine. Not only is it an automation machine, it does automation faster, easier and cleaner than anything previous (Sorry VBScripters… it true). Scripts I wrote in VBScript years ago I’ve since recreated in PowerShell in less than half the amount of lines. Of course you realize less code = less errors = less time spent working on those scripts. If you are not automating with PowerShell, you’re working too hard.

#2 Gathering Information

PowerShell is a fantastic tool for gathering information. Not only will it gather that information from multiple sources, it will put it all together and output it any way you want all with more than a single line of code. Your boss wants to work on something he or she things will be a huge undertaking? Gather the information with PowerShell and let your boss know it will or won’t be a huge undertaking in a few minutes. How many users have never logged on? How many Group Policies are not linked anywhere? How many Exchange mailboxes aren’t being used? Find this information easily with PowerShell. I don’t think your GUI will tell you this stuff.

#3 Reactive

Something just broke and it’s your job to find out what it was. I have a collection of little functions I call from a “Parent” function any time something goes wrong in my Exchange environment. It checks disk space, mail queues, running services to name a few. When someone comes into my office with a problem, the first thing I do is run my script. Any red flags are checked on and in some cases we’ve gotten back up an running in no time. As issues (and their resolutions) come up, I add it to my script. PowerShell is a building process you don’t have to have all the answers day 1. As you build your library, your downtime will shrink and everyone likes that.

#4 Proactive

Before anything goes wrong, before the red phone rings, be proactive with PowerShell. You can easily setup PowerShell to check your event logs on a nightly bases and send you (or your team) an email summary of events which may be of concern. Got System Center running? Add a PowerShell script which gathers get-process information when the Operations Manager says the system is out of memory (ok, that might be reactive, but getting that little bit of info in System Center is proactive). Found a recurring problem write a script to check things on a regular interval. DSC anyone?

#5 PowerShell working for PowerShell

This is the BEST! This is breaking down the fourth wall of PowerShell! This is where you make PowerShell REALLY work for you and itself. Here is an example: A while back I needed to write a script which would run on a nightly bases and make regular changes to hundreds of groups. I won’t go into details, but think of it as we gave instructions to hundreds of admins and this was our insurance policy that things were done right. Anyway, I’m working on the script, I add copious amounts of logging. One log for changes made, one for errors that occurred and one super verbose for me in case I needed to make a change. At the time there was potential that the changes this script would make would break user’s functionality. I started to think about that. What I did was when it came time to make the change, and the change was successful, I also wrote a command to a script file which reversed the change. Not only was I reactively automating the change, I was proactively writing the script which would reverse the change if there was ever a problem!

Here is another way PowerShell can make your life easier… I often need to gather messaging logs from Exchange. To do this, with a certain amount of specificity, I need to write a single line of PowerShell which is generally fairly long. It’s not much use to write a function for it because calling the function wouldn’t be much shorter than the original. So, I wrote an “Example module.” A module where I call various commands to get *My* examples for commands. All these functions do is write to the screen (color coordinated of course) my examples. All I need to do then is copy the one most like what I need, paste at the command line and make a few changes.

Windows PowerShell for Developers

There are dozens of other ways you can get PowerShell to work for you. Get “Windows PowerShell for Developers” by Douglas Finke (Microsoft MVP) if you really want to get an idea of the kinds of things you can do to make PowerShell work for you.

PowerShell is really an amazing tool. Its versatility lends itself to great creativity.

For those who haven’t started with PowerShell, don’t feel overwhelmed. PowerShell (like ANY language) is a building process. Don’t wory, you’ll get there and you will be so happy when you do.

If you know of other ways in which PowerShell can be used, I’d love to hear them.

Thanks for reading

Mar 20

Why you should make your PowerShell scripts publicly available

In my environment, I am the “PowerShell guy.” Just about everyone around me still does things the old fashion way and sometimes, I think they roll their eyes a little bit when I mention writing a script or doing it in PowerShell. Some people are coming around and they are learning with a bit of help from me (future post: Why you should teach at EVERY opportunity).

As a result of scripting in a bubble, I write a lot of scripts and functions for me and me alone. Because I know exactly how some piece of code works, I don’t really worry about all the help that should be in there, or worry about systems which are not setup the same as me. You know… I get sloppy! This is a bad thing.

The other day I participated in a short GitHub workshop which was just enough to get me interested and I’m hooked on the idea. Not only in the idea of having versioning and branching and all that great stuff, but putting *my* work, out “there.” Anyone can see what I put together. As a test, I created a repository for my “RBACHelper” module. This is a small module of a few functions which I find very helpful when working with Exchange Role Based Access Controls. Shortly after putting up the initial version ( I realized there were large sections of commented code full of little tests I had in place while developing and there wasn’t much if any help information.

I quickly made some changes and updated my module because I was embarrassed by the less than professional appearance of my code. Now, if you are a person who writes code, hopefully you are the kind of coder who strives to be better all the time. Maybe we’ll get there, maybe the journey will never end. For me, making my code publicly available on a wider stage (outside my little blog here) will force me to be a better coder and not so disheveled.

I welcome the challenge, and I recommend you do too… if you want to be a better coder.