Saturday, January 31, 2009

Likening to Dave Edmunds

Many, many years ago, I spent a bank holiday in Amsterdam with a life friend. We had a pretty outrageous time. The highlight was seeing Dave Edmunds in the Paradiso. We had outstanding seats and the evening was magic. It was the early 1980’s and Dave was in his prime. The sound was outstanding. The environment was even better!

I was reminded of this when I came across a bootleg concert featuring Dave and The Refreshments (a Swedish band). The show features Billy Bremner. It’s so wonderful to hear tunes like “I Knew The Bride”, “Here comes the Weekend” and “Standing at the Crossroads”. It’s a real shame Debra is not on their set list. If you can find this show (it’s on a few popular torrent sites) it’s worth the download.

Let it Roll!

Friday, January 30, 2009

The MCSE Is Dead – It’s Official!

In June 2007, I posted that MSL was planning on ditching the MCSE. It was not a popular post – within a few hours of it going up, I was asked by MS to take it down. Well, now Bill Chapman has confirmed: The MCSE IS DEAD.

This was disappointing a couple of years ago, and is disappointing now. Microsoft has taken a hugely popular and well understand brand and replaced it with, well what? Replacing a credential with “Engineer” in the title with “Administrator”??? How is that an improvement?. Does MSL really expect anyone to get excited about “upgrading” an Engineer credential with an Administrator credential?

I was one of the first MCSEs in the world, and am still quite proud of that credential. I’m not sure if I could ever get 1% as excited about being an Enterprise Administrator. It just sounds a step down and backwards. As a result, I’m just not chasing the new credentials any where near as fast as I used to. I know I am not alone.

I think Bill and the rest of MSL has made a huge mistake with killing off the MCSE. The fact that he’s heard the question (“Where is the MCSE 2008”) on a daily basis indicates the market is not anywhere near enough clued up on MSLs rebranding (and like “New Coke” don’t accept it). Despite several years of trying, the MCSE is sill a better known credential than Enterprise Administrator.

I spent some time in the autumn of 2007 trying to convince MSL that they needed to invest in evangelising the new credentials, otherwise, in a year or two’s time, the new credentials would be still unknown. Two years on, MSL still hears ‘the question’ on a daily basis (or at least Bill does).

MSL made a mistake – it’s time to correct that mistake and bring back the MCSE. Learning from one’s mistakes is hard to do, but Coke did it! MSL should take a hint and do the right thing.

Technorati Tags: ,,

Thursday, January 29, 2009

Get-Scripting Podcast Episode 7 – Available Now

The latest edition of the Get-Scripting podcast is now available for your listening pleasure. The interview this month is with your truly – Alan had his pod-mike with him at the last UK PowerShell user group meeting and we chatted about things PowerShell. It was a lot of fun.

One of the things we chatted about was when I first saw PowerShell.I was in the back of a big hall in LA, attending PDC 2003, and watched Jim Truer and Jeffrey Snover give Monad, as it was then called, its initial airing. I got quite excited – and waved a US$20 bill in the air and proclaimed “I’ll buy it now”! Jeffrey wisely declined, but many years later (and a bit worse for wear), I finally presented the $20 to him at an event in Redmond. You can see the $20 here and here

It was one of those events I’ll not forget – it began a passion that continues to this day. As I say on the interview, nothing that has happened since – the community, the wealth of cmdlets, etc – has been a surprise at all. About the only surprise is that it’s taken a bit longer than I’d have hoped.

Enjoy the interview!

Technorati Tags: ,

Wednesday, January 28, 2009

PowerShell Virtual User Group Meeting

If you are around and on line on Thursday evening (12:00 EST, or 17:00 UK time), check out the Windows PowerShell Virtual User Group Meeting #8. This meeting will feature June Blender from Microsoft – no doubt she’ll be talking about the PowerShell Documentation.

Go to the site to get the credentials to the Live Meeting. No idea if the session is being recorded!


I just had mail from Marco – turns out he records all the meetings. See his master page at: for all the back recordings!


QDB: Quote #244321- The joys of social engineering

I’ve long enjoyed examples of social engineering, hoping I’d never fall for them. The IRC chat, at QDB: Quote #244321, is a great example of this. Take a read (I had to read it twice!).

Technorati Tags:

Tuesday, January 27, 2009

PowerShell Code Repository - Get-Exception

I’ve been working with PowerShell’s error handling – trying to get a better handle on it. Yesterday, I looked at general error handling and discussed how to trap or catch an exception. By handling errors in your scripts, you make the script more production ready. And for many, this is a great thing!

One of the problems in trapping or catching an error is knowing what to look for. You can do a blanket catch, such as:

  1. try { 
  2. ... some dodgy bit of code...  
  3. catch { 
  4.  "code broke..." 
  5. }

At one level this works. You can catch an error and avoid your script dying – and maybe returning some interesting info. But there’s often a need to do a better job in detecting different exceptions and handling them differently. As I noted yesterday, a simple script could result in a number of discreet different errors each of which might mean different actions in your script. Along these lines (with PowerShell V2 of course):

  1. try { 
  2.   ... some dodgy bit of code...  
  3. catch [... exception 1]{ 
  4.   "code broke due to exception 1..." 
  5. catch [... exception 2]{ 
  6.   "code broke due to exception 2..." 
  7. catch {
  8.     “no idea why your code broke…”
  9. } 

In this second example, you are explicitly testing for two specific exceptions as well as a more general one. The idea here is that for the first two exceptions, you might be able to find some specific information to help the caller (or you) work out what went wrong and deal with it.

The big problem is knowing what exceptions you can trap. I saw a cool script today online - PowerShell Code Repository - Get-Exception. This script looks at all the loaded assemblies and works out what possible exceptions exist and then outputs them. Makes interesting reading and is a great reference.

Technorati Tags: ,

Monday, January 26, 2009

Error Handling with PowerShell

At last week’s UK PowerShell User Group meeting, we heard a great talk on Error handling. It got me thinking about the whole business of errors and error handling in PowerShell.

One of the great differences between casual and production scripts is the need to manage, control and handle errors. If I write a script to open a file, eg c:\foo\gd.txt, I know it exists. I created it, I regularly edit it, and I can see it in my folder. So why bother with error handling?

Tor casual scripting the answer is probably that you don’t need to worry. But when you start moving scripts into production, you can’t always be so sure. Stuff happens in production and your scripts need to deal with that.

There are three sets of error handling statements included in the PowerShell language:

  • Trap – allows you to trap any errors that occur in your code. You define the trap block before any risky code is executed. Trap is part of V1 and included in V2.
  • Try/Catch/Finally – these three statements allow you to try some dodgy code, catch any errors that occur in that dodgy code then do any clean up (whether or not an error occurred). Try/catch/finally is an addition to V2 and is not supported in V1.
  • Throw – this allows you to detect an error (for example in a subordinate script or function) and throw an error for a higher level script or function to catch or trap. This statement is in both V1 and V2.

So which do you use? It depends I suppose. For most scripts, you probably won’t need to use Throw. If you have some utility function, you might enable it to catch specific errors then throw an exception to callers of that code. Typically a throw is inside of your subordinate’s trap or catch block. You handle the error, perhaps do some clue generation (ie is the file you are trying to open, or is the drive just not accessible, etc) then throwing a more specific error. Try/catch (and finally if needed) should surround any production script logic that could fail.

To illustrate this point, take a look at the Get-StockQuote script (see for full script). At the core of this script is the following code:

Process {  
$s = New-WebServiceProxy –uri  
foreach ($symbol in $ticker) {  
$result = [xml]$s.GetQuote($symbol)  

There are (at least!) two things that could go wrong with this fragment. First, the New-WebsServiceProxy cmdlet could fail – in fact, of late, this call has been producing errors (the site appears down). Second, the GetQuote web service could fail. There is a third source of error ($ticker is empty), but that would be dealt with via parameter declarations, which I omitted form this snippet!

As a casual script, just showing the basics of using this web service, I don’t need to worry about such things. But if this was a web service I depended on (and is outside my direct control!), I should either have one (or possibly) to trap blocks, or two try/catch blocks. I could expand this code as follows:

process {  
   try { 
      $s = new-webserviceproxy -uri

   catch { 
    "StockQuote web service not available"; $error[0] 
 foreach ($symbol in $ticker) {  
    try  { 
      $result = [xml]$s.GetQuote($symbol)  
    catch { 
      return ("Can not get stock quote for {0}" -f $symbol) 
    # return results 

This code is a bit longer, but it enables the script to run and not abort. When errors do occur, this script fragement won’t produce the results you might have hoped for. In both catch blocks, I’ve not added much in the way of error handling. In the first catch block, some additional detection might include:

  • Checking the host IP configuration to see if the host is on-line and host networking is enabled.  There might be a local issue with a cable being unplugged.
  • Checking the local IP gateway. Networking might be OK, but the gateway might be down.
  • Determining if the site name ( was resolving via DNS to an IP address. Networking and gateways might be OK, but the DNS resolution of the site might be down.
  • Checking policy to see if the site’s IP address is being blocked. You might be able to ask the firewall(s) if they’re blocking traffic to the service.
  • Checking For SSL/TLS protected web services – can you create the secure tunnel? Is the certificate protecting the site duff?
  • Pinging the site to see if it’s actually up. DNS may resolve, but the site might be down.
  • Opening the home page for the site to see if there’s a working web site at the root. The site might be up but the web service gone.
  • Etc. I suspect there are more things you could check for.

Moving on in the code, with the second catch block, you could have checked the error details to produce a more relevant exception. Simply returning an instance of $Error puts the onus on the caller to handle the error. Also on both catch blocks, you could have thrown an exception for the caller to handle rather than just continuing.

As I think you can see, a pretty simple script that just creates a web proxy then calls it (effectively 2 lines of code) could end up with 10 times that many lines of error checking/handling code. This is amongst one aspect of writing production oriented PowerShell scripts.

Something related to this, of course, is the $Error variable. This is a subject worthy of an entire book chapter!

I’m Going to TechEd US!

Every year, like many of you, I submit proposals for TechEd. In recent years, they’ve been less than successful – the competition is very fierce and there is so much great content to choose from. I almost have to feel sorry for the Microsoft folks having to wade through hundreds of proposals.

This year, I submitted three talks, two around PowerShell and one around OCS. To my great surprise and delight, one has thus far been accepted! YEAH

The accepted break out sessions, part of the Unified Communications track, is entitled “SIP - Naked In All Its Glory”. The abstract for the talk is:

This session will look at the key protocols behind Microsoft's OCS product, in particular Session Initiaion Protocol as well as Real Time Transfer Protocol (RTP) and Session Description Protocol. A short discussion on TCP, TLS, and PSOM will also be given. The talk will focus on looking at the SIP and related protocols "on the wire". We'll dive in to using NetMon as well as OCS's Snooper tool. Troubleshooting SIP will be the final part of this talk.

I’m still waiting on the other two talk proposals. The other proposed talks are PowerShell related and are titled: “PowerShell and WMI”, and “Writing Production Quality Power Shell Scripts with PowerShell V2 and Windows Serer 2008 R2”. I await the verdict on those two talks, but would like to think that at least one will get accepted

Irrespective of the PowerShell talks, I’m really excited that I’ll be a speaker again. LA, here I come!

Technorati Tags: ,,,,

Sunday, January 25, 2009

Pismo File Mount Audit Package

I’ve just got a new laptop (Dell Lattitude E6500) and am running Win7 Beta. The combination is awesome – my old laptop (running XP) looks so old and dated.

One thing I’ve relied on for many years is Daemon Tools. I use this to mount ISO images of software and then use them. However, Daemon Tools does not install on Win7 (you get a known incomptibilty error message when you try to install it). From reading their forums, it looks like Win7 support will come, but not any time soon. In the mean while, I need a solution,

So I sent out a twitter message and within a few seconds, I had a reply from Chris Randall pointing me to Pismo Technic Inc. - Pismo File Mount Audit Package.

This is a neat package. Not only can it mount ISO images, but it also mounts VHDs and Zip files. This is a seriously nice package. Another cool thing about this tool is that’s free! I’ve downloaded and installed it, and have used it to install Office on my new laptop.

Thanks Chris, and thanks Pismo!

Technorati Tags: ,,

Saturday, January 24, 2009

More on PowerShell’s #Requires Statement

I recall the moment of amusement when Jeffrey Snover revealed the #Requires PowerShell statement when the first release of PowerShell V2 occurred. It had been in PowerShell V1, but was a very tight secret. As it turns out, with V2’s beta, the #Requires statement had additional functionality, as I pointed out in a recent blog post.  With the release of V2 coming sooner rather than later, the question is whether this statement is rich enough (and whether it can get improved in V2). But let me explain what I mean by that.

The concept that a script or even a function/cmdlet requires some set of pre-requisites is obvious. If, for example, I have a script that relies on the Quest AD cmdlets, then I’d like to impose a rule that says the script needs those cmdlets. There’s no point me running any script that requires something not present on my system. I belive I should not only be able to describe this requirement, and that autohelp should provide that information.

The #Requires statement come some way towards meeting this general requirement. But I think it needs more than that. For me, the question becomes: what does a script need to ‘require’ and more importantly, how does it signal that need? As I see it, that means two separate questions: what do you NEED, distinct from how to express that requirement.

The second question taken first – I am not sure that just using the '#Requires statement is enough. As I think about it the need should be expressed in Powershell autohelp. Autohelp is that set of instructions that you can put at the start of a function, module or script that can leter provide help for users of your code. I realise that the intention of autohelp was to, oddly enough, help. It seems to me that this can also be used to express the need that chunk of functionality  might have a module/script/function for some pre-req. So I think that the should be a .REQUIRES statment in auto help, for example:

  1. <# 
  2. ... 
  3. .Requires 
  4.     VERSION 2 
  5.     MODULE Quest-ADTools 
  6.     DLL tfl.reallycoolDLL.dl
  7.     ELEVATION
  8.     ASSEMBLY globalknowledge.powershell.ocs2007r2
  9. ... 
  10. #> 

You should not only document the requirements but demand them in this Autohelp block (ie the autohelp block is more than just documentation). This would be a very elegant solution.

But separate to HOW you express the pre-reqs, I think V2 needs to expand what #Requires provides. Specially, I’d like to see the #REQUIRES statement expanded to allow you to specify:

  • A specific module
  • Elevation (should it need to run elevated)
  • A specific named DLL
  • A specify Assembly (i.e. in the GAC)
  • etc

I’d like to see the advanced function’s autohelp block be really useful!

Technorati Tags: ,

Blog Posting This Week

It looks like some of my daily blog posts did not quite make it up to their sites on time. The posts (for both this blog and my PowerSHell Scripts blog) were written off line and I thought had been successfully posted. My Blogger-fu must be weak this week, but the latest scripts and PowerShell scripts are now up and running.

I’d love to have the time to spend to write some PowerShell scripts against Live Writer and Blogger to check that posts actually hit the blog properly! Maybe in another life when days are 40+ hours long!

Friday, January 23, 2009

Get-Scripting Podcast

I recently wrote about a cool weekly Podcast, The PowerScripting Podcast, delivered by Jonathan Walz and Hal Rottenberg. This is a great resource, and you can watch it in progress every Friday Morning at 0200 UK time. For most normal folks, this may be a bit much to ask, but you can of course download it later. One interesting aspect – the podcast is broadcast with a live chat going on. Sadly, this week, that failed, as noted over on the Podcast’s blog site.

More recently, I discovered a UK podcast, the Get-Scripting Podcast presented by Jonathan Medd and recently co-hosted by Alan Renouf. This is a monthly podcast and is prepared partly in advance (the interviews are pre-recorded). You can download this podcast from the blog site,, or via iTunes.

At last week’s PowerShell use group meeting, I was interviewed for the Get-Scripting podcast. If I understand it correctly, the interview will be included in the next podcast, due for release this week. It was a lot of fun to record this interview. I spent some time talking about my first experience with PowerShell (over 5 years ago when it was still called Monad) – I hope the $20 dollar story will be interesting to some. I also spent some time talking about the Microsoft PowerShell class which I helped develop.

Thursday, January 22, 2009

Interview for the Get-Scripting PodCast

As I noted yesterday, Last night, I attended a great PowerShell user group on Friday night. One highlight that I forgot to mention was that I was interviewed for the Get-Scripting podcast. This will be broadcast sometime next week.



Wednesday, January 21, 2009

Group Policy PowerShell Cmdlets in Windows 7

I’ve been writing this week about using GPOs with PowerShell. I noticed today, over on the Group Policy Team blog a neat article: Introduction to Windows PowerShell Cmdlets in Windows 7. With Windows 7 and Server 2008 R2, we’ll have decent cmdlet support for Group Policy – no more hard core COM objects.

The group policy support is meant to be implemented by a module (grouppolicy), although that module does not exist in the beta versions of either Windows 7 or Server 2008 R2.

I’m looking forward to seeing this module when it gets completed!

Technorati tags: , ,

Tuesday, January 20, 2009

The Dead to Perform at Obama Inauguration

On several levels, this is pretty cool. What’s left of the Grateful Dead will play today at the inauguration of Barack Obama. See See Relix Magazine’s article - The Dead to Perform at Obama Inauguration. I’ll be in class all day, but I sure hope to catch this!

It turns out this story is a good excuse to update all my PowerShell scripts that I use to manage my Grateful Dead/Jerry Garcia archive. I’ve got getting on for a 1.5 terabytes of live shows by The Grateful Dead and Jerry Garcia (in his many incarnations as a solo act apart from the Dead). I developed a few scripts in the .MSH days and I need to dust them off and update them.

Time for a DEADHEAD Module? :-)

Technorati tags: , ,

Monday, January 19, 2009

More on Arrays vs Scalars

In a post yesterday, I looked at telling the difference between a scalar and arrays. The difference matters when working with PowerShell – a scalar has properties and methods, while an array has members that have properties and methods. You can directly reach into a scalar, whereas you need to process array members (e.g. using ForEach). If you assign a variable to the output of a cmdlet or a pipeline of cmdlets, you may end up with zero, one or more objects inside that variable. When you go to process that variable in your script/function, you need to know the differences.

In yesterday’s post, I noted that with GPO objects, what you got back was a collection, even if the collection had only one object. A comment left on the blog made a great point: this is a feature of the particular object model (i.e. GPMC objects). It’s not a feature of COM as such.
And this just underlines my conclusion – there’s no easy way to tell what you have, without knowing the object and underlying object model.

Sunday, January 18, 2009

Managing a Music archive with PowerShell

I have a lot of digital music, as many of my friends, and readers of this blog, know. I’ve been slowly getting around to writing some PowerShell scripts to manage this archive. This afternoon, someone asked me how many Jerry Garcia shows I had on my system. I could not recall off the top of my head, so I wrote a short script to do the counting, and published this over on my PowerShell Scripts blog. The Get-CountOfJerryShows.ps1 scritp may not, perhaps, be the most elegant of PowerShell scripts. And I know I could do it in fewer lines. But it works and it’s something I am likely to be able to read and understand in the future, if/when I want to update it.

The answer, thanks to PowerShell: I currently have 734 live Jerry Shows! This is in addition to my 1144 Grateful Dead shows.

The latter is a lot simpler, as I just store all the Grateful Dead Shows in a single folder. To get a count of Dead shows, I just type:

   1: cd m:\gd;(ls | where {$_.Name -match "^gd"}).count
Technorati Tags: ,

Determining If An Object Is An Array

Over on Hal Rotettenberg’s blog (TechProsaic), he recently posted an interesting article about how to tell if an object in PowerShell is an array or a scalar. His solution was to use the Get-Type method, and to look for the BaseType property of the object. Like this:

PS C:\foo> $array = 1,2,3
PS C:\foo> $scalar = 1
PS C:\foo> $array.GetType()

IsPublic IsSerial Name                                     BaseType
-------- -------- ----                                     --------
True     True     Object[]                                 System.Array

PS C:\foo> $scalar.GetType()

IsPublic IsSerial Name                                     BaseType
-------- -------- ----                                     --------
True     True     Int32                                    System.ValueType

Looking carefully at what the GetType method returns, there’s an “IsArray” property, as follows:

PS C:\foo> $a
PS C:\foo> $a.GetType().IsArray

Pretty cool and to some degree more .NET than an another approach I’ve used. In some scripts, I’ve just checked for the existance of the count property. An array has one, but a scalar doesn’t. I do something like this:

PS C:\foo> if ($a.count) {"array"} else {"scalar"}
PS C:\foo> if ($b.count) {"array"} else {"scalar"}

These three approaches (basetype, IsArray and checking for count) all work – they can help to determine whether an object (i.e. a PowerShell variable) is an array or just a scalar. However, this doesn’t always work. In particular, the approach does not give proper results if you are using COM objects. To show this problem, let’s take a look the Group Policy Management Console COM interface. In this example, I search for all the GPOs in my domain. There are currently just 4 GPOs – and these are returned. However, although checking the count works, the BaseType is not so helpful. Here’s a small script I wrote to demonstrate this

# Setup GPMC
$gpm  = new-object -com GPmgmt.Gpm
$k    = $gpm.getconstants()
$dom  = $gpm.getdomain("", "","")   
# Search for all GPOs
$gpos = $dom.SearchGPOs($sc)
# Now - result time
"Type         :{0}" -f $gpos.gettype()
"Base Type    :{0}" -f $gpos.gettype().basetype
If ($gpos.Count) {"Seems to be an array"} else {"Seems to be a scalar"}
"Count        :{0}" -f $gpos.Count
"An array?    :{0}" -f $gpos.gettype().IsArray

The results of this were a little surprising:

PS C:\foo> . 'C:\Users\tfl\AppData\Local\Temp\Untitled4.ps1'
Type         :System.__ComObject
Base Type    :System.MarshalByRefObject
Seems to be an array
Count        :4
An array?    :False

As you can see, the $GPOS object appears to be an array and has a count – but GetType says it’s not an array. ALso, the Type and BaseType don’t actually help all that much.

If I change the code above marginally (to search for just one GPO) as follows:

# Setup GPMC
$gpm  = new-object -com GPmgmt.Gpm
$k    = $gpm.getconstants()
$dom  = $gpm.getdomain("", "","")   
# Search for one GPO
#Add a searcher to search for just ONE GPO Object
$sc.add($k.SearchPropertyGPODisplayName,$k.SearchOpEquals, "GPO1")
# Now search
$gpos = $dom.SearchGPOs($sc)
# Now - result time
"Type         :{0}" -f $gpos.gettype()
"Base Type    :{0}" -f $gpos.gettype().basetype
If ($gpos.Count) {"Seems to be an array"} else {"Seems to be a scalar"}
"Count        :{0}" -f $gpos.Count
"An array?    :{0}" -f $gpos.gettype().IsArray

The results of this were even more surprising to me:

PS C:\foo> . 'C:\Users\tfl\AppData\Local\Temp\Untitled4.ps1'
Type         :System.__ComObject
Base Type    :System.MarshalByRefObject
Seems to be an array
Count        :1
An array?    :False

This time, although only one object was returned, the count property exists and it appears to be an array! I wonder if I was the only person to be marginally confused!

While the three techniques above work great for .NET objects, they don’t work well for COM Objects. You just have to know what the underlying API (COM, .NET, etc) returns. In the case of the GPMC, the APIs seem to always return a collection, even when there’s only one object occurrence returned. For most harder-core developers, this really is not an issue, but for Admins using PowerShell it can be very confusing (It took me several hours to work this out. It’s one of those things that if you are to use PowerShell richly, you just have to know.

Technorati Tags: ,,,,

Saturday, January 17, 2009

Train Signal – Exchange PowerShell Training

Train Signal has released a new course: Microsoft Exchange Server 2007 PowerShell. This course includes 5 hours of video instruction and is delivered on two DVDs.

The course is designed to teach the basic concepts and capabilities of PowerShell. Key topics include:

  • Variable Types, Manipulation, and Scope
  • Functions and File Manipulation in PowerShell
  • Active Directory and Exchange Objects
  • Database Creation
  • Creation and Automation of Recurring Administrative Tasks
  • Creation and Management of Distribution Lists

The cost of this package is US$ 995, although there’s a special limited time  offer price of US$197.

Technorati tags: ,

Friday, January 16, 2009

Jeffrey Snover and Bruce Payette on the PowerScripting Podcast

I love downloading podcasts to my Zune and listening to them as I travel. I’ve got a bit of a backlog, but one I’ve just downloaded and will be listening to shortly (possibly tomorrow as I head from Milan back to London) is the PowerScripting Podcast, Episode 53  which features Jeffrey Snover and Bruce Payette.

You can listen to the podcast on your PC, or download it to your MP3 player. The file, PSPodcast-053.mp3, is just over 38mb.

Technorati tags: ,

Thursday, January 15, 2009

Running PowerShell Scripts via Email

One of the things I love about the PowerShell Community is the innovative ideas that come up. Dimitry Sotnikov, from Quest, posted a really neat blog article that  shows how to run PowerShell scripts from Outlook email.

Basically, this is done by setting up a rule in Outlook that detects a specially coded email (e.g. a subject header containing “PowerShell” or similar). The rule then runs a VBScript script which in turn runs the PowerShell script.

Just when you think you’ve seen it all, the community delights and surprises.

Technorati tags: ,

Wednesday, January 14, 2009

Date and Time in PowerShell (and WMI)

In yesterday’s PowerShell script of the day entry,posted over on my PowerShell scripts blog, I re-implemented an MSDN sample (originally written in VBScript) that calculates up time. The script first used a WMI class (Win32_OperatingSystem) to determine when a computer started. Then the script gets the current time, and works out and displays the difference (i.e. the current uptime).

The .NET framework contains a class, System.DateTime, that provides a variety of time/date related features.The PowerShell Cmdlet Get-Date returns a System.DateTime object that contains the current date and time. You can use the methods and properties of the class to get aspects of that current date and time as you need. For example:

PSH [D:\foo]: $d=Get-Date
PSH [D:\foo]: $d.Date

14 January 2009 00:00:00

PSH [D:\foo]: $d.Year
PSH [D:\foo]: $d.IsDaylightSavingTime()

You can, of course, pipe your DateTime object to Get-member to see the other properties and methods on this class (or refer to the MSDN documentation). There’s a lot of pretty rich date and time handling available.

The problem I had with yesterday’s script is that WMI uses a different format for date and time. The Win32_OperatingSystem WMI object has a property, LastBootUpTime which returns the date and time when the OS was last booted. However, WMI returns this as a string that is formatted rather differently to System.DateTime, as you can see here:

PSH [D:\foo]: $os=Get-WmiObject Win32_OperatingSystem
PSH [D:\foo]: $os.LastBootUpTime

This demonstrates that .NET, and therefore PowerShell, uses a native date/time formats that are different. For many admins (and for most native level developers) this is not a big deal. But if you need to inter-operate, you have a small issue of converting between the two formats.

As it turns out, solving this issue is simple. The developers of .NET created a simple solution. First, there’s a .NET class,  System.Management.ManagementDateTimeConverter, which does date and time conversion. This class has a method, ToDateTime, which converts a WMI date string into a .NET DateTime object. I used this  method in the WMIDateStringToDate function in Get-Uptime.ps1.

You can either implement a function (i.e. WmiDateStringToDate) in your profile or cut/pasted it into any WMI script you write. Naturally, you could just use the .NET method natively. As follows:

PSH [D:\foo]: $os=get-wmiobject win32_operatingsystem
PSH [D:\foo]: $time = $os.lastbootuptime
PSH [D:\foo]: [System.Management.ManagementDateTimeconverter]::ToDateTime($time)

12 January 2009 14:24:58

PSH [D:\foo]: ([System.Management.ManagementDateTimeconverter]::ToDateTime($time)).hour

As with most things PowerShell, easy stuff is very easy while complex stuff is often just a method call away.

Monday, January 12, 2009

Blogging in 2008

In 2008, I spent a lot of time blogging, both here in Under The  Stairs, and on my PowerShell scripts blog ( and my corporate blog ( The first two of these blogs had a lot of PowerShell content (especially the PowerShell Scripts blog which was all PowerShell).

Traffic varied a bit over the year, but really picked up in the autumn, as follows:



Under The Stairs
Unique visitors

Under The Stairs
Page views

PowerShell Scripts
Unique visitors

PowerShell Scripts
Page views


















































As this table shows, I had more traffic towards the end of the year, which I attribute to posting more. I was motivated by what I saw at TechEd to up my level of posting, which has paid off in more traffic.

The PowerShell Scripts blog was started in July. However from Aug-September, it did not see much traffic mainly because I was did not post during those months. (due to being pretty busy with my day job). 

With the release of PowerShell V2 CTP3, interest in the new features has driven a  lot of traffic to both blogs. And with the newly released Windows 7 betas, I hope this interest will continue.

Technorati tags: ,

Saturday, January 10, 2009

Windows 7 Beta Has Arrived – But Not For Everyone

The Windows 7 and Windows Server 2008 R2 beta versions were released this week. I got the ISOs myself during the week, and finished off today loading R2, Win7 Ultimate and WIn7 Home Premium as VMware virtual machines. But it looks like Microsoft has totally underestimated the demand. In a blog post over on The Windows Blog, Brandon LeBlanc explains that they are delaying the introduction of the public beta. They are adding extra infrastructure to cope with the demand. Another blog post, Windows 7 Beta Downloads Will be Available Soon - Microsoft's Servers Already Can't Handle Demand, Frederic Lardinois gives more detail.

I guess it was inevitable that Win7 would be popular once it was officially released. The beta had already leaked out, and at one point the torrents had thousands of folks downloading. Looking at The Pirate Bay, the Win 7 torrent (“directly from Microsoft”) today has 1700 lechers. It should have been obvious that this was going to be popular. Of course, these problems just make great press stories, and stoke the enthusiasm of the industry. Perhaps Microsoft should have considered deploying Bit-Torrent to enable the download?

Irrespective of Microsoft’s infrastructure woes, Win7 and Server 2008 R2 look like real winners. Win 7 looks really good, although thus far, I’m only running it in a VM. And Server 20087 R2 also looks pretty good! For me the biggest feature of both, thus far, is the inclusion of PowerShell into the OS. As I’m running these betas in VMware, I am not yet getting the benefit of the updates to the Aero UI – my new laptop comes very soon and I’m really looking forward to running Win7 natively.

If you have Windows 7 and/or Windows Server 2008 R2 betas, be sure to check out PowerShell V2. The OS betas incorporate what is more or less PowerShell CTP3 beta (which itself was released just before Christmas). This new version has some pretty exciting new features – I’ve been blogging about these ever since I got the beta. The version of PowerShell that is contained in the OS betas is just slightly earlier than the released PowerShell CTP3. But there should be no real difference in functionality (aside from having a few less bugs in the full CTP3). While remoting and eventing are pretty cool technology, the module functionality, including Auto-Help, are awesome.

So if you don’t have the WIn7, or R2, betas yet – be patient. The infrastructure will soon be in place. I suspect most users will feel the wait is worth while. Hopefully, their reaction will be a little more positive than xkcd’s.


Technorati Tags: ,,


Share this post :

Friday, January 09, 2009

More on PowerShell Best Practice

I read an very interesting post over on James O'Neill's blog. It was interesting  both because it talked PowerShell, but also because it talked about OCS and the neat PowerShell code James had written to support OCS 2007. 

As part of the Server 2008 Resource Kit, there’s a large script that contains a number of function definitions. If you dot source this script file, you can then use the functions sort of like cmdlets and administer OCS using PowerShell. I demo this script in my OCS Voice Ignite classes and the reaction is good!

When I first saw these cmdlets, my eyes lit up since it meant I could use these functions. However, I quickly noticed that some important best practices had not entirely been adhered to. Nothing major and certainly nothing that would break the functions – but it did not leverage PowerShell’s discoverability model.

In the latest post, James describes the re-write and the lessons he learned. The lessons are great ones that all PowerShell users should employ as they implement PowerShell into their environment. These are:

  • PowerShell nouns are written in the singular. So a cmdlet/function to get all users would be GET-USER not GET-USERS (even though the latter is more likely to be the result of the get).
  • Be consistent with Nouns, Avoid using “usage” in one function and “PhoneUsage” in another one.
  • Avoid creating new verbs. While adding verbs like LIST is tempting, using Get- is more discoverable.
  • Make use of the pipeline  when writing cmdlets. This enable a the user to pipe things into commands, pass an object or a name to fetch the object.
  • Assume users want to use wildcards and allow wherever possible.

These are great lessons we all should  learn. Personally, I have some trouble with the third one when writing scripts for my PowerShell Scripts Blog (

For those of you who will be buying the OCS 2007 R2 Resource Kit book (something I’ll sure be doing!), the function library is a real opportunity to do some repackaging using PowerShell V2 CTP. James’ .PS1 script contains none of the V2 stuff (Cmdlet binding and parameter attributes, auto-help contents, using manifests to do the updating of type data, etc).

Wednesday, January 07, 2009

PowerShellers: The "#Requires" statement

Another interesting PowerShell tidbit. Alexandair has documented a bit more information about the #Requires statement in PowerShell. As he described in PowerShellers: The "#requires" statement, the #Requires statement has been in PowerShell since Version 1 (but was undocumented and not communicated widely!).

What’s neat is what else #Requires can do. The #Requires statement can also mandate a particular shell ID (i.e. this script only runs under a particular shell id) or that a particular snap-in is loaded (e.g. the Quest Active Roles tools).

This is more goodness in terms of Enterprise-readyness. I’d like to see what else could be added to the #Requires statement. Any thoughts?

Technorati Tags: ,,

Tuesday, January 06, 2009

PowerShell Audit Reports – Turning Great Scripts Into a Module

I just read a neat blog post entitled PowerShelling Audit Reports over on the TenBrink Tech blog. This blog post sets out three scripts: GetRecursiveGroupMembership.ps1, Audit-QuickGroup.ps1, and Audit-MultipleGroups.ps1. The second and third make use of the first. They make use of Quests’s AD tools to create CSV file(s) containing details of members in an AD Group.

Scripts to Modules

Dillon (the post’s author) has implemented all three of these as separate PS1 files. Which is so Version 1. :-)!!  As I read through his scripts, I could not escape the felling that a much better approach would be to implement them as a single module with PowerShell V2. So I did! It took a bit more time than I’d hoped, but it helped me to learn a bit more about modules in PowerShell V2.

Creating a Module

I first created a module file (audit.psm1) which contained the three functions Dillon created. These were slightly modified to be functions rather than script files, but the resulting module is essentially identical to the original. I also created a module manifest, which describes the module, and indicates the functions the module should expose to the user once the module is imported. I felt the first function was a helper function so the module only exports two functions not all three.Of course, I could be wrong, but it did make an interesting challenge – just exporting two of the three functions using a manifest.

Here’s the PSM1 Module itself:

  1. # 
  2. Write-Host "Importing Module Audit.psm1" 
  3. function Get-RecursiveGroupMembership { 
  4. <# 
  6.     Gets membership of a group.   
  8.     Uses recursion to handle nested groups 
  9. .NOTES 
  10.     File Name  : Audit.psm1 
  11.     Author     : Dillon (@tealshark on Twitter) 
  12.     Updated by : Thomas Lee 
  13.     Requires   : PowerShell V2 CTP3 
  14.     This is a helper function and not exported by the module. 
  15. .LINK 
  18. .PARAMETER DistinguishedName 
  19.     This paramater is the DN of the group you want to expand 
  20. .PARAMETER AddOtherTypes 
  21.     This parameter adds other types to the search 
  22. #> 
  24. param
  25. [Parameter(Position=0, Mandatory=$TRUE, ValueFromPipeline=$TRUE)]   
  26. [string] $distinguishedname
  27. [Parameter(Position=1, Mandatory=$FALSE, ValueFromPipeline=$FALSE)]     
  28. [bool] $addOtherTypes = $false 
  30. # Start of function 
  31.  $members = @() 
  33.  $this = (Get-QADGroup $distinguishedname).member | Get-QADObject 
  34.  $this | foreach
  35.       if ($_.type -eq ‘user’) { 
  36.           $members += $_ 
  37.       } 
  38.       elseif ($_.type -eq ‘group’) { 
  39.           Write-Host "Adding sub group $_" 
  40.           $members += Get-RecursiveGroupMembership $_.dn $addOtherTypes 
  41.       } 
  42.       else
  43.           if ($addOtherTypes -eq $true) { 
  44.               $members += $_ 
  45.            } 
  46.            else
  47.               Write-Host "Non user/group member detected. Not added. Use -addOtherTypes flag to add." 
  48.            } 
  49.        } 
  50.     } 
  51.     return $members 
  53. function Audit-QuickGroup { 
  54. <# 
  55. .SYNOPSIS 
  56.     This function takes the distinguishedName of a group in any domain and writes 
  57.     the results of that group membership to a csv file of the same name. 
  59.     This script uses get-recursivegroupmembership function to get the group membership 
  60. .NOTES 
  61.     File Name  : audit.psm1 
  62.     Author     : Dillon (@tealshark on Twitter) 
  63.     Updated by : Thomas Lee 
  64.     Requires   : PowerShell V2 CTP3 
  65.     This function is exported 
  66. .LINK 
  69. .EXAMPLE 
  70. .PARAMETER Name 
  71.     Disginguished name of a group whose membership the script will ascertain. 
  72. #> 
  74. param
  75. [Parameter(Position=0, Mandatory=$TRUE, ValueFromPipeline=$TRUE)]   
  76. [string] $Name 
  78. # Start of Function 
  79. $Csvdata = Get-RecursiveGroupMembership $name | select name,type,dn,title,office,description | convertto-csv -NoTypeInformation 
  80. $Filename = $Name + ".csv" 
  81. [String]$Reportdate = "Report Generated: " + [datetime]::Now 
  82. $f = new-item -itemtype file $filename 
  83. add-content $f "Audit Report - Active Directory Group - $name" 
  84. add-content $f $reportdate 
  85. add-content $f $csvdata 
  88. function Audit-MultipleGroups { 
  89. <# 
  90. .SYNOPSIS 
  91.     Gets membership of multiple groups 
  93.     This function uses the filtering abilities of the Quest Get-QADGroup cmdlet to  
  94.     get the membership of multiple groups. These are then written to multiple files. 
  95. .NOTES 
  96.     File Name  : get-autohelp.ps1 
  97.     Author     : Dillon (@tealshark on Twitter),  
  98.     Updated by : Thomas Lee 
  99.     Requires   : PowerShell V2 CTP3 
  100.     This function is exported 
  101. .LINK 
  103. .EXAMPLE 
  104.     Left as an exercise for the reader 
  105. .PARAMETER GroupInput 
  106.     The groups you want to audit. 
  107. #> 
  109. param
  110. [Parameter(Position=0, Mandatory=$TRUE, ValueFromPipeline=$TRUE)]   
  111. [string] $GroupInput 
  113. # Start of function 
  114. # First get groups 
  115. $GroupList = get-qadgroup $groupinput 
  117. # Iterate through groups, creating output 
  118.  foreach ($Group in $GroupList) { 
  119.      Write-Host $group.dn 
  120.     $GroupMembers = Get-RecursiveGroupMembership $group.DN | select name,type,dn,title,office,description | convertto-csv -NoTypeInformation 
  121.     #now create file 
  122.     $filename = $Group.Name + ".csv" 
  123.     [String]$reportdate = "Report Generated: " + [datetime]::Now 
  124.     $file = New-Item -ItemType file $filename -Force 
  125.     Add-Content $file "Audit Report - Active Directory Group Membership" 
  126.     Add-Content $file $reportDate 
  127.     Add-Content $file $groupMembers 
  128.   } 
  130. # End of Module 

Creating the Manifest

I used the New-ModuleManifest cmdlet to produce the basic module manifest, the .psd1 file. I then did a bit of editing with PowerSHell plus to achieve this final manifest:

  1. # Module manifest for module 'audit' 
  2. # Generated by: Thomas Lee 
  4. @{ 
  5. # These modules will be processed when the module manifest is loaded. 
  6. NestedModules = 'Audit.psm1' 
  7. # This GUID is used to uniquely identify this module. 
  8. GUID = '5eed72f9-5f1d-4819-973c-63f80ccee415' 
  9. # The author of this module. 
  10. Author = 'Thomas Lee (, with functions by dillon.' 
  11. # The company or vendor for this module. 
  12. CompanyName = 'PS Partnership' 
  13. # The copyright statement for this module. 
  14. Copyright = '(c) PS Partnership 2009' 
  15. # The version of this module. 
  16. ModuleVersion = '1.0' 
  17. # A description of this module. 
  18. Description = 'This module is a packaging of audit scripts by Dillon into a single module.' 
  19. # The minimum version of PowerShell needed to use this module. 
  20. PowerShellVersion = '2.0' 
  21. # The CLR version required to use this module. 
  22. CLRVersion = '2.0' 
  23. # Functions to export from this manifest. 
  24. ExportedFunctions = ('Audit-QuickGroup', 'Audit-MultipleGroups'
  25. # Aliases to export from this manifest. 
  26. ExportedAliases = '*' 
  27. # Variables to export from this manifest. 
  28. ExportedVariables = '*' 
  29. # Cmdlets to export from this manifest. 
  30. ExportedCmdlets = '*' 
  31. # This is a list of other modules that must be loaded before this module. 
  32. RequiredModules = @() 
  33. # The script files (.ps1) that are loaded before this module. 
  34. ScriptsToProcess = @() 
  35. # The type files (.ps1xml) loaded by this module. 
  36. TypesToProcess = @() 
  37. # The format files (.ps1xml) loaded by this module. 
  38. FormatsToProcess = @() 
  39. # A list of assemblies that must be loaded before this module can work. 
  40. RequiredAssemblies = @() 
  41. # Lists additional items like icons, etc. that the module will use. 
  42. OtherItems = @() 
  43. # Module specific private data can be passed via this member. 
  44. PrivateData = '' 

The Results

It turns out that converting a set of script files, like Dillon created initially, into a module (as above) is easy. The syntax of the module file turns out to be a bit tricky, and the error messages and help text in CTP3 are woefully inadequate.

Here’s what this module looks like at runtime:

PS MyMod:\> # Note the module is not yet loaded so you get no output from Get-module

PS MyMod:\> Get-Module audit
PS MyMod:\> # So import the module, then look at module details

PS MyMod:\> Import-Module audit
Importing Module Audit.psm1
PS MyMod:\> Get-Module audit

Name              : audit
Path              : C:\Users\tfl\Documents\WindowsPowerShell\Modules\audit\audit.psd1
Description       : This module is a packaging of audit scripts by Dillon into a single module.
Guid              : 5eed72f9-5f1d-4819-973c-63f80ccee415
Version           : 1.0
ModuleBase        : C:\Users\tfl\Documents\WindowsPowerShell\Modules\audit
ModuleType        : Manifest
PrivateData       :
AccessMode        : ReadWrite
ExportedAliases   : {}
ExportedCmdlets   : {}
ExportedFunctions : {[Audit-QuickGroup, Audit-QuickGroup], [Audit-MultipleGroups, Audit-MultipleGroups]}
ExportedVariables : {}
NestedModules     : {Audit.psm1}

PS MyMod:\> # Here – use AutoHelp feature to get help on the exported function/
PS MyMod:\> Get-Help Audit-QuickGroup


    This function takes the distinguishedName of a group in any domain and writes
    the results of that group membership to a csv file of the same name.

    Audit-QuickGroup [-Name] [<String>] [-Verbose] [-Debug] [-ErrorAction [<ActionPreference>]] [-WarningAction [<ActionPreference>]] [-ErrorVariable [<String>]] [-WarningVariable [<String>
    ]] [-OutVariable [<String>]] [-OutBuffer [<Int32>]] [<CommonParameters>]

    This script uses get-recursivegroupmembership function to get the group membership


    To see the examples, type: "get-help Audit-QuickGroup -examples".
    For more information, type: "get-help Audit-QuickGroup -detailed".
    For technical information, type: "get-help Audit-QuickGroup -full".

What I learned

This was an interesting exercise. It was pretty easy, but I did stumble a bit with the module (how I wish there has been better documentation on modules with CTP3!). Here are some of my take-aways relating to PowerShell modules in CTP3:

  1. Turning a set of inter-related scripts into a module is both easy, and a good thing!
  2. If you have a module that you want to also have a manifest with, they can both have the same file name but with different extensions (the module itself in a .psm1 file and the manifest in a pds1 file).
  3. To export only a subset of the functions in the .psm1 file, you use the ExportedFunctions feature in the manifest.
  4. To export multiple functions you enclose the set of functions as a string array inside the parenthesis. See this in line 24 above. This format was not all that obvious at first.
  5. You can add statements into the module that are executed when you import the module. See Line 2 in the module – this prints out a short message when you import the module. This has some great potential – thanks to Jeffrey Snover for the tip!
  6. If you import a module (using import-module as above) you can generally remove it using remove-module. This aids in testing!

There’s certainly room for improvement in this module. One thing that could be done would be to check to see if the Quest QAD tools were installed and issue a warning message if not (even better, if the tools are not found the script could go get them and install them for you auto-magically!). There should also be some trap or try/catch statements in the functions to better handle errors. Some auditing of the functions usage could also be implemented.

Modules are pretty cool – I hope this helps you understand them a bit better!