Tuesday, December 29, 2009

Is PowerShell only for the Nerds?

An interesting question, to say the least. This blog carries a lot of PowerShell content – and much of that content is aimed at IT Pro types. I think it’s clear that IT Pro nerds working in the Microsoft need to know PowerShell. PowerShell is central to Microsoft’s current and future management roadmap – so, yes, PowerShell is for nerds.

That begs the question though - should home users have to learn it? Well – yes and no. There are certainly a number of PC enthusiasts out there that want to learn more about how to manage their home computer – and PowerShell can help much like they once used Cmd.exe (and may still do). Learning PowerShell does involve a learning curve. And much of the documentation and a fair share of blog posts do tend to be on the dense side until you become more familiar with PowerShell. So persevere.

But should your grandmother have to learn PowerShell? No. No more than she’d need or even want to learn C, the .NET Framework, DNS, TCP/IP! Those technologies, and PowerShell too, is built into the “system” so she doesn’t have to use them directly. Of course, developers may well begin to use PowerShell as part of building their GUI – and I hope they do so.

So if you’re a nerd, take a moment to download PowerShell V2 (assuming you haven’t already done so) and get cracking!

 

Technorati Tags:

Saturday, December 26, 2009

Exchange 2007 Updated Help File

For some products, suggesting someone read a help file is like suggesting they take a long walk on a short pier (albeit less wet). But for Exchange 2007, this seems good advice. Microsoft published an updated version of the help file for Exchange, available for download from the MS Downloads site.

I hope they continue this approach with the advent of Exchange 2010.

 

Technorati tags: ,

Unified Communications Developer Portal

Microsoft's push into the world of Unified Communications continues with the launch of the Unified Communications Developer Portal.  This MSDN portal is part of the Office Development Center, and is designed to feature links and resources for developers wanting to get into UC.

As I write this updated post, there some interesting feature articles, including Detecting the State of the Office Communications Server and Building UCMA Installers.

This is a great site for developers (and admins that secretly want to be developers!) to learn more about developing in the UC space.

Friday, December 25, 2009

Merry Christmas – One and All

It’s time to rest and hang out with my family. For those of you on line today: Merry Christmas – now get back to the folks that matter. For the rest of you, I hope you had a great day.

Thursday, December 24, 2009

PowerShell V2 Download

If you are running Windows 7 or Windows Server 2008 R2, then you will have PowerShell installed by default. On Windows 7, both the PowerShell console and PowerShell ISE are installed by default. On Server 2008 R2, PowerShell console is installed by default, whilst the PowerShell ISE is a feature you can add using Server Manager. And of course, for any Server 2008 R2 core installations, you would need to add both the .NET Framework and PowerShell – virtually nothing extra gets added to Server Core!

For downlevel (aka older) versions of Windows, Microsoft has released a KB article which directs you to versions you can download. Go to the page http://support.microsoft.com/kb/968929 where you can find PowerShell V2 for Windows XP, Vista, Server 2003 and Server 2008 (RTM) – in both 32-bit and 64-bit versions. Note that in Server 2008 RTM, PowerShell is not supported in Server Core.

In addition to PowerShell V2, the downloads also include the RTM version of Windows Remote Management (WinRM). WinRM is Microsoft’s  implementation of the standards based WS-Management Protocol. WinRM is based on Simple Object Access Protocol (SOAP), and provides a firewall-friendly protocol enabling hardware and operating systems from different vendors to interoperate.

If you are are using an older OS, i.e. Windows 2000 or earlier, then there is no supported version of PowerShell for you. You can probably hack some or even most of PowerShell V2 into those older OSs. But things may not work – and it’s not ever going to be supported.

Want to REALLY Understand What PowerShell Remoting Does On The Wire?

Well, maybe not everything, but the document entitled PowerShell Remoting Protocol Specification contains 179 pages of details of what Powershell Remoting does! It’s not for the faint of heart, to say the least as it very detailed – just the sort of hard core techie details that the true PowerShell addict will lap up!

This document is one of the many Communications Protocols that Microsoft has been publishing. You can download the specific PDF for the PowerShell Remoting Specification, or download a zip file with all the PDF files for all the published specifications. The latter might be overkill even for the most hardened network geek.

Wednesday, December 23, 2009

Microsoft Releases the PowerShell v2.0 SDK

The Systems Development Kit, or SDK, is a download of stuff you need to be able to develop applications around a particular technical area. The Windows SDK is huge and there are SDKs for almost every technology MS produces. Other vendors produce SDKs too.

Initially, Microsoft shipped a PowerShell SDK. Then some time ago this changed and the PowerShell SDK was subsumed into the Windows SDK – which meant finding  just the PowerShell stuff was challenging. I saw any number of complaints around this in the newsgroups etc.

Well –the PowerShell team have just released the Version 2.0 SDK – as an independent (and small at 2.4mb!) SDK. The SDK contains the reference assemblies and 45 samples that help you to better understand how to use PowerShell. There appears to be no documentation (and the installation process doesn’t seem to add in any shortcuts or start menu items.

More holiday reading!

OCS R2 Training Materials

Earlier this year, Microsoft released an OCS R2 learning Portal – you can find this at:  http://www.microsoft.com/learning/ocs2007/r2/default.mspx. It contains some training resources for OCS R2. However, the page is both out of date, and is short on links. I have some more details to add to this page:

  • As noted, the OCS R2 Resource Kit is available. This is a must have book!  The book is excellent, but it appears rushed and needed a better technical editing. Hopefully that will happen in the next release of the book!
  • The Portal discusses the OCS 2007 R2 exam. The exam number is the same exam number as for the RTM exam, although the contents have been updated to reflect the new features in R2. If you have already passed the earlier RTM exam there’s no need to re-sit it. There is also a voice exam,but it’s not listed in the OCS Learning Portal (yet).
  • The portal describes the training available for OCS (on a linked page). This page  lists 5177/8/9 (and clinic 6447A) which was RTM courseware (and fairly poor) and should probably be avoided. An updated version of the official courseware, released as CWL course 50214, was released, but the quality was so poor it has been withdrawn until remediation can be completed and properly tested. When this updated course is available, I’ll post here! I am anxious to see the updated courseware. The updated labs look good.
  • The portal also lists the OCS Ignite course (50024A), but this material is both RTM only and not being run very often (although I can certainly offer it if clients really want it). A much better course is the OCS 2007 R2 Ignite content – course 50232A. This focuses on the R2 release, although it can be used to teach new to product folks. I teach this a lot and love it – but beware trying to do it in 3 days. With the right instructor, this course can easily fill 4 days and 5 (if the delegates are new to OCS). The labs are great too.
  • Finally, the page does not mention the updated Voice Ignite workshop. Voice Ignite for OCS 2007 R2 has bee been created and is available.
  • Note that all the courseware discussed here is Courseware Library content, content authored by a 3rd party with MS just acting as a reseller. Quality of CWL material has been variable, but MSL and the UC team are ensuring that all the courseware is good and fit for purpose. So you can book this training with confidence that the material is good.

OCS 2007 R2 is now in the field and customers should start to evaluate it for your organisation. If you have not yet deployed OCS at all, R2 is a natural next step. A new version of OCS, OCS “wave 14’ will be going into beta some time in the new year and is scheduled for release later in 2010 – dates are not yet firm on either the beta or RTM. Once I get more information, I’ll post it here.

OCS is a rich and complex application – if you are planning on deploying it, you could usefully do with some training. But before booking on the course, make sure you get a good trainer – someone who has been working with the product for a while and can explain the product and it’s values and can dive deep into deployment, configuration and support. Investing in some training would be a good thing!

Technorati Tags:

Tuesday, December 22, 2009

Office Communications Server 2007 Virtualisation Support

Earlier this year, Microsoft confirmed that it would now support OCS 2007 in a virtualised environment. This has been a big ask by clients pretty much ever since OCS was launched. Just about every time I’ve taught OCS in the past couple of years, as well as in the OCS newsgroups, the question of why it’s not supported comes up over and over again. Along with anecdotal evidence that it worked in at least a VMware ESX environment. And in the classroom, we see OCS working (well as least the server components) just fine.
The support Microsoft is providing caters for OCS server roles running in VMs hosted on both a single server (a typical classroom scenario) or hosted on a number of servers (a scenario more likely in corporate deployment). Support is limited to a subset of OCS Server roles, i.e. Front-End Servers, Back-End SQL Server 2008 (64-bit), Group Chat Channel Servers, Group Chat Channel and Compliance Servers, and Access Edge Servers. This means no support of other server roles - Microsoft say these roles are not supported due to “possible quality issues” with real time media.
It took MS a while after R2 was released to support this - I understand that part of the delay in announcing full support was testing. MS always wants (needs!) to test anything it offers to support, and OCS is no different. In announcing VM support, MS has tested a fully distributed topology with 40,000 users and 10,000 group chat users. This means, says Microsoft that: “audio/video/web conferencing servers, audio/video/web edge conferencing servers, dial-in conferencing, Communicator Web Access, enterprise voice, or Remote Call Control may not be deployed as part of the virtualized pool.” The impact of this means that you can not (at least in a supported fashion) virtualise a Standard Edition pool (or for that matter an EE consolidated pool), a consolidated edge server or a CWA server.
Microsoft also published an interesting whitepaper detailing the tested architecture. The white paper also looks at OCS performance and how you can use the Capacity Planning Tool.
This is a great move forward but it comes with some strings. Namely, the roles that are supported only work in an administratively complex environment. Or to put it another way, all the easy installations of OCS (SE, consolidated EE pool, consolidated edge) can not be virtualised in a supported way. And since, with R2, Microsoft has de-emphasised the distributed architecture, to deploy OCS in a virtualised environment, you would need to use the command line tools which makes the deployment potentially more work. I am hoping we’ll see a better story with the next wave of product.

Monday, December 21, 2009

Office Communications Server 2007 R2 Ignite

I’ve been running a lot of OCS training over the years – and Office Communications Server 2007 R2 Ignite is one cool course! I’ve taught this several times around EMEA and find it to be well received.I will be teaching it again this week coming. The technical level as advertised is also pretty correct – although one can go deeper as needed!

The content is OCS 2007 R2 focused, with fairly limited marketing fluff! The labs are rich and complex – and the troubleshooting opportunities abound. In my view, this is a 4  day class – especially given the richness of the labs. AND – if you are new to product, it can be a great 5 day class with the right instructor.

 

Technorati Tags:

Sunday, December 20, 2009

Some Thoughts on the Microsoft Certified Learning Consultant Programme

I was in New Orleans over the summer of 2009 at the WPC Microsoft Partner Conference. At the conference, I got to ask Microsoft Learning’s leaders what was up with the Certified Learning Consultant Programme. This programme was launched with much fanfare a few years ago, and I was asked to sit on the MCLC Review Board. The Review Board reviewed MCLC applications and failed or passed them. For the first few years, there was quite a lot of work. Of course, this always seemed to co-incide with other work related complications – but such is life.

I enjoyed the opportunity to look at the applications, but frankly was quite disappointed in many.  All too often, the application form was filled in such that information was just plain missing. One simple example I cited was the request for 100 words about project – and to me that means round about 100 words. I’d be happy if the word count was anywhere between 85 and 115 – and a bit longer if there was anything particularly complicated. But an entry of 43 and 252 words do not make it to 100. But there were some good applications from folks that clearly demonstrated their abilities. It was a pleasure to approve their application.

I like the idea behind the MCLC certificate and the current application process. An MCLC basically reviews the current competency level of a group of people, typically some sort of project team, to determine the gaps in their skills. The MCLC then designs and rolls out a training plan designed to address the key skill gaps and to achieve some soft of return on investment. During the roll out, the MCLC is expected to look at how the training is going and adjust accordingly. Finally, the MCLC needs to analyse the results and ensure that the ROI has been achieved (or not). A hurdle in some cases was that the candidate was required to have the client acknowledge the ROI achievement in writing.

One example was where the MCLC was working with a technical support team about what is required to support the next version of Windows and Office and Exchange in their organisation. The MCLC looked at all the available Microsoft learning products as well as other non-Microsoft products to determine what would be best for that team. This included some in-class work, remote labs, e-learning and some other reading. During the execution of the programme issues, such as redundant modules in the training, or difficulties with attendance at events, etc would be looked at and the programme adjusted to meet reality!

The issue of ROI on the training was one that confused a lot of us – both candidates and review board members. Initially, I understood ROI to be in terms of pounds/euros/dollars. I was never very comfortable about this and was constantly reminded of the annecdote about an accountant. When asked what two plus two should equal, gave the reply: whatever you need it to be. But then I saw a number of really good projects that had non-dollar amounts cited as ROI.

The key point was, for me at least, that if you can’t measure it, it’s not important. Leading from that, basing ROI on less tangible things was OK – as long as it was measured and either the measurement “improved” or there was a good explanation why not.

In the case of the company I noted above was that the project was performed to train a company’s small IT group. That company was in the process of doing a major corporate restructuring deal and really wanted to keep the IT group as a whole. The company had unique home grown applications that needed the trained staff currently in post. So the ROI measurement was based simply on staff satisfaction and their likelihood to want to leave. Each employee was sent a questionnaire and had an interview with their employer and the MCLC before and after the training. And the satisfaction levels measured. A very interesting project. I suspect that one could have put some dollar amounts into some equation and “measure” it in cash terms. But I liked the simplicity of it – and it was something that could be (and was) measured.

But I thought this programme was dead – and hence asked the question at WPC. Turns out the programme still lives. I understand some consideration was given to closing the programme down, but thankfully the axe has been spared. The question now is how to breathe some life back into it.

Some years ago, I gave a talk at a MSL event around how to prepare a successful MCLC application. If there’s any real interest, I’ll ask MSL to organise a Live Meeting and repeat the talk. If you are interested – then mail your MSL contact(s). Or post a comment here. I’d love to hear of any enthusiasm from the community around this programme.

Saturday, December 19, 2009

PowerShell’s Popularity and Search Engines

I was searching for some information on how to add custom menus to PowerShell ISE. I’d been meaning to play around with this for a while and had some time last night. So I went searching and came across what looked at first sight to be the perfect answer: a blog post by Jeffrey Snover entitled My PowerShell_ISE Profile. It did just want I wanted – so I added to my ISE profile and restarted ISE. WHOOPS – there were some really weird errors.

To make a long story short, PowerShell changed post CTP3 – in this case the $psISE.CustomMenu.Submenus.Add calls at the bottom of Jeffrey’s post no longer work. After tweeing my confusion, I got pointed to: http://powershellers.blogspot.com/2009/05/what-happened-to-custommenu-property.html which resolved the issue for me. Sadly, that page had not come up at all in the searching I’d done. While it’s not the point of this post, here’s the erroneous and corrected lines of Jeffrey’s script:

At the end of his script, the lines:

$null = $psISE.CustomMenu.Submenus.Add("Edit Selected", {Edit-Selected}, 'Ctrl+E')
$null = $psISE.CustomMenu.Submenus.Add("Export Session Files", {Export-SessionFiles}, 'Ctrl+SHIFT+E')
$null = $psISE.CustomMenu.Submenus.Add("Import Session Files", {Import-SessionFiles}, 'Ctrl+SHIFT+I')

Should read:

$null = $psISE.CurrentPowerShellTab.AddOnsMenu.submenus.Add("Edit Selected",{edit-selected}, 'Ctrl+E')
$null = $psISE.CurrentPowerShellTab.AddOnsMenu.submenus.Add("Export Session Files", {Export-SessionFiles}, 'Ctrl+Alt+E')
$null = $psISE.CurrentPowerShellTab.AddOnsMenu.submenus.Add("Import Session Files", {Import-SessionFiles}, 'Ctrl+Alt+I')

But that’s not the real problem – and I sure do not want to criticise the PowerShell Team! The problem is wider than this one post.The problem is that there’s been so much blog and web traffic around PowerShell that the Search Engines are promoting old content. For example: try searching for “PowerShell V2 dowload” – On Google the first three hits return links to Pre-RTM downloads. Bing is no better at present!

I think the solution is twofold. To some degree, the issue will tend to go away – as new content is created, indexed, referenced and used, the search engines will ‘learn’ the new content and ‘forget’ the older stuff. So let’s get going and start pointing to the updated key content! At the same time, it’ll be useful for bloggers to update their older content – either dropping it totally, or updating the content somehow. I’ll do what I can on that front!

In some respects, it’s a nice problem and one almost worth having! The search engines are just a reflection of the creators and consumers of content – if PowerShell wasn’t so popular, the links would be even worse!

Technorati Tags: ,

Friday, December 18, 2009

PowerShell For Visio

I was looking tonight for add-ins for PowerShell, and I came across a pretty neat tool: PowerShell for Visio. This is a free tool that provides a visual designer for PowerShell. This is a small free download for Visio 2007 – so far as I know, it’s not supported yet on Visio 2010. The download comes with source code, which is a nice touch.

After installation, you’ll find some new PowerShell Templates included:

image

 

You select a template then create a workflow using the template. Here’s a really simple example

 

image

Then look at the tab and you’ll see a script:

image

It’s pretty limited, but a great start. What would be really cool is if this tool wsa taken to the max and included (with FULL support for all of PowerShell) into Visual Studio.

Technorati Tags: ,

Saturday, December 05, 2009

Yet another Microsoft MCT Courseware Download Site Outage

As a Microsoft Certified Trainer, I am able to download courseware – both for study purposes and for the courses I deliver. Sadly, the site has been somewhat less than perfect of late. The latest incident is today: I simply can’t get to it at all:

image

Now another great benefit is the Regional Support centre. There’s a nice web form for reporting errors. But when I do, I get a canned reply, typically 2 days later, saying “we need a screen shot” – and without screen shots they just close the case. If a screen shot is needed, then why don’t a) they say so and b) provide me a mechanism to input it when I report the latest issue. I suppose this is more proof, as if it was needed, of MS’s internal/outsourced support being sub-optimal.

This is not the first time this site has failed – it’s doing it best to emulate a yo-yo. As a paying customer (MCTs have to pay for the privilege), I find these continuing issues to be unacceptable. Having complained privately, things just seem to get worse. It looks lik the only way to get this fixed is to go public. I will include the link to this blog post in the report to the support centre. I await their response.

If you have problems with this site, please post a comment here!

[Later] The problem resolved itself. After a couple of reboots, I tried again and it worked fine. I got a call 2 days later from the RSC saying they could not reproduce the issue.

Google DNS

Last week,  Google announced a new service, Google DNS. The idea is simple – instead of using your own, or your ISPs DNS, you use theirs. Most readers of this blog will know what DNS is, why it’s really important and how to configure and manage it, at least I sure hope so! But that leaves a great percentage of the Internet-using population who both don’t know and  probably don’t care as it all just works.

The idea of an independent DNS network is not new. Several years ago, I wrote about the ORSN, the European Open Root Server Network. I sometimes set systems up to use it, but like most folks, the DNS that most of I use is a part of the huge distributed network that comes as part of the Internet. Personally, I have my own internal DNS server which serves as a resolver for the  systems in my network, as well as hosting the records for my AD domain.

The Google DNS service is just a resolving DNS cache – they host no zones  (other than Google’s own zones) and they do not host any root domains. They just resolve the names you send it, and cache those results for others to use. Google say their goal is “to benefit users worldwide while also helping the tens of thousands of DNS resolvers improve their services, ultimately making the web faster for everyone.” A laudable goal, but is it actually a useful service. On Google’s code blog, they cite three main advantages: Speed, Security and Validity. I’ve spent some this morning looking at the speed claim.

Google DNS is hosted at several servers. In their configuration instructions, they :there are two servers (at 8.8.8.8 and at 8.8.8.4), although another document shows 5 servers (ns[1-4].google.com). I have been doing some testing, using PathPing.Exe this morning to look at performance. Access to these servers from the UK (where this article is being written) is quick and has no packet loss. The path to the servers goes first via my ISP to Google-Lon.Google.Com and then via a few un-named routers to the DNS server. The RTT to 8.8.8.8 was 38 ms.

I then ran a PathPing against my own ISP’s DNS server and another DNS Server run by Demon Internet. For both UK servers, I see a shorter route (10 hops vs 12) – not surprising as these servers are in the UK, and Google’s I suspect are not.I also see a faster path (31 ms to Demon, 28 to Clara.Net vs 38 to Google). Packet loss to both UK servers is also zero although the route to Demon’s name server goes via  tele-service-22-s267.router.demon.net which drops 100% of the ICMP packets (no such issue to Clara.Net. So on path grounds, using Google does not seem to provide much benefit as it’s slower and has more hops.

As for security, Google makes the valid point that DNS is vulnerable to attach – it can be spoofed routing users to malicious sites rather than the intended site. The DNSSEC protocol is meant to attack that problem through cryptography, but DNSSEC is not yet in widespread use (and is a lot more work to setup). I’m not sure I entirely buy the argument that they are any better at protecting DNS than my ISP. We’ll have to see.

The validity advantage that Google cites seems even less interesting. They say their DNS servers comply with DNS Standards. Well I’d sure hope so – a non-standards compliant DNS server would be a bit a a waste of time. They also avoid blocking, filtering or redirection. Doing some testing using NSLookup, it looks like they don’t do wildcard DNS – thus trying to resolve (hopefully!) non-existent domains gives teh expected error. I tried resolving x.microsoft.com and foo.foo.foo – both queries were “refused” since ns1.google.net could not find those domains.

All in all, this looks to be a reasonably fast and competent DNS solution. For many users, using this might benefit their web surfing. For me, it looks a tad slower – so ‘buyer’ beware. As for the security argument, I accept they intend to make their service secure, but experience will show whether this is a good argument, or not.

Technorati Tags: ,,,

Saturday, November 28, 2009

PowerShell Plus 3.1 Beta

With PowerShell V2 now released, those nice folks over at Idera have released a Beta of their upcoming 3.1 version of this popular PowerShell tool. You can see more information, and snag a copy of the beta, over at the PowerShell.Com site!

Personally, I like PowerShell Plus and use it a lot so I was keen to see the beta. You can take a look at some of the new features in a post by the Product Manager Richard GIlles also over on the PowerShell.Com site. Naturally, once you install the beta, there’s a readme file with more information.

One neat thing – there’s now a 64-bit version which installs neatly in a side-by-side fashion with the released version. There’s also some nice script sharing features which encourages further the PowerShell community.

I’ve got the beta downloaded and am using it – I look forward to the release!

Wednesday, November 18, 2009

PowerShell – WMI presentation tonight

The things I agree to do…

A good MVP buddy of mine, Joel 'Jaykul' Bennett has asked me to speak tonight to the Upstate New York PowerShell user group. The title of the talk is WMI and  PowerShell. I’m aiming at the basics but will go into a bit of detail. Plus there’s demos. If you want to join into the Live Meeting, here’s the url: ">https://www.livemeeting.com/cc/mvp/join?id=UPNYPUG&role=attend

However, I’m not planning on starting till 11:30 PM (23:30) UK time – which is thankfully earlier for the audience in New York.

Hope to see you there…

[later]

Yes – I will post my slides and demos tomorrow…

Technorati Tags: ,,

Thursday, November 12, 2009

PowerGui.Org’s PowerPack Challenge – Closing Sooon

PowerGUI.org is holding the PowerPack Challenge contest. Basically,  the idea is that you create a PowerGUI add-on (admin console for a particular platform you manage based on PowerShell cmdlets/scrips), and then submit it to the site. By entering the contest, you can win one of the prizes (the top prizes are $1000 in Amazon certificates). This is easy and the site has tutorials on how to do this.

The contest will run for 3 more days, i.e. until the end of Nov 15 2009. You can get full details at PowerGUI.Org. Get coding and best of luck!

Technorati Tags: ,

Friday, November 06, 2009

Leaving Global Knowledge for Pastures New

Many of you will already know this, but I am shortly leaving Global Knowledge. After 3+ years of working for GK’s EMEA group, I have been made redundant and will be out of the company by the end of November or thereabouts. The redundancy process is swift, and pretty brutal – but that’s the nature of the beast. Global Knowledge are treating me pretty well, under the circumstances for which I am genuinely grateful.

In the short term, I’m going back to contracting – doing training, consultancy, or whatever turns up. I would hope to get enough work to tide me over until I can figure out the longer term plan. I am favouring returning to full time employment but we’ll see what happens over the next few months. I’ve already had some brilliant leads, which are great, so things are not totally bleak. I also have a lot of really good friends in the industry who are lending a helping hand. I could not ask for more.

It’s a sad day, on one hand, but a great opportunity on the other. From the people I’ve spoken to, the broad consensus is that I’ll make out just fine. I just hope those voices are right.

Monday, November 02, 2009

Disk to VHD – Another Cool Tool from the Sysinternals Guys

Mark Russinovich and Bryce Cogswell have released yet another cool tool – Disk2VHD (version 1.21) which you can download from here. As it turns out, I’m looking to convert one of my physical boxes to a VM – I use the system rarely but don’t want to chuck it. This tool would be just the business – I could easily run a VM of this system on my laptop when I actually need it. 

The VHD that this tool creates can be used as a VM for either Microsoft Virtual PC or Hyper-V. One important limitation – with Virtual PC, the largest volume is 127GB. And the tool is command line – but you could of course, run it from PowerShell!

Technorati Tags: ,,

OCS 2007 R2 Documentation

If you are working with Microsoft’s Office Communications Server 2007 product, you may know about the great documentation produced by the product team. This documentation has been updated to cover the R2 release that hit the streets earlier this year.

There are three ways you can get the documentation:

  • A single CHM file with all the other documents as a single file
  • A zip file containing all the other documents as Word docs
  • Individual Word documents.

Navigate to http://tinyurl.com/ocsdoc and  you can get all three of these formats!

If you get the .CHM file, you will need to remove the protection from the file (use Systinternal’s streams.exe) in order to see the contents. From the .CHM file, you can also send the writers email. They are highly responsive and are happy to incorporate any and all good ideas. This is a great job guys!

Technorati Tags: ,,

250,000 Visitors To This Blog!

I started this blog in May 2003. Just over 5 years ago, I added a traffic counter, and started keeping more detailed traffic counts. Over the weekend, the hit count got to 250,000 – or a quarter of a million visitors! Not as much traffic as some blogs get, but it’s been nice to see the traffic grow here. I am certainly pleased at the slow and steady readership growth.

Looking at my stats today, they sown an average of 377 visitors a day and 504 unique page hits a day. Traffic in October hit an all time record with over 10,00 visits (and over 13,000 page hits). Wow!

For long time visitors – thanks for reading this blog. And for new visitors – welcome and let me know what else I can post here that would keep you visiting.

Sunday, November 01, 2009

Connecting OCS To Other PIC Suppliers

Office Communications Server 2007 implements a feature called Public Internet Connectivity (PIC). Basically, PIC enables you to federate with AOL, Yahoo and MSN/Live Messenger. Thus a user using an AOL IM client can connect and use IM with someone inside your organisation. PIC was cool, but was limited to just the three suppliers (i.e. no Google Talk) and it was licensed separately.

In the past month Microsoft has released some important news. First, PIC licensing has changed. Secondly, Microsoft has announced the release of an XMPP Gateway which facilitates present and IM interoperability between OCS and both Jabber (now owned by Cisco) and Google’s Google Talk.

There are some key PIC Licensing changes. The additional PIC license will not longer be required for federation with AOL. Federation with AOL is free for customers with OCS R2 Standard CAL, or Software Assurance on their current OCS license. But if you wish to federate with Yahoo, you continue to need a PIC license, but the cost of this drops by 50% (effective 1 October 2009). Additionally, as from June 2009, you no longer need a PIC license to federate wit Windows Live (same requirements as noted above for AOL federation).

The release of an OCS 2007 XMPP Gateway means you can federate with both Google’s Google Talk, and with Jabber. And the licensing calls the gateway “Additional Software” meaning there is no additional Microsoft licensing costs associated with you deploying the Gateway.

The OCS Team have published a couple of articles to explain how to get the XMPP gateway up and running. The first blog post discusses Configuring XMPP connectivity to Google Talk. The second blog post looks at how to configure the XMPP Gateway with Jabber XCP 5.4. Both articles are detailed and well illustrated.

For OCS 2007 R2 users, PIC connectivity got a whole lot better!

What Happened To The Post Counts on the MSDN and TechNet Wikis?

I’m a fairly heavy contributor to the MSDN and TechNet Wikis - also known as MSDN and TechNet Community Content. I started posting there pretty much ever since Microsoft setup this feature a couple of years ago. My contribution has included over 10,000 posts (just over 7500 to MSDN and over 2800 to TechNet). I wrote about the MSDN wiki in August.

I do not know if it’s a short term glitch or a more major change – but the post counts have been updated in a significantly downward fashion. TechNet shows just 1297 posts, while on MSDN just 3927) – thus I’ve lost around half my post count. At the time I wrote the august post, I had over 6500 posts credited, but now it’s a LOT lower.

MSDN/TechNet: what’s happened??

Wednesday, October 21, 2009

MSDN Has A New Look and Feel

I spend time on the MDSN Site, particularly the MSDN Library sub-site where I’ve added a few PowerShell scripts as well as editing the content that is added. Just recently, the site has had a bit of a make over. The “MSDN-RED” logo is replaced wiht a more stylish blue colour, along with the opportunity to change your view of the site.

From FireFox, I have a new pop up at the bottom right hand corner of my browser window:

image

The “old” view, Classic is what you are used to, although with new colours. It is the view I will use going forward. Lightweight beta provides what it says, a much more lightweight feel. ScriptFree is even nicer (IMHO) to look at.  And being smaller pages, download times are much snappier.

But what both these two new views omit is all the community contnet (i.e. Community Content) as contained in Classic View. From the Script page, community added page tags are R/O, and there appears to be no way to see or edit Community Content (from both LightWeight and Script Free skins). And the big orange Switch View button from Classic view is pretty ugly and distracting – worse, there appears to be no way to tell it: I’m happy with what I see and please go away.  Or at least a more subtle control perhaps in the title bars like in the other views.

For casual users, or those on lightweight (aka celluar) networks, it’s a nice touch. Shame about losing the community content.

Wednesday, October 07, 2009

Ensuring PowerShell Is Loaded Onto a System

I saw a cool tip over at PowerShell.com for working out if PowerShell is available on a system. This tip points out that if PowerShell is installed on a machine, then the registry key HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\PowerShell\1 will exist and will contain key configuration information.

From my main workstation, I see:

PSH [C:\]: cd hklm:\\SOFTWARE\MICROSOFT\PowerShell\1\
HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\PowerShell\1
PSH [HKLM:\SOFTWARE\MICROSOFT\PowerShell\1]: ls

    Hive: HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\PowerShell\1

SKC  VC Name                           Property
---  -- ----                           --------
  0   1 1033                           {Install}
  0   6 PowerShellEngine               {ApplicationBase, ConsoleHostAssemblyName,PowerShellVersion,
                                        RuntimeVersion...}
  3   0 PowerShellSnapIns              {}
  1   1 PSConfigurationProviders       {(default)}
  1   0 ShellIds                       {}

PSH [HKLM:\SOFTWARE\MICROSOFT\PowerShell\1]: ls .\PowerShellSnapIns

    Hive: HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\PowerShell\1\PowerShellSnapIns

SKC  VC Name                           Property
---  -- ----                           --------
  0   7 PowerGUI                       {Version, PowerShellVersion, AssemblyName,ApplicationBase...}
  0   9 Pscx                           {PowerShellVersion, Vendor, Description, Version...}
  0  11 Quest.ActiveRoles.ADManagement {AssemblyName, Description, ModuleName, PowerShellVersion...}

This is pretty cool. And looking at the PowerShellSnapIns – this information is useful for managing snap-ins on clients and servers. You could test the existence of this Registry key as follows:

PSH [C:\foo]: Test-Path hklm:\SOFTWARE\MICROSOFT\PowerShell\1
True
PSH [C:\foo]: $PSVersionTable

Name                           Value
----                           -----
CLRVersion                     2.0.50727.3074
BuildVersion                   6.1.6949.0
PSVersion                      2.0
PSCompatibleVersions           {1.0, 2.0}

But testing the existance of this registry path from within PowerShell is somewhat meaningless. If PowerShell exists on a given system, then the key will exist so the test will of course succeed. But if PowerShell does NOT exist, you’d never be able to run the script in the first place so being able to test from within the PowerShell script is unhelpful. I suppose you could write a C# program or a VBScript script to do the detection.

But wouldn’t a better way be to just get PowerShell deployed? I’d base the wide deployment to down level systems on the RTM version of The Windows Management Framework. The WMF Release Candidate was posted to the web on Aug 13th, so is getting very close to release. I do not have any specific knowledge, but given past experience by the PowerShell team, I’e expect to see this by the end of the year, maybe at TechEd Berlin.

Technorati Tags:

Tuesday, September 08, 2009

Sharkfest 2009 – The videos

I’m setting up a few new machines, and needed to check out some networking issues – so I went off to Wireshark.Org to download a copy when I discovered a treasure trove of videos and slides from the recent SharkFest conference held in June in Palo Alto.  It sounds like it was an interesting conference – the videos I’ve watched look OK.

Technorati Tags:

Saturday, August 29, 2009

The MSDN Wiki – a look after nearly 6500 edits

Last summer, I posted a blog article about the MSDN wiki, better known as MSDN Community Content (there’s also an equivalent set of content around IT Pro type information, namely Technet  Community Content. This morning I got a comment on that entry which complained about a) not being able to find stuff and b) that the community content idea had been killed. I posted a response to that comment – the Community content is still alive and kicking. In the past year, I’ve added over 6000 updates, the latest of which was a few minutes ago! I’ve also added around 2500 edits to the TechNet equivalent, or nearly 9000 edits in total – and I’m not even an MVP!

The Community Content represents some really great information – and not a few criticisms where the content (or the product!) is at fault. Sadly, as I noted last year, there is a degree of vandalism on this site, which has grown somewhat over the past year. I am fairly ruthless (although as not up to date as I’d want!) with reviewing new comments and removing what I call “non-content”, as well as duplicate posts which some times get made. I also try to ensure good tags. One trend that has accelerated is for posters to see the MSDN community content as a place to ask questions – a couple of posters including my self, point them to the community forums and the Microsoft newsgroups. I also try to ensure the tags on the each community content are relevant.

As ever in publishing, the MSDN content contains errors – usually minor typos, etc. While these are regatable, given the sheer scale of the MSDN (and TechNet) library, these are probably inevitable. Thanks to the sharp eyes in the community, these are found, and have been tagged “Contentbug”. Microsoft are slowly working through these and updating the content proving that the community review process is working, albeit much slower than I’d like.

All in all, the MSDN and TechNet Community Content  are fantastic resources, and are growing daily. Thanks to the MSDN/Technet content teams for providing the platform and working with the community to improve the content

Monday, August 24, 2009

Installing Windows 7 From USB Drive

Surfing around this morning, I found an interesting blog article: Installing Windows 7 From USB Drive 3 Steps Easy Process. Basically, this article shows how to prepare a USB stick from which you can install Windows 7. This is specifically useful for Netbooks or other systems that come without a DVD drive.  This method uses the UltraISO trial package. You do need a big USB stick, 4gb or more and make sure you have the latest version of UntrlISO.

Technorati Tags: ,

Sunday, August 09, 2009

Windows 7 and Virtual Server

With Windows 7 now at RTM and in my hands, the time has come to start to migrate over. I got the ISO images I needed last week, and immediately started to get things running. One specific thing I needed is to get Virtual Server running on Win 7. I am teaching a class on Aug 13/14 and need to prepare – the class is a Windows 7 upgrade class for Microsoft Partners.

I thought, given we’d achieved RTM, that it would be cool to have the Labs running under Windows 7 RTM. To do that, I just needed to get Virtual Server running and install the course files on my laptop. Now I know it’s not officially supported, but I wanted to see if I could make it work. If so, I can then run and demo the labs with Win7 as the host.

There were only two problems with that. The first is that the labs that come from Microsoft Learning require Virtual Server. But Virtual Server is not supported under Win 7, and is actively blocked. And even if I did get VS running, MSL assume that you will be using earlier OSs to run the labs – and installing these on Windows 7 is specifically blocked by the MSI installer.

After a bit of Googling using my favourite search engine and perusing the MCT private newsgroups, it appears that there are two solutions. However, both involve some hacking (and are probably NOT supported).  The solutions were either to get the labs working under Virtual PC 7, or install VS into Win7 and get it running. While both are possible, I chose to do the latter since this seemed to provide an easier to use solution (if it worked of course!!!).

Naturally, a third (and preferable!) option would be to wait for Microsoft Learning to come up with a supported solution. But that is not likely to happen any time soon and I have a course to teach now. So down the road I went with two main tasks: first, installing Virtual Server on Windows 7 and then installing the Microsoft Learning labs on Windows 7.

Installing Virtual Server under Windows 7

To install Virtual Server under Windows 7 is straightforward, although it involved registry editing (and is NOT supported). The steps are as follows:

  1. Disable the application compatibility fixes. You first need to disable all the application compatibility fixes that would stop you from installing Virtual Server. In Windows 7, bring up the local Group Policy editor and Computer Configuration\Administrative Templates\Windows Components\Application Compatibility\ and enable all of the “Turn off” entries. As here;image
  2. Then go to the Computer Configuration\Administrative Templates\System\Troubleshooting and Diagnostics\Application Compatibility Diagnostics\ folder and disable all of the entries there. Like here:
    image
  3. Reboot to ensure the settings take effect. You could, I suppose, just do a gpupdate /force, but I prefer to reboot.
  4. Install Virtual Server. Download the binaries from Microsoft .com and install it.
  5. Install VS SP1 – depending on what download you get, you may may need to install SP1. NB: Installing SP1 is more difficult once you have completed this process, so why not do it now?
  6. Change Application File Name. To enable the next step to work properly, you need to rename the Virtual Server application file name, by going to the Virtual Server directory (typically C:\Program Files\Microsoft Virtual Server\ ) and rename Vssrvc.Exe – I used Vssrvc_Win7.exe.
  7. Change the registry settings for this new file name.  Run regedit, then find and replace all occurrences of "vssrvc.exe" to vssrvc_Win7.exe. There are around 5 or 6 entries to fix.
  8. Undo the group policy changes you made at step 1 above. This turns all those application compatibility fixes back on (which is probably what you want!).
  9. Reboot.

Once the reboot is completed, you should have (the renamed) Virtual Server running on your Windows 7 box. If you want to just run VMs, off you go. you can create and use VS just as you did on XP or Server 2003. Hoevere, if you want to install a MSL Lab onto the system, you have just a bit more work (hacking) to do.

Install course files for Lab Launcher into Win7

Once you have Virtual Server running, you need to fix the installation file for the lab you want to install, e.g. 6291.MSI. Normally, you just run this MSI file on your master system (and use Ghost et al to then blast it to the rest of the systems in the classroom). The only problem with this approach is that Microsoft Learning have put in an installation block on Win7. That means when you try to run the installation MSI, it fails with an error message (”the operating system version does not meet the minimum version required”). Turns out this is really easy to fix – but you do need an MSI editor.

There are two MSI editors you can use (and both are free):

  • ORCA – this is a free MSI editor from Microsoft and comes as part of the Windows Installer SDK. It’s a bit of a pain to get as you first have to download and install the SDK, and then install Orca.Msi. You can find copies of orca.msi out on the internet, but downloading it from Microsoft is probably safest even if it is a bit tedious.
  • InstEd - you can also use InstEd, a free tool you can get at http://www.instedit.com/. I like Instead and blogged about it a while ago. It’s still cool!

Once you your MSI editor loaded, you need to remove one line from the Launch Condition table. If you are using Orca, you’ll see something like this:

image

Just delete the first line of this table, then save the MSI (File/Save As) to a separate MSI (just in case). Then exit out of Orca and you can run your newly saved MSI. You should be good to go.

Minor issues

I found that, after playing around a bit, Virtual Server was complaining about access permissions of VHDs on the host. I just gave all users full access to the C: on the host and the problem went away.

An important issue to consider is that this is not supported. Yes, it works and works reasonably well. It also involved editing registry entries, and using a MSI editor. From personal esperience, both can be dangerous. So use this information at your own risk.

Kudos

Thanks to a couple of posters for helping me to do all this. First, the details of how to get VS running were obtained from a TechNet  forum post. Hacking the MSI to get around version limitiations is an old trick, but thanks to MCT Super-Star Michael Buchardt who pointed me to InstEd! And finally a thanks to Chris Randall for the motivation to write this up.

Thursday, August 06, 2009

The Flood Gates are Open – Windows 7 has hit TechNet and MSDN

Well, after a long wait, it’s finally there – Windows 7 RTM release is on TechNet Plus for download. As I type this, I’m getting speeds varying between 100kbps and bursting to 200kbps. Not stunning but OK. Hopefully the download will finish tonight and I can spend tomorrow with installing Win7 onto my Laptop. Horray!

Technorati Tags:

Static vs Dynamic WMI Methods with PowerShell

In yesterday’s  PowerShell and .NET Framework – Similarities and Differences! article, I wrote about how there was stuff you have to know in order to use PowerShell efficiently. In a recent Hey, Scripting Guy! Blog article, The Scripting guys talk about the differences between a static method on a class and an instance-based method. They use the WMI Win32_Process class as an example. In this WMI class, there is an instance based method, Terminate, that lets you terminate a specific process. As the Scripting Guys point out, there’s no Create method. That’s because the  Create method is a static method which you invoke a slightly different way.

To create a process, you’d do the following:

PSH [C:\foo]: ([wmiclass]"Win32_Process").Create("NotePad")

__GENUS          : 2
__CLASS          : __PARAMETERS
__SUPERCLASS     :
__DYNASTY        : __PARAMETERS
__RELPATH        :
__PROPERTY_COUNT : 2
__DERIVATION     : {}
__SERVER         :
__NAMESPACE      :
__PATH           :
ProcessId        : 8828
ReturnValue      : 0

This is yet another one of those things you need to know about using WMI with PowerShell: how to access static vs. instance based methods.

Technorati Tags: ,,

Wednesday, August 05, 2009

PowerShell and .NET Framework – Similarities and Differences!

When I teach PowerShell, I point out the consistency within and across the product. When you learn something, it can be broadly used in other circumstances. This is the power of knowledge transfer – learning things once and using that knowleldge to solve other problems.

Of course, despite the consistency that’s there, there is a level of in-consistency to manage. There are things that are just not particularly intuitive – and you just have to learn those differences. There is definitely a learning curve when it comes to learning PowerShell! I’ve published what I learn as I go along – I posted an article about .NET and Powershell last December where I looked at how you access .NET from PowerShell.

With a bit of learning, via posts like mine or from Newsgroup.Forum posts, most PowerShell users can look at a bit of documentation on .NET or  WMI, and work out how to access the relevant class,  method, etc. But every so often, stuff that looks obvious isn’t! This fact (feature???) bit me over the weekend when I was working on the Get-System.Environment script that I published over on my PowerShell Scripts blog.

I was playing around with the System.Environment class, and in particular the Environment.SpecialFolder enum. Just looking at it, I felt if I can use one enum like this,

PSH [C:\foo]: [system.Enum]::GetValues([system.dayofweek])
Sunday
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday

Then I should be able to to do something like this (which as you see I can’t):

PSH [C:\foo]: [system.Enum]::GetValues([System.Environment.SpecialFolder])
Unable to find type [system.Environment.SpecialFolder]: make sure that the assembly containing this type is loaded.
At line:1 char:60
+ [system.Enum]::GetValues([system.Environment.SpecialFolder] <<<< )
    + CategoryInfo          : InvalidOperation: (system.Environment.SpecialFolder:String) [], RuntimeException
    + FullyQualifiedErrorId : TypeNotFound

Then, after some searching, I discovered may mistake!  Instead of [System.Environment.SpecialFolder], I needed to specify [System.Environment+SpecialFolder]. As shown here:

PSH [C:\foo]: [system.Enum]::GetValues([system.Environment+SpecialFolder])
Desktop
Programs
Personal
Personal
Favorites
Startup
Recent
SendTo
StartMenu
MyMusic
DesktopDirectory
MyComputer
Templates
ApplicationData
LocalApplicationData
InternetCache
Cookies
History
CommonApplicationData
System
ProgramFiles
MyPictures
CommonProgramFiles

Sure – that was obvious, NOT! But another hurdle on my learning curve!

PowerGui Version 1.9 is Released

As Dmitry discusses over on his blog: The most popular PowerShell freeware tool has just got an update. He and his team have reached into the “I want” list and they’ve produced a stunning new version, just in time for the release of PowerShell V2 (effectively tomorrow or thereabouts). You can download the new version from the PowerGUI Website at: http://powergui.org/downloads.jspa.

If you are already running an older version of PowerGui, you should be able to check for updates (Help/Check for Updates).  But on my main workstation, that gives an error. So I am downloading the new version manually. The download (PowerGUI.1.9.0.900.msi) is 8.1 mb – so not a massive download! In my case, I also needed to install the Active Roles cmdlets (another 8.45MB) before I could finish off installing PowerGui.

While I prefer PowerShell Plus for my day to day editing and development tasks, I have to say PowerGui is a nice product – especially as it’s free. The new version has a number of neat new features, including: Version 2 Module Support, and more intellisense. Perhaps the biggest feature is full Windows 7 and Server 2008 R2 support. It might be tempting fate to release 1.9 the day before Microsoft opens the floodgates that will be the waves of Windows 7 and Server 2008 R2. I suspect like many people, I have a browser window open at the TechNet Plus download site just waiting for the downloads to begin!!

Technorati Tags: ,,

Sunday, June 07, 2009

Bing is making a spash!

In more ways than one! By Bing, I mean Microsoft’s new search engine (http://www.bing.com or http://www.bing.co.uk). I’ve been playing with it a little (and love the nice scenery pictures on the landing page). The results seem OK so far, but it seems a lot better than the old Live Search engine. And there are a lot of references to Wikipedia.

One interesting thing though, I’m starting to see referrals in my blog hits. Looking at the most recent 20 hits, 5 are from Bing.com! And if I’m seeing hits like this, then that means other folks are using Bing.

One other related bing thing, I notice that there’s already a PowerShell project on CodePlex (http://poshbing.codeplex.com). This project has created a number of Bing-related PowerShell script library. So you can do things like Get-BingSpell (to query alternative spellings for a word) or Get-BingTranslation. LIke so:

PSH [C:\foo]: Get-BingTranslation "Beer is good, free beer is better" en de

Query                               SourceType     TranslatedTerm
-----                               ----------     --------------
Beer is good, free beer is better   Translation    Bier ist gut, freie Bier ist besser

I could get to like Bing.

Technorati Tags:

Sunday, May 24, 2009

Useful Yet Little Known Features in PowerShell

Over on the Stack Overflow site, there’s a fascinating question and answer(s) about little known features of PowerShell. See What are some of the most useful yet little known features in the PowerShell language for this article. If you are a PowerShell geek, you probably know these, but there are one or two that you might have missed!

Technorati Tags: ,

Saturday, May 23, 2009

TechNet Virtual Conference 2009

Where are you going to be on June 19th? If you are a techie and into Microsoft technology, then perhaps you should be attending TechNet’s free Online Conference. This is an all day, free, on-line event aimed at delivering technical information that the community requested. And did I mention, unlike other events you have to pay for, this conference is free!

The conference is divided up into two tracks: Technology and IT Manager, with content to match. See the web page: TechNet Virtual Conference 2009 for more details on the agenda.

This is an excellent idea and I am definitely looking forward to hearing Richard Siddaway and Brent Johnson in particular. I’m sure glad the sessions are recorded so I can view later the sessions I missed!

See you on-line.

Technorati Tags:

Friday, May 22, 2009

PowerShell and WMI Namespaces

Over on Tim Benninghoff blog – he has an interesting post:: PowerShell and WMI namespaces. With WMI, the classes and intances are organised under a hierarchal namespace starting at the appropriately named “root”.  Individual nodes can have children which can in turn have children and so on. To some degree, namespaces are just defined by a product team and there is little consistency across software products (such is life!). But where to start?

Despite what Tim says about using the GUI, MOW’s most excellent WMI Explorer script is one fantastic tool. Not only is it a really good browser, but as Tim notes, it’s written in PowerShell which is even more cool. I use this script in most of my training courses to add value!

His post then goes on to describe two methods of obtaining the namespaces within WMI. There are two small problems with his examples. In his first example he has a minor typo – this should read as follows:

gwmi -namespace "root" -class "__Namespace" | Select Name

In Tim’s post, he spelt the class with just a single underline (“_Namespace”) not two (“__Namespace”). WMI is sadly very picky!  Tim’s other method works fine and as he says produces the same output as his first (well once corrected!). On my system, this produces the following output:

PSH [C:\foo]: gwmi -namespace "root" -class "__Namespace" | Select Name

Name
----
subscription
DEFAULT
MicrosoftDfs
CIMV2
Cli
nap
MicrosoftIISv2
SECURITY
RSOP
MicrosoftDNS
WebAdministration
WMI
directory
Policy
virtualization
Hardware
ServiceModel
MSAPPS12
Microsoft
aspnet

The second point is that his two methods just produce a list of namespaces under the root. Since each node in the namespace can have children, his two methods do not list all the namespaces in which you can find classes. This, IMHO, is once case where the GUI is a better tool – visualising the hierarchy in a tree control is a whole lot easier than trying to do it from the command line. And you get the names spaces in alphabetical order (although you could do pipe the output above to Sort-Object easily enough). On my workstation I see a number of subsidiary namespaces below root as you can see here:

image

 

WMI Namespaces are a good thing to understand, since many of the classes you might want to access using Get-WMIObject (et al) rely on the –NameSpace paramater (and the appropriate namespace name!).

Technorati Tags: ,,,

Thursday, May 21, 2009

Windows Vista for XP Professionals

I’ve been carrying this book (see http://www.amazon.co.uk/Windows-Vista-XP-Professionals-Updating/dp/9072389018 to buy this book!) . It’s written by Dutch MCT superstar Raymond Comvalius. In summary, this is a great book – simple  and to the point. Unlike some books, there’s very few screen shots – just lots of good straightforward text!

The book contains 8 chapters:

  • Chapter 1 – Introduction
  • Chapter 2 – What’s new in Vista and is not discused in the book – a nice touch!
  • Chapter 3 – Deploying Vista – a good look at the deployment tools which are all new in Vista.
  • Chapter 4 – Managing Vista- includes details on group policies and a look at WInRM.
  • Chapter 5 – Securing Vista – explains the key new security features of VIsta including UAC, file/registry virtualization and BitLocker.
  • Chapter 6 – Networking – Vista includes a bunch of new networking features, in effect a new TCP/IP stack, which are described in this chapter.
  • Chapter 7 – Mobility – a look at the mobile features of Vista.
  • Chapter 8 – Migration to Vista – this final chapter examines how to plan your Vista migration.

This is an excellent summary of what an IT Pro needs to know moving forward to Vista. I hope Raymond writes an update for Windows 7!

Technorati Tags: ,

Wednesday, May 20, 2009

Pscx 1.2 Beta Released

The beta of the next version of the PowerShell Community extensions has been released (see Nivot Ink - Pscx 1.2 Beta Released for more details). The code itself can be downloaded from: http://pscx.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=1615.

For me, the cool thing is that this is released as a module so I can import the module when I want it – or not. Nice touch. If you are using Win 7, note this beta requires Win 7 RC!

Tuesday, May 19, 2009

2009 Summer Scripting Games

As in years past, Microsoft is hosting the 2009 Summer Scripting Games. This year, the games are co-promoted by The Microsoft Script Guys and PosCode.Org. The games will run from June 15-26 and should be a lot of fun. As happened last year, I’ve been asked to submit a sample script and am hard at work on it already!

If you are a novice, an expert, or anywhere inbetween with scripting (PowerShell or even VB Script) then visit the site, sign-up and take part.

 

Wednesday, May 06, 2009

Updates for Communications Server 2007 R2 (almost SP1!)

Microsoft has released a set of 13 patches for OCS 2007 R2. You might think of this as SP1 for OCS 2007 R2 (but that’s not the say MS is marketing it). Irrespective of what they call it, this set of patches is probably worth adding if you are deploying R2. As ever with patches, check the details to see if your systems are affected by each of these patches and which ones.

Due to the complex nature of OCS Deployments, patching is hard as there’s not just a single patch you can apply to all systems – you have to apply some patches on some systems and other patches on other systems. For example, an OCS 2007 R2 SE system needs 9 patches, while the Edge Server needs three. So plan this carefully as, at the least, you’ll need some service outages to apply the fixes.

The KB article: List of available updates for Communications Server 2007 R2: April 2, 2009, lists all 13 patches and describes which patch needs to be installed on which server role. You can drill down into each of the 13 individual patches – each has its own KB article. Most of these KB articles explain the issues resolved by the patch. KB 967675 that describes the fix to the Mediation server does not contain details of the fixes, but that’s probably just a doc error that will get fixed soon. For each individual issue, there’s a link to (another!) KB article describing the specific issue in more detail which include the symptoms of the (resolved) issue.

All in all, this is a useful update and well packaged. Next time though, couldn’t we have a mondo-patch (R2PatchAug09.exe for example) that you must apply to every related OCS system. That would help with the deployment and could reduce support calls especially from organisations deploying distributed Enterprise pools.

Technorati Tags: ,,

Friday, April 24, 2009

Windows Management Infrastructure Blog

The WMI team are now blogging in the Windows Management Infrastructure Blog. A most useful blog for those wanting to understand WMI (and BITS, and WINRM) especially those coming to this from a PowerShell point of view. One interesting post examines the WMI story, pre Windows 7 (with lots of great links to more background). Another post looks at what is coming with WMI in Windows 7 (i.e. with PowerShell V2).

This blog is well written! And any blog that can mix PowerShell, reindeers, Jedis, ands and grasshoppers can’t be all bad!

Technorati Tags: ,,

PowerShell and CMDlet design guidelines.

Dan Harman, a PM in the Windows PowerShell team has published an excellent blog article: Increasing visibility of cmdlet design guidelines. The article talks about cmdlet naming conventions and the importance of good naming around the verbs and nouns used in the names of cmdlets (and by implication advanced functions too). The article also talks about the problems of a proliferation of uses of PowerShell that deviate from the standards currently being set by the PowerShell team.

With the increasing adoption of PowerShell across MS and non-MS products, standards are vital and enforcement probably mandatory. PowerShell aims help organisations break down their knowledge silos. To achieve this, there needs to be a standard set of verbs and nouns you can use and some good way of enforcing it.

Microsoft proposes some changes to V2 to resolve the issue. Specifically, when you call Import-Module to import cmdlets, Powershell evaluates the module and warns against violations of the standards. The end use sees a warning message but the cmdlets are still imported. Thus, by default, when you load a module with badly named cmdlets, you get a warning message but everything works (i.e. the module is imported!). For scripts where you want to suppress the Warning, just use the –DisableNameChecking on the call to Import-Module.

This is a pretty neat solution. We’ve already seen a number of interesting approaches to the use of PowerShell by various teams, and MS is right to want to stop the problem from being a real problem.

One addition I’d like to see: a GPO setting to set a default value to enable disabling name checking. The scenario I envisage is an organisation that has, for whatever reason, invested in cmdlets that end up being badly named. Rather than have their admins constantly be reminded, or worse possibly confused, I’d like to see a GPO setting to always just not show the message for my organisation (or part of an organisation).

Technorati Tags: