Tuesday, September 14, 2010

Microsoft Lync 2010/Microsoft Lync Server 2010 - Resources

Here are the resources I’ve found so far. I am updating this regularly as I find more stuff!

Last Update: 13:15 18September 2010

 

Background Information

Here is some background information on Lync Server 2010 and Lync 2010:

Lync Software Components

Microsoft has released some product bits of the product as separate downloads:

Microsoft Blogs

Non MS Blog Posts

Podcasts

Planning Tools

Product Documentation

Microsoft has released several white papers on the http://technet.microsoft.com/en-us/lync/default.aspx site, including:

  • Determining Your Infrastructure Requirements for Lync Server 2010 (RC).doc - Download
  • Planning for Archiving Lync Server 2010 (RC).doc - Download
  • Planning for Clients and Devices Lync Server 2010 (RC).doc - Download
  • Planning for Enterprise Voice Lync Server 2010 (RC).doc - Download
  • Planning for External User Access Lync Server 2010 (RC).doc - Download
  • Planning for IM and Conferencing Lync Server 2010 (RC).doc - Download
  • Planning for Other Features Lync Server 2010 (RC).doc - Download
  • Planning for Your Organization Lync Server 2010 (RC).doc - Download

Pricing and Licensing

Lync Server 2010 follows a Server/Client Access License (CAL) model whereby a Lync Server 2010 license is required for each operating system environment running Lync Server 2010 and a CAL is required for each user or device accessing the Lync Server.

  • Pricing for Lync Server and Client – this page sets out the details of licensing for Lync Server and client.  Pricing on the page is ‘estimated’ – in other words see your reseller as prices will vary from the 'official’ costs shown here.

Support

At present, there’s no formal support for Lync 2010.  At present, the two places where you can find more information are Microsoft’s OCS 2007 Forums:

Webcasts – CS14 at TechEd

Microsoft presented CS14 topics at TechEd North America earlier in 2010. The presentations and slide decks are all available for download and use. These presentations talk about CS’14’ – but aside from the branding, the details are the same!

Press and Release PR Coverage

As often happens, much the industry found out about Lync Server’s public debut from non-Microsoft sources, quickly followed by the MS presentation. Here is some of the press background if you are interested.

Email me any changes or updates and I’ll try to keep this list up to date!

[UPDATES TO ORIGINAL POSTING]

  • 15 Sept 2010 - Added CS4 web casts, Added Podcast section and did some minor re-org of the list itself. Also added update list
  • 17 Sept 2010 – Separated out the Press/PR stuff from basic tech info. Added Lab deployment guide reference.
  • 18 Sept 2010 – fixed missing link to Thomas/Cezar's podcast, added Worked Deployment Guide from Jeff Schertz.
  • 4 Oct 2010 – added details on licensing and links to planning documentation.

 

Technorati Tags: ,,

Monday, September 13, 2010

PowerShell Master Classes

I’ve been teaching my PowerShell Master Classes both in Europe and in the US over the past 6 months. After two classes in Stockholm, I taught in the US (teaching Microsoft’s Hotmail Engineering team), and in Copenhagen – plus a one-day session for a client in the City of London. 

There are now two classes – both three days. The PowerShell Basics class covers the basics of PowerShell both from the command line and as a scripting tool. The PowerShell Advanced class looks at more advanced features of PowerShell, particularly its use in the OS, as well as key applications including SQL Server, Exchange 2010, SQL and IIS.

The next runs of the PowerShell Master Classes are:

I’ll post more details on the agenda for these classes shortly. I am also working on a Weekend 2-day Script Camp – watch this space for more details once we have them finalised.

Microsoft Ships RC of Lync (renamed from Communications Server)

Thanks to May Jo Foley’s piece in ZDNet, I see Microsoft has just shipped a release candidate for Lync Server 2010. Lync, formerly known as Communications Server ‘14’ is the successor to Office Communications Server. The name, said to connect link and sync, is important in that it simplifies the branding with much shorter names for the constituent components.

Lync combines presence, instant messaging, conferencing and telephony – allowing you to simplify communications and in due course reduce the costs of running your legacy phone system.  Presence enables your users to know what each other is doing thus simplifying communications. If I want to chat to someone, and I see they are in a meeting, I know the call would be wasted. Combined with presence, IM enables fast peer to peer and sometimes peer to multiple peer communication. Lync 2010’s voice capabilities should enable many companies to adopt the product as their main PBX.

At the time of writing, Microsoft has put up the RC for download here: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=29366ba5-498f-4d21-bc3e-0b4e8ba58fb1&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+MicrosoftDownloadCenter+%28Microsoft+Download+Center%29#tm (don’t you love the snappy urls?). This page also has links to more Lync Server 2010 information, but these links do not yet exist. No doubt these will get put up shortly.

The download is just over 1.5GB and contains both the Standard and Enterprise Editions of Lync 2010. In keeping with recent platform changes, this RC (and the final product in due course) ships as 64-bit only which means you need to be running a 64-bit OS (and here I recommend running Server 2008 R2!).

Over the coming weeks, I’ll blog more about the functions and features of this cool product. Not least of which is the PowerShell interface.

Saturday, September 04, 2010

Calling Functions in PowerShell

I spent quite a bit of time earlier this week with a problem in calling worker functions. In a PowerShell script, or a PowerShell session, if there are a set of commands you run more than once or twice, one thing you can do is to put them into function then just call the function rather than typing out all the individual commands. Worker functions can make scripts a lot shorter. You can see an example of a worker function over here: http://pshscripts.blogspot.com/2010/09/get-umalquracalendarps1.html. In that example, you can see a worker function, DisplayValues (begins at line 45), which I then call several times in the script.

The problem I had this week was that for some reason, the worker function was spewing out nothing like what I was expecting. I started at the code for some time before spotting the problem – I’d not use the right calling process to invoke the function. A typical newbie mistake.

In .Net method calls, you state the name of the object, a “.” followed by the method name, and a parameter list enclosed in parentheses:  $object.method($a, $b, $c).  But a function is called without the parentheses, and the  parameters are space, not comma delimited, or function $a $b $c.

To illustrate this problem, I’ve written a little script that defines a worker function then calls it several ways:

function wf1 {
param (    $a, $b, $c)
"`$a:" ; $a; ""
"`$b:" ; $b; ""
"`$c:" ; $c; ""
}

wf1("Foo",'Bar',"foobar")
wf1  "Foo",'Bar',"foobar"
wf1  "Foo" 'Bar' "foobar"
wf1 –c “foobar” –a “Foo” –b “Bar:

The output from this is left as an exercise for the reader!

Now the first two times the script calls the worker function, PowerShell assigns the array (i.e the stuff delimited by commas) to the first parameter, leaving the second two parameters empty. Not what you want. The second two examples call the worker function correctly. Arguably the last of the calling sequence is better from a production orientated point of view.

On interesting thing – all four of those calling sequences ‘works’ and by default presents no apparent errors. However, if you set strict mode (Set-Strictmode) and setting version to 2 would warn that the first call was in error like this:

PSH [C:\foo]: . 'C:\Users\tfl\AppData\Local\Temp\Untitled5.ps1'
The function or command was called as if it were a method. Parameters should be separated by spaces. For information about parameters, see the about_Parameters Help topic.
At C:\Users\tfl\AppData\Local\Temp\Untitled5.ps1:8 char:4
+ wf1 <<<< ("Foo",'Bar',"foobar")
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : StrictModeFunctionCallWithParens

Now had I had the relevant line set in my profile, I’d not have wasted an hour wondering why things were not working.

Thursday, September 02, 2010

Paper.Li – Organising Twitter Information

Although Newspapers of the physically printed variety are dying all around us, the metaphor that is a newspaper fails to die. I’ve been playing a bit today with Paper.Li – one of those hundreds of sites that leverage Twitter and add value to your Twitter stream. I’ve created my own 'newspaper’ - http://paper.li/doctordns. This page is updated once every 24 hours and features links that Paper.Li has gleaned from my twitter feed.  It’s nicely laid out and features my own tweets, plus tabs I follow. In my case, the paper also includes tweets with the hash-tag #PowerShell. I’m kind of surprised I don’t see more Grateful Dead links, but we’ll see.

The papers that you can create with Paper.Li are one of three broad types: You and your tweet stream, hash tag, @people. The first is the model I noted above – Paper.Li parses your tweet feed/stream (tweets you get from those you follow and those you make) and makes a paper. The second makes the newspaper from tweets containing a particular hash tag. For example, the PowerShell hash tag is #PowerShell, and there’s a related newspaper  http://paper.li/tag/PowerShell.  The @people paper is based on a list of folks you create on Twitter. http://paper.li/jkavanagh58/powershell, for example, is a paper based on @jkavanagh58’s PowerShell twitter list.

 

 

Technorati Tags: ,

Wednesday, August 04, 2010

PSHSCripts.Blogspot.Com – 2 years On!

Just over 2 years ago, I created a new blog, The PowerShell Scripts blog over at http://pshscripts.blogspot.com. The idea was simple – a blog with single function PowerShell scripts. Scripts that demonstrated one (or at least a very small number) of things to do with PowerShell. I had in mind that since the blog was hosted by Google, they’d do a good job of indexing it, and providing links to it. Which is exactly what happened.
In the two years, I’ve had over 65,000 visitors to the blog. At present, I’m getting around 170 hits per day and just under 300 page hits per day on average over a week. However, this is very much a week day blog with Monday-Friday tending to be closer to 200+ hits/day during the week.
But the interesting, and gratifying, thing is the percentage of hits coming from the key search engines (Google and more recently Bing) and the search terms they are using. I’ve used a free traffic counter from sitemeter to measure the traffic, but I only see the last 1000 visitors and had not added a more permanent tracking system – but I’ve recently added Google Analytics the site. So far, the results from both sites show the same tends.
Looking at the Analytic's output for the past month. around 70% of all the traffic to the blog comes from Google and Bing with a bit more from PowerShell.com, TechNet and this blog. There are a few other search engines that send traffic, but Google and Bing are dominant.  Also, around 13% of the traffic comes from direct hits on http://pshscripts.blogspot.com
Looking at the search terms used is also interesting. PowerShell Scripts (and PowerShell script) make up 18% or all hits. Below those two, there’s a very long tail of relatively low numbers of hits over the past month. There were just around 1000 separate search terms used – and all but a handful more than once or twice. That shows that the narrow focus of each post has proven useful – you can search with a fairly narrow term, such as “PowerShell ipaddress wmi” or “powershell send udp” and see the blog at the top or near the top of the 1st page.
There were two discoveries that were curious. First, looking at the referring sites, I noticed one site had a relatively high number of pages per visit and a long time too on average time on site. The bounce rate for this traffic was also very very low. It turns out that an IT professor in Viet Nam has put a link to my blog and all the slides from the upcoming PowerShell V2 class. Not sure about the legality of putting the slides up – but see for yourself at http://hoanguyen40.ecoles.officelive.com/OSScripting.aspx.
The other amusing thing I found is that, when looking at the networks that send you traffic (think ISP). The top network listed was Microsoft!
All in all, a good first two years for this blog – getting several hundred hits a day is more than I expected.

Tuesday, August 03, 2010

Using Later Versions of the .NET Framework Remotely

I’ve posted a couple of articles recently regarding using later versions of the .NET Framework. Through the magic of .NET, you can just create a simple .CONFIG file to tell the relevant executable to use a later version of the framework by default. A .CONFIG file is just a very simple bit of XML as I demonstrated here, where I show how to create the XML and save it as PowerShell.Exe.CONFIG. This worked fine to enable the PowerShell console to use later versions of the framework. And I showed some of those results in a separate post here.

To enable this work work with PowerShell Plus, I just went and found the executable for PowerShell Plus, created the relevant .CONFIG file (PowerShellPlus.Exe.CONFIG). Now PowerShell Plus brings up .NET 4 too! Yeah! Or so I thought.

After posting these articles I saw on Twitter that @qa_warrior was having trouble doing this remotely. The upshot of his problem was that even though he’d configured PowerShell to use .NET 4.0 on both client and server, when he remoted, his remote session was based on 2.0. It turns out that this too was simple to fix – if you understand how PowerShell remoting is implemented on the remote host

.

When you create a new PSSession with a remote server, what you are doing is instantiating a PowerShell runspace and sending it commands to execute (and returning the results, albeit serialised). In order for the remote machine to create (and later use) the runspace, Windows needs to house that runspace inside a process. Powershell 2.0 creates that runspace inside an executable wsmprovhost.exe.  You can see this in action here:

SNAGHTML57421abb

In this screenshot, I created two remote sessions (admittedly to my self – but ‘remote’ nonetheless). You can see the two occurrences of wsmprovhost.exe and the two remote sessions. After removing these remote session, you can see that there are no occurrences of either wsmprovhost or the remote sessions. Then after you create a new PSSession, you can see the new session and the new occurrence of wsmprovhost.

Technorati Tags: ,

Monday, August 02, 2010

More on Using Different Versions of the .NET Framework

In yesterday’s blog post, I wrote about being able to use later versions of the .NET Framework with PowerShell. The trick was simple: create a config file, a small file of XML, to enable the use of, in my case, the .NET Framework Version 4 and the new namespace System.Numerics. That XML file tells PowerShell what version of the .NET Framework to load. You can see this if you display $psversiontable variable. By default, you’ll see something like this:

PSH [C:\foo]: $PSVersionTable

Name                           Value
----                           -----
CLRVersion                     2.0.50727.4200
BuildVersion                   6.0.6002.18111
PSVersion                      2.0
WSManStackVersion              2.0
PSCompatibleVersions           {1.0, 2.0}
SerializationVersion           1.1.0.1
PSRemotingProtocolVersion      2.1

But with the .Config file in place, you’d now see this:

PSH [C:\foo]: $PSVersionTable

Name                           Value
----                           -----
PSVersion                      2.0
PSCompatibleVersions           {1.0, 2.0}
BuildVersion                   6.0.6002.18111
PSRemotingProtocolVersion      2.1
WSManStackVersion              2.0
CLRVersion                     4.0.30319.1
SerializationVersion           1.1.0.1

Another key difference is that when you now load additional .NET namespaces using LoadWithPartialName, you will get later version of the dll (the exact version number loaded depends on what you put into the config file. So loading Windows Forms now looks like this:

 

PSH [C:\foo]: [system.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")

GAC    Version        Location
---    -------        --------
True   v4.0.30319    C:\Windows\Microsoft.Net\assembly\GAC_MSIL\System.Windows.Forms\v4.0_4.0.0.0__b77a5c…

By default, you got this:

PSH [C:\foo]: [system.Reflection.Assembly]::LoadWithPartialName("system.windows.forms")

GAC    Version        Location
---    -------        --------
True   v2.0.50727     C:\Windows\assembly\GAC_MSIL\System.Windows.Forms\2.0.0.0__b77a5c…

It’s pretty easy to use later versions of the .NET Framework. These later versions provide new classes you may find useful.

PSH [C:\foo]: [system.Reflection.Assembly]::LoadWithPartialName("system.windows.forms")

GAC    Version        Location
---    -------        --------
True   v2.0.50727     C:\Windows\assembly\GAC_MSIL\System.Windows.Forms\2.0.0.0__b77a5c561934e089\System.Windows.Forms.dll

But beware of doing this in a data centre without testing that your existing scripts are fully compatible with the later versions of the Framework. You shouldn’t have any problems as MS is pretty good about forward compatibility, but it never hurts to be careful.  Just ensure you do thorough testing before rolling this out across the board.

Technorati Tags: ,

Sunday, August 01, 2010

Using Newer Version(s) of .NET with PowerShell

I’ve been playing around a bit with the latest version of the .NET Framework. There are some pretty cool new classes and namespaces but one that caught my eye was System.Numerics. This namespace has two neat classes: System.Numerics.BigInteger and Sytem.Numerics.Complex – these represent big integers and complex numbers respectively. To demonstrate the BigIngteger class, I’ve written two small scripts: New-BigInteger.ps1 and Get-BigIntegerProperties.ps1 (with more to come!). You can get these and several hundred more PowerShell scripts from my Powershell Scripts blog. When I first started to develop these scripts, starting first to translate an MSDN sample from C# into  PowerShell, I came across some curious errors. In the end learned how to call updated versions of the .NET Framework.

By default, PowerShell uses .NET version 2.0. But if you want to use classes implemented in later versions of the Framework (in namespaces that are not loaded by default of course), you first need to load the relevant dll. You can do this as follows:

Add-Type -Path "C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Numerics.dll"

But there’s only one problem (by default!) – when you do this you get the following run time error:

Add-Type : Could not load file or assembly 'file:///C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Numerics.dll' or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.
At line:1 char:9
+ Add-Type <<<<  -Path "C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Numerics.dll"
    + CategoryInfo          : NotSpecified: (:) [Add-Type], BadImageFormatException
    + FullyQualifiedErrorId : System.BadImageFormatException,Microsoft.PowerShell.Commands.AddTypeCommand

The solution is pretty simple – just tell PowerShell to use a later version of the CLR. To do this, you need to create a config file, named PowerShell.Exe.Config, located in the same folder as PowerShell.Exe (and another one for PowerShellISE.Exe, or PowerShellISE.Exe.Config). These config files contain a small bit of XML to tell the system which version of the CLR to use. To access the .NET 4.0 versions, use the following:

<?xml version="1.0"?>
<configuration>
    <startup useLegacyV2RuntimeActivationPolicy="true">
        <supportedRuntime version="v4.0.30319"/>
        <supportedRuntime version="v2.0.50727"/>
    </startup>
</configuration>

With this XML created, just restart PowerShell and you can add the System.Numerics.Dll and use the classes in that namespace!

Tuesday, July 13, 2010

A Quiet Word to the Chinese Comment Spammer

Hi.  Thanks for all your comments, especially all those that contain URLs to adult sites. I appreciate how you follow up nearly every post to this blog with more comment spam. You will notice that none of the comments actually get published – that’s because I review every comment and am rejecting yours. I will continue to reject spam like this, so you might consider not wasting your, and my, time with comments that will never get published (not now and not ever).

[later]

Almost as predicted, you (婉婷) did indeed try to leave a spam comment message. :-( 

Monday, July 12, 2010

PowerShell Needs a New Approved Verb

I’ve been playing around a bit with the System.Speech namespace, in particular the System.Speech.Synthesis.SpeechSynthesizer class. This class allows you go get the Speech Synthesis engine speak for you. On my workstation, I have just one voice, called Anna. If you look over on my PSHScripts blog, there’s a script to get all of the voices installed on your system.

As you can see from the script, the class is pretty simple, although it’s one that PowerShell loads by default. Once you create a SpeechSynthesizer object, you can then get the installed voices as the script shows. As you can see, on my system, there’s only one loaded voice (Anna).  The Speech Synthesizer has another useful class – Speak (well two, the second being SpeakAsync). These methods enable the SpeechSynthesizer object to speak some text.

I’ve written a couple of scripts that will demonstrate these APIs, and I’ll publish these shortly. But in doing so, I realised that the PowerShell Approved Verb list needs a new verb: Speak, which mirrors the Speak method. If you look closely at the approved verb list, there’s no verbs relating to a voice modality, which with the benefit of hindsight, is unfortunate. For Version 3, I think a new verb is needed, which could be Speak (my favourite) or perhaps Say.

 

Friday, July 02, 2010

PowerShell and XML Element Attributes

I’ve been playing a bit this week with XML and PowerShell. As you no doubt know, PowerShell has first class XML support built in. To see more about that, see Tobias’s Ebook Chapter on XML and PowerShell. My task this week was to work with attributes that can appear inside an XML tag. I was using the .NET XML class System.XML.XMLElement and it’s various attribute related method.

An XML Element, as  noted in MSDN, is a node in a DOM (XML) document. These elements can have attributes which you can associate with the element. For example, consider the following XML element:

<book genre='novel' ISBN='1-861001-57-5'>
<title>Pride And Prejudice</title>
</book>"

Such an element would normally be part of a much larger collection (eg <books></books>), but for the purposes of playing with element attributes, you can load it and then treated as an XML document with elements (albeit not many). You can load this document like this (and yes, there are a  bunch more ways!)

$Doc = New-Object System.Xml.XmlDocument 
$Doc.LoadXml("<book genre='novel' ISBN='1-861001-57-5'>" + 
             "   <title>Pride And Prejudice</title>" + 
             "</book>"

In the XML document, the book element has two attributes, genre and ISBN. Each attribute has the simple format (in the XML) of <attribute name>=<attributevalue>.

Once you load the document, you can do things like:

  • Check whether an element has a particular named attribute
  • Get the value of an attribute
  • Remove an attribute
  • Set and attribute

To do this in PowerShell you would do something like this, e.g. to set an attribute:

$Root = $Doc.DocumentELement
$Root.SetAttribute("attributename","value")

In richer XML scripts the attributename and the value would be held in a variable (that you in turn might have obtained from another XML document).

I’ve written several sample scripts over on the PowerShell scripts blog, which re-implement a number of MSDN attribute handling C# samples:

  • Get-XMLAttribute.ps1 – this script loads the XML then checks to see if the element has an attribute and if so, the code prints out the value of the attribute.
  • Remove-XMLAttribute and Remove-XMLAttributeAt.ps1 – these scripts load the XML and then remove the attribute, but using different .NET methods (i.e. RemoveAttribute and RemoveAttributeAt). Using the Remove AttributeAt, where you specify the position of the attribute, and not the name, is potentially dangerous. I have the t-shirt on that one! 
  • Set-XMLAttribute.ps1 – this script as the name might imply, loads the XML and adds an attribute to the element.

Fun stuff!

Technorati Tags: ,,,

PowerShell and XML Element Attributes

I’ve been playing a bit this week with XML and PowerShell. As you no doubt know, PowerShell has first class XML support built in. To see more about that, see Tobias’s Ebook Chapter on XML and PowerShell. My task this week was to work with attributes that can appear inside an XML tag. I was using the .NET XML class System.XML.XMLElement and it’s various attribute related method.

As noted in MSDN, are a node in a DOM (XML) document. These elements can have attributes which you can associate with the element. For example, consider the following XML element:

<book genre='novel' ISBN='1-861001-57-5'>
<title>Pride And Prejudice</title>
</book>"

Such an element would normally be part of a much larger collection (et <books></books>), but for the purposes of playiing with element attributews, you can load it and then treated as an XML document with elements (albeit not many). You can load this document like this (and yes, there are a  bunch more ways!)

$Doc = New-Object System.Xml.XmlDocument 
$Doc.LoadXml("<book genre='novel' ISBN='1-861001-57-5'>" + 
             "   <title>Pride And Prejudice</title>" + 
             "</book>"

In the XML document, the book element has two attributes, genre and ISBN. Each attribute has the simple format (in the XML) of <attribute name>=<attributevalue>.

Once you load the document, you can do things like:

  • Check whether an element has a particular named attribute
  • Get the value of an attribute
  • Remove an attribute
  • Set and attribute

To do this in Powershell you would do something like this, eg to set an attribute:

$Root = $Doc.DocumentELement
$Root.SetAttribute("attributename","value")

In richer XML scripts the attributename and the value would be held in a variable (that you in turn might have obtained from another XML document).

I’ve written several sample scripts over on the PowerShell scripts blog, which re-implement a number of MSDN attribute handling C# samples:

  • Get-XMLAttribute.ps1 – this script loads the XML then checks to see if the element has an attribute and if so, the code prints out the value of the attribute.
  • Remove-XMLAttribute and Remove-XMLAttributeAt.ps1 – these scripts load the XML and then remove the attribute, but using different .NET methods (i.e. RemoveAttribute and RemoveAttributeAt). Using the Remove AttributeAt, where you specify the position of the attribute, and not the name, is potentially dangerous. I have the t-shirt on that one! 
  • Set-XMLAttribute.ps1 – this script as the name might imply, loads the XML and adds an attribute to the element.

Fun stuff!

Technorati Tags: ,,,

Wednesday, June 30, 2010

They Called Their Web Site WHAT????

Those nice folks at Angel Internet Press sent me a copy of  new book: Slurls – They Called Their Website WHAT?!” which has been keeping me amused. A slurl is made up term – joining ‘slur’ and ‘url’. A slurl is a web site URL that can be read in ways other than what the owner probably intended. Slurls come about when someone creates a website name based on their company, but where the words can be read in a much more amusing, and often embarrasing, way! Some of the more amusing slurls are:

The book lists a number more slurls – most of them quite amusing. I still can’t quite work out just how someone really did create all these web sites and didn’t notice the potential gaff! The author’s website, www.slurls.com shows the SLUR of the day (today, it’s google.co.ck which still has me chuckling) and has  discussion forums where you can suggest a SLURL and learn of those that have ceased to be. The web site also has screen shots of the site to show it’s not just a made up url, for example this page pointing to an MP3 site: www.mp3shits.com (MP3’s Hits).

A light read, but highly amusing!

Network Monitor V3.4 Ships

Microsoft has just shipped a new version of Network Monitor, one of my favourite network tools. The new version has a slew of new features. MS has reworked the capturing engine to capture on faster networks without losing frames.  The parser logic has been updated giving you the ability to do deeper/slower parsing or shallower/faster parsing, a great feature for fast networks where shallower parsing is acceptable. The UI is also more customisable, something no doubt of value to those who use tools like every day. For a fuller list of features in the new version – see the beta blog announcement here.

Microsoft also provides ongoing information via the Network Monitor blog: here. Additionally, there’s a support forum on the  Network Monitor forum. The forums enable you to ask questions about the UI, NMCap, API, parsers and troubleshooting scenarios (and even get answers!).

Sunday, June 20, 2010

PowerShell Plus 3.5 Beta

I use PowerShell Plus pretty much all the time for the development of PowerShell scripts. I have it on both my desktop workstation and my laptop, and I regularly demonstrate it in my classroom teaching. Those nice folks at Idera have just released the beta of the next Version, V3.5, of this cool tool which is a free download (although the final product will be commercial and is not free).
The beta shows off the new features in V3.5:
  • Remoting Support
  • Improved Script Sharing
  • Enhancements to the code editor
  • Enhancements to the Learning Center.
I have a long boring weekend ahead stuck in a hotel – I will be playing!
Technorati Tags: ,

Saturday, June 19, 2010

Another Free PowerShell Book

I see the Swiss MS IT Pro Team (i.e. Frank Koch) are at it again, this time with another Free PowerShell Book. The latest book is entitled Administrative Tasks Using Windows PowerShell. And it’s now available in English (along with a copy of the first e-book, Windows PowerShell. Both are great introductions to PowerShell!
You can get the English versions of both books here. This is a large-ish ZIP file with both PDF and XPS versions of the book, along with a set of sample scripts. For the German speakers, or at least those who can read German, you can get the original versions here.

Later
I read this second document on the plane to the US, and as the comment below says, this turns out to not be such a new book - but is mainly V1 based. There are aslso a number of errors in translation.

Friday, June 18, 2010

PowerShell Script Provider

Just when you think we’ve seen all that the (awesome) PowerShell community can do, along comes another cool development. The latest coolness comes from Oisin Grehan in the form of a PowerShell Script provider. This is a tool, which you can download from Codeplex, that enables you to write a provider purely in script without the need to do stuff in C#. This is pretty cool!.

The project is at version 0.1, with at least 4 more versions planned. As it says on Codeplex, this code is alpha – but knowing Oisin those versions will come quickly. 

Thursday, June 17, 2010

Signing PowerShell Scripts – A Gotcha with ISE!

In some enterprise environments, signing PowerShell scripts and setting an execution policy to only run signed scripts is a useful control mechanism. It can avoid less skilled admins ‘fixing’ a script almost correctly and can avoid untested scripts from running. Of course, the malign admin can still cut/paste the scripts into the command line and do damage – but that same admin can nuke the registry, reformat a volume, etc. Script signing is just another layer of defence.

The Scripting Guys Team (well actually superstar MVP Ragnar Harper) has written a two part blog post on the subject of how to do script signing. Part 1 is a useful tutorial on how to setup your own PKI using Windows Server’s built in Certificate Service feature (AD CS as MSFT call it). With Part 1, you learn how to get your code signing digital certificate.  Part 2 then talks you through how to use that certificate to generate a signed script.

The demo is good and the instructions work well, however there is one small gotcha. If you use PowerShell ISE to edit and save your scripts, the technique shown in Part 2 will fail. Here’s what you will see (from the ISE).

image

As you can see, this results in a rather less than helpful “UnknownError”. Turns out the reason is simple: By default, ISE saves scripts in Unicode BigEndian format – which Set-Authenticode does not cater for. And worse, the ISE Save-As dialog does not give you any option to save in a more friendly encoding (ie ASCII!).

There are three solutions to this:

1. Use Notepad to re-save the file as ASCII, then sign it. This is suboptimal but it works.

2.  You can get ISE to save as Unicode (not BigEndian) using a small script, unfortunately not from the menus. Just run the following bit of code:

$psise.CurrentFile.Save([system.Text.Encoding]::Unicode)

3: You can get ISE to save as always as ANSI. As t add the following bit of code to your PowerShell ISE:

register-objectevent $psise.CurrentPowerShellTab.Files collectionchanged -action {
    # iterate ISEFile objects
    $event.sender | % {
        # set private field which holds default encoding to ASCII
        $_.gettype().getfield("encoding","nonpublic,instance").setvalue($_, [text.encoding]::ascii)
     }
}

Once this fragment is executed, probably by adding it to your Profile, scripts get saved as ANSI and can be signed just fine.

Note in the above, you can both save the current file as ASCII or auto-save as Unicode. Set-AuthenticodeSignature works with both encodings, just not the ISE normal default of Unicode BigEndian (just change ::ASCII to ::UNICODE or vice versa!). Personally, I now save as ASCII. But if you have non-ANSI characters in your scripts, set the default as Unicode!

And a tip of the hat to Oisin Grehan who posted about this on the MVP list and who has posted some of the above code on his blog here.

 

Thursday, June 10, 2010

Communications Server ‘14’ Powershell Blog Up And Running

The Communications Server ‘14’ team have put up a CS and Powershell blog. You can read it at http://blogs.technet.com/b/csps/. It’s early days, as CS14 was finally revealed to the world (with no NDA!) here in New Orelans TechEd North America. Speaking to the PowerShell PM, the intention is for the blog to be a one-stop shop for all CS14 and PowerShell. I look forward to seeing how this evolves!

Friday, June 04, 2010

Training and the Cloud

I’ve been having some discussions with a client about the impact of cloud computing on the IT training business, particularly IT Professional training. If you believe the hype, cloud computing will take all the problems of running your IT suite and make it a thing of the past. Thus, you don’t need any more IT Pros and therefore no training. Of course it’s nowhere as simple as that! I’ve been reading an interesting blog piece by Alan Le Marquand titled From Servers to Services: the Role for the IT Pro in the Cloud. It’s provided some good input into the training question.

Alan first makes the point that 'the cloud’ is actually many things. Cloud computing has evolved in to distinct layers, each with their own approaches and IT Pro needs. Le Marquand breaks “the cloud” down into three main layers:

  • Software as a Service (SaaS)– here you buy the software hosted by a supplier. For example, an organisation can use one of many hosted Exchange solutions for email. For this layer, the hardware, OS layers and application peices are for the most part gone. so IT Pros no longer need to worry about them. Of course, at the application layer, there is still a need to perform management and provisioning functions such as adding or removing email accounts as employees come and go. And since you are paying for the application, you need to monitor your SLA, possibly differently than today as well as manage your supplier.
  • Platform as a Service (PaaS)– here the hardware and OS are provided by a supplier but you provide the application – Microsoft’s Azure is an example of this cloud layer. With PaaS, the OS and hardware are primarily managed by the supplier , but you manage the application that sits on top.This layer still requires IT Pros to handle all the application management functions plus the ability to manage the platform in the cloud. 
  • Infrastructure as a Service (IaaS) – here you just get the hardware provided by the supplier and you deal with the OS, application and everything above. To some degree, this is really not much different than from today in terms of the skills that IT Pros need – the key difference being where you put the hardware and how you scale it out (or not!). IT pros still need to know how to deploy the OS and the applications and manage them. Only the scale is different – you still need to patch, troubleshoot etc.

The role that the IT Professional plays in each of these layers differs, in some cases significantly, from today.  And of course, this means training needs differ too. All three layers require you to consider end-user/administration training in how to use and get the most out of the applications. PaaS and IaaS also require you to know how to develop and deploy your application solution. And finally, IaaS requires you to have broadly the same deployment skills as today since most IT Pros start their tasks once the hardware is in place.

The key first step all organisations considering the cloud need  to take is to get a good understanding of how to buy as well as deploy and manage cloud services. Organisations are being led to believe by the suppliers that the Cloud is the answer. That may be the case ultimately, but not all suppliers are equal and not all offerings are the same. So companies need to understand how to go about buying the services and then how to deploy them. For IT Pros, that deployment may be a challenge as there are new issues to consider. Deploying application to hundreds or thousands servers require you to have new automation skills than you might have needed if you just use the GUI to manage one server.

The suppliers of particularly PaaS and IaaS also need to deliver or facilitate training in their offerings. While SQL on Azure may be almost like SQL on your own server, there are differences. Also, deploying and managing larger numbers of servers will require new approaches to the tasks.

For organisations that are considering cloud computing, I’d recommend spending some time to think through just which cloud layers you are considering and the impact those have on your IT staffing. Then you should start to do some training needs analysis. The cloud offers organisations of all sizes some advantages and as Alan poitns out, IT Pros, and hence the training, will certainly adapt to fit the cloud model. But there is still some thinking and planning to be done before leaping off in to cloud-land!

Technorati Tags: ,

SQLIse – A PowerShell SQL Server Query Tool

I am doing some work at the moment building SQL PowerShell training for an upcoming PowerShell MasterClass. In my searching, I came across a small project being done by Chad Miller, called SQLIse. As Chad describes in his blog, SQLISE is an ISE add-on that provides  “a basic IDE for T-SQL that includes the ability to exit, execute, parse and format SQL code from within PowerShell ISE”.

SQLISE is a part of a larger SQL PowerShell project called SQL Extensions for PowerShell or SQLPSX. Having played a bit with both extensions, they are pretty cool and can certainly help IT Pros who have to deal with SQL. Chad has even created a short video to demostrate SQLISE – get this at YouTube: http://www.youtube.com/v/1KcNSHn7oTA&hl=en.

SQLISE has two pre-requisites. First, you need to have the PowerShell Pack installed and you need the SQLPX extensions loaded. You can get PowerShellPack from the MSDN Code Gallery.

One issue I faced was that the SQLIse installation process was not seamless or easy.  In order to get this running, I needed two uber-modules (SQLPSX and PowerShellPack) and these come from different places. Second, the installation process requires you to run programs, SQLPSX_Install and PowerShellPack.MSI.

I find this somewhat contrary to the spirit of Modules in PowerShell V2 in that modules should be deployable using only Xcopy. In my case, I did not want either uber-module to be loaded in the personal modules folder, but in the system modules folder. But the installer(s) gave me no option. More importantly, while the SQLPSX installation program seemed to run, but left the module folder empty. A bit of hacking (and running streams.exe across the expanded file set!)  enabled me to get the module installed. But sadly the hacking did not work well – and some of the features do  not work. More hacking I suspect is needed.

In summary, a great feature let down by the complex installation process.

Wednesday, June 02, 2010

PowerShell Admin Module – A Follow Up

in a recent blog article I wrote about a new Codeplex project called PowerShell Admin Modules, being developed by super-star MVP Richard Siddaway. I noted two small things I’d noticed about the module. Well today, Richard wrote to say he’d fixed Get-Share to accept wildcards and had changed around the parameters of New-Share to match that of the venerable Net Share.

He also mentions the latest version, 0.2, has a bunch of functions for dealing with binary and hex numbers, which include:

  • ConvertTo-Binary
  • ConvertTo-Decimal
  • ConvertTo-Hex
  • Get-BinaryAND
  • Get-BinaryDifference
  • Get-BinaryOR
  • Get-BinarySum
  • Get-BinaryXOR
  • Get-HexDifference
  • Get-HexSum
  • Test-Binary
  • Test-Hex

This is nice work – although one thought would be to merge the functions of PSAM into the PowerShell Community Extensions.

Tuesday, June 01, 2010

The PowerShell Guy Has Returned

Marc, aka The PowerShell Guy is back. In his blog (now back on air), Marc recounts a tale that is all too familiar to many of us: IT problems combined with the demands of a real life. But thanks to those very nice folks over at OrcsWeb, Marc’s site and his many outstanding contributions are back online.

Welcome back to the online world Marc!

Monday, May 31, 2010

Another Reason Why PowerShell Matters

I’ve just been reading an interesting blog article in Peeters Online about fixing a DCOM issue that affects some machines. The issue results in Error Log entries with DCOM Event 10005 – Service can not be started. At least one common cause of this is disabling the Remote Storage Manager. The solution involved some manual registry editing and that is something I really do NOT like doing on production servers. 

In PeetersOnline, the author presents a nice PowerShell script to fix the issue. The script opens the remote registry, and does the necessary fixes. The script is parameterised therefore suited to larger organisations but more importantly, seems to handle the key error situations (keys not being found, not able to connect to the remote registry). You could go further, and create a trouble-shooter using the Windows SDK to enable remote admins to do the job even more simply (and perhaps both avoiding doing it where not needed and ensuring it’s done properly where it is).

This script shows how PowerShell can be used both to automate a solution and at the same time ensuring good documentation on what the fix is. The script , but also provides a measure of security from the admin that can’t type well! The reliable repetition of accurate instructions is a key to improving availability. It is things like this that make PowerShell so important.

Technorati Tags:

Sunday, May 30, 2010

Awesome script

I just saw a neat post over on twitter regarding Beyond Export-Csv: Export-Xls 

This is an awesome script, and one I've put into my profile already!! It might be a good addition to PowerShell Community Extensions.

Sunday, May 23, 2010

PowerShell Admin Module

I’ve been playing a bit with a Codeplex project called PowerShell Admin Modules (PSAM). Developed by Richard Siddaway, PSAM is intended to supply a number of PowerShell modules for use by IT Pros. The first of these modules contains 6 functions that work with shares, as follows:

  • Get-Share
  • Get-ShareAccessMask
  • Get-ShareSecurity
  • New-Share
  • Remove-Share

Using this module is easy. First download the zip file from Codeplex and expand the contents into your modules folder. Then import the module using Import-Module. It looks like this:

PSH [C:\foo]: import-module PAMSHARES
PSH [C:\foo]: get-command * -module pamshares

CommandType     Name                                                Definition
-----------     ----                                                ----------
Function        Get-Share                                           ...
Function        Get-ShareAccessMask                                 ...
Function        Get-ShareSecurity                                   ...
Function        New-Share                                           ...
Function        Remove-Share                                        ...
Function        Set-Share                                           ...

PSH [C:\foo]: new-share c:\foo foo2
Share foo2 was created
PSH [C:\foo]: get-share foo2

Name                                    Path                                    Description
----                                    ----                                    -----------
foo2                                    c:\foo

PSH [C:\foo]: remove-share foo2


The module works well enough although get-share does not seem to support wildcards and the order of the parameters in New-Share is different from that of Net Share (for those of us who even remember Net Share!). I’ve posted a fix to the wildcard issue on CodePlex. For the other issue, one could just change the order of the parameters in New Share. This is the joy of open source proejcts like this – you can fix ‘bugs’ and make your own changes as you see fit.

Friday, May 14, 2010

PowerShell Community Extensions (PSCX) 2.0 Released

A new version of the PowerShell Community Extensions (PSCX), version 2.0, has been released and is available from Codeplex (http://pscx.codeplex.com/releases/45101/download/121340). This free set of PowerShell extensions provides a number of very useful enhancements, particularly new/added cmdlets.

The main purpose behind the 2.0 release was to migrate the code from being a snap-in to being a module. As it’s a module, deployment is really simple – just Xcopy the files into your module folder and use Import-Module to load the module (typically you just put the Import-Module into my $profile). You can of course, put the module somewhere else and specify the full path using the Import-Module cmdlet.

PSCX also includes some NEW cmdlets, including:

  • ConvertTo-Metric
  • Get-AlternateDataStream
  • Test-AlternateDataStream
  • Remove-AlternateDataStream
  • Unblock-File
  • Get-LoremIpsum (lorem)
  • Get-TypeName (gtn)
  • Get-Uptime
  • Get-FileTail (tail)

This package is a great set of very useful extensions. Being module based, deployment is much simplified. I commend this release to all PowerShell users!

Wednesday, May 05, 2010

MSDN Webcast: Powershell for Data Professionals Tomorrow (May 5th)

I just got news of an interesting SQL PowerShell web cast coming up tomorrow. In the webcase, Microsoft SQL Server guru Aaron Nelson is going to look at how you perform everyday database administrator tasks using POwerShell. He’ll cover backing up user databases, scripting table objects, and evaluating disk space usage. The idea is to demonstrate how PowerShell scripts can be used to automating SQL Server activities.

Sign up for the web cast here: MSDN Webcast: geekSpeak: Powershell for Data Professionals (Level 200). I’ll try to post details of the recordings after the event for those that can’t make it.

Technorati Tags: ,,

Thursday, April 22, 2010

Digicert - Utility for Managing SSL Certs

Most of the readers of this blog will have used and possibly setup SSL on a web site. If you really, really know what you are doing and do it all the time, managing digital certs is relatively straigtforward. But for over-worked admins who DON'T do this daily, dealing with things like intermediate CAs, keys, etc, etc is just a mind-bending experience. It's also easy to make mistakes (for example, adding a root CA cert to your personal vs computer store).

Digicert, who sell digital certificates, have created a new and free certificate management utility. This utility provides a number of useful features including:

  • See all the SSL certificates installed on a server.
  • View details for all certificates.
  • Import and Export certificates either as a backupo or to copy/move certs between servers.
  • Test a certificate

You can download this utility from Digicert’s site for free – and you run it on the server you wish to test. Here’s what the UI looks like:

image

If you are using services that require digital certs, such as Communications Server (aka OCS), then this tool may well be very useful in helping to resolve certificate issues on your OCS/CS systems.

Wednesday, April 21, 2010

PowerShell Cmdlets Search via Bing

Microsoft this week released a new feature in the Bing search engine – a visual search of PowerShell Cmdlets. This is part of the Visual Search feature of Bing which allows you to search using visual images, versus just text. When you fire up the Visual Search , you will now see a “Visual Search” link.  You can then click through to see the PowerShell Cmdlets. Sadly, this is available currently only in the “en-us’ version of Bing, but it looks like this:

image

 

Along the left, you can see a number of categories (Top 12 Cmdlets, WMI Cmdlets, etc) as well as some ways to narrow down your search. I especially like the ‘Introduced in Version’ and ‘Remoting Uses’  As you can see in the above graphic, when you click on a cmdlet, you get more help information along the right, including basic cmdlet information and several links to more detailed help documentation. Although you can’t see it from the graphic above, this page seems to be generated by using WPF, so you get some pretty neat effects when you click on the left hand pane of this page!

Th PowerShell Cmdlets search feature is available on the US version of Bing only. Unlike Google, there seems no way to invoke country specific versions of Bing. You get what Bing works out to be your local variant. But with a little hacking, you can indeed get to the above page. The rather unfriendly URL is: http://www.bing.com/visualsearch?mkt=en-us&g=powershell_cmdlets&FORM=SGEWEB&qpvt=powershell#remoting=1&r=1. Alternatively, you can start at the Bing US home page and drill down from there (http://www.bing.com/visualsearch?mkt=en-us).

There will also be more cmdlets documented this way. I have no idea what the time table is, either for wider disbribution of this pretty cool feature, or when we’ll see more cmdlets being documented.

Technorati Tags: ,

Saturday, April 17, 2010

PowerShell Profile Files

PowerShell defines some special script files, called Profiles, that you can use to customise and configure a PowerShell Session, whether you are using Powershell.exe, PoweShellISE.exe or a customised version (e.g. Exchange Management Console). The neat thing about the profile is that it runs, at start-up of every PowerShell host, in dot sourced mode. Thus functions and variables defined in the profile persist in your shell.

For each PowerShell Host, you have up to four potential profiles:

  • AllUsersAllHosts
    • C:\Windows\System32\WindowsPowerShell\v1.0\profile.ps1
  • AllUsersCurrentHost
    • C:\Windows\System32\WindowsPowerShell\v1.0\Microsoft.PowerShellISE_profile.ps1 (PowerShellISE.exe), or
    • C:\Windows\System32\WindowsPowerShell\v1.0\Microsoft.PowerShell_profile.ps1 (PowerShell.exe)
  • CurrentUserAllHosts
    • C:\Users\<username>\Documents\WindowsPowerShell\profile.ps1
  • CurrentUserCurrentHost
    • C:\Users\tfl\Documents\WindowsPowerShell\Microsoft.PowerShellISE_profile.ps1 (PowerShellISE.,exe), or
    • C:\Users\tfl\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1 (PowerShell.exe).

For both the CurrentHost files – these will vary depending on which host the script is run. Above shows the profile files for both PowerShell and PowerShell ISE. The profile files (obviously one each of the four!) run in the order noted above. This means the admin could, for example define some functions in the AllUsersAllHosts that you can override in your CurrentUser profiles.

As it turns out, there’s a very PowerShell one-liner you can run to return your host’s profile files (and can even tell you if the file exists!) This one-liner looks like this:

$profile | gm *Host* | % { $_.Name } | % { $p = @{}; $p.Name = $_ ;$p.Path = $profile.$_; $p.Exists= (Test-Path $profile.$_); New-Object PSObject -Property $p } | Format-Table –AutoSize

I’ve posted a similar script over on my PowerShell Scripts blog. The script I’ve posted on the PowerShell Scripts blog contains a Format-List (to look a bit better on the blog. You might prefer the Format-Table in the one-liner above.

 

Technorati Tags: ,

Wednesday, March 24, 2010

PowerShell Master Class – More Sessions

I’ve had a gratifying response to my first PowerShell Master Class event. Like all first time runs, you learn some things by running the event.  One of which is that three days is not long enough. So future events are not four days. Not that I neede dto learn it, but the last run shows that, given how vast V2 is,  there is stuff that even with four days, we still can’t cover. So I am planning an advanced seminar for later in the year – watch this space.

In the mean time, we have two more sessions currently planned:

I’m also discussing a further session in Stockholm in September along with a more advanced workshop also in September in Stockholm.

Master Class outline

The class outline is as follows:

Day 1 – The Basics of PowerShell

  • PowerShell Fundaments – the key elements of PowerShell including installation, setup, profiles
  • Discovery – finding your way and learning how to discover more
  • Formatting – how to format output nicely

Day 2 – From the command line to the script

  • Remoting – working with remote systems and PowerShell’s remoting feature.
  • Providers – getting into OS data stores
  • Scripting Concepts – automating everyday tasks including language constructs, error handling and debugging

Day 3 – Practical PowerShell

  • Modules – managing PowerShell in the enterprise
  • .NET/WMI/COM Objects – working with objects of all kinds, including WMI, COM, .NET and your own custom objectw
  • PowerShell and Windows Client/Server – how you can use built in PowerShell cmdlets and providers included with Windows 7 and Windows Server 2008 R2

Day 4 – Applying PowerShell 

  • PowerShell in Key Microsoft Servers - a look at PowerShell today in SQL, Exchange, SharePoint 2010, SCVMM/HyperV and CS 2010.
  • Taking it to the Next Level – stuff to do later!