Thursday, October 30, 2008

Creating an NLB Cluster using Windows Server 2008

As noted in an earlier blog post, I’m writing an article on Server Clustering for Server Management Magazine. In the article I discuss setting up of an NLB cluster. This blog post shows the steps in more detail, including screen shots of each step.

Creating an NLB Cluster

There are three key steps to creating an NLB cluster in Windows Server 2008:

  1. Install the NLB Feature into Server 2008 on each host that you will add to an NLB cluster.
  2. Use the New Cluster wizard to create the cluster and add the first host.
  3. Use the Add Host wizard to add one ore more nodes to your cluster

You also have to setup the application that you want to cluster. For the article I created a very simple ASP.NET application that I documented in an earlier blog post.

Installing NLB in Server 2008

This is extremely easy. First you run Server Manager, and select Features from the tree and click on Add Features:

image

From the Add Features wizard just select Network Load Balancing and click Next:

image

Finally from the Confirm Installation Selections click Install:

image

After a few seconds, NLB will be installed on the first host. You need to repeat this installation on each host you plan to add to your NLB Cluster. Installing the NLB feature does NOT require a reboot, but removing it does.

Creating your NLB Cluster

Once NLB is installed, you create your cluster by first bringing up the NLB Cluster Manager:

image

The right click the Network Load Balancing cluster node in the left pane, and select New Cluster:

image

This brings up the first page of the New Cluster Wizard – here you select the first member of your cluster by adding the IP address or DNS name into the Host box:

image

This shows you the interfaces on this host that you can use for configuring a new cluster. Chose the interface and click Next:

image

This brings up the Host Parameters page, where you enter the IP address to be used for the cluster for the cluster, then you click next:

 

image

This brings up the Cluster IP address page. Here you add the address used by clients to connect to nodes in the cluster:

image 

After adding the IP address, and clicking next, the New Cluster wizard displays the Cluster Parameters where you specify the cluster IP configuration and the cluster operation mode (and click  next):

image

Finally, the wizard displays the port rules. In this case, we specify the cluster shoudl handle TCP Port 80 (and click on Finish):

image

The Wizard then does the necessary configuration, resulting in a single node NLB cluster, shown in the NLB manager like this:

image

At this point, you can navigate to the cluster to see the application running:

image

 

Adding Additional Nodes

A single node cluster is not much value, so you next need to add an additional node (or nodes). To add a node, you right click your newly created cluster and select Add Host to Cluster:

image 

This brings up the Add Host to Cluster: Connect page where you specify the host to add, and the interface used in the cluster:

image

Next you specify the new host’s parameters:

image

Then you get to update, in needed the cluster’s port rules:

image

Clicking next completes the wizard, resulting in a 2nd host in the cluster. As seen by NLB Manager:

image

Re navigating in your browser to the cluster may (or may not) result in a different page. Whilst running the wizard to create the screen shots shown here, this did change:

image

 

So – it’s simple and easy to create an NLB cluster in Server 2008!

Testing Server 2008 NLB clusters

I am working on an article for Server Management magazine, due out on December focusing on Clustering. One aspect of this article is around NLB clusters in Server 2008. To test this, I wrote a very simple ASP.Net Application.

Here’s the default.aspx web page:

1 <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="myip._Default" %> 2 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> 3 <html xmlns="http://www.w3.org/1999/xhtml" > 4 <head runat="server"> <title>NLB Test on NLB 1Tool</title></head> 5 <body> 6 <form id="form1" runat="server"><div></div> 7 <p> NLB Test Tool</p><p></p> 8 <p> The IP address of this server is:</p> 9 <asp:Label ID="Label1" runat="server"></asp:Label> 10 </form> 11 </body> 12 </html>

As you can see, this ASP.NET has some code:

using System; using System.Collections; using System.Configuration; using System.Data; using System.Linq; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.HtmlControls; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Xml.Linq; using System.Net; namespace myip { public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { string strHostName = " "; strHostName = System.Net.Dns.GetHostName(); IPHostEntry ipEntry = System.Net.Dns.GetHostByName(strHostName); IPAddress[] addr = ipEntry.AddressList; Label1.Text = addr[0].ToString(); } } }

I loaded this onto two NLB cluster hosts changing the HTML title on each host to reflect the host name (the HTML avove is used on Node 1)! When I then navigated to the page, I saw this:

image

As you can seem, in this case, the cluster served up Node 2. Depending on which underlying host serviced the request, I saw a different title and a different IP address.

Thanks for http://www.codeproject.com/KB/cs/network.aspx and Naveen K Kohli for the important parts of the C#code.

Technorati Tags: ,,

574 Reasons for MS to be Proud and Optimistic About W7 and WS08R2

In a really neat blog post, PowerShell Team Blog : 574 Reasons Why We Are So Proud and Optimistic About W7 and WS08R2, Jeffrey Snover continues the PowerShell 2.0 tease. In this post he looks at the wide range of cmdlets that Microsoft plan to ship with PowerShell V2.

To some degree, this is not very exciting news. But only because this just meets MY expectations. However, in saying this, you should know that my expectations were set to a very high bar years ago – this latest blog post merely confirms what I’ve always believed. Namely, that with a good team, PowerShell could become a truly great product.

For IT professionals, Windows admins, and general power users, this post is another in a long line of hints that you really should get to know PowerShell much better. If you can’t pick it up and learn it on your own (like some of us old timers who had to since there was no training in those earlier days), then come on a training course. In EMEA, I run a nice selection of PowerShell classes, including the Microsoft 6434 course.

The 6434 is an excellent course, but I suppose that as I helped to write it, I would say that. It’s three days long and packed. Day 1 looks at what I call the holy trinity: cmdlets, objects and the pipeline and a whole lot more. Day 2 moves into how you use these things to write production scripts. Day three then looks at three key technologies (WMI, ADSI/.NET and COM) and how PowerShell can be used to interoperate and to manage Windows 2008 and other servers.

In closing, I just wish Jeffrey would stop teasing, and just release some code. We’ve not had a new PowerShell build for what seems like forever and I’m just anxious to get my hands on all this wonderfulness. If the PowerShell team is true to form, there just may be a nice surprise next week in Barcelona. I sure hope so!

Tuesday, October 21, 2008

Hyper-V on Server Core vs. VMware ESXi

With Microsoft having released  Hyper-V (for free) and VMware having done the same, it was always a matter of time before someone did head-to-head tests on the two free Hypervisors. And in fact the ICTFreak blog has done just that. In the post: VMware: Hyper-V on Server Core vs ESXi we have a comparison on how easy it is to install VMware ESXi and Hyper-V. This post also has a couple of videos that show the process.

Before I read the piece, I would have expected the results to favour Redmond. But no – VMware required 1/7th the number  of reboots, significantly fewer keystrokes (85 vs. 684) and mouse clicks (14 vs 30). Installing VMware ESXi required 7 steps taking just over 10 minutes whereas Hyper-V took 37:27 and required 13 steps. The hands down winner was Vmware.

To be fair to Redmond and Hyper-V, part of this gaping difference is due to the weaknesses of Windows 2008 Server Core’s UI than the hypervisor itself. The bare UI offered by Server Core, plus the absence of decent tools such as PowerShell, means that the whole configuration process was a lot harder. As the article says that, with Server Core, you would need to: “key in a long sequence of obscure commands to configure iSCSI initiators and targets, partitions and file systems.” A fairer test might be how long it took on Windows 2008 full install.

Of course, installation is only one part of the puzzle and hopefully you do it just once (per server). The extra half an hour per server to install Hyper-V may not be a huge problem, but it sure doesn’t help in the hearts and minds battle.

This result shows Redmond has a lot of work to do to make the install of Hyper-V better as well as work to do to make Server Core more Admin friendly. Both are, I would hope, work items for the next version of Windows due out in a few of years (if the 4 year between versions story is made, this would be 2011). Maybe some  of this might be done in time for the intermediate release (Server 2008 R2) which should hit the shelves in 2009 (or more likely 2010) based on the 2-year between major and  minor version gap. We’ll see.

Technorati tags: , , ,

Sunday, October 12, 2008

Under The Stairs – Access by Browser (IE’s Share Drops, FireFox’s Gains)

As I have mentioned in the past, I watch the stats for this blog using Site Meter, which offers the ability to show me access by browser type. Looking at the browser share report today, I saw this really very interesting graph:

image 

FireFox now scores more than IE!

I suspect this may have more to do with my audience, but this is the first time I’ve seen the FireFox beating IE. I also suspect that there may be errors in the way Site Meter counts – there are no numbers for FireFox 2 or 3, or Chrome, but there are numbers (small) for IE8.

Interesting.

 

Symantec’s Vista User Account Control Tool – UAC done right

UAC is a feature of Vista that I long ago learned to dislike. Recently, I’ve been playing around a bit with Symantec’s Vista User Account Control tool. Currently in beta (and available for a free download), this effectively replaces Vista’s UAC tool so as to both make your system more secure and to improve the user friendliness.

UAC, as a concept, is a great idea. In most case, you really don’t need high privileges to do most day to day jobs. Of course, it depends on your job, but for most of the folks out there for whom their PC is just a business tool, UAC makes sense. While the concept is great, the implementation by Microsoft is so awful that many users (myself included) have just simply turned it off. I recognise the risk – but I do enough “privileged” things that UAC was more of a hindrance than a help. Like a lot of IT Pros that know better what they are doing than some mindless individual back at Redmond, I just said no to UAC.

Symantec’s tool, on the other hand, offers some nice features that would make UAC tolerable for many – it’s about good enough that I might actually buy it for my home machines. You can compare Vista and Symantec’s UAC fea\ture over on the ZD site: http://content.zdnet.com/2346-12554_22-240379.html

The key feature that, for me, makes Symantec’s product acceptable (and MS’s version so unacceptable) is the “Don’t ask me again” feature – it stops the UAC prompt coming up when you repeatedly do things (that would otherwise trigger UAC). If you are constantly doing some action that UAC doesn’t like – Vista prompts you each time, and there’s no way around it. In other words, Symantec’s UAC learns what is ‘normal’ and simply prompts less – making the prompts that do come up more useful in the long term.

The Symantec tool also provides more information about what triggered the alert. The way MS has implemented it, most users just get conditioned to click Yes and to allow the action. ALl in all, with Symantec’s tool, users can make more informed decisions on those few(er) occasions when they get nagged and are less conditioned to just click Yes!

The product is currently in beta. One aspect to be aware of is that the beta will send information back to their labs about the tool’s usage. A FAQ on the Symantec web site says this information “contains file name and file hashes for the EXE that caused the prompt and the EXE that is to be the recipient of the elevated privileges. In addition, the meta information contains file name and file hashes for DLLs that were active in either of the two EXEs, response information (what option did the user choose, how quickly, and did they choose "do not ask me again"), and date/time info.

So well done Symantec! Hopefully MS will try to catch up in this area in time for Windows 8. In the mean time, if you like the UAC concept but hate how MS have done it – take a look at Symantec’s offereing.

Technorati Tags: ,,

Sunday, October 05, 2008

MSDN/TechNet Sites are back up!

Yesterday, I noted that Microsoft’s TechNet and MSDN Sites were unreachable for me.Today, they seem back up. Not sure what the outage was due to, but they’re live now.

I hate long (eg 1 day or more) outages where there seems to be no warning, or no exlanation. MSDN/TechNet site management needs a Microsoft Operations Framework course in Change Management.

Technorati Tags: ,,,

Saturday, October 04, 2008

PowerShell Security – why MS do it the way they do

Early on the life of PowerShell, the PowerShell development team “got” security. I don’t mean to suggest that team members themselves didn’t get security, what I mean is that the team, and the product got it.  And while I hated some of the implications of that, it was the right decision.

When I was first confronted with what has become PowerShell’s security model, I was not impressed – it slowed me down and, so I felt at the time, was little more than a speed bump. But the more I thought about it, and listened to some pretty smart guys explain it in the private newsgroups, the more I think they got it and got it well. It’s needed and the team have done well.

As described over on the PowerShell Team Blog, the product’s security model will not prevent all risks – it won’t stop 3rd world hunger, it won’t make you attractive to the opposite sex, and it won’t stop us all getting older. But what it does do, and does well, is to mitigate the risks it can – read the article to see what I mean. And what the security model does, to my way of thinking, is to show MS’s security vision, Trustworthy Computing, in a great light.

Of course any software product can’t do the things I mentioned above. But a tool as powerful as PowerShell can, in the wrong hands, do big damage. The fist step in designing any new product (e.g. PowerShell) is to have a very clear idea of the threat model: who will do bad things with the product. Not only does Jeffrey explain PowerShell’s threat model, but he explains how the team have tackled it.

For anyone looking at PowerShell, I urge you to read Jeffrey’s blog post carefully. And perhaps read it several times. There is incredibly good thinking, IMHO,there. And – if you can find either additions to the threat model, or a better idea as to how to address it, I’d sure like to hear it! No doubt the team does too!

Thanks Jeffrey to Lee Homesfor yet another great blog post.

[later]

And thanks Steve for pointing out who the author of the post really was – my bad! Oh – and I managed to nuke your comment to accidentally – sorry.

Technorati Tags:

The PowerPoint world lost a good guy today

The PowerPoint world lost a good guy today (well a few days ago actually). Brian was a friend of mine. He was one of the first MVPs I ever met. He was outstanding in his field, and a real gentleman. The MVP programme is poorer for his loss.

RIP Brian Reilly.

Technet/MSDN sites seem down

It may be me, but I can’t get to http://msdn.microsoft.com or to http://technet.microsoft.com. Trying the former sends me to: http://msdn.microsoft.com/Message-Error.htm?aspxerrorpath=/en-gb/default.aspx, whereas the latter sends me to: http://technet.microsoft.com/Message-Error.htm?aspxerrorpath=/en-gb/default.aspx

I’ve pinged a UK contact, but it’s Saturday so she’s probably doing sensible things like not working <grin>.

This may just be another maintenance window, but I needed something from the site today and it’s down. :-(

[later]

Apparently - these pages are OK coming from Germany (thanks for the comment) but are still broken from here. I just tried on another system and get the same results. :-(

Microsoft Doesn't Matter Anymore

The title is not my idea – but comes from a Salon article: Microsoft doesn't matter anymore. The article talks about Microsoft’s latest attempt to improve the popularity of its Live Search site. The article, overall, is pretty scathing about MS, and  personally I don’t agree with at least some of its conclusions. But I suppose that’s to be expected as I make a living out of working in the MS eco-sphere and I do truly like many of the products MS makes and sells.

I do,agree with the article’s suggestion that this is a really lame marketing campaign. Or perhaps I should say another lame marketing campaign (Seinfeld/Gates etc). While Salon is right to point out the lameness – there’s another even lamer aspect: you have to use IE6. I use FireFox as my browser for some good reasons and certainly not going to change just to win cheap tacky prizes.

I have used both Google and Live Search for many years – and have both of them installed in my browser. Pretty much every time I change my default to use Live Search, I have to end up going back to Google. The reason is really simple: Live Search in general produces less relevant results. As a writer and trainer, I rely on web searching – relevance really matters to me.I’d switch over to using Live Search in a heartbeat if it’s results were better. As a writer and trainer, I rely on web searching – relevance really matters to me.

There is one area, however, where I do like Live Search – using it to search for typos. I am a moderator for the MSDN and Technet wiki’s and from time to time I find a typo that I want to get cleared up. If I was to search for “Micorsoft” for example, Live Search does just that, whereas Google has a tendency to assume I meant Microsoft and goes on to search for that. 

Something I disagree with the article over is the Zune. Now maybe it’s because I’ve never owned an Ipod, but I like my Zune. I have the 80GB red ZUne and I love it. I’ve got a great set of Sure earbuds and I take the device everywhere. I subscribe to several podcasts (including The DeapPod: at http://deadshow.blogspot.com). I carry my Zune with me whenever I travel!

As a small aside, and as another example of Google vs Live Search relevance,try searching for “dead pod” in Live Search vs Google (I wanted to find the URL to the site in the above paragraph). Google’s first hit was exactly what I wanted to find – and after three pages of results, I gave up with Live Search. Thanks MS, but free t-shirts can’t top relevance.

In summary, I think Salon is wrong to say Microsoft doesn’t matter any more – and I like my Zune. But I sure agree that Live Search is sub-optimal (and the latest marketing gimmick is pretty lame).

Friday, October 03, 2008

Culminis (Mark II)

We finally released news last night, The Culminis Restructure is now official. I’m super-excited to be part of this fine group and highly honoured to be the first EMEA Chairman. I feel this is both a huge opportunity and a huge responsibility. I can’t wait to get stuck in!

if you are going to be at TechEd EMEA, then stop by our booth. I’m doing what I can to arrange comfy chairs, power leads and wired network drops. With a bit of luck, we’ll also have some freebies to give away and some special guests – no promises but I’m on the scrounge for goodies!

Technorati tags:

Thursday, October 02, 2008

Wireshark Training Courses

If you are in to networking, then you probably know about Wireshark (or Ethereal as it used to be known). It’s a first class packet capture and analyser tool, which many networking professionals rely on. It’s got a rich set of decodes an other features and is invaluable as a troubleshooting tool, as well as very useful to get to understand network protocols.

As our press release (New Wireshark Training Courses ) explains this training . I can’t wait to teach these classes.

If you want to know what’s happening on the wire – come along and learn!