Monday, April 08, 2019

PowerShell 7 - Coming Soon!

The news is now out and to me clear - the next release of PowerShell, to be known as PowerShell 7 is going to be a huge deal. Steve Lee released the news in a blog post about the next release. There are several big points in Steve's blog post:

First, the next release is going to be called PowerShell 7. They are dropping 'core'  from the product name, and the team are taking the opportunity to up the major build number to 7.0. I think many folks confused the term "core", so this is a great time to tune the product name. The change in version number was also surprising but most welcome - by calling it PowerShell 7, it signals to Windows IT Pros that what you now know as PowerShell Core 6.2 is pretty good on windows (all my Grateful Dead curation scrips work - so that's good!). But it's not 'there' enough: no WPF, no WinForms, and a bunch of cmdlets that are not supported.

Since PowerShell is based on .NET, its hard to do things that are not in the version of .NET in use. So since there is no WinForms support in the .NET version integrated with PowerShell Core 6.2, that version of PowerShell core cannot run Winforms programs. All you need to do is to add the necessary classes and hey presto.

.NET Core 3.0 is the big game changer here. It is intended to have all the underlying APIs that should allow upwards of 90% compatibility with Windows PowerShell 5.1. In my discussions with Steve Lee, I came away with the distinct view that .NET Core would both solve a lot of problems many Windows IT Pros had in adopting the newer versions and be an awesome tool in the Linux world. Imagine being able to write mini-GUIs that work in Linux. Utterly cool in my view and potentially very very useful for the Linux users. When we chatted, the view was to call it PowerShell Core 6.3, but I really like the name PowerShell 7. It not only makes the name easier but it calls out for adoption.

I am looking forward to updating my PowerShell books for PowerShell 7. Can't wait to get stuck in! I'll certainly be blogging more about this new release!

Monday, March 18, 2019

Moving from PowerShell Journeyman to PowerShell Master

I’ve just finished writing another book on PowerShell. The book looked at a number of core Windows Features and components, from AD, DHCP/DNS, SMB file sharing and FSRM, Hyper, V and more. Having gone through the process of witting over 125 scripts using a dozen Windows Server 2018 features  has given me a perspective on both PowerShell in use today, and in what it takes, what you really need to know in order to progress from a Google-Engineering assisted journey man into a PowerShell  master.  There is absolutely NOTHING wrong with using Google to get the job done. And for many their career path possibly precludes acquiring deep skills. But if you are the folks who aspire, please read on!

So what do you need to know, to know how to do? The following list is not in any order:

  • Understand and be able to use the PowerShell language - the starting point, for me, is that you know the PowerShell language and can use it. You should understand the core concepts of cmdlets, objects, and the pipeline. You should be familiar the the core approach to discovery (Get-Module, Get-Command, Get-Help, and Get-Member). You need to know how each language feature works and be able to leverage it in your scripts. Knowing the internal architecture of PowerShell is also almost expected.
  • Understand the .NET Framework. Powershell is built on top of .NET – cmdlets work by using .NET. Get-Process, for example, just calls [System.Diagnostics.Process]::GetProcesses(). Also known as a static method (GetProcesses()) on a .NET Class (Systems.Diagnostics.Process). You should understand the architecture of .NET (CLR, BCL, IL and JIT Compilation, .NET Security, and more).  The .NET Framework can often provide functionality for which there are no cmdlets. For example, there a number of .NET classes useful for localisation. Time zones, clock types, DST/ST, etc are all a method call away and therefore easy to use if you know how.
  • Understand how to read C# and be able to convert C# to PowerShell. There is a feast of wonderful examples of more obscure tasks often written in C#. You should be able to read the C# sufficiently to see how the code is doing things, and be able to concert simple snippets into working PowerShell Code. Knowing enough VB.NET to convert it into PowerShell is also a useful skill.
  • Understand COM and COM objects. There are a number of features that make use of COM. The Microsoft Office products can each be automated using PowerShell’s COM interop features. The Performance Logging and Alerting subsystem makes use of COM. You use PowerShell's New-Object cmdlet to instantiate a COM object to specify PLA data collector sets.
  • Know how to use XML. Some features such as the Task Scheduler make use of XML. You should know how to use the XML emitted by windows as well as knowing how to manage XML documents and the DOM. XPath is also a useful skill. The FSRM, for example, produces reports. The report format is fixed and cannot be modified. But the FSRM also produces XML files containing the report’s raw data for you to format to your own needs. PowerShell also makes use of XML for default object formatting. which you can customise to change how PowerShell formats objects.
  • Know how PowerShell modules work. Cmdlets are delivered in modules and you can write your own. Both DSC and JEA leverage modules. You should know where modules are stores, how PowerShell finds them and how it builds the Module cache, what a manifest is.
  • JEA – Also known as Just Enough Administration. It’s a neat feature that enables you to provide delegated permissions to do just those things necessary for a person’s job and nothing more. This is a very useful security feature of Powershell that appeals to large and distributed organisations.
  • Understand how to implement DSC. DSC is a great way to configure hosts and to ensure they stay configured. You should know about DSC resources, setting up DSC pull servers (SMB and Web), and DSC reporting and error logging. You should also know how DSC resources work as well as how to write your own DSC resources)
  • Master Remoting – This is a rich topic area. You should understand the PowerShell Remoting stack (including PSRM, SOAP, WinRM), how end points work and how to create a constrained end point. 
  • Understand at Depth Core Windows Features. I suppose it’s obvious, but to be a PowerShell master you have to be able to apply your skills  to Windows. You should really understand AD (and GPO), SMB (SMB3 SOFS, Clustering and Hyper-Converged S2D), Containers and Docker, Hyper-V (and maybe VMware too!), TCP/IP Networking, Disk/FIle Storage, PLA, Task Scheduler, and probably more.
  • Leverage Azure – Organisations are increasingly moving to the cloud and knowing Azure (or AWS) is also an important skill. With Azure, you should be able to build IAAS objects in cloud including web sites, VMs, and Virtual networks. You should also be able to manage Azure Storage and Content distribution.
  • Be competent at Windows Troubleshooting – There are a variety of good PowerShell tools that assist in Troubleshooting, particularly network troubleshooting. These tools really are second nature to a PowerShell master. And to be a good trouble-shooter you really need to understand what you are troubleshooting. Knowing how to leverage the information in the event logs is also critical. You should become very familiar with docs.Microsoft.com.  And if you ever work out how to fully automate the Windows trouble-shooters – let me know.
  • Use PowerShell Core and VSCode – PowerShell Core 6 is almost a re-invention of PowerShell. Cross-platform, open source, based on .NET Core which is also open source along with a totally new development tool (VS Code). Arguably,6.1 and 6.2 are not quite ready for hard core usage across all features. But it’s close – I am now using the developing 6.2 and VS Code in preference to the ISE and Windows PowerShell. My Grateful Dead scripts even work in PS Core! There are a number of features that do not work with PowerShell Core. Today, for example, DSC and Windows Forms are not supported (although 6.3 should support Windows.Forms!).

So a baker’s dozen of things you really need to know, and know how to do with PowerShell.

PowerShell Core and Experimental Features

In testing any new feature, one technique for getting users to use (and test) the feature is known as feature flags.  Essentially these are settings (flags) that signal you should gain access to those experimental features. The allows a user to opt-in to testing the new features. Thus is a big application, such as PowerShell, most users just use the published feature set. But if you know how, you can turn on some interesting new features. And of course if you do not like a given experimental feature, you can turn it off.  Let’s look at hot oaccess these features. In this blog post, I am using PowerShell Core 6.2.0-rc.1. If you install different versions, your mileage is going to vary!

Finding Experimental features

Finding experimental features is pretty easy. Hey – this is PowerShell and you should know what to do). Like this:

PS [C:\foo> ]> Get-ExperimentalFeature
Name                        Enabled Source   Description
----                        ------- ------   -----------
PSCommandNotFoundSuggestion   False PSEngine Recommend potential commands based on fuzzy search on a CommandNotFoundException
PSImplicitRemotingBatching    False PSEngine Batch implicit remoting proxy commands to improve performance
PSTempDrive                   False PSEngine Create TEMP: PS Drive mapped to user's temporary directory path
PSUseAbbreviationExpansion    False PSEngine Allow tab completion of cmdlets and functions by abbreviation

So these four experimental feature present on 6.2.0-rc.1 all look pretty cool to me.  The usefulness of these may vary. For me tablcompletion of cmdlet names using abbreviations could interesting. Potentialy really useful so worth looking at even though as I think about it – if the alias is any good, it’s wired into my fingers thus it may be a feature that I never use. Creating the TEMP: folder is something I would take advantage of. The PSCommandNotFoundSuggestion just helps IT Admins to find what they need. And the batching of implicit remoting commands could be very useful if you are, for example, managing Exchange On-Line  using local PowerShell and importing the session.

Enabling Experimental Features

Again – this is PowerShell so: Simples:

PS [C:\foo> ]> Get-ExperimentalFeature | Enable-ExperimentalFeature
WARNING: Enabling and disabling experimental features do not take effect until next start of PowerShell.
WARNING: Enabling and disabling experimental features do not take effect until next start of PowerShell.
WARNING: Enabling and disabling experimental features do not take effect until next start of PowerShell.
WARNING: Enabling and disabling experimental features do not take effect until next start of PowerShell.

So very simple to add in. Due to how these features are implemented you nee to restart Pwsh before you can access the fetures:


Using Experimental Features:

The Command not found suggestion feature is nice. After enabling, PowerShell does a better job of handling typos, like this:

PS [C:\foo> ]> get-chliditem
get-chliditem : The term 'get-chliditem' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:1
+ get-chliditem
+ ~~~~~~~~~~~~~
+ CategoryInfo          : ObjectNotFound: (get-chliditem:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException


Suggestion [4,General]: The most similar commands are: Get-ChildItem, Get-ChildItem2.

Nice.  It could be really useful where you have long cmdlet names


Disabling Experimental Features

Needless to say, disabling them is simple too: Use the DisableExperimentalFeature cmdlet to disable the commands.


Summary

Experimental features are PowerShell Core features that may or may not be added to future versions of PowerShell Core. They are easy to enable, consume, and disable as you wish but use at your own risk.

Thursday, March 07, 2019

Setting up DHCP in Windows Server 2019

In my recently published book (Windows Server 2019 Automation with PowerShell), I included several recipes to show how to install DHCP, how to setup a scope, and how to setup DHCP load balancing and fail over. This blog post looks at the steps involved.

In this walk through, I assume you have two servers on which you wish to run DHCP. Many smaller organisations install DHCP on their DCs for simplicity – I’m assuming the two servers are DC1.Reskit.Org and DC2.Reskit.Org. You also need to determine what IP addresses to hand out (this sample uses the rante 10.10.10.150-10.10.10.199), the subnet mask (255.255.255.0), and an IP address for the DHCP Server.

Hree are the steps:

1. Install DHCP Service on DC1, DC2

# Create a script block
$SB1 = {
  # Install the service
  Install-WindowsFeature -Name DHCP -IncludeManagementTools
  # Add the DHCP server's security groups
  Add-DHCPServerSecurityGroup
  # Let DHCP know it's all configured
  $RegHT = @{
    Path  = 'HKLM:\SOFTWARE\Microsoft\ServerManager\Roles\12'
    Name  = 'ConfigurationState'
    Value = 2  }
  Set-ItemProperty @RegHT
}
# Run the script block on DC1, DC2
Invoke-Command –Computer DC1 –ScriptBlock $SB1
Invoke-Command –Computer DC2 –ScriptBlock $SB1
# Authorise the DHCP server in AD
Add-DhcpServerInDC -DnsName DC1.Reskit.Org
Add-DhcpServerInDC -DnsName DC2.Reskit.Org
# Restart the DHCP Service on DC1, DC2
$SB2 = {
  Restart-Service -Name DHCPServer –Force
}
Invoke-Command –Computer DC1 –ScriptBlock $SB2
Invoke-Command –Computer DC2 –ScriptBlock $SB2


2. Create a DHCP Scope

# Add new DHCP Scope to DC1
$SHT = @{
  Name = 'Reskit'
  StartRange   = '10.10.10.150'
  EndRange     = '10.10.10.199'
  SubnetMask   = '255.255.255.0'
  ComputerName = 'DC1.Reskit.Org'
}
Add-DhcpServerV4Scope @SHT

3. Configure DHCP Server Options

# Set DHCP V4 Server Option Values
$OHT = @{
  ComputerName = 'DC1.Reskit.Org'
  DnsDomain    = 'Reskit.Org'
  DnsServer    = '10.10.10.10'
}
Set-DhcpServerV4OptionValue @OHT

4. Configure Fail Over/Load Balancing between DC1 and DC2

$FHT = @{
  ComputerName       = 'DC1.Reskit.Org'
  PartnerServer      = 'DC2.Reskit.Org'
  Name               = 'DC1-DC2'
  ScopeID            = '10.10.10.0'
  LoadBalancePercent = 60
  SharedSecret       = 'j3RryIsG0d!'
  Force              = $true
}
Add-DhcpServerv4Failover @FHT

After you complete these two steps, both DC1 and DC2 are able to satisfy DHCP configuration requests on the 10.10.10/0 subnet, can provide some options to DHCP clients along with offered IP address and the subnet mask, and is in a fail over/load balancing relationship.

Monday, March 04, 2019

It's Done! So now what??

Last week, I completed work on a new PowerShell book. The book is entitled Windows Server 2019 Automation with PowerShell. The front cover looks like this:

One of the more interesting features the more eagle-eyed viewers might notice is that Jeffrey Snover wrote a forward. I've known Jeffrey for a very long time and admire him enormously. The draft of his foreword reads like this:


Of course, I love it and am immensely flattered. I just hope the book lives up to his introduction.

The book contains over 120 'recipes' - small scripts that do useful things. I have tried to strip out some of the overhead normally associated with production scripting (error handling, logging, firewalls, etc.) to concentrate on the essence of the various features and how to manage them with PowerShell. The scripts are also published on Github at: https://github.com/doctordns/PowerShellCookBook2019. 

I learned a lot in the process of writing this book. It was fun delving down into various aspects of Windows Server 2019 and playing with various features. Being able to log performance data then graph it all in a few lines of code was neat.

A major resource in writing this latest book was https://docs.microsoft.com. This is Microsoft's (largely) open source documentation platform. The documents are, in the main, of an excellent standard. And when I found a page that has an error or was unclear - I just fixed it. Highly satisfying to improve the documentation.

The book is now available on Amazon: https://smile.amazon.com/Windows-Server-Automation-PowerShell-Cookbook/dp/1789808537/ref=sr_1_3?ie=UTF8&qid=1551353410&sr=8-3

Enjoy!!

Saturday, December 15, 2018

Calling a PowerShell Function like a Method is a Bad Idea

As a frequent contributor to Spiceworks’s PowerShell forum (come visit at: https://community.spiceworks.com/programming/powershell), from time to time I see a post that contains bad practice and sometimes worse.  One thing I see more often than I’d like, is post that calls a PowerShell function like a .NET method. This is, as the subject to this article suggests, is a very bad idea. Let me explain why.

First, let’s create a really simple function Add, like this:

Function ADD {
   Param ($A, $B)
   $A + $B
}

This function just adds two values and returns the results. Now, go and use the function both as a proper function call, then as a method call:

Psh[C:\foo]> Add 1 2
3

Psh[C:\foo]> Add(1,2)
1
2

As Iyou can see, the two uses of the Add function return decidedly different results. But why? At first sight, it appears quite illogical. To work out what is going on, under the covers,  create a richer Add function that describes what is being passed to and from the function, like this:

Function WhatAdd {
   Param ($A, $B)
   # Set to avoid error if GetType fails
   $ErrorActionPreference = 'SilentlyContinue'
   # Display the type and value of the first parameter
   "A is type   : [$($A.GetType())]"
   "A has value : [$A]"
   # Display the type and value of the second parameter
   "B is type   : [$($B.GetType())]"
   "B has value : [$B]"
   # Now calculate and describe the sum:
   $Result = $A + $B
   "Result has type  : [$($Result.GetType())]"
   "Result has value : [$Result]"
   # Return the result
   Return $Result
}

This function performs the same calculations and returns a results. It also displays the type and value of each input parameter and the result.  First try it out with calling a function like a cmdlet:

Psh[Cookham24:C:\foo]> WhatAdd 1 2
A is type   : [int]
A has value : [1]
B is type   : [int]
B has value : [2]
Result has type  : [int]
Result has value : [3]
3

Next, try calling Add like a method:

Psh[Cookham24:C:\foo]> WhatAdd(1,2)
A is type   : [System.Object[]]
A has value : [1 2]
B is type   : []
B has value : []
Result has type  : [System.Object[]]
Result has value : [1 2 ]
1
2

As you can see, the results of calling a function like a method are not what you might have expected.  What is happening is that when you call the Add function like a method, PowerShell assumes that the stuff inside the parentheses is an array of values that is passed to the first parameter of the function, not two separate parameter values. This means PowerShell passes an untyped array of two numbers as the value of the first parameter, and Null as the value of the second. The result of Adding Null to an array of values is an just an array of values (the input values) and not the addition of anything.

Of course, had you indicated in the function’s parameter block of the exact type of the parameters, PowerShell would have indicated the mismatch (passing an array into an parameter with a type of Int.

So two best practices in a single article.

  • First, avoid calling functions like a method. The results can be very much not what you were expecting or wanting.  As a user of someone else’s function, you don;t know how well, or badly they wrote it. Just call a function like a cmdlet.
  • Second, if you are writing functions, assume ALL user input is evil until you prove other wise and that your user will pass you data you were not expecting (like an array not a single integer). At a minimum, ensure that ALL parameters in your functions are explicitly typed. And consider ensuring that what you return is properly typed.  It all cuts down on bad input ruining your day.

Saturday, October 27, 2018

There's always a way...

I teach my classes that there's always more than one way to do almost anything. This week was a nice reminder of how true that is.

The client had a legacy script that scans their client's AD attached computers. One small thing this script does is to determine the Domain Functional Level and whether the domain is still in mixed mode.

The 'old' script we were looking at was a very long, comment-free VBS script using LDAP to get the Domain Object in the AD to get the values for those two attributes to determine the domain level. You might think you could convert that horrid VB code into a simple Get-ADDomain, pick out the two properties.

Sadly, that does not work. The Get-ADDomain cmdlet returns a number of properties that represent the attribute values inside the AD.  But not all!

Here is a screenshot of LDP.EXE showing the domain object and two properties inside my AD:



As you can see, the domain object has two attributes/properties (msDS-Behavior-Version and ntMixedDomain. But Get-ADDomain does not return those properties, as you can see here:


The solution, however, is pretty easy. Get-ADDomain returns an object with the Distinguished Name of the domainDNS object. That object represents the domain inside the AD and it is that object which has the needed properties. So, use Get-ADDomain to get the Domain object's Distinguished Name (DN), then use Get-ADObject to get the object whose identity is that DN. It looks like this:



An interesting thing is that some AD objects, like the domainDNS object, have attribute names that contain a hyphen, eg the msDS-Behavior-Version attribute. This can confuse PowersShell's parser. So the solution is to enclose the property names in braces, as you can see in the final line of the snippet above. An obscure feature of PowerShell that in this case is useful.

All in all, I'd like to think Jimmy Andersen would be impressed. Well, a bit anyway!




Thursday, October 04, 2018

Installing RSAT tools

I've been installing 1803 (and for reasons I can not explain, 1709)  on a bunch of workstations - but needed the RSAT tools. I HATE having to use the GUI - so here's a PowerShell script:

# Install-RSATTools.PS1 # Thomas Lee - doctordns@gmail.com # 1. Get Windows Client Version and Hardware platform $Key = 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion' $CliVer = (Get-ItemProperty -Path $Key).ReleaseId $Platform = $ENV:PROCESSOR_ARCHITECTURE "Windows Client Version : $CliVer" "Hardware Platform : $Platform" # 2. Create URL for download file # NB: only works with 1709 and 1803. $LP1 = 'https://download.microsoft.com/download/1/D/8/'+ '1D8B5022-5477-4B9A-8104-6A71FF9D98AB/' $Lp180364 = 'WindowsTH-RSAT_WS_1803-x64.msu' $Lp170964 = 'WindowsTH-RSAT_WS_1709-x64.msu' $Lp180332 = 'WindowsTH-RSAT_WS_1803-x86.msu' $Lp170932 = 'WindowsTH-RSAT_WS_1709-x86.msu' If ($CliVer -eq 1803 -and $Platform -eq 'AMD64') { $DLPath = $Lp1 + $lp180364} ELSEIf ($CliVer -eq 1709 -and $Platform -eq 'AMD64') { $DLPath = $Lp1 + $lp170964} ElseIf ($CliVer -eq 1803 -and $Platform -eq 'X86') { $DLPath = $Lp1 + $lp180332} ElseIf ($CliVer -eq 1709 -and $platform -eq 'x86') { $DLPath = $Lp1 + $lp170932} Else {"Version $cliver - unknown"; return} # 3. Display the download details "RSAT MSU file to be downloaded:" $DLPath # 4. Use BITS to download the file $DLFile = 'C:\foo\Rsat.msu' Start-BitsTransfer -Source $DLPath -Destination $DLFile # 5. Check Authenticode signature $Authenticatefile = Get-AuthenticodeSignature $DLFile If ($Authenticatefile.status -NE "Valid") {'File downloaded fails Authenticode check'} Else {'Downloaded file passes Authenticode check'} # 6. Install the RSAT tools $WusaArguments = $DLFile + " /quiet" 'Installing RSAT for Windows 10 - Please Wait...' $Path = 'C:\Windows\System32\wusa.exe' Start-Process -FilePath $Path -ArgumentList $WusaArguments -Wait # 7. Get RSAT Modules Get-Module -Listavailable | Where-Object Name -like '*rsat*'

Friday, August 03, 2018

PSReadline V2 Can Break PowerShell Profile Scripts in Latest Insider's RS5 Preview

The Windows Insider programme provides access to the latest pre-release versions of Windows 10 and Windows Server 2019. Earlier this week, Microsoft released a new preview of what is coming with Windows 10 RS5. This build, 17728.1000, contains an updated version of the module PSReadLine. PowerShell uses this module to enable you to, inter alia, set the colours of tokens in the PowerShell command console.

I love this feature, as it allows me to override the default colours that PowerShell uses. One particular issue for me is that the colours can look less good when displaying on some of the lower quality projectors I get stuck with!  One example - the parameter name token is set, by default, to gray. Displaying from my laptop, this has been hard to see at some sites. So I just added this line to my profile:
Set-PSReadlineOption -TokenKind Parameter -ForegroundColor Cyan
Simple and this is highly useful for me ainsome places I teach. However, with the latest release of the RS5 preview, Microsoft has  now included an updated version of PSReadline (2.0 vs 1.2). Taking the new version was done to enable a fix to an accessibility issue. If you run PowerShell after the upgrade, you might see this error when you run PowerShell:


This is both a breaking change and one not described in the release notes. And, if you use Bing or Google to search for Set-PSReadLine, the page at docs.microsoft.com shows the old version of the cmdlet - you need to dive into the actual PSReadLine repository (ie the modules 'source' code!) to discover the answer.

The solution is simple - change that line to:
Set-PSReadLineOption -Colors @{"parameter" = [ConsoleColor]::Cyan}
I've opened issues at docs.microsoft.com, the PsRadline repo and with the Insider's programme, so hopefully going forward this issue is no longer an issue.


Friday, July 20, 2018

My Book Scripts

I saw a comment over on Amazon relating to my last PowerShell book (see Amazon Book Listing) from Long Time Programmer. His comment is that he couldn’t download the code. I wish there was a way I could comment on his comment as there is a way. I’ve put all my scripts up on GitHub!  If you go to https://github.com/doctordns/PowerShellCookbook you can see the scripts there.  Although I have to say some of them – getting the scripts onto GitHub is a work in progress.

As an aside and for background: the publication process at the time this book was published had a few issues.  The web site used to enter content  was literally eating content so we switched over to word at last hour. In doing so, the longer-line scripts ended up broken in the final PDF and thus the published version.

To be helpful to my readers, I’ve slowly retrieved the scripts from inside the VMs and have published them to GitHub, Which is less simple than it sounds and it not yet complete. At present, I’ve ported over the recipe scripts for chapters 1-4, 8, 10, 11 (7 out  of 13). And I hope to have the updates soon.
Please download any of the scripts – and file issues if you find problems.

[Later]
The conversion has been completed and I've created a version 1.0 of the scripts on the GitHub repository. . You can consume the scripts by navigating the GitHub UI or you can take a fork and clone to a local machine. The release mechanism in GitHub provides a zipped up Go to the GitHub Release page and download a .gz or .zip file depending on your tastes. I may do some further tidying up. 

Wednesday, July 18, 2018

Please Buy This PowerShell Conference In A Book

psbook

A couple of months ago, M:Ike Robbins (the tech editor of my last PowerShell book (https://www.amazon.co.uk/dp/B073RP2SNZ/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1), sent me mail about a project that he and a few others were planning. The idea was to develop a PowerShell Conference in a book. Mike invited some of the big PowerShell community members to each contribute one chapter. The proceeds are intended to help the DevOps community. Any royalties are to go towards OnRamp Scholarships with the DevOps Collective. Of course I said yes.

Well – now that book is published to which I have contributed a chapter. You can read about the book and buy it here: https://leanpub.com/powershell-conference-book. The book has an impressive list of contributors and is pretty reasonably priced! Of course, if you are feeling generous, LearnPub is happy to enable you to pay more.

The On-Ram Scholarship is a great cause, bringing new people into the DevOps field. You can read about the scholarship here: https://powershell.org/summit/summit-onramp/onramp-scholarship/

My chapter in the book is entitled ‘A Lap Around .NET’ in which I look at what is .NET and some of the things you can do with it.

Saturday, July 14, 2018

Keeping PowerShell Modules Up To Date

At one time, many (most?) IT Pros did not worry to much about updating PowerShell modules,relying for the most part on whatever cmdlets come with the Application, PowerShell, and/or the OS. But times have changed. With the release of Windows PowerShell 5.1, work on modules both inside and outside Microsoft has not stopped. And a number of modules now have updates that might be useful.

Another big change in the PowerShell world is the movement towards holding these new versions on Repositories that you can access via built-in commands (i.e. Find-Module, Update-Module, etc). The question then becomes what modules need updating? Turns out there is a script or two for that!

The first question is whether you care enough about module updates to worry about dealing with updates, and the side effects of module changes.

As I was writing this article, I encountered an interesting example of this issue. The version of the PSReadline module has been updated  to Version2.0 This version has a breaking change in how you set UI token colours. it took a while to sort out but it drove home to me the need to test new versions and not just update because you can. Unless, I suppose, you like living on the edge.

So how do you tell what modules need updating. And as ever, there’s a script for that ™.

Here is a simple function that looks at all the modules you have on your system (well those pointed to via $env:PSModulePath), and checks to see if that module is available via the PSGallery. For every module the founction returns an object with three properties: the module name, the version on the local system and the version on PSGallery. For modules that do not exist on PSGallery, the script returns a PSGallery version of ‘zero’. Here’s the function:

Function Get-ModuleVersionInformation {

[Cmdletbinding()]
Param()

# Startup
$Start = Get-Date
Write-Verbose 'Get-ModuleVersionInformation'
Write-Verbose 'started at: [$start]'

# Get the modules on the local system
$Modules = Get-Module -ListAvailable -Verbose:$False
Write-Verbose ("{0} modules locally" -f $modules.count)

# For each module, see if it exists on PSGallery
# Create/emit an object for each module with the name,
# and the version number of local and remote versions
Foreach ($Module in $Modules) {
  Write-Verbose "Processing $($module.name)"
  $UpdateHt         = [ordered] @{}    # create the hash table
  $UpdateHt.Name    = $Module.Name     # Add name
  $UpdateHt.Version = $Module.Version  # And local version

try {
#  Find module, and add gallery version number to hash table
    $GalMod = Find-Module $Module.name -ErrorAction Stop
    $Updateht.GalVersion = $GalMod.Version
  }
  # here - find module could not find the module in the gallery
Catch {  
   # If module isn't in the gallery
  $Updateht.GalVersion = [System.Version]::new(0,0) 
  }

# now emit the object
New-Object -TypeName PSObject -Property $UpdateHt

} # End foreach

$End = Get-Date
Write-Verbose "Stopped at: [$End]"
Write-Verbose "Took $(($End-$Start).TotalSeconds) seconds"
  
} # End Function

On my system, the output looks like this:

Psh[Cookham24:C:\foo]> $mods = Get-ModuleVersionInformation
Psh[Cookham24:C:\foo]> $mods

Name                     Version     GalVersion
----                     -------     ----------
Buildlab                 1.0         0.0       
NetCmdlets               16.0.6446.0 16.0.6592.0
ScriptBrowser            1.3.2.0     1.3.1.0   
tfl                      1.0         0.0       
Azure.AnalysisServices   0.5.0       0.5.2     
Azure.Storage            4.1.0       4.3.1     
AzureRM                  5.2.0       6.4.0                      
AzureRM.RedisCache       4.1.0       0.0       
AzureRM.Relay            0.3.1       0.0
 

… Etc.

Loaded with this function you can then do this:

$zero = [System.Version]::new(0,0)
foreach ($mod in $mods) {
  If ($mod.galversion -eq $zero) {
  $msg = "Module [$($mod.name)] does not exist in PSGallery"
  $msg
  continue
  }
  If ($mod.galversion -gt $mod.version){
  $msg = "Module [$($mod.name)] should be updated from v$($mod.version) to v$($mod.galversion)"
  }
  Else {
  $msg = "Module [$($mod.name)] does not need to be updated - current version $($mod.version)"
  }

$msg
} # End foreach

The output, again in my main workstation looks like this:

Module [Buildlab] does not exist in PSGallery
Module [NetCmdlets] should be updated from v16.0.6446.0 to v16.0.6592.0
Module [ScriptBrowser] does not need to be updated - current version 1.3.2.0
Module [tfl] does not exist in PSGallery
Module [Azure.AnalysisServices] should be updated from v0.5.0 to v0.5.2
Module [Azure.Storage] should be updated from v4.1.0 to v4.3.1
Module [AzureRM] should be updated from v5.2.0 to v6.4.0
Module [AzureRM.AnalysisServices] should be updated from v0.6.2 to v0.6.9
Module [AzureRM.ApiManagement] should be updated from v5.1.0 to v6.1.1
Module [AzureRM.ApplicationInsights] should be updated from v0.1.1 to v0.1.4
Module [AzureRM.Automation] should be updated from v4.2.0 to v5.0.1
Module [AzureRM.Backup] should be updated from v4.0.2 to v4.0.6
Module [AzureRM.Batch] should be updated from v4.0.4 to v4.1.1
Module [AzureRM.Billing] should be updated from v0.14.0 to v0.14.3

This leaves me with a bit of work to do – updating PSReadLine, as I discovered did have some unfortunate side effects (and I suspect I am not the only one to have fallen over this issue!).

If I was brave, I could have updated that last script fragment to actually update each of the older modules. Not sure I’m ready for that, just yet…





Saturday, May 19, 2018

Free Cheat Sheets, Revision Aids and Quick References

A cheat sheet is a simple short document, typically 1-2 pages of notes designed to aid your memory.  I discovered an interesting site today, https://www.cheatography.com/. This site has over 2500 cheat sheets for all manner of topics, including Office, Business, Technology, kitchen and garden, travel, and a whole lot more.  The site has 17 cheat sheets relating to PowerShell as well as  35 cheat sheets relating to PHP. The most viewed cheat sheet relates to Regular Expressions (a technology I have yet to master).

The site also enables you to create your own cheat sheets.

Friday, April 06, 2018

Method Chaining–A Neat PowerShell Trick

I discovered an interesting  with PowerShell. It’s known as method chaining. Before explaining it – look at an example:

# c:\foo\testit.ps1
# Test of method chaining
$StringBuilder = [System.Text.StringBuilder]::New()
# Build up string contents
[string] $S1 = Get-ChildItem -Path c:\*.pdf | out-string
# Now build the string
$Null = $StringBuilder.
   AppendLine('Using String Builder as opposed to string concatenation').
   AppendFormat('This uses a cool feature: {0}', 'Method Chaining').
   Append([System.Environment]::NewLine).
   AppendLine('.NET has other "builder" classes operate the same way.').
   Append($S1)
# Output the resultant string
$StringBuilder.tostring()

This produces the following output:

2018-04-06_00-38-27


With Method Chaining, you have an object with methods. You specify the object name (i.e. $StringBuilder) followed by a ‘.’ character. On subsequent lines you specify a method call followed by another ‘.’ character. In this case, I created the $StringBuilder object, then chained several method calls. Each method call added lines to the $StringBuilder object. You end the method chaining by omitting the .’ on the final method in the chain (the directory listing).

I’m not too sure I’d advocate using this. It is an obscure trick of the PowerShell parser and as such might fail my “3:00 in the Morning Test” (if woken at 3:00, could you figure this out before a cup of coffee?). But is pretty cool. Yet another example of how awesome the PowerShell parser is!

Wednesday, March 21, 2018

Code Signing Certificates and PowerShell

When creating PowerShell scripts for distribution, it can be useful to digitally sign the script. This ensures the person getting your code knows it has not changed since you sent it, and can verify you as the person signing the code. If you put code on GitHub, for example, a signature might be a great idea. To do this, you need to have a code signing certificate.

In terms of certificates, you have two main options. You can use your own CA to issue the certificate(s) – or use a public CA. The benefit of the public CA is that their root CAs tend to be fully trusted making it more useful. If you issue your own certs, you may have trouble with other people trusting your code signing certificates.

I have recently obtained a new code signing certificate from DigiCert (https://www.digicert.com). It was really very easy:

1. Open an account and order your cert.

2. Validate you are who you say you are. This involves sending DigiCert some documentation (eg your Passport) to prove you are who you way you are.

3. Do a Skype call, where they watch you sign their Identify Verification and Authorization document.

4. Generate, download, and install the certificate.

The validation process was easy although I had issues with the Skype call, initially. Mainly because I was flat out ill for weeks. Then when I was better, I had some difficulty getting the Skype call going. Entirely my issue, although it has to be said, Digicert support are really very, very, very busy. Between being ill and their overload, it took a bit longer to organise – but today it’s done. I did the call, they saw me sign the form and within an hour or so, the cert was working.

To use the cert to sign something is pretty easy. First you start with the script you want to sign:

# C:\Foo\Certs\Cert1.ps1
Write-Host "I got my cert from DigiCert"

A simple script saved as C:\Foo\Certs\cert1.ps1. To sign it it is simple:

$Cert = Get-ChildItem -Path Cert:\CurrentUser\My\ –CodeSigningCert
Set-AuthenticodeSignature -Certificate $Cert -FilePath C:\Foo\Certs\Cert1.ps1

Once signed, you can verify the signature by using Get-AuthenticodeSignature, like this:

2018-03-21_16-39-30

Very simple and very straightforward. If you, for some reason, have multiple signing certificates then you’d need to adjust call to Get-ChildItem to ensure you get the right certificate.


Friday, February 23, 2018

PowerShell’s $OFS built-in Variable And What It Does

The other day, I was working on converting some C'# to PowerShell. 95% was trivial and almost muscle memory. Then I came to this block of C#

char[] chars = { 'w', 'o', 'r', 'd' };
string string1 = new string(chars);
Console.WriteLine(string1);

The output is:

word

So I initially translated it as:

[char[]] $Chars =  ('w', 'o', 'r', 'd' )
[String $String1 = $Chars
Write-Host $String1

But that produced:

w o r d

The same characters but with spaces between which seemed illogical (at first!). I scratched my head, and did a bit of digging over at Spiceworks, where I was introduced to the $OFS PowerShell Variable. $OFS holds a string, known as the Output Field Separator.   PowerShell uses this character string to separate array elements when it coverts the array to the string, PowerShell has a default value of  ” ”, but you can change at the command line, in a script, or in your Profile. The issue here is that PowerShell is doing the array to string conversion and separates each character in the char array with the separator (“ “). You can read a bit more about this in an old blog post by Jeffrey Snover:

This gave rise to two solutions. The first is to let .NET do the conversion and the second was to leverage $OFS. Here  are two ways to do it: https://blogs.msdn.microsoft.com/powershell/2006/07/15/psmdtagfaq-what-is-ofs/


# Leveraging $OFS
[char[]] $Chars =  ('w', 'o', 'r', 'd' )
$OFS = ''  # Set separator to null
[String] $String1 = $Chars
Write-Host $String1
# Or using .NET directly
[char[]] $chars =  ('w', 'o', 'r', 'd' )
$String2 = [System.String]::New($Chars)
Write-Host $string2

This is interesting, but there is a highly practical solution to an issue I’ve seen brought up in several PowerShell support places. The issue if how to construct a comma separated string of words. So if you had an array of several words such as (‘X423q420’, ‘JG75-01-27’,”PCNY”) you could easily concatenate them as follows:

$Array = (‘X423q420’, ‘JG75-01-27’,”PCNY”)
[string] $Array

Which produces:


X423q420,JG75-01-27,PCNY

You could also create the array from properties then force it to be separated by $OFS, like so:

$F = Get-ChildItem –Path X.XML
$OFS = ','
$Sring3 = [string] ($F.Fullname, ($f.length/1kb).ToString('n3'))
Write-Host $String3

Which produces this:

C:\foo\X.XML,0.317

You learn something nearly every day!




Sunday, February 18, 2018

Using Azure PowerShell and PowerShell 3 or 4? You need to update PowerShell SOON (probably).

I recently saw a GitHub Pull Request (https://github.com/Azure/azure-powershell/issues/5149) for the Azure PowerShell cmdlets (the PR was merged into Version 5.3.0 of the Azure cmdlets). Besides from the continuing improvements that each version brings, I noted one very interesting sentence: 'PowerShell version 3 and 4 will no longer be supported starting in May 2018. Please update to the latest version of PowerShell 5.1 '

What does this mean? Well – it means that after May this year, if you are running either PowerShell V3 or V4, new versions of the Azure cmdlets may not longer work – and are not supported in any case. That is not to say that either the sky is going to fall in! Older versions of the cmdlets should continue to work and most of the newer cmdlets should work too. Note the ‘should’. But why take the risk. You have several months before the lack of support begins. But you should start to plan now if you are still using PowerShell V3 or V4 to manage Azure.

So what should you do? The answer should be fairly obvious – if you are using Azure and the Azure cmdlets, just upgrade to the latest version of PowerShell (i.e. 5.1). This new version of PowerShell should work just fine on all supported versions of Windows. Of course, if you are still using XP to manage Azure then you may have some issues trying to upgrade, although an OS upgrade to Windows 10 would fix this problem.

The upgrade of PowerShell should be a no brainer. I suspect many (most?) readers here are already running later versions!  There should be no issue, but if you are using Exchange, tread carefully to ensure that the version of PowerShell you are thinking of upgrading to is going to work with and is supported by your version of Exchange.  This is probably not going to be an issue if you using hosted Exchange (O365).

It seems to me that this is the start of removing all support for PowerShell V3 and V4. V5 and V5.1 are sufficiently better to make the upgrade most welcome. Loads more cmdlets, improvements in workflows etc are all goodness that comes from the Upgrade.

What is your take?

Saturday, January 20, 2018

Power-User–a great productivity add-in for PowerPoint

I do a lot of PowerPoint presentation – I train on a variety of technical subjects and PowerPoint has been my ‘friend’ for as long as I can remember! Over the years, PowerPoint has evolved, but I still prefer the older menu structure vs the ribbon. But such is life – with the menu or ribbon, PowerPoint is an awesome product. With that said, I do find remembering where the key features are to be found.

I’ve just started using a new tool, Power-User (https://www.powerusersoftwares.com/) which is an add-in for both PowerPoint or Excel. When you open PowerPoint, once this add-in is installed, you get a new ribbon item with great tools, which looks like this:

image

Thus, you see all the useful PowerShell features all in one place. Power-User helps with creating a variety of effects including icons, diagrams, colour management etc., etc., etc. And as if that were not enough, Power-User also has some cool Excel tools as well, like this:

image

I love this tool and would not be without it!  For more details on this cool product, see the presentation video at: https://youtu.be/xtFGSnQnWpE.

Power-User is, however, a commercial product. Students and teachers can get this for free, but for commercial use individual licenses are €198/year. For larger companies, a bulk license can reduce the cost per user. So not free, but well worth the fee if you are doing a lot of PowerPoint. My only regret is that I did not find this tool years ago!

Friday, November 10, 2017

Major updates to PowerShell Azure Module

Like the rest of Azure, the Azure PowerShell cmdlets are truly a work in progress. I've written many times before about the cmdlets, including details of the great cmdlet renaming (https://tfl09.blogspot.co.uk/2015/09/azure-powershell-some-changes-and-some.html). I've also demoed these cmdlets on a variety of occasions.

Microsoft has just released a major new revision to these cmdlets, version 5.0.0. The details can be found over on GitHub at: https://github.com/Azure/azure-powershell/releases/tag/v5.0.0-November2017.

This is a major update, as evidenced by the major version number change (from 4.x.x to 5.x.x). Additionally, there are several breaking changes - changes that could break your code if you update just the module itself.  You can see a list of updates at: https://github.com/Azure/azure-powershell/blob/release-5.0.0/documentation/release-notes/migration-guide.5.0.0.md.

If you are using Azure cmdlets - then you probably have some work to do to update to the new cmdlets. But all in all, such is the price of progress.

Monday, October 16, 2017

A Cool Azure Resource

I spent the weekend attending a Train-The-Trainer event at Microsoft UK (Thanks Ed Baker!). The event was focused on Azure and both what it was and how to teach it. A cool event which led to a lot of sharing of tips, tricks, and cool links.

One particularly cool link I discovered was to https://azureplatform.azurewebsites.net/. When you navigate there, you see a page like this:


Each tile on the page represents one Azure service. Click on a service and a neat pop-up appears providing more details of that service. Clicking on Virtual Machines, for example, shows this:

A great launching pad for discovering more about Azure.


Thursday, October 12, 2017

Events in the Security Event Log

I was answering a question in the Spiceworks PowerShell Forum concerning the event log. The poster was looking for how to find out who had logged on to a particular computer. The answer was to use Get-WinEvent and search the Security log for the relevant event. Easy.

But how do you know which event to look for? There are so many events! Well, there's a PowerShell script for that. To find the different event codes and roughly what they mean looks like this:

# Get all the security even  
$e = Get-WinEvent -LogName Security
# Get the different message kinds:
$ids = $e | Sort-Object Name |Group-Object -property id
# And print the event types 
Foreach ($id in $Ids) {
$m = ($id.Group[0].Message).SPLIT("`n")[0] 
" {0:N5}    {1}" -f ($id.NAME), $m }
This code first gets all the security events and sorts them by Event ID. Then the code extracts the first line of the Event Log message and displays the event ID and that first line. On my the output looks like this:
 4624    An account was successfully logged on.
 4672    Special privileges assigned to new logon.
 4634    An account was logged off.
 4648    A logon was attempted using explicit credentials.
 5058    Key file operation.
 5061    Cryptographic operation.
 4798    A user's local group membership was enumerated.
 4799    A security-enabled local group membership was enumerated.
 4904    An attempt was made to register a security event source.
 4905    An attempt was made to unregister a security event source.
 4907    Auditing settings on object were changed.
 5059    Key migration operation.
 4688    A new process has been created.
 4608    Windows is starting up.
 4902    The Per-user audit policy table was created.
 1100    The event logging service has shut down.
 4616    The system time was changed.
 4826    Boot Configuration Data loaded.
 5033    The Windows Firewall Driver started successfully.
 5024    The Windows Firewall service started successfully.
 4647    User initiated logoff:
So knowing this, finding out who logged in is simple, right? You might think. It takes a bit of tinkering with the object, but here's my code:

# Get logon users
$le = $e | where id -eq 4624
$x =
foreach ($event in $le) {
$time = $event.timecreated
$username = $event.properties[5].value
$domain = $event.Properties[6].value
If (($username -ne '') -or ($Username -ne 'System')) {
$ht = @{}
$ht.time = $time
$ht.user = "$domain/$username"
New-object psobject -property $ht
# And display the results:
$x | group user | sort count -desc| ft name, count
This code creates a simple object for each event log entry for the relevant ID. This object just has the time, username and domain name from the event log entry. I create an object to, at the end, group then sort the logon events. The result is almost like this:
Name                                                    Count
----                                                    -----
COOKHAM.NET/JerryGarcia                                  7576
NT AUTHORITY/SYSTEM                                       746
COOKHAM.NET/COOKHAM24$                                     73
COOKHAM/BobWeir                                            36
NT VIRTUAL MACHINE/27A96661-D855-4286-81D6-BBB32172CCED     6
COOKHAM.NET/MickyHart                                       5
Window Manager/DWM-1                                        2
NT AUTHORITY/NETWORK SERVICE                                1
NT VIRTUAL MACHINE/55C8EC55-6D2B-421D-A454-28FCF4680366     1
NT VIRTUAL MACHINE/53EC57B5-BAB2-4A29-A34B-19A8BB857C42     1
NT VIRTUAL MACHINE/45FF27A5-C133-4213-9A4A-DBF4317D55D0     1
NT VIRTUAL MACHINE/4459B92D-0476-4815-B2DE-C3243CD2D82B     1
NT VIRTUAL MACHINE/3D886F3D-B8BD-41A0-8B05-B82AEB2FE99D     1
NT VIRTUAL MACHINE/370E6442-86B5-4310-BDAB-1882DAE4E5C6     1
NT VIRTUAL MACHINE/33872EB0-2259-4312-83F4-AE783B9D817C     1
NT VIRTUAL MACHINE/2FF8C7E0-CB1E-46E1-9C53-7DEFF18FB488     1
NT VIRTUAL MACHINE/289DD95C-0454-4D51-93FC-F4D7502D894B     1
NT VIRTUAL MACHINE/596834C2-6B40-47E6-9EC5-3231BAD2C01B     1
NT VIRTUAL MACHINE/125EFD6E-2F88-4E2E-A0F2-BDA9516B2B59     1
NT VIRTUAL MACHINE/0C77EC57-8A20-4533-A4E1-5CDB93CB1DC2     1
NT AUTHORITY/ANONYMOUS LOGON                                1
NT AUTHORITY/LOCAL SERVICE                                  1
NT VIRTUAL MACHINE/2FAD3305-C65D-4304-AFF1-F4CFC0C96381     1
NT VIRTUAL MACHINE/64D69931-57FE-491F-96C8-215DE6B3D3FC     1
NT VIRTUAL MACHINE/880CD2FD-7304-4CE1-B831-87ED01DD0BD7     1
NT VIRTUAL MACHINE/7A4205E9-D2C6-466C-82BE-80CFF9947738     1
NT VIRTUAL MACHINE/FA3ADF88-EA85-43A4-AE49-5551186977DB     1
NT VIRTUAL MACHINE/EBF0AAF0-2300-4CD3-9B92-BCA29896DD90     1
NT VIRTUAL MACHINE/E4B8AA47-B256-4918-9098-A80C09DC91ED     1
NT VIRTUAL MACHINE/DD8F6DE3-5F65-4990-B0DD-BF328BFB47BE     1
NT VIRTUAL MACHINE/DC824601-E4F9-445D-BFE4-44FB83D7B733     1
NT VIRTUAL MACHINE/DA85B909-A42E-400F-96CB-340BBB6E0DC0     1
NT VIRTUAL MACHINE/D5420357-DF18-4140-B986-B85CF25D8FF1     1
NT VIRTUAL MACHINE/C66F22AD-DF26-4ED3-A555-9FDDE0588EE4     1
NT VIRTUAL MACHINE/6A8984FD-8774-447A-9F35-4FD97766E303     1
NT VIRTUAL MACHINE/BFDAC935-60ED-4CF9-BE1C-FC12DC47EBB2     1
NT VIRTUAL MACHINE/BE427F4F-C3AC-4086-B58D-8B5B8B8C7863     1
NT VIRTUAL MACHINE/A20EA3B5-7926-4AE4-96D7-4AFE2E34D80A     1
NT VIRTUAL MACHINE/9C00DC59-E565-4B88-88D0-CEE2AC08E870     1
NT VIRTUAL MACHINE/95D96D7E-9A2F-46D8-8E02-0FC0B2F9E594     1
NT VIRTUAL MACHINE/953BFF3A-A3EA-4567-ABAD-2A7337CE3B26     1
NT VIRTUAL MACHINE/9353F711-F39C-47E0-B41A-9E85D70997D8     1
NT VIRTUAL MACHINE/88536766-EF5D-4AE9-A343-B3713EA912DF     1
Font Driver Host/UMFD-1                                     1
NT VIRTUAL MACHINE/BFAFEC2C-5565-4458-A359-A4EC6F62079C     1
Font Driver Host/UMFD-0                                     1
Fun and games with the event log!

Wednesday, October 04, 2017

Free Microsoft Azure Symbol/Icon Set

Over the years, many Microsoft groups have created sets of downloadable symbols and icons so you can create nice diagrams to represent your Azure service architecture. The latest set of tools includes Azure services which can be used in PowerPoint or Viso making it easy to create professional looking content.

You can get this icon pack at: https://www.microsoft.com/en-us/download/details.aspx?id=41937. Note the download is 23.3MB.


Wednesday, September 20, 2017

The PowerShell Book Is Complete

Since late last year, I have been working on a book. Titled  Windows Server 2016 Automation with PowerShell Cookbook (ISBN: 978-1-78712-204-8), it was finally completed and sent to the printers. The book is published by Packt Publishers.

It's taken 10 months to complete and we face some serious issues during the writing. We had been using a neat web portal, but it suddenly started 'eating' bits of code. It was a nightmare and cost weeks of extra effort to fix (and we'd fix it only for the portal to eat the code again). 

Then disaster struck in the form of my co-author having to drop out for personal reasons. This is never nice. I ended up picking up the extra chapters, but due to time, space and personal commitments, we did have to drop some of the chapters.  And during the final chapter reviews, we found that code that looked great in the Word Documents had been badly borked in the final PDFs (so HOURS or detailed proofreading). I just hope I picked up all the errors.

But it's done. The printers are doing their best now to print it, and Amazon is now selling the book. I have to say I get a kick out of seeing that page.

The code is planned to be uploaded to a GITHUB repository, so you can download and leverage it. This should happen soon!

So with this book done, my fifth, I am never going to do this again. Way too much work, way too many late nights and early mornings, way too much stress. Never again. But having said that, my super-star tech reviewer, Mike Robbins, has suggested another book. Hmmm.

In any event, let me know if you get this book and what you think!