Info Share about PSDayUK 2018 – Call for Speakers, Ticket availability & Upcoming Call for Sponsors.

This year, we as the collective behind the UK PowerShell & DevOps User Groups, are running the second PSDayUK event, the only Conference that is totally Dedicated to PowerShell here in the UK & will be held on October 10th at CodeNode, London. You can find more info about at PSDay.UK including being able to purchase tickets & the eventual schedule once published in the upcoming weeks.

We will constantly be releasing information about PSDay as we approach the time for the event via various methods of social media including via our Twitter Account @psdayuk which I would highly recommend you follow if you want to be kept in the loop of what is coming to PSDay.

To give a little bit of background, PSDay is the conference brand of the UK PowerShell & DevOps User Groups, for more information on the UK PowerShell & DevOps User Groups please see PowerShell.org.uk and PSDay is currently planned at being an annual event much like the bigger PSConf EU, PSConf Asia and US PowerShell & DevOps Summit events, and whilst each of the bigger events are multi-day events PSDay at present is a singular day event, although the format of the event may change in future, this is something that the organising team are keeping in mind for future events.

With this in mind this year we are running PSDay as a dual parallel track conference, where we a solid idea of what we are intending the tracks to contain to cater all skill sets based on what we’ve learnt as part of running consistent monthly User Groups in London and whilst there is a HUGE variety of topics that could be delved into with PowerShell we have seen a reoccurring theme with the User Group over recent months.

This means that we have been more selective in the ideas behind what sort of topics that we are looking for this year with a view to have topics along the following lines

Track Name Track Focus Suggested topics
The many components of the PowerShell Language All things related to the PowerShell Language Debugging, Classes, Remoting, Performance, WMI/CIM, Pester, PSScriptAnalyzer, DSC, Workflow,  Using .NET in PowerShell & anything centred around the Core PowerShell language that can be useful for all skill sets.
Using PowerShell as the Glue of Automation All things Automation Automating any Technology from any device installed anywhere, Azure, AWS, GCloud, Office 365, Microsoft Graph, VSTS, GitHub, PowerShell Gallery, SharePoint, Exchange, SQL Server & more

 

The idea behind the first track, ‘The many components of the PowerShell Language’, is that those that are new to PowerShell, and even those of us that have been using PowerShell for years, can come into this track and take away a wide variety of knowledge about the core parts of the PowerShell Language that comes from a more general use perspective, which would allow attendees to be able to take away and expand on what attendees learn in this track in their own times and is expected to be of a more generalist track where the skills learned can be then taken and used across an enormous number of technologies.

 

The idea behind the second track, ‘Using PowerShell as the Glue of Automation’, is to be much more centred around using PowerShell with specific technologies & is more more likely to be the track for those that want the more technically  for those people that are either well into their DevOps journeys & are already using many differing DevOps Practices and perhaps looking at further expansion of their skill set or are looking at replacing existing or embedding additional technologies within their organisations.

The Call for speakers form is located at PSDay Session Submissions & we are looking for sessions for topics as listed above and to be of 60 minutes in length. We currently have a cut of date of July 31st for sessions and I would highly suggest that any potential sessions you may want to submit be submitted quickly. I would also suggest that the abstract you submit does not need to be perfect but does need to give us as organisers the ability to pick and choose topics from all the submissions. The reason for this is that we will come back to chosen speakers, based on topic & technology and come back to them to confirm/polish off their abstract in the early weeks of August, prior to publishing the schedule by beginning of September.

We are also currently in the process of working out what sponsorship packages of the event will look like and I would say until we release a further post on what these look like if your organisation would be interested in sponsoring PSDay then please reach out to me in this interim period, whilst we iron out what the sponsorship package would look like, and we have already been approached by a few sponsors so this will be coming along very soon.

I am looking forward to PSDay & I am looking forward to seeing you there as either, an attendee, a speaker or a sponsor.

If you have any questions at all please reach out and I’d be happy to answer

 

 

#PowerShell Side by Side #ProTip

Today I’m going to share with you a little but simple tip to enable you to do more Side by Side testing of PowerShell v6 with you current installed version in a simpler and less error prone manner.

 

Firstly we will create a new environmental variable which we can do in a number of ways but I quite doing it this way  as its easy enough to script

Function Update-PS6Path {

       

        $PS6LatestPath = Get-ChildItem ‘C:\Program Files\PowerShell’ -Directory |

                         Sort-Object CreationTime -Descending |

                         Select-Object -ExpandProperty FullName -First 1

        [Environment]::SetEnvironmentVariable(“PS6”,$PS6LatestPath,“Machine”)

    }

 

This then means that to Launch PowerShell v6 you can do this in the console to run PowerShell v6 (the latest installed version anyway) and in this case we are passing some of the available arguements to the powershell.exe application as noted at https://msdn.microsoft.com/en-us/powershell/scripting/core-powershell/console/powershell.exe-command-line-help

& $env:ps6 -NoProfile -NoLogo -ScriptBlock { $PsVersionTable } -NoExit

So hopefully this little snippet will help you out in doing some more Side by Side testing as time goes on.

1 Small thing about running PowerShell Core and Windows PowerShell side by side on Windows

*Updated August 23rd 2016 as there was a change between 6.0.0.8 & 6.0.0.9 to PSModulePath that I had missed – I will be blogging about this in more detail in a future post but for now check the updated section at the bottom of this post! *

 

If your like me and you want to test out PowerShell Core on you Windows machines as well as other *nix machines then you may get caught out with this like I did in the upgrade from 6.0.0.8 to 6.0.0.9.

 

You can grab the MSI installer for 6.0.0.9 at https://github.com/PowerShell/PowerShell/releases/tag/v6.0.0-alpha.9  however do note that there are no Windows 7 or Windows 8 installers due to the requirements for WMF 4 to have been installed prior to WMF 5 as noted in this issue https://github.com/PowerShell/PowerShell/issues/1931 which links to this issue https://github.com/PowerShell/PowerShell/issues/1705
So lets get into the Side by Side stuff Smile

 

Once you’ve installed the MSI install you can run PowerShell 6.0.0.x alongside the Installed Version on your machine like so

 

PS-SBS

 

This is because PowerShell 6x installs in the following Location C:\Program Files\PowerShell\ and as you can see below I have installed 6.0.0.8 & 6.0.0.9 on my machine.

PS-SBS2

This also means that if we look in our Start Menu you can see the following new options

 

PS-SBS3

 

*Note* This will not change your default version of PowerShell from the one that is at C:\Windows\System32\WindowsPowerShell\v1.0\ so if your running Windows10 on the Insider Fast ring like me then it will run 5.1.1405.1000

 

To run one of these alpha versions you have to explicitly do so from the Start menu (or a desktop link if you create one) so you can be sure that this will not cause your any issues with day to day PowerShell use.

 

Hopefully that clears up any potential confusion!

 

In 6.0.0.8 the $profile Variable referenced the Windows PowerShell Documents location as can be seen below

PScore-6.0.0.8

 

Whereas in 6.0.0.9 we have a new location as shown below

PScore-6.0.0.9

 

So when we load 6.0.0.9 we wont get our profile to load as it doesn’t exist.

So that we can get our current profile to load in 6.0.0.9 we can do what we would normally do and just use New-Item like I’ve shown below

PScore-6.0.0.9-2

 

This seems only to have been needed in 6.0.0.8 & not 6.0.0.9 – as the default values in 6.0.0.9 for the PSModulePath are not what we have set in the ENV Variable and I’m not sure how this works but will dig in and post about this at a later date!.

 

Then next time we load 6.0.0.9 we will have a working profile but the issue is that this will now enable loading of all the modules that we have in our PSModulePath environmental variable.

 

. However we can get round this by 1 simple line in our Profile

If ($PSVersionTable.PSEdition -ne ‘Desktop’)
{ if ($IsWindows -eq $true)
{  $version = $host.UI.RawUI.WindowTitle.Split(‘_’)[1] ;
$env:PSModulePath = “C:\Program Files\PowerShell\$version\Modules” ;
Write-Output ‘Removed all but the shipped Core modules’
}
}

This is a Windows only forwards compatible inclusion in your profile & will only affect the local session of PowerShell that is running.

So you can be sure that this will work across your Windows Machines however ideally we will get some amendments to PSVersionTable as noted in https://github.com/PowerShell/PowerShell/issues/1997 & https://github.com/PowerShell/PowerShell/issues/1936 to be able to tell the OS easier and more dynamically.

 

The $IsWindows variable is only available in PSCore along with $IsOSX , $IsLinux & $IsCoreCLR so you cannot currently use them in the Full Version of PowerShell and currently I don’t think that you can build the full version of PowerShell from the Repository to a 6x version. However this may change in future.

So you can actually with 6.0.0.9 and above ignore the above section completely and comment that out of your profile (or delete it)

 

This is also a good example of the rate of change between the different alpha versions although I’ve checked the commit notes and cant see this change mentioned in a concise and easy to understand manner so I will feed this back to them to see if release notes can be improved in future.

My Opinion on Open Source PowerShell and what this means to the PowerShell community

If you’ve been under a rock the last few days (or for days/weeks/months depending on when your reading this blog post) then you would have missed that on Thursday August 18th 2016 –  Microsoft Open Sourced PowerShell!

Not only did they Open Source PowerShell they have released a Cross-Platform alpha version that can be installed on a variety of Linux Distros as well as a Mac OSX version.

 

You can read about it in more detail from the Jeffery Snover himself from over at https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/

You can also read the PowerShell Teams blog (which has some great links too) on this at https://blogs.msdn.microsoft.com/powershell/2016/08/18/powershell-on-linux-and-open-source-2/

 

But what does this really mean to you, me, & all the other PowerSheller’s across the globe?

 

Firstly

 

  • Well done – you picked to learn a technology, which celebrates its 10 year old anniversary later this year (November 14th) and that was considered a “Windows only” tool and now it’s not this now means that you could start working with other platforms – increasing your value to your employer *cough maybe its time for that payrise*
  • PowerShell is likely to start to change even quicker (I’m speculating past the Server 2016 launch here)
  • If you don’t want to get involved and help with building PowerShell v6 (& fixing any bugs you find) then you can let this announcement pass you by a little and await an actual real release in the future

However if you do

 

  • You need to start learning some more new things and not just start learning PowerShell as there is now an even bigger ecosystem of tools that you need to learn to be really efficient and successful in IT going forward.
  • You need to learn how to work with source control and I will recommend git like I do to every one else. Check out https://help.github.com/articles/good-resources-for-learning-git-and-github/ for some resources but a google search will get you some others too.
  • You need to learn how to work with Github Issues and search and be able to file an issue that has enough information to be useful for the person reading it. Some of this is helped by Issue Templates but these don’t always capture all the possible required information so be prepared to be asked for more info.
  • You need to start attending User Groups & Conferences and train for your IT Future as the world of IT is undergoing a massive change and this isn’t going to stop anytime soon and if anything the rate of change is going to start getting quicker and quicker.

 

 

So where are we right now?

 

Currently the release of PowerShell to Github is an Alpha release – this means that it is not supported in any case for any production uses at all! Basically its out there for you to “kick the tyres” so to speak.

It also means that at least for now and the near future you may think that you have 2 places to raise issues. Github & UserVoice.

However The Guidance from the PowerShell team is this at present

Customers and Enterprise users should still raise these on UserVoice as this is still where issues relating to the PowerShell Engine that are contained within Windows Client and Server Systems including WMF releases should be raised.

Basically this means for issues that relate to anything PowerShell v5.1 and below should be raised on UserVoice.

My understanding of this is because these versions of PowerShell haven’t been released to Github (we have the changes that have occurred since PowerShell 5.1 was rolled up for WMF 5.1) so changes to them can only be done by the PowerShell team – plus we do need to remember that Server 2016 is still yet to RTM and the Source code for that will have been sealed in preparation for launch. So any fixes to the PowerShell engine included in Server 2016 or the RTM version of WMF 5.1 will come either by hotfixes or a recommendation to upgrade to a stable version of PowerShell 6 once released as we currently have alpha releases available on GitHub.

However For Developers and those that feel comfortable to do so then they can raise issues on Github.

This is where current guidance from the PowerShell team could easily bring a little confusion to some but we have to remember that this is new ground for the PowerShell Team so they will need to have some time to sort out how they work with the different streams. It is likely (& I’m just speculating here) that the team has an internal consolidated issue tracker that tracks UserVoice and all of the PowerShell Repo’s, however be on the look out for a blog post from the PowerShell team at https://blogs.msdn.microsoft.com/powershell in the next few weeks where they will be detailing how they interact with the community across these mediums.

 

So What does the future hold for PowerShell?

 

Over the course of the upcoming months we will see a number of further alpha releases as well as a stronger emphasis on making use of the PowerShell RFC Process for any changes to how the PowerShell Engine works. The PowerShell RFC Process can be found at https://github.com/PowerShell/PowerShell-RFC and there are a few new additions to this already from outside of the PowerShell Team.

 

But the interesting thing from this point on is that there will be more and more of the PowerShell Eco System open sourced including 1 module that I’ve been waiting to tear apart – PowerShellGet – which Jason Shirk confirmed is planned to happen in the future in this Issue https://github.com/PowerShell/PowerShell/issues/1979 – It is also worth noting that a number of the modules that we have inbox on Windows 10 Machines are not written by the PowerShell team so there is likely a chance that the module, cmdlet or function that you have ideas to improve (New-Item is one I’d like to see be a bit more intelligent with folder creation) may not be open sourced – however I think it is a matter of time before we see there be demand for these to be open sourced as well and there are already calls for other modules from other teams to be Open Sourced including the SQLServer module (was SQLPS) which shows where the ecosystem has been going for some time now.

 

Overall I’m incredibly proud to be working with such an amazing product that now has opened even more doors to me than what it had available to it before. You never know what the future will hold but now I have skill that can be used cross platform that means to me that the possibilities in the upcoming months & years of my IT Career are even more prosperous than they were last week.

 

If you haven’t yet picked up PowerShell I would seriously urge you to do so!

If your struggling on how to pick up this language and understand the benefits that it can bring you and are interested in learning more, here is a number of resources that you may find useful.

https://www.pluralsight.com/search?q=PowerShell&categories=course
https://stackoverflow.com/questions/tagged/powershell
https://powershell.org/
https://aka.ms/psslack
https://aka.ms/psdiscord
http://poshcode.org/
https://powershell.org/summit/
http://www.psconf.eu/
https://psday.uk/
https://www.meetup.com/pro/uk-devopscollective/
https://www.youtube.com/user/powershellorg/videos
https://www.youtube.com/channel/UCxgrI58XiKnDDByjhRJs5fg/videos
https://www.youtube.com/channel/UCplHMQDHTYH3wA7A1YcaVTw/videos
https://www.youtube.com/channel/UCnWPTrwSmO5xvjOiqqLTYpg/videos
https://mva.microsoft.com/training-topics/powershell#!jobf=IT%20Pros&lang=1033
https://channel9.msdn.com/Search?term=PowerShell&lang-en=true
https://www.powershellgallery.com/
https://github.com/PowerShell?utf8=%E2%9C%93&q=DSC&type=&language=

I’m looking forward to seeing how the future pans out with xplat PowerShell – what are you looking forward to the most with this?

Functional / Non-Functional Pester Tests and why I think you really should have a form of both.

So in this blog post I’m going to cover why there is a need to create Functional & Non-Functional Pester Tests for your PowerShell Modules but before I get into the nitty gritty into the whys behind creating both let me explain what the real differences are between the two because it may not be something that you have previously thought about or considered in your journey up until this point.

 

Functional

  • Used to test the code’s different use cases
  • Can be either be a form of Unit or Integration Test
  • Where we “Mock” the functionality to confirm it works as expected
  • To Determine the level of code coverage that your tests actually hit
  • Makes Functionality changes simpler and easier going forward as long as you write more Functional tests
  • Should save headaches as code moves between environments as part of a Build/Release Pipeline
  • Provides a Documentation Mechanism to catch either bugs so these can be fixed
  • Provides a Documentation Mechanism to potentially highlight where you may be able to make possible improvements

 

Non-Functional

  • Can be more referred to as “Traditional Documentation”
  • Aids Newcomers to the code base by being suggestive that you provide some useful help documentation
  • This can also aid newcomers in learning how to understand some of the more advanced functionality
  • We get Validation on the Functions Parameter types – i.e should the parameter be a String for input
  • Confirmations on whether the Parameter a Mandatory Parameter or not ?
  • Gives us a basic form of ParameterSet Validation
  • Gives us a basic form of Parameter Position Validation
  • Does the Parameter Accept Pipeline Input ?
  • Does the Parameter Accept Pipeline Input by Property Name ?
  • Does the Parameter use Advanced Validation at all ?
  • Does the Parameter have at least some help text defined ?
  • Does the Function have at least a basic level of Comment Based Help ? – lets leave the pro’s & con’s for another topic shall we.

 

So with the additional amount of tests that we may have to write from looking at the above why should we spend the time writing these tests?

This is where the story for Non-Functional tests becomes a little hazy in some ways but it really depends on the situation on how you’ve ended up with this module.

 

These possibilities can include

You’ve Inherited or downloaded someone else’s code and you have no clue what its doing because it’s

  • Not well documented with little or no help
  • Difficult to read because of the formatting
  • Uses a number of privately scoped functions
  • All the functions are either in a single ps1 or psm1 file
  • Just needs to be refactored to make it easier to manage, maintain & update going forward

Or it may just be that

  • It almost does what you need but you need to extend the functionality
  • You want to dig a little deeper into how it works
  • You are possibly continuing a discontinued open source project
  • Or you are looking at your own older code and want to give it a much needed update considering you’ve become a more experienced scripter than you were when you originally wrote it

 

If you were to go and create all the Non-Functional Tests that I’ve listed above then this will give you a lot of additional tests (& I mean a lot) that you would then have available to you to provide you some more trust in your code whilst you refactor or just understand how all the bolts fit together.

However I will point out that from this is really meant to provide you with a Singular Set Baseline on what is included in the module and not how the Module actually functions as that’s the role of the Functional Tests to do so.

 

In my next post I will show you how we can automagically create these Non-Functional Tests for each function included in an existing Script Module, including those functions that are defined as private/internal functions to give us a better chance of being able to manage, maintain & update it going forward.

Recap of a Long February, March, April and May – Events Events Events!

I had intended that I would be doing a recap type post at the end of every month however I’ve been very busy so haven’t been able to do so for a number of months – that and I had an issue with my blog being offline for a few weeks.

Let us start with a recap on the amount of events that I managed to attend and I think that you can see that I did a lot of travelling and attending a number of different user groups.

I attended the following events

  • Get-PSUGUK – Manchester – Feb 1st
  • SharePoint User Group – Manchester – Feb 2nd
  • Azure Security Training Event – London – Feb 3rd
  • SQL User Group – Manchester – Feb 3rd
  • Get-PSUGUK – London – Feb 4th 
  • Mississippi PowerShell User Group – Feb 10th – Online
  • What’s New in Server 2016 – Microsoft Training Event – London – Feb 17th
  • What’s New in Windows 10 – Microsoft Training Event – London – Feb 18th
  • WinOps Meetup – London – Feb 23rd
  • Chef Meetup – London – Feb 24th
  • Cloud Roadshow – London – Feb 29th – Mar 1st
  • Azure User Group – London – Mar 1st
  • Manchester Geek Nights – Agile and Tech in Local Government – Mar 3rd
  • SQL Sat Exeter – Mar 12th
  • Lean Agile Manchester – Mar 16th
  • SQL User Group Manchester – Mar 17th
  • Manchester .Net – .Net Core recap – Mar 22nd
  • SQL User Group Cardiff – March 30th
  • MCR Tech Event Organisers meet – Apr 7th
  • SharePoint User Group – Nottingham – Apr 12th
  • PSConfEU – Hanover, Germany Apr 19th – 22nd
  • Get-PSUGUK Manchester – Apr 25th
  • Get-PSUGUK London – Apr 27th
  • MVP Open Day – Apr 28th – 29th
  • SQLBits Sat – May 7th
  • Get-PSUGUK Manchester – May 23rd
  • WinOps Conf London – May 24th
  • UKITCamp London – May 25th
  • SQL London User Group – May 25th
  • Get-PSUGUK London – May 26th

So in the space of the beginning of February to the end of May I attended 30 different User Groups, Training days or Conferences and that wasn’t all the ones that I had planned either due to some unfortunate illnesses that occurred as well.

Now those that know me will know that I attend the events because I’m genuinely interested in the topics at the events or catching up with the people that are there after all the events are all about the community and the networking opportunities that they bring to us.

I intend to in future post ahead of where you can catch me in the following months via the Find Me At page and then at the end of the month detail more about what I learned at the events.

Before I go into detail on the events and what happened at them just take a moment to look at the types of events that they are and the breadth of technology that they span. This may give you an insight into the differing technologies that excite and interest me going forward. 

To start Get-PSUGUK Manchester on Monday Feb 1st which seems a long time ago but is still an event that I can vaguely remember enough to post about. I presented the initial version of my “Teaching the IT Pro how to Dev” Session where I introduced my ISE_Cew Module to the Audience for helping with getting to grips with using source control with Git and unit testing with Pester. We also had our first community speaker Tim Hynes @railroadmanuk who presented on Automating Infrastructure using PowerShell with various Infrastructure API’s that he’s been working with including VMWare, Cisco & NetAPP devices. You can find his presentation at https://github.com/railroadmanuk/presentations and not long after Tim was awarded VMWare vExpert. I know he’s presented at other events since and I’m looking forward to seeing what the future holds for Tim.

Then on Tuesday Feb 2nd SharePoint User Group in Manchester will always be a group that is close to me as it was the first user group to give me the possibility to present at which you can read more about here – though this was a night about “What you need to know about SharePoint 2016” by Heath Groves @Heath_Groves and Building Enterprise Platforms by Andy Talbot @SharePointAndy – you can find Andy’s slide deck at http://www.sharepointandy.com/?p=550

Heath gave us a rundown on all the things coming in SharePoint 2016  and even prepared some take-me-homes which included the New and Removed PowerShell Cmdlets in SharePoint 2016. Andy’s session was a good thought provoking session for those that have dealt with SharePoint in the past and there are some really good points in the slide deck that are applicable to a number of different areas of IT. You can tell this deck was put together with the pains that Andy will have personally felt working with the number of different IT Departments over the years and a number of them I have felt as well as will a number of you too. Even if your not a SharePoint person go and have a look at the deck and see if it resonates with items that you feel in your day to day IT lives.

Next up on Wednesday 3rd Feb it was an early morning with a 5:15am train from Manchester to London for an Azure Security Morning at Microsoft’s offices at Victoria – this is an area that more people need to put time into and I’m looking forward to seeing some further work in this area and mainly more so from Microsoft. Saying that Microsoft recently released the Azure Security Information Site on https://azure.microsoft.com/en-us/documentation/security/ so go and have a look at it as there is a lot of good information in there. However the Security morning was a good event although I felt it would have been better as a full day event especially as there were a number of issues with getting the interactive demos/labs up and running with the Barracuda security devices mainly due to issues in the Scripts that had been provided to set everything up. They should have written Pester Tests for these scripts as I had gotten the impression that the scripts were recently updated for a recent release of the Barracuda security devices. Some of the attendees managed to get things set up however I was unable to which was not ideal.

I then had to leave London around 14:30 in order to get back to Manchester in time for the SQL Server User Group that evening. Now everyone that knows me knows my SQL knowledge isn’t close to be on par with those that live and breath SQL every day however one thing all platforms require is a data backend of sorts. So I’ve pushed myself to attend more and more SQL events where possible (as you’ll gather from the rest of this post as well) so that I can learn more about this crucial technology and be able to implement and use it in my own adventures going forward and one of the area’s that has piqued my interests is PowerBI and I was glad to be able to get what was a real beginners crash course into PowerBI by what I can only describe as an Awesome Instructor – Adam Aspin. We also had a session on SQL Server Wait Stats by Rainer Unwin which was an interesting although perhaps a bit too technically in depth for me to fully follow at this stage of my interaction with SQL Server – though I’m sure it will be something that I come back to in future.

Then the next day Thursday Feb 4th, I had to travel back down to London from Manchester for the London PowerShell User Group at RackSpace just out of Hayes and Harlington, where I also presented my Teaching the IT Pro how to Dev session with a bit of an update to it from the Manchester session. We also had Rudolf Vesely @RudolfVesely from Rackspace give an Introduction to Pester which was a great session for the audience – Rudolf will be presenting to the London group again in future on a more in depth session on Pester so look out for this.

On Feb 10th I was lucky to present to the virtual Mississippi PowerShell User Group where I Presented the Teaching the IT Pro how to Dev session – this was recorded and I’ve blogged about it in a bit more detail here.

I then attended the UKITCamps in London on Feb 17th & 18th on the What’s New in Server 2016 & What’s New in Windows 10 topics and although these are camps that I’ve previously attended there are a number of labs in there that are good to have chance to run over and replay. I also enjoy the UKITCamps as these are Microsoft delivered training days meaning that there are a number of others there that I get chance to network with along with also getting chance to catch up the guys running them, namely Ed Baker, Marcus Robinson and Andrew Fryer. I was also very lucky to get chance to head out for a meal with Ed, Marcus & the other members of the DX team that work behind the scenes to put on these events. I for one look forward to the events and them being put on by the guys in the DX Team and now how difficult it is to arrange events like these. This is before you include preparing the Slide decks and the labs that are to be used in these events. Hopefully we will see more of these events in future however there aren’t any currently planned so we will have to wait and see if more of them appear in future.

I then had a just under a week until my next event which was decided last minute where I was to present my Teaching the IT Pro how to Dev session to the WinOps group in London on Feb 23rd which was great however I suffered from a failed MicroHDMI to HDMI Adaptor so I had to try and move my demo and deck to Stephen Thair from DevOpsGuys Laptop and as per the standard developer line ‘Well, It worked on my machine’ I was unable to show the demo’s working. This has lead me to build a VM in Azure and a second Hyper-V VM for any demos that I want to run in future to ensure that demos work – Also I’m planning getting a dedicated presentation only device which I’ll wipe between events to ensure that all runs as expected along with a few backup cables & Adaptors to have with me.

Then the next night attended the Chef Meetup where I was introduced to GoCD, Terraform & Kubernetes – all look like interesting technology but I need to get a reason to get in deep with any of these technologies so look forward to me possibly blogging on these technologies in future.

I then Attended the London leg of the Microsoft Cloud Roadshow on Feb 29th & March 1st where there were a number of different sessions on throughout the event with tracks covering most of Microsofts technologies with a number of them focused on the SharePoint/Office365 ecosystem and the Azure ecosystem. The highlight of the event was the ability to go and have a few drinks with Joey Aiello one of the PowerShell PM team who was over from the US for the Cloud Roadshow. It was good to be able to have a face to face chat and I’m sure in future that there will be more chances to chat including the MVP Summit. Joey is younger than I am and is rocking a very good role at Microsoft – Imagine being part of the PowerShell Team – that is a number of peoples dream jobs and I would be lying if I were to say that I wouldn’t find it amazing to spend my day working even more with PowerShell than I already do. However as an MVP I do get that luxury already although it would be a very different role to the one that I’m doing. Who knows what the future holds but I know that for me it will likely involve PowerShell for a number of years if not decades to come.

I also dragged a few people to the London Azure User Group that was happening on the evening of March 1st where we were introduced to Boris Devouge, Director of Open Source Strategy at Microsoft and I can only describe him as a ‘Fluently Funny Frenchman’  which make his presentations engaging and as this was on the new Azure Container Service (it’s an Azure User Group after all) it was interesting to hear of the partnerships that Microsoft have been recently making in this area with the push to make Azure the most open source friendly cloud. The Azure Container service was in public preview (I think) at the time of the presentation however it has since been made Generally Available and you can learn more on ACS on this post on the Azure Blog site https://azure.microsoft.com/en-us/blog/azure-container-service-is-now-generally-available/

I next attended a talk in Manchester on March 3rd at Manchester Geek Nights on Agile and Tech in Local Government delivered by Stockport Council where I was lucky to bump into my good friend Ethar who always has a good story to tell. I must get chance to catch up with him again when I’m next in Manchester and not just there on a flitting visit. The Talk by Stockport Council left me realising why our Governments, Local & National, get a lot of stick for being poor at delivery and execution of their IT projects (& projects in general) and this is because there is so much fragmentation in the IT Systems being used across all differing councils due to them all having separate and diminishing IT budgets to do any projects. I personally think that Centralisation of all of the UK Council & Local Government IT into a single pool would work much better for the public and my reasons for this are pretty simple, Enhanced Governance, Lower Boundaries to sharing data between the different departments that need to share data Nationally (think Social Care departments, Housing Departments etc) and Generally a simpler to manage Infrastructure and Workforce. Though perhaps I’m biased being from a Microsoft background which means that I can see some opportunities to scale similar services nationally which would be massively more cost efficient. Almost all the banks have done this and realised the benefits and to me it makes sense for the Public Services Sectors to do the same too! It was however interesting to hear about how Stockport Council are embracing Open Source technologies and essentially building out their own products which they are in turn open sourcing for other councils to take advantage of too. Its an interesting journey for them to take and I hope that the effort doesn’t end up being completely canned in a few years time if a Nationalisation of IT Services to Councils were to occur. It in my opinion is a logical step for this country to take though I’m not sure politicians and logic can go together. We will have to wait and see.

 

SQL Sat Exeter – March 12th. Well I’m not really sure I need to say any more than that really. However it was a great event and my first event doing a back to back demo heavy session on PowerShell DSC. Even more scary it was DSC but for SQL Server. I hadn’t realised how much of a headache the SQL Server DSC resources were until I spent the majority of the week leading up to it getting annoyed with little things like hardcoded values for where the Resource expected the Install media to be. I got that frustrated with it that I began to rewrite the resources so that it would work how I expected it to work which meant that I spent more time writing DSC Resources from scratch than actually doing anything useful. Especially as a week or two after SQL Sat Exeter I wiped the drive with the resources on them. Yes they were in Source control but only on that machine – lesson learned – DOH!!!

SQL Sat Exeter was my first real forage into the SQL Community events except User Groups and I after the fun I had with them at Exeter I can see why it is they call themselves SQLFamily. In the lead up to my sessions there was a run around to get some bacon sandwiches and a fair amount of drama with my demo’s having decided to kill themselves that morning – However I managed to get them working before my session and there was some good reviews come from it. I know where I need to improve the content and will be looking forward to SQL Sat Paris in a few weeks where I will need to cram all of the information from 2 hours into 45 minutes. #ChallengeAccepted

It was also the Saturday night at after event Curry & following drinks that the discussion about SQL Sat Manchester having a PowerShell Track came to fruition. I was lucky enough to have ended up out with Chris Testa-O’Neill and the other organisers at SQL Sat Manchester the year before (my first SQL Sat event and I went as an attendee) so it all felt natural to be there along with a number of other familiar faces like Rob Sewell and Steff & Oz Locke. Its like a reunion and I’m looking forward to what will be a kick ass SQL Sat Manchester this year. The PowerShell track shaped up nicely Smile. One thing I’ve learnt about the SQL Community is that it really does kick ass but then again all the IT Communities I’m a part of do. Our Passion brings us all together and with it we ensure to have a bloody good time when we get together. Else why bother?

On the Sunday morning I had an interesting email come in as I was sat having breakfast which lead me to question it a little with Chris & Alex Whittles and well history has been written since that morning.  I also got chance to help Rob out with a DSC issue he was having and gave him the guidance that he needed to resolve his issue in the right way as things currently stand and in future we will have a feature complete PowerShell DSC Resource for SQL Server – though this will require some community help and you can help out by voting on / adding items to the Trello board at http://sqlps.io/vote

Next up on my events (and half way through the 30 events I’d attended) was LeanAgile Manchester on March 16th – a firm favourite of mine as its a great community (like they all are) where we were treated to a talk by Jon Terry – but not that Jon Terry! – from LeanKit about how the deal with working in a Lean\Agile way with their FSGC (Frequent Small Good Decoupled – said FizzGood) approach. It’s another example of where the Software/manufacturing world bring good things to the rest of IT and generally other areas too and I would highly recommend that you go and read their blog on FizzGood at http://leankit.com/blog/2015/07/does-this-fizz-good/ and take away from it what you can.

Next up on my User groups that I attended was the Manchester SQL User Group where we would be walking through Cortana Analytics which I was looking forward to as at SQL Sat Exeter Chris Testa-O’Neill & Cortana essentially got a divorce whilst he was in the Speaker Room prepping at SQL Sat Exeter. I’m sure with a decent set of data I’ll be able to find a good use case for Cortana Analytics and I have some ideas in the pipeline so keep an eye out on future posts on this.

As an Non-Dev Admin who realised that I am really a Dev just wasn’t ready to admit it to myself, I find that the .NET User Group in Manchester is a useful group to attend especially when the topic is about .NET Core which it was on March 22nd. Even more so as with .NET Core there is a real possibility that the PowerShell Engine will eventually be open sourced especially as we are seeing a refactor of the existing Cmdlets to be able to be run on Nano Server with more and more coming each new TP and more to come for Server 2016 GA. We were treated to a history lesson on .NET Core by Matt Ellis @citizenmatt with the slide deck at http://www.slideshare.net/citizenmatt/net-core-blimey-windows-platform-user-group-manchester and again is well worth the read.

Next up was just after I had moved from Manchester to Derby and still had the hire car – and I had an itching to go see some of my SQL friends in Cardiff – especially as it was an epic event – Return of the Beards! This only means that not only did I get chance to catch up with Steff Locke again but also with Rob (again – it seems like that guy gets everywhere Winking smile) and also another one of my SQL friends Tobiasz Koprowski and lastly the other bearded SQL guy of the night Terry McCann. This was where I got to learn a bit more about TSQL from Terry and Securing SQL in Azure from Tobiasz but also see Rob’s session on the pains of Context Switching and how PowerShell & PowerBI help him not get mithered for information that can be easily made available and easily searchable with a little effort. This is for me a great example of real world use of PowerShell and PowerBI being useful together and well worth watching Rob deliver this if you can get the chance.

I then attended my first Tech Organisers Meetup in Manchester on April 7th – it was good to meet the other Tech User Group Organisers in Manchester/NW area and have the discussions that was needed as a collective to help strengthen the view that Manchester is a blossoming Tech Hub in its own rights – something that Londoners seem to miss out on. Manchester is ace because it’s cheaper than London and is actually more lively at night than London (I’ve found) and you can literally walk from one end of the main city centre to the other in about 20 minutes or so and within that you have the Northern Quarter. So you are pretty much sorted!

Next up I had another event I presented at – The SharePoint User Group in Nottingham on April 12th. I presented on PowerShell DSC for SharePoint like I did at the SharePoint User Group in Leeds in January but this was a special one for me as it was the first User Group that I presented to after being awarded MVP which being awarded on April fools day lead me to post this post Congratulations 2016 Microsoft MVP at 15:31 about 10 min after getting the Email and then Fooled Ya – Today I became a MVP at 15:55  – I also blogged Awarded the MVP Award – What it means to me and the future for the Community. We also had a talk from Garry Trinder @garrytrinder on Require.JS which can be used in conjuction with MDS (Minimal Download Strategy) in SharePoint 2013 and Online Sites to help bundle up and control your page load and transition times. Javascript is one of those dark arts that I’ve not had much more I’ve needed to do with it – but I certainly would look to use Require.JS in any of my future web projects.

My next event was PSConfEU and this was the event that I had been looking forward to because of the sheer work that went into it by all involved, including Tobias Weltner and myself to make it a success. Due to the size of this event I will put together another post in the coming days that really captures the details on what an amazing event that it was as I don’t think that a few sentences will do it any real justice. Plus I want to relive the experience in as much detail as I can so that I can share it with you as well – so that if you weren’t able to make it then hopefully you’ll do what you can to make PSConfEU 2017. Planning will begin for PSConfEU 2017 most likely early August so there will be small announcements some point after then though its still all to be determined.

From the spill over from PSConfEU I had managed to bribe June Blender to agree to come and present at the Manchester & London PowerShell User Groups – though to be honest there wasn’t much bribing involved as June had wanted to come to Manchester anyway and timing wise it just worked out great. June gave her Thinking in Events hands on lab at both groups and both groups had some great questions and I’ve had some fantastic feedback from the sessions which has lead me to start working on preparing my own hands on events for in the future. These are “in the works” so to speak and details on these will start to appear in the next few months.

Next up was my first MVP event where we went to Bletchley Park – a fantastic historical site and I’m planning to head back there again in future. The event was good for me as it allowed me to meet up with other UK MVP’s including fellow PowerShell MVP Jonathan Noble. There is a good story behind how we ended up meeting on the train up from London to Bletchley Park and it starts with me forgetting to charge my Laptop and Phone the night before. When I got to Euston I was frantically trying to make sure that I got on the right train to get to Bletchley. I had messaged Jonathan whilst on my way and had found out that we were catching the same train to Bletchley. However, phone signal is pretty poor when you are travelling out of London and just before my phone died I managed to send him a message letting him know I was about half way up the train. About 20 minutes passed and then all of a sudden this guy two rows in front of me got up and came to me and said “Hello – its Ryan isn’t it? I’m Jonathan only just got your message” and from that moment we just continued chatting. When we got to Bletchley Jonathan was able to lend me a power bank to charge my phone not that I really needed it but having charge on your phone is now a comfort thing isn’t it. We had  an afternoon of talks and then a really nice drinks and dinner where I got chance to meet some more of the MVPs which was good. We then next day had some presentations in the morning and then we had to make some Rocket Cars in the afternoon. It was great fun to something less techy but still something that most enjoyed. I was lucky to be able to get a lift from Alex Whittles from Bletchley along with Steff Locke to Birmingham New Street Station which allowed for a number of good conversations about SQLBits & SQLRelay. Both being events that in future I may get more involved in – if I can manage to stretch that far that is. Once Alex dropped me and Steff off we worked out that we either had half hour to try and get something quick to eat before running for our respective trains or we could get something decent to eat and then get a drink afterwards before catching the train after that. Naturally, decent food and drink was always going to be the winner Smile.

 

Nearly Finished with the Recap with just 6 events left to cover, so If you’ve read this far well done you can manage to make it to the end Smile

 

I then attended the SQLBits Saturday event on May 7th in Liverpool and although I got there not long before lunch I was still able to get to the sessions that I wanted to get to – mainly the SQLTools session as seeing that SSMS has been decoupled from the SQL Server Install – which is 100% the right thing to have done. Like other SQL events I bumped into Alex, Steff, Rob (he is literally everywhere Winking smile), Tobiasz & a number of other SQL people including Mark Broadbent, Niko Neugebauer, André Kamman, John Martin, Mladin Prajdic & Neil Hambley to name just a few. As per all these events once the curtains for the event has closed that is when the Food and Drinks appear and I’ve realised that I have a soft spot which stops me saying no to going for a Curry & Drinks with all these amazing people. This means that future events I’ll be planning to stick around for the almost guaranteed after Curry and the ensuing drinks and conversations that happen around them.

I then had the amazing opportunity to meet and spend a few hours with Ed & Teresa Wilson – The Scripting Guy & Scripting Wife – where I took them for a wonder down to the University of Manchester Campus and took them to KRO – a nice Dutch place for some food which was right round the corner of where I used work when I was at UoM. We then strolled leisurely around the campus on the way back towards the venue for the User Group where we had Ed talking us though OMS & Azure Automation DSC now that Ed is a part of the OMS team at Microsoft. Due to the fact that we had to get a Train to London at 21:15 the user group was an hour shorter than it normally would be so we didn’t have time for pizza and the normal after drinks that we would have normally done but the turn out was still one of the best turnouts we’ve had and there will be more events like it planned in future as well with an aim to make the next Manchester User Group occur in July.

As I mentioned Ed, Teresa and I all had a Train to catch to get to London for WinOps, and much like PSConfEU, I am planning to blog about this event separately to really capture the spirit of the event. Look out for that post in the next week or two.

 

We then had the UKITCamp which Marcus Robinson & Ed were going over the feature sets of Azure & OMS. I unfortunately missed the morning of this event due to being called onto a customer production issue conference call – 3 hours of my morning I couldn’t get back however sometimes that is how these things go and as I was leaving the Venue I found out that there was the London SQL User Group on that evening and I decided to stick around for it as the topic was “Common SQL Server Mistakes and How to Avoid them” which is the kind of SQL topic that I enjoy because it isn’t deeply technical but allows me to understand the product just that little bit better than I did beforehand.

Lastly The London PowerShell User Group, which we had Ed at again and had the highest turnout so far. Ed again was talking about OMS & Azure Automation DSC but also had a number of opportunities for some open directed questions from the audience which is always an added bonus of having more & more people turn up to the group. We over run a little with the conversations that were flowing mainly due to having an excess of beer and pizza due – something that we haven’t had happen before at the user groups. Then as per usual with the User Groups we end up finding somewhere else to go for another drink or two and continue the conversations.

 

So thats most of my last 3 months summarised – what have you done in the last 3 months?

Future posts like this will be much shorter, contain some pictures and be competed on a monthly basis.

Thanks for reading – Hope you have a great day!

Building A Lab using Hyper-V and Lability – The End to End Example

Warning – this post is over 3800 words long and perhaps should have been split into a series – however I felt it best to keep it together – Make sure you have a brew (or 2) to keep you going throughout reading this

In this post we will be looking at how you can build a VM Lab environment from pretty much scratch. This maybe for testing SharePoint applications, SQL Server, Exchange or could be for additional peace of mind when deploying troublesome patches.

Our requirements for this include

  • Machine capable to Run Client Hyper-V – Needs SLAT addressing (most machines released in last 3 years are capable of this)
  • Windows 8.1 / 10 / Server 2012R2 / Server 2016 TP* – In this post I will be using Windows 10 build 14925 – ISO download is available from here
  • If using Windows 8.1 then you will need to install PowerShell PackageManagement – you can use the script in my previous post to do this as detailed in here
  • A Secondary/External Hard Drive or Shared Drive – this is to store all Lability Files including ISO’s, Hotfixes & VHDX files

Where do we begin?

Obviously you need to install your version of Windows as detailed above and once you have done this you can crack on!

Time Taken – ??? Minutes

However as mentioned I’m going to Use Windows 10 – This is just personal preference and is for my ease of use.

As you hopefully know by now Windows 10 comes with WMF5 and therefore we have PackageManagement installed by default. We will use this to grab any PowerShell Modules that we need from the Gallery. I personally have a Machine Setup Script that lives in my Onedrive as you can see below. As this is a Windows 10 Machine I am logging into it with my Hotmail credentials – this then means that I am able to straight away pick the folders that I want to sync to this machine (joys of the integrated ecosystem)

This takes about 5 minutes for OneDrive to finish syncing and then we are ready to go onto the next step.

Time Taken – 5 Minutes

Lability1

In this stage I will Open ISE with Administrator Privileges – this is required as I need to change the Execution Policy from Restricted to RemoteSigned as well as run other scripts that require elevation.

Once I have done this I can move onto the next step. This includes setting up my PowerShell Profile and Environment Variables and then setting up all the required functionality for me to continue working on this new machine.

This includes setting up the ability to install programs via Chocolatey like VSCode & Git and installing Modules from the PowerShell Gallery a few examples being ISE_Cew, ISESteroids, & importantly for this post Lability . Also It is worthwhile to note that at this point I am not downloading any DSC Resources as part of my setup script – this is because we will cover this later on as part of the workings of Lability.

As an additional note it is worth mentioning that the Version of Lability at the time of writing this article is 0.9.8 – however this is likely to change in future with more features being added as required. If you have a thought or suggestion (or issue then head over to the Github Repo and add your suggestions / issues.

I am also in this script enabling the Hyper-V Windows Feature to enable me to carry on with this Lab. I then initiate a System Shutdown. Overall this whole section takes maybe about 10 minutes to complete & yes I intend to build this as a DSC Resource in the near future, however it is worth while to note that Lability has a Function that will ensure that the Hyper-V feature is enabled & your are not awaiting a System Reboot for you – more on this a little later on.

Time Taken – 15 minutes

Once the reboot has completed we can then get on with the Lability bits and that is the real interesting part of this post.

Lability Functions

Lability has 38 public functions and 6 Aliases as can be seen below.

Lability8Lability9

I wouldn’t worry too much on the aliases as these are built in for continued support from prior versions of the Lability Module and will likely be removed on the 1.0 release.

We will be using a number of these functions throughout and as is always best practice have a read of the help for the functions and Yes they do include some great comment based help.

There are a number of additional private functions in the Lability module that have comment based help too but again I wouldn’t be worrying about these too much, unless you need to do a lot of debugging or want to help add to the module.

The Key Lability Functions that you will need are and likely in the below order

  • Get-LabHostDefault
  • Set-LabHostDefault
  • Reset-LabHostDefault
  • Get-LabVMDefault
  • Set-LabVMDefault
  • Reset-LabVMDefault
  • Start-LabHostConfiguration
  • Get-LabHostConfiguration
  • Test-LabHostConfiguration
  • Invoke-LabResourceDownload
  • Start-LabConfiguration
  • Start-Lab
  • Stop-Lab
  • Get-LabVM
  • Remove-LabConfiguration
  • Test-LabConfiguration
  • Import-LabHostConfiguration
  • Export-LabHostConfiguration

These are just a few of the Functions available in Lability and we will cover most of these functions in greater detail as we head through this article.

Lability Media Files

Lability has a number of different configuration files all in JSON format, and these are HostDefaults, VMDefaults & Media. All of these files are in the Config folder of the Lability Module which on your new Machine will be C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config
The HostDefaults file contains all the settings that we associate with the Lability Host Machine. These include the paths where we will be looking for any ISO’s, VHDX, Hotfixes and any additionally required Resource Files for in our Lab.

The VMDefaults file contains all the default settings that we associate with the created VM’s. This includes Media used to create the Machine, Startup RAM, Number of Processors and which virtual switch we can expect the VM’s to use. This can be useful to have just like the HostDefaults but as we will see later on in this post we are most likely to override this in our configurations.

The Media file contains the settings for any media that we we want to use. As Lability in its nature was was built for building Labs it uses the Evaluation Licensed media for the VM’s.

The benefit of this is that the items already in this file allows you to get on with building Labs almost straight away on a brand new Machine.

This file has some included Hotfix Download links for getting the DSC updates on WMF4 for Server 2012R2 & Windows 8.1, but don’t worry Lability uses these to download the hotfixes and embed them into the VHD files for you. 1 Less job to worry about Winking smile

LabHost Defaults

Firstly we need to get the LabHost Defaults setup correctly for our environment – this is important and also is great for being able to move Labs between machines if required ( I’ve had to do this a fair amount myself ) and is why I recommend that all the core Lability bits are installed on a Separate Drive.

Personally I’m using an External Hard Drive but that is because my Lab is portable. I have not tried this with a Shared Drive however there shouldn’t be much that needs to change to get it working that way.

On my External Drive I have the following Setup – I have a folder called Lability and in this I have all the Folders required by Lability as detailed in LabHost Defaults as we will see below – however I also have another folder – Lability-Dev as this was from the Zip that you can download of a repository from GitHub as this was prior to Lability being made available on the PowerShell Gallery. In essence this means that I have copy of Lability that I can edit as required – especially the 3 Lability Configuration files detailed in the previous section but also allows me to do additional debugging as required.

Firstly we will Run Get-LabHostDefault and this should return the below by default – this is because the File HostDefault.json is stored in the C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config location (remember 0.9.8 is the current version – yours may vary)

Lability4

As this is the default and I’ve been using Lability on a few different machines I have a copy of it on my External HDD in the Lability Folder. Lets see what that file says it should be.

Lability5

Well – That’s not good! As you can see on my last machine the external drive had been the D Drive but on this machine its the E Drive. A simple (yet annoying) thing that we can easily change. Now I could be Done Manually but I decided that I wanted to wrap this all together so that I don’t have to think about it again. This is simple enough so I just wrapped it in a very simple function as seen below

Lability6.1

This allows me to Update this as I move it between machines quite easily. This isn’t an ideal scenario but it works at at least.

The benefit of this is that it will update the HostDefaults file on both my C: Drive and the External Drive at the same time – Which further means that this will be easier to be portable.

We can then run the function Reset-LabHostDefault and we should get something similar to the below

Lability7

We can also do the same thing for the VMDefaults file however I find this is less likely to be a requirement as we can override the defaults in the configuration data files that we will work with and this is my preferred method.

Once we have done this we are ready to run the following function Start-LabHostConfiguration – this will on a new machine go and create the required directories as specified in the HostDefaults.json file that I have shown you how to amend and the output from Start-LabHostConfiguration is below

Lability18

We would then use Test-LabHostConfiguration to confirm that this is all correct and we can see that this is the case below

Lability19

Building your First Actual Lab

Wow that was a fair bit of setup required though a lot of it may be completely ignored depending on your own set up or if your re-visiting this post.

Now we move onto the real meaty part of the post and I’m going to use 2 examples for this – The Bundled TestLabGuide and one of my own for a SQLServer install.

So starting with the TestLabGuide.ps1 file there is only 1 small modification that I have made and this is at the end of the File and that is to include the following 2 lines

Lability10

This allows me to build the configuration for these VMs as if it was a script and this is how I am personally doing it.

However on a Machine with No DSC resources we have an issue if we are building VM’s that are Dependant on these DSC Resources.

Well within Lability there is a Function called Invoke-LabResourceDownload and this has the ability to download all the required resources that we need as defined in our configuration data file.

Within the Configuration Data file shown below, the key section for us to look at in here at this point is the NonNodeData section where we have a subsection for Lability configuration Items, this can include EnvironmentPrefix, Media, Network, Resources & most importantly for us DSC Resources.

So far I have found that we only need to run this for pulling the DSCResources as defined in our configuration data file as shown below – this is because we require them to be on the machine before we can build the mof files.

Lability15

I found that it best to have the DSCResources as RequiredVersion and not MinimumVersion as it is by default in the TestLabGuide.psd1 file – This is by preference but with the amount of changes happening to the DSC Resources its worthwhile being extra cautious here.

The output from Invoke-LabResourceDownload can be seen below and this as we can see has downloaded only the DSC Resources that we specified in the Configuration data file (TestLabGuide.psd1)

Lability11
This also means on a clean machine you will be sure that you have the right required versions. This is especially useful when building Labs in my opinion.

However if you have multiple Labs running concurrently then the next bit may be an unfortunate blow to you.

Within the Configuration keyword we have a Dynamic Keyword defined – this is Import-DSCResource – which you may have thought was a function.

With it being a Dynamic Keyword it works a little differently to a normal Function/Cmdlet and therefore we are limited as to what we can do with it – for example we cannot use Splatting with it and we also cannot pass the required DSC Resource Modules to it from outside the current file. This is required for the syntax highlighting that we get as part of the parser. If you want to learn more about the Import-DSCResource Dynamic Keyword then read up this article by the PowerShell Team – be wary it is from 2014 and there hasn’t really been any better content come out on this since (that I can find anyway)

My thoughts on this is that we should be able to pass the required DSC Resources through from the Configuration data file like we have already detailed prior – however this isn’t currently possible.To me it would be beneficial (& logical) to be able to extract this away from the configuration as it is really a part of the configuration data, especially seeing as we already have to pass configuration data to our outputted configuration keyword – in this case TestLabGuide. However this is where we are at and at this time we will need to mirror the DSC Resources between both the configuration itself and the configuration data file.

However that aside lets look at the Node data and especially the All Nodes section which is where the NodeName = *

Lability16

As we can see in here we have a few settings for the all the nodes in this configuration that will share items from Lability and these include the items we had available in the VMDefaults file as well as some other items too that we would want shared between the VM’s like DomainName etc

Further down we can see that for the Client VM’s in this Lab we are specifying different Lability_Media values for each-  so it looks like we will have both a Windows 8.1 & a Windows 10 Client Machine in this Lab.

Lability17

 

That’s enough about the configuration and configuration data side of things – lets go and build our Lab.

At this point what we want to do is just do the below.

Lability12

At this point you will be prompted for an Administrator Password and once that has been given as we can see  above it will go and create all the mof files that we need for this lab. The next step is to go and kick off the actual build of the lab which can be done as shown below

Lability13

This Function Start-LabConfiguration  is the key function of this module as it will go and

  • check that the Lability host is correctly setup – by calling Test-LabHostConfiguration  – if not it will throw an error (possible update here)
  • download any ISO’s that are required as we have expressed in configuration data if that image matches one we have listed in the Media file. It will match these to the Checksum value given in the Media file for the Image
  • download any Hotfixes that are detailed in the Hotfix section of the matched Media in the media.json file.
  • build a Master VHDX File from the ISO & Hotfixes as detailed for the media type for the Lab VM’s as downloaded above – it is worthwhile to point out that this is built of lots of smaller functions that are essentially based off of the Convert-WindowsImage script.
  • build a Lab Specific VHDX file – this is currently setup as a 127GB Dynamic Differencing disks
  • build and inject a Lab VM specific unattend.xml file into each Lab VM VHDX
  • Inject into Lab VM VHDX all required certificates
  • download & Inject any resources into that are defined in the Lability Section of the NonNodeData section of the Configuration data file – I Will show more on this in the SQL Example later on. These are injected into the Lab Specific VHDX file
  • Inject all required DSC Resources into the resulting Lab VM Specific VHDX file.
  • Inject the mof and meta.mof files for each Lab VM into corresponding VHDX file.

Seriously though – Wow – that 1 Function is doing a lot of I would call tedious work for us and depending on your internet connection speed can take anywhere between maybe 30minutes to a day to complete – 1st time I ran it I think it took about 7 hours to complete for me due to slow Internet & I was also watching Netflix at the time Winking smile

You can see the final output from this Function below

Lability20

Note – If you have your own Media you could always create new entries in Media.json for these to save the download time – Especially if you have a MSDN License

Now this is where the fun bit really starts and it also involves more waiting but hopefully not as long as the last bit took you.

All we need to do at this Point is run Start-Lab like shown below and let DSC do its thing – note that I’ve used Get-VM and not Get-LabVM – this is a small issue that I have faced and have reported it on the Github Repo

Lability21

And here is an image of all the VM’s running and getting started

Lability22

 

This part can take anywhere from 10minutes to a few hours depending on your VM Rig setup and the amount of ram allocated to each VM as part of your configuration Data and whether there is requirement to wait for other machines to have be in their desired configuration as well as the complexity of the configurations being deployed.

Under the hood Lability has injected the DSC Configuration into the VM VMDX and has setup a Bootstrap process which in turn calls Start-DSCConfiguration and passes the path of the mof files to this. You can have a look at how this is setup in a VM’s in the following folder C:\Bootstrap\ if you are interested.

Once that is done you’ll have your first fully deployed set of VM’s using DSC & Lability – Pretty amazing isn’t it!

 

SQL Server install – Showing  some of the other features of Lability

In this section I’ll try and keep the content to a minimal but still add in some additionally useful screenshots.

My ConfigurationData file is as below for the SQL Server Node, notice how we have the required properties to be able to install SQL, SourcePath, InstanceName, Features and the Lability_Resource.

Lability24

As this was taken from a previous configuration this is using the xSQLServer DSCResource – take a look at cSQLServer here as this will likely be the version that gets ported to replace the xSQLServer & xSQLPS resources ass it is relatively close to being usable in place of the two resources. Expect news on this after PSConfEU.

Also note that in the Configuration Document we are specifying an additional item in the NonNodeData Section – Resource

Lability25

This allows us to specify further resources that are stored in the E:\Lability\Resources\ folder (E being in my case of course)

I’ll let you decide what you want to put in that folder but any items for installation from the VM could be candidates, things like SharePoint Media or SQL media or other installable programs etc. You could always add your personal script library in a zip file and then get this Lability to unzip it into the right directory. Choices are up to you on this one – so be creative Winking smile

For this Lab I didn’t have the installation media already downloaded so this has had to be downloaded as part of the Start-LabConfiguration Function – however if your remember there was a Invoke-LabResourceDownload Function.

This has some additional Parameters that allow you to download any of the required items for the LabConfiguration to succeed. This can be useful for example if you happen to have a few hours where the Internet Connection is much better than that of your own – especially if you are using this for personal testing and not professional lab testing as it was originally designed to be for.

One of the other great things with this module is that you can make use of it for your lab environments regardless of whether your shop is using WMF5 or not. If your still running WMF4 (with the essential DSC Updates) then you can still build labs using this.

Wrap up

Well I hope you’ve enjoyed reading this 3800+ word post of mine and this helps you get to grips with building out Labs in an easy and repeatable way whilst having the chance to play with DSC to do it.

Remember that this Module DOES A LOT behind the scenes – if it didn’t there wouldn’t be the need for this post – and there is more functionality being introduced as appropriate all the time.

Lability is built for building labs – however you could easily use this for building production like environments – if you dare that is and I can see the benefit to doing so, I mean why re-invent the wheel when Lability will do a lot (most) of the work for you.

Like with getting to grips with most new modules always start with the Help files. This Module has a number of about_* help files and almost all the functions (even the internal ones) have Comment Based Help.

This is a module where you need to RTFM to really understand all the workings of it. Spend a few hours looking through it and understanding it a best as you can. It will be so worth it in the long term even after reading this post a decent number of times.

My do however take my hat off to Iain Brighton (@iainbrighton) on creating this module and for me it is the only module to use when building Lab Environments – So lets gather some momentum as a community to suggest enhancements and improve it even more over on the Github Repo.

My example files that I have used (especially the SQL one) will be made available in due course once Iain has decided on a scalable way forward for being able to share Lab Configurations. We have discussed a number of options (yet to be added to this issue) and if you have an Idea please add it via the Lab Sharing Issue on Github.

This is just the first in a series of posts that I intend on doing on Lability – although future ones will be much shorter but will focus in depth around the functions that I haven’t covered in this post along with some of the more interesting parts in more depth. However I expect that this will be a good starting point for you to get to grips with the Lability Module and start building and running test labs.

As per usual please let me know your thoughts on this post whether it’s via Twitter or via the below comments section and I hope you have enjoyed the read.

Pulling the Community Together to Improve the Quality of PowerShell Modules

In a discussion that started on Twitter a while back with June Blender about the quality of the Modules being posted to the PowerShell Gallery I had an Idea on a way that we could help improve this from the community – using the tools that we have available to us and more importantly the expertise of the rest of the community to help shape and guide the direction for modules.

The Idea starts off with a simple GitHub Organisation, in this case this one – PowerShellModules – in which we set up a number of teams in that organisation. Some of the teams that I have had in mind include but are not limited to, Documentation, Unit testing, Practise guidance and then module maintainers.

This means that there will be a much more open way of directing the development of Modules for the community but still having the ability to allow modules to be developed in a way that allows others to add to them instead of branching out and creating a second module. This also will mean that there is a stronger focus on modules being updated as the overhead isn’t on a single maintainer but can be given to a number of multiple maintainers at any given time.

 

My thoughts around this would start with me porting my current ‘in progress’ modules to the organisation and building out the teams as mentioned above with a suggestions/RFC like repository that would allow us to drive this for the better of the community.

The end goal from my perspective would be to have 1 community recommended Module for a technology (examples being GitHub, Trello, VSTS, Slack etc) that has been widely community developed that covers as much of the functionality as it can without the need for a separate module.

We have a great community of people and I think that this is the right point in time to start the drive to improve the quality of what we all output to the community in a targeted and effort efficient method.

If your interested in getting involved please comment on this post with your GitHub User Name and I’ll get you added to the Organisation in due time but please keep an eye out for the Email from GitHub requesting you to join the Organisation especially in the Junk Folder Winking smile

Travelling to London – yet again!

Today I’m off to London for the 5/6 time already this year. This time I’m off to present at the WinOps Meetup and then attend the Chef Users Meetup the following day.
My Presentation at WinOps will be the one that I gave for the Mississippi PowerShell User Group.

Looking forward to seeing old faces and new ones there as well!

Presentation to Mississippi PowerShell User Group now Available on Youtube

I recently presented to the Mississippi PowerShell User Group via Skype for Business on Tuesday 9th (well Wednesday as it was 2:30am that I presented for them)

The video from that session is now online at https://youtu.be/z3CmI73LnyI

My session was around my Script & Module Creation Workflow and the Tight integration with Git & Pester that I have included in a module that is an Addon to the PowerShell ISE – Called ISE_Cew

ISE_Cew can be downloaded from the PowerShell Gallery in 1 line

Install-Module ISE_Cew

This is only if you are running PowerShell v3+ and have installed the PackageManagement Additional install – details at PackageManagement on v3/v4

Otherwise you can install it from GitHub at https://github.com/kilasuit/ISE_CEW

Hopefully this will be helpful for you and I’ll look forward to gathering feedback from you in future.

#PSConfAsia – What an amazing experience!

So this post is a fair bit overdue and that’s because I’ve been very very busy as of late.

However let’s get right into it!

#PSConfAsia was an important event for me mainly because it was my first venture out of the UK in 20 years – yes 20 – which meant that I needed to sort out my passport – something that had been on my to-do list for about 6 years and it was my first real holiday EVER! It also happened to be very well needed and well timed venture abroad – for those that know me well will understand why the timing of this was crucial to me and perhaps for those that are still to get to know me better this may be something that I share with you in time.

So I had it planned that I would be in Singapore for a few days before the conference and that ended up being that I was in Singapore for 6 days in total which for my first dose of international travel this I think for me was more than enough – my previous “holidays” over the last 5-7 years have been a few days off at a time with almost no travel from the house so this was going to be a really big change for me.

The actual travel there was something I was a little anxious about due to me not knowing if I would enjoy flying or not. Luckily it seems that I’m not fazed by it at all and this has since lead to me looking to other events to attend/present at (more on this in a later blog post)

But #PSConfAsia was more than just a conference with a few days for holiday in-front of it for me – it was a chance to meet with an ex-colleague and also co-organiser of this event, Matt Hitchcock, which was a key driver for me to attend the event let alone put together a presentation for it.

So my travel to Singapore started off after #SPSCambridge as I was flying from Heathrow the next morning and on arrival to Heathrow and getting checked in I had the fun of being the guy randomly picked for Explosive Swabs checks whilst – You can imagine what was going through my mind at that point with it being my first every flight abroad.

The journey consisted of 2 7 hour legs and in between them a short layover in Abu Dhabi, luckily only for a little under 3hours which allowed me to kind of just relax a little and have a wonder round the Airport and have a look through the shopping areas. Also at Abu Dhabi the free Wifi was really slow so instead of any Social media catching up I watched some videos that I had prepped for the journey.

The Flights with Etihad were comfortable and had some useful amenities on board and in seat that included on board Wifi (unfortunately not free) a plug socket at your seat and also in seat entertainment system that also allowed you to chat with someone else based on their seat number – this could be a very useful addition to all flights and I would hope that the likes of EasyJet etc will install these on their flights soon.

However once I got to Singapore I had the conundrum that I would be without data or calls (not paying the silly roaming charges) but luckily for me I had researched about this before I had left and Singapore has a great selection of Tourist mobile sim offers that typically include upto 100Gb of 4G data with local call and texts for upto a 10 day stay for the equivalent of £15. That’s just amazing value and is indictivative of how advanced Singapore is in terms of its Infrastructure.

There was also another great thing in Singapore that anyone visiting should take advantage of. This was the Tourist travel pass which at roughly £15 gave you unlimited travel on buses or the Singapore MRT (like the London Underground/Overground systems) but this could also be replaced after 3 days with another one so I made good use of it to see as much of Singapore as I could whilst I was there and I did manage to see a fair amount of Singapore but still have other places to see too J (for the next visit).

On the Monday I decided that I would head to the hotel and drop off my luggage, have a nap and then make my way into the Centre to see where the Microsoft buildings were and have a general touristy nose around. I also on that day decided that I needed some comfort food – What better than Singapore McDonalds – who will deliver as well which is a little mad considering that they won’t in the UK.

However most of the fun started to slowly build from the Tuesday evening as that was when I got to meet Matt & Milton (another of the Organisers) for some food and drinks in the city area, which was a great experience and made me realise how expensive any kind of alcoholic drink is in Singapore with prices ranging anywhere upto £13 a pint. Owch!

102015_0006_PSConfAsiaW1

 

 

 

 

 

 

 

 

 

Wednesday started Early as another one of the Organisers Ben had arrived late on that previous evening and was in Singapore for work both Wednesday and & Thursday and he mentioned that he was heading into the office early that morning so I tagged along with him and he showed me what a typical breakfast was in a local restaurant in the centre. I have to say the Kaya Toast was really tasty and I’ll look forward to it when I next get chance to over there. Also whilst at breakfast I got chance to try out local coffee as well and again was not disappointed at all. I then spent most of Wednesday just being a tourist and seeing some more of the areas of Singapore (whilst riding the MRT and downloading Windows10 as I did so – gotta make use of my 4G data somehow) and then made my way back to the hotel mid-afternoon as to meet up with the organisers & some of the other speakers that had arrived – which included Ferdinand Rios from Sapien, Narayanan Lakshmanan from the PowerShell Product team, Benjamin Hodge from Kemp (also Organiser), Milton Goh (Organiser), Matt Hitchcock (Organiser) & Jaap Brasser and below is a picture of us all on that night.

102015_0006_PSConfAsiaW2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

As you can see we had a great time J

This was just the Wednesday night!

On the Thursday I met up with Gokan Ozcifci A SharePoint MVP who I’d briefly met at SPSLondon and was also speaking at PSConf and went and did some further sightseeing of Singapore which was goo as we managed to get around the Gardens at the Marina Bay Sands which was great fun to go round and see everything there. We also went to the SkyPark on the Marina Bay where you can see some amazing views including this cracker of a one below.

101915_2356_PSConfAsiaW3

 

 

 

We then had a small gathering that evening with some of the Attendees and a few of the other speakers too.

101915_2046_PSConfAsiaW3

 

 

 

 

 

 

 

 

 

 

 

 

 

And then the Friday (and the beginning of the Conference) and what an amazing day that was – I decided that I would go and meet Ravikanth Chaganti & Deepak Dhami at the Airport as they were flying in early that morning. I must say I am glad I did as it was a lot of fun meeting the both of them and having some time to engage with them on a more casual basis than it would have been at the conference. It can be seen below as to the fun that we had – I even set up my laptop to be an interactive Display board similar to the ones that the Taxi/Chauffeur drivers have as can be seen below.

101915_2356_PSConfAsiaW5

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Though Deepak didn’t know that I was coming to meet him so walked right past without even realising – Luckily Ravi Had arrived a little earlier so we managed to between us get his attention and then we headed to the hotel to let Ravi & Deepak check in, get ready and then we arranged to head to the Venue.

Now we arrived a little bit through the Keynote by Jeffery Snover but I was still able to get a quick chance to give him a personal thanks (even if the session was via a Skype call) as can be seen below (Thanks Jaap for the picture!) It is without a doubt that without the Invention of PowerShell I would not be finding a career in IT as fun as I currently do and will likely do so for many more years to come!

101915_2356_PSConfAsiaW6

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now throughout the day I had chance to chat to most of the presenters and some of the attendees and sat in on a few sessions where I could. After the day wrapped up there was an organised speaker’s dinner which was just fantastic and set in a great location with a great view over Singapore. However I had to leave the dinner early due to having not slept very well the last few days leading up to the conference.

I then as per usual for me when I’ve not slept very well (been like this for well over 12 years!) I find that I end up having a ridiculously long sleep – in this case almost 14hours and is typical that it happens on a Saturday as well! This then meant that I missed all of the morning’s sessions as well and arrived to the venue just in time to give my session which unfortunately I had a bad case of the Demo Gods Wrath! I’ve yet to revisit and restructure my session but this is on my to-do list for November! After my session there were a few other great sessions in which I need to get back myself into them in more depth – including Pester testing – my annoying phrase from PSConfAsia for the following week was definitely ‘Let me Pester you about that later’ or some other Pester related variant.

We then had a great after event food, drinks and prize giving event afterward at the truly most English pub possible outside of England and it was a great event. This below has to be my favourite picture from the after event (other than the picture in picture in picture in picture that we took) with Jason Brown from Domain.com.au (attendee / guest DevOps Panelist) and Sebastian (attendee that won the F1 tickets)

101915_2356_PSConfAsiaW7

 

 

 

 

 

 

 

 

 

 

 

 

We then after that night had a few of us meet up for a final lunch in a great little Indian in the Little India suburb of Singapore where Ravi made sure that we had some great food whilst we had the chance to.

101915_2356_PSConfAsiaW8

 

 

 

 

 

 

 

 

 

 

 

 

 

 

After we left we went and had a relaxing few drinks in the Singapore sun and then we had to make our way to the airport as it was time to get ready to depart and I was luckily enough to get even more time with Ravi to discuss a certain upcoming DSC project of mine and it was great to get some more insights from one of the most well known in the field and I was lucky enough to get myself a selfie with him (and I’m not a photo taking person) at the airport before we had to depart to board our separate flights.

101915_2356_PSConfAsiaW9

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

All in all the 6 days I spent in Singapore was an amazing adventure and it is one that I am hoping to be able to partake in again next year – hopefully it will be as fun filled as it was this year and I have been lucky to have made some amazing friends from it and I cannot wait to catch up with them all again in the near future.