5 out of 6 in 6 Days = A busy week

This week has been a busy week for me with the SQLRelay and SQLSat Munich events. It has been full of fun especially seeing as for SQL Relay we had the fun bus for travels between the different venues all across the UK.

The week started of as most other weeks do and that was with me at home in Derby on Monday Morning. This was followed by me jumping on the train to Birmingham around 11am Monday Morning for the first leg in the SQL Relay tour where I presented a completely new and fully non-technical session, something that is a little bit out of my comfort zone of the typical more heavily technically focused sessions that I’m used to delivering.

This was a session that I’ve put together based upon my own career experiences about the need to really spend time on developing and taking ownership of your career. It was pointed out on a few occasions throughout the week that at 26 I’ve still yet to really “have” a career and although in some ways that can be seen as being very true, there is also the other side of the coin, in which I’ve had the opportunity to see first hand with other colleagues how not owning your career can lead you down path that doesn’t leave you with a role that you enjoy and feel sustained and secure in.

 

I really do feel that it really is essential that you keep up with your Training and take control of your own career – as after all it’s your career and how well it goes is down to you as an individual and how determined you are to achieve the Salary and work life balance that you wish to have. Troy Hunt has blogged about his experience of making his job redundant at https://www.troyhunt.com/how-i-optimised-my-life-to-make-my-job/ and this is something that scares most people when they think about it at any real depth. I would also recommend reading the follow up post on this that Troy has done on this recently as well https://www.troyhunt.com/7-years-of-blogging-and-a-lifetime-later/ as both of these are similar to my way of thinking around work and life balances.

 

Not only did I do a new presentation but I have also been busy enjoying being on the SQL Relay FunBus as well and although I knew a number of the other fellow travellers it was good to be able to spend some more concentrated time with them. The SQL Relay is a great idea and I’m already looking forward to the ‘tour de UK’ again next year.

 

To top the week off I have also been at SQLSat Munich this weekend where there has been even more fun times with the extended #SQLFamily which has been great. I seem to have a thing for Munich and the first real weekend of October as I was also here last year for SPSMunich which you can read about the experience in my recap post

 

I however am looking forward to getting back home after pretty much a week on the road and getting ahead with some of my prep for PSConfAsia in just under 2 weeks.

PSConfEU call for Speakers is now Open!

Proud to announce that Speaker Submissions are being accepted for PSConfEU 2017 – you can submit your session proposals via the following form

A few things to note about this year’s submission and selection process

 

  • We have a hard cut of date of the end of Sunday December 1st – submissions must be in by this time or will not be accepted.
  • This is because we will have a selection committee gathering during the week commencing Monday 2nd December
  • The members of the selection committee we all vote for our favourite sessions.
  • This will begin to form a preliminary schedule.
  • We will then send out on confirmation emails to the selected speakers
  • By December 31st we expect to have confirmation from all speakers and the schedule ready to launch hopefully posted by Jan 1st.

 

Once again I am very proud to have the opportunity to be working alongside fellow MVP Tobias Weltner and the rest of the organisation team to bring to you the 2017 flavour of the PSConfEU and will look forward to seeing you at PSConfEU 2017n

Speaking at SQL Saturday Munich October 8th!

So last October I attended the first SharePoint Saturday in Munich which was great event and if you want to you can read up about my experience in this previous post.

 

However it seems that in October this year I’ll be returning to Munich for the SQL Saturday event where I’ll be delivering my Why & how to implement PowerShell DSC for SQL Server session.

 

There has been a number of changes to the xSQLServer Resource over at https://github.com/powershell/xSQLServer in the last few months (yay) so there will be some cutting edge new insights into this session so it looks like I know what I’ll be spending my time on soon enough.

 

It’s also a session where I do cram in a lot of information within the hour (originally it was a 2 hour session at SQL Saturday Exeter) so if you attend make sure that you are ready to jot down a fair amount of notes and

 

There are a number of other familiar faces from the SQL Community speaking at the event so I’m looking forward to being able to catch up with them all and meet even more of the amazing #SQLFamily.

#PowerShell Side by Side #ProTip

Today I’m going to share with you a little but simple tip to enable you to do more Side by Side testing of PowerShell v6 with you current installed version in a simpler and less error prone manner.

 

Firstly we will create a new environmental variable which we can do in a number of ways but I quite doing it this way  as its easy enough to script

Function Update-PS6Path {

       

        $PS6LatestPath = Get-ChildItem ‘C:\Program Files\PowerShell’ -Directory |

                         Sort-Object CreationTime -Descending |

                         Select-Object -ExpandProperty FullName -First 1

        [Environment]::SetEnvironmentVariable(“PS6”,$PS6LatestPath,“Machine”)

    }

 

This then means that to Launch PowerShell v6 you can do this in the console to run PowerShell v6 (the latest installed version anyway) and in this case we are passing some of the available arguements to the powershell.exe application as noted at https://msdn.microsoft.com/en-us/powershell/scripting/core-powershell/console/powershell.exe-command-line-help

& $env:ps6 -NoProfile -NoLogo -ScriptBlock { $PsVersionTable } -NoExit

So hopefully this little snippet will help you out in doing some more Side by Side testing as time goes on.

1 Small thing about running PowerShell Core and Windows PowerShell side by side on Windows

*Updated August 23rd 2016 as there was a change between 6.0.0.8 & 6.0.0.9 to PSModulePath that I had missed – I will be blogging about this in more detail in a future post but for now check the updated section at the bottom of this post! *

 

If your like me and you want to test out PowerShell Core on you Windows machines as well as other *nix machines then you may get caught out with this like I did in the upgrade from 6.0.0.8 to 6.0.0.9.

 

You can grab the MSI installer for 6.0.0.9 at https://github.com/PowerShell/PowerShell/releases/tag/v6.0.0-alpha.9  however do note that there are no Windows 7 or Windows 8 installers due to the requirements for WMF 4 to have been installed prior to WMF 5 as noted in this issue https://github.com/PowerShell/PowerShell/issues/1931 which links to this issue https://github.com/PowerShell/PowerShell/issues/1705
So lets get into the Side by Side stuff Smile

 

Once you’ve installed the MSI install you can run PowerShell 6.0.0.x alongside the Installed Version on your machine like so

 

PS-SBS

 

This is because PowerShell 6x installs in the following Location C:\Program Files\PowerShell\ and as you can see below I have installed 6.0.0.8 & 6.0.0.9 on my machine.

PS-SBS2

This also means that if we look in our Start Menu you can see the following new options

 

PS-SBS3

 

*Note* This will not change your default version of PowerShell from the one that is at C:\Windows\System32\WindowsPowerShell\v1.0\ so if your running Windows10 on the Insider Fast ring like me then it will run 5.1.1405.1000

 

To run one of these alpha versions you have to explicitly do so from the Start menu (or a desktop link if you create one) so you can be sure that this will not cause your any issues with day to day PowerShell use.

 

Hopefully that clears up any potential confusion!

 

In 6.0.0.8 the $profile Variable referenced the Windows PowerShell Documents location as can be seen below

PScore-6.0.0.8

 

Whereas in 6.0.0.9 we have a new location as shown below

PScore-6.0.0.9

 

So when we load 6.0.0.9 we wont get our profile to load as it doesn’t exist.

So that we can get our current profile to load in 6.0.0.9 we can do what we would normally do and just use New-Item like I’ve shown below

PScore-6.0.0.9-2

 

This seems only to have been needed in 6.0.0.8 & not 6.0.0.9 – as the default values in 6.0.0.9 for the PSModulePath are not what we have set in the ENV Variable and I’m not sure how this works but will dig in and post about this at a later date!.

 

Then next time we load 6.0.0.9 we will have a working profile but the issue is that this will now enable loading of all the modules that we have in our PSModulePath environmental variable.

 

. However we can get round this by 1 simple line in our Profile

If ($PSVersionTable.PSEdition -ne ‘Desktop’)
{ if ($IsWindows -eq $true)
{  $version = $host.UI.RawUI.WindowTitle.Split(‘_’)[1] ;
$env:PSModulePath = “C:\Program Files\PowerShell\$version\Modules” ;
Write-Output ‘Removed all but the shipped Core modules’
}
}

This is a Windows only forwards compatible inclusion in your profile & will only affect the local session of PowerShell that is running.

So you can be sure that this will work across your Windows Machines however ideally we will get some amendments to PSVersionTable as noted in https://github.com/PowerShell/PowerShell/issues/1997 & https://github.com/PowerShell/PowerShell/issues/1936 to be able to tell the OS easier and more dynamically.

 

The $IsWindows variable is only available in PSCore along with $IsOSX , $IsLinux & $IsCoreCLR so you cannot currently use them in the Full Version of PowerShell and currently I don’t think that you can build the full version of PowerShell from the Repository to a 6x version. However this may change in future.

So you can actually with 6.0.0.9 and above ignore the above section completely and comment that out of your profile (or delete it)

 

This is also a good example of the rate of change between the different alpha versions although I’ve checked the commit notes and cant see this change mentioned in a concise and easy to understand manner so I will feed this back to them to see if release notes can be improved in future.

My Opinion on Open Source PowerShell and what this means to the PowerShell community

If you’ve been under a rock the last few days (or for days/weeks/months depending on when your reading this blog post) then you would have missed that on Thursday August 18th 2016 –  Microsoft Open Sourced PowerShell!

Not only did they Open Source PowerShell they have released a Cross-Platform alpha version that can be installed on a variety of Linux Distros as well as a Mac OSX version.

 

You can read about it in more detail from the Jeffery Snover himself from over at https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/

You can also read the PowerShell Teams blog (which has some great links too) on this at https://blogs.msdn.microsoft.com/powershell/2016/08/18/powershell-on-linux-and-open-source-2/

 

But what does this really mean to you, me, & all the other PowerSheller’s across the globe?

 

Firstly

 

  • Well done – you picked to learn a technology, which celebrates its 10 year old anniversary later this year (November 14th) and that was considered a “Windows only” tool and now it’s not this now means that you could start working with other platforms – increasing your value to your employer *cough maybe its time for that payrise*
  • PowerShell is likely to start to change even quicker (I’m speculating past the Server 2016 launch here)
  • If you don’t want to get involved and help with building PowerShell v6 (& fixing any bugs you find) then you can let this announcement pass you by a little and await an actual real release in the future

However if you do

 

  • You need to start learning some more new things and not just start learning PowerShell as there is now an even bigger ecosystem of tools that you need to learn to be really efficient and successful in IT going forward.
  • You need to learn how to work with source control and I will recommend git like I do to every one else. Check out https://help.github.com/articles/good-resources-for-learning-git-and-github/ for some resources but a google search will get you some others too.
  • You need to learn how to work with Github Issues and search and be able to file an issue that has enough information to be useful for the person reading it. Some of this is helped by Issue Templates but these don’t always capture all the possible required information so be prepared to be asked for more info.
  • You need to start attending User Groups & Conferences and train for your IT Future as the world of IT is undergoing a massive change and this isn’t going to stop anytime soon and if anything the rate of change is going to start getting quicker and quicker.

 

 

So where are we right now?

 

Currently the release of PowerShell to Github is an Alpha release – this means that it is not supported in any case for any production uses at all! Basically its out there for you to “kick the tyres” so to speak.

It also means that at least for now and the near future you may think that you have 2 places to raise issues. Github & UserVoice.

However The Guidance from the PowerShell team is this at present

Customers and Enterprise users should still raise these on UserVoice as this is still where issues relating to the PowerShell Engine that are contained within Windows Client and Server Systems including WMF releases should be raised.

Basically this means for issues that relate to anything PowerShell v5.1 and below should be raised on UserVoice.

My understanding of this is because these versions of PowerShell haven’t been released to Github (we have the changes that have occurred since PowerShell 5.1 was rolled up for WMF 5.1) so changes to them can only be done by the PowerShell team – plus we do need to remember that Server 2016 is still yet to RTM and the Source code for that will have been sealed in preparation for launch. So any fixes to the PowerShell engine included in Server 2016 or the RTM version of WMF 5.1 will come either by hotfixes or a recommendation to upgrade to a stable version of PowerShell 6 once released as we currently have alpha releases available on GitHub.

However For Developers and those that feel comfortable to do so then they can raise issues on Github.

This is where current guidance from the PowerShell team could easily bring a little confusion to some but we have to remember that this is new ground for the PowerShell Team so they will need to have some time to sort out how they work with the different streams. It is likely (& I’m just speculating here) that the team has an internal consolidated issue tracker that tracks UserVoice and all of the PowerShell Repo’s, however be on the look out for a blog post from the PowerShell team at https://blogs.msdn.microsoft.com/powershell in the next few weeks where they will be detailing how they interact with the community across these mediums.

 

So What does the future hold for PowerShell?

 

Over the course of the upcoming months we will see a number of further alpha releases as well as a stronger emphasis on making use of the PowerShell RFC Process for any changes to how the PowerShell Engine works. The PowerShell RFC Process can be found at https://github.com/PowerShell/PowerShell-RFC and there are a few new additions to this already from outside of the PowerShell Team.

 

But the interesting thing from this point on is that there will be more and more of the PowerShell Eco System open sourced including 1 module that I’ve been waiting to tear apart – PowerShellGet – which Jason Shirk confirmed is planned to happen in the future in this Issue https://github.com/PowerShell/PowerShell/issues/1979 – It is also worth noting that a number of the modules that we have inbox on Windows 10 Machines are not written by the PowerShell team so there is likely a chance that the module, cmdlet or function that you have ideas to improve (New-Item is one I’d like to see be a bit more intelligent with folder creation) may not be open sourced – however I think it is a matter of time before we see there be demand for these to be open sourced as well and there are already calls for other modules from other teams to be Open Sourced including the SQLServer module (was SQLPS) which shows where the ecosystem has been going for some time now.

 

Overall I’m incredibly proud to be working with such an amazing product that now has opened even more doors to me than what it had available to it before. You never know what the future will hold but now I have skill that can be used cross platform that means to me that the possibilities in the upcoming months & years of my IT Career are even more prosperous than they were last week.

 

If you haven’t yet picked up PowerShell I would seriously urge you to do so!

If your struggling on how to pick up this language and understand the benefits that it can bring you in your organisation or even personally and are interested in getting some training that is tailored to your needs or your organisations needs then check out my company Re-Digitise at https://www.re-digitise.org

 

I’m looking forward to seeing how the future pans out with xplat PowerShell – what are you looking forward to the most with this?

My Company Re-Digitise website gets a much needed lick of paint

Also over the course of this weekend I have rebuilt the Re-Digitise site from the shell that I threw together on Github Pages to a more modern Site using WordPress as the Backend

 

Please take a few minutes to have a look at the site at https://www.re-digitise.org and get in contact with me if you feel that Re-Digitise could help your company out at all.

 

As per my Previous blog post the site is fully https where it wasn’t before although I am yet to get it set up with Cloudflare for the CDN side of things.

 

Now to start building some Promo materials for Re-Digitise Smile

Minor Blog Update – now HTTPS by default!

This Sunday I set out to force my blog hosted on Azure to be Https by Default and I mainly made use of the following Article by Troy Hunt on the underlying implementation which makes use of Cloudflare but I’ve also decided to get it set up ready for if I may want to move away from CloudFlare to Azure CDN in future.

 

There really isn’t to difficult to do this especially if you follow Troy’s post. It is something that can be completed in a manner of hours and best of all it is a free service from Cloudflare to enforce HTTPS and you get the power of a CDN built in too.

 

This means that the core items of my blog site load much quicker than they used to which is good for everyone that visits in future.

 

There are a few little amendments that you need to do on the Azure WebSite side with the Web.Config file but with it being an addition as simple as below I’m sure that it wont be something that trips people up in future

 

<rule name=”Force HTTPS” enabled=”true”>
          <match url=”(.*)” ignoreCase=”false” />
          <conditions>
            <add input=”{HTTPS}” pattern=”off” />
          </conditions>
          <action type=”Redirect” url=”https://{HTTP_HOST}/{R:1}” appendQueryString=”true” redirectType=”Permanent” />
        </rule>

 

You do need to also get your own SSL Certificate if you are using your own Domain name as by default there is a Wildcard SSL cert for the azurewebsites.net domain and I decided to go with DigiCert – http://digicert.com/ – for this as opposed to the Cloudflare cert that you could go with.

 

Hopefully now my blog will load a little quicker for you all Smile

Functional / Non-Functional Pester Tests and why I think you really should have a form of both.

So in this blog post I’m going to cover why there is a need to create Functional & Non-Functional Pester Tests for your PowerShell Modules but before I get into the nitty gritty into the whys behind creating both let me explain what the real differences are between the two because it may not be something that you have previously thought about or considered in your journey up until this point.

 

Functional

  • Used to test the code’s different use cases
  • Can be either be a form of Unit or Integration Test
  • Where we “Mock” the functionality to confirm it works as expected
  • To Determine the level of code coverage that your tests actually hit
  • Makes Functionality changes simpler and easier going forward as long as you write more Functional tests
  • Should save headaches as code moves between environments as part of a Build/Release Pipeline
  • Provides a Documentation Mechanism to catch either bugs so these can be fixed
  • Provides a Documentation Mechanism to potentially highlight where you may be able to make possible improvements

 

Non-Functional

  • Can be more referred to as “Traditional Documentation”
  • Aids Newcomers to the code base by being suggestive that you provide some useful help documentation
  • This can also aid newcomers in learning how to understand some of the more advanced functionality
  • We get Validation on the Functions Parameter types – i.e should the parameter be a String for input
  • Confirmations on whether the Parameter a Mandatory Parameter or not ?
  • Gives us a basic form of ParameterSet Validation
  • Gives us a basic form of Parameter Position Validation
  • Does the Parameter Accept Pipeline Input ?
  • Does the Parameter Accept Pipeline Input by Property Name ?
  • Does the Parameter use Advanced Validation at all ?
  • Does the Parameter have at least some help text defined ?
  • Does the Function have at least a basic level of Comment Based Help ? – lets leave the pro’s & con’s for another topic shall we.

 

So with the additional amount of tests that we may have to write from looking at the above why should we spend the time writing these tests?

This is where the story for Non-Functional tests becomes a little hazy in some ways but it really depends on the situation on how you’ve ended up with this module.

 

These possibilities can include

You’ve Inherited or downloaded someone else’s code and you have no clue what its doing because it’s

  • Not well documented with little or no help
  • Difficult to read because of the formatting
  • Uses a number of privately scoped functions
  • All the functions are either in a single ps1 or psm1 file
  • Just needs to be refactored to make it easier to manage, maintain & update going forward

Or it may just be that

  • It almost does what you need but you need to extend the functionality
  • You want to dig a little deeper into how it works
  • You are possibly continuing a discontinued open source project
  • Or you are looking at your own older code and want to give it a much needed update considering you’ve become a more experienced scripter than you were when you originally wrote it

 

If you were to go and create all the Non-Functional Tests that I’ve listed above then this will give you a lot of additional tests (& I mean a lot) that you would then have available to you to provide you some more trust in your code whilst you refactor or just understand how all the bolts fit together.

However I will point out that from this is really meant to provide you with a Singular Set Baseline on what is included in the module and not how the Module actually functions as that’s the role of the Functional Tests to do so.

 

In my next post I will show you how we can automagically create these Non-Functional Tests for each function included in an existing Script Module, including those functions that are defined as private/internal functions to give us a better chance of being able to manage, maintain & update it going forward.

Recap of a Long February, March, April and May – Events Events Events!

I had intended that I would be doing a recap type post at the end of every month however I’ve been very busy so haven’t been able to do so for a number of months – that and I had an issue with my blog being offline for a few weeks.

Let us start with a recap on the amount of events that I managed to attend and I think that you can see that I did a lot of travelling and attending a number of different user groups.

I attended the following events

  • Get-PSUGUK – Manchester – Feb 1st
  • SharePoint User Group – Manchester – Feb 2nd
  • Azure Security Training Event – London – Feb 3rd
  • SQL User Group – Manchester – Feb 3rd
  • Get-PSUGUK – London – Feb 4th 
  • Mississippi PowerShell User Group – Feb 10th – Online
  • What’s New in Server 2016 – Microsoft Training Event – London – Feb 17th
  • What’s New in Windows 10 – Microsoft Training Event – London – Feb 18th
  • WinOps Meetup – London – Feb 23rd
  • Chef Meetup – London – Feb 24th
  • Cloud Roadshow – London – Feb 29th – Mar 1st
  • Azure User Group – London – Mar 1st
  • Manchester Geek Nights – Agile and Tech in Local Government – Mar 3rd
  • SQL Sat Exeter – Mar 12th
  • Lean Agile Manchester – Mar 16th
  • SQL User Group Manchester – Mar 17th
  • Manchester .Net – .Net Core recap – Mar 22nd
  • SQL User Group Cardiff – March 30th
  • MCR Tech Event Organisers meet – Apr 7th
  • SharePoint User Group – Nottingham – Apr 12th
  • PSConfEU – Hanover, Germany Apr 19th – 22nd
  • Get-PSUGUK Manchester – Apr 25th
  • Get-PSUGUK London – Apr 27th
  • MVP Open Day – Apr 28th – 29th
  • SQLBits Sat – May 7th
  • Get-PSUGUK Manchester – May 23rd
  • WinOps Conf London – May 24th
  • UKITCamp London – May 25th
  • SQL London User Group – May 25th
  • Get-PSUGUK London – May 26th

So in the space of the beginning of February to the end of May I attended 30 different User Groups, Training days or Conferences and that wasn’t all the ones that I had planned either due to some unfortunate illnesses that occurred as well.

Now those that know me will know that I attend the events because I’m genuinely interested in the topics at the events or catching up with the people that are there after all the events are all about the community and the networking opportunities that they bring to us.

I intend to in future post ahead of where you can catch me in the following months via the Find Me At page and then at the end of the month detail more about what I learned at the events.

Before I go into detail on the events and what happened at them just take a moment to look at the types of events that they are and the breadth of technology that they span. This may give you an insight into the differing technologies that excite and interest me going forward. 

To start Get-PSUGUK Manchester on Monday Feb 1st which seems a long time ago but is still an event that I can vaguely remember enough to post about. I presented the initial version of my “Teaching the IT Pro how to Dev” Session where I introduced my ISE_Cew Module to the Audience for helping with getting to grips with using source control with Git and unit testing with Pester. We also had our first community speaker Tim Hynes @railroadmanuk who presented on Automating Infrastructure using PowerShell with various Infrastructure API’s that he’s been working with including VMWare, Cisco & NetAPP devices. You can find his presentation at https://github.com/railroadmanuk/presentations and not long after Tim was awarded VMWare vExpert. I know he’s presented at other events since and I’m looking forward to seeing what the future holds for Tim.

Then on Tuesday Feb 2nd SharePoint User Group in Manchester will always be a group that is close to me as it was the first user group to give me the possibility to present at which you can read more about here – though this was a night about “What you need to know about SharePoint 2016” by Heath Groves @Heath_Groves and Building Enterprise Platforms by Andy Talbot @SharePointAndy – you can find Andy’s slide deck at http://www.sharepointandy.com/?p=550

Heath gave us a rundown on all the things coming in SharePoint 2016  and even prepared some take-me-homes which included the New and Removed PowerShell Cmdlets in SharePoint 2016. Andy’s session was a good thought provoking session for those that have dealt with SharePoint in the past and there are some really good points in the slide deck that are applicable to a number of different areas of IT. You can tell this deck was put together with the pains that Andy will have personally felt working with the number of different IT Departments over the years and a number of them I have felt as well as will a number of you too. Even if your not a SharePoint person go and have a look at the deck and see if it resonates with items that you feel in your day to day IT lives.

Next up on Wednesday 3rd Feb it was an early morning with a 5:15am train from Manchester to London for an Azure Security Morning at Microsoft’s offices at Victoria – this is an area that more people need to put time into and I’m looking forward to seeing some further work in this area and mainly more so from Microsoft. Saying that Microsoft recently released the Azure Security Information Site on https://azure.microsoft.com/en-us/documentation/security/ so go and have a look at it as there is a lot of good information in there. However the Security morning was a good event although I felt it would have been better as a full day event especially as there were a number of issues with getting the interactive demos/labs up and running with the Barracuda security devices mainly due to issues in the Scripts that had been provided to set everything up. They should have written Pester Tests for these scripts as I had gotten the impression that the scripts were recently updated for a recent release of the Barracuda security devices. Some of the attendees managed to get things set up however I was unable to which was not ideal.

I then had to leave London around 14:30 in order to get back to Manchester in time for the SQL Server User Group that evening. Now everyone that knows me knows my SQL knowledge isn’t close to be on par with those that live and breath SQL every day however one thing all platforms require is a data backend of sorts. So I’ve pushed myself to attend more and more SQL events where possible (as you’ll gather from the rest of this post as well) so that I can learn more about this crucial technology and be able to implement and use it in my own adventures going forward and one of the area’s that has piqued my interests is PowerBI and I was glad to be able to get what was a real beginners crash course into PowerBI by what I can only describe as an Awesome Instructor – Adam Aspin. We also had a session on SQL Server Wait Stats by Rainer Unwin which was an interesting although perhaps a bit too technically in depth for me to fully follow at this stage of my interaction with SQL Server – though I’m sure it will be something that I come back to in future.

Then the next day Thursday Feb 4th, I had to travel back down to London from Manchester for the London PowerShell User Group at RackSpace just out of Hayes and Harlington, where I also presented my Teaching the IT Pro how to Dev session with a bit of an update to it from the Manchester session. We also had Rudolf Vesely @RudolfVesely from Rackspace give an Introduction to Pester which was a great session for the audience – Rudolf will be presenting to the London group again in future on a more in depth session on Pester so look out for this.

On Feb 10th I was lucky to present to the virtual Mississippi PowerShell User Group where I Presented the Teaching the IT Pro how to Dev session – this was recorded and I’ve blogged about it in a bit more detail here.

I then attended the UKITCamps in London on Feb 17th & 18th on the What’s New in Server 2016 & What’s New in Windows 10 topics and although these are camps that I’ve previously attended there are a number of labs in there that are good to have chance to run over and replay. I also enjoy the UKITCamps as these are Microsoft delivered training days meaning that there are a number of others there that I get chance to network with along with also getting chance to catch up the guys running them, namely Ed Baker, Marcus Robinson and Andrew Fryer. I was also very lucky to get chance to head out for a meal with Ed, Marcus & the other members of the DX team that work behind the scenes to put on these events. I for one look forward to the events and them being put on by the guys in the DX Team and now how difficult it is to arrange events like these. This is before you include preparing the Slide decks and the labs that are to be used in these events. Hopefully we will see more of these events in future however there aren’t any currently planned so we will have to wait and see if more of them appear in future.

I then had a just under a week until my next event which was decided last minute where I was to present my Teaching the IT Pro how to Dev session to the WinOps group in London on Feb 23rd which was great however I suffered from a failed MicroHDMI to HDMI Adaptor so I had to try and move my demo and deck to Stephen Thair from DevOpsGuys Laptop and as per the standard developer line ‘Well, It worked on my machine’ I was unable to show the demo’s working. This has lead me to build a VM in Azure and a second Hyper-V VM for any demos that I want to run in future to ensure that demos work – Also I’m planning getting a dedicated presentation only device which I’ll wipe between events to ensure that all runs as expected along with a few backup cables & Adaptors to have with me.

Then the next night attended the Chef Meetup where I was introduced to GoCD, Terraform & Kubernetes – all look like interesting technology but I need to get a reason to get in deep with any of these technologies so look forward to me possibly blogging on these technologies in future.

I then Attended the London leg of the Microsoft Cloud Roadshow on Feb 29th & March 1st where there were a number of different sessions on throughout the event with tracks covering most of Microsofts technologies with a number of them focused on the SharePoint/Office365 ecosystem and the Azure ecosystem. The highlight of the event was the ability to go and have a few drinks with Joey Aiello one of the PowerShell PM team who was over from the US for the Cloud Roadshow. It was good to be able to have a face to face chat and I’m sure in future that there will be more chances to chat including the MVP Summit. Joey is younger than I am and is rocking a very good role at Microsoft – Imagine being part of the PowerShell Team – that is a number of peoples dream jobs and I would be lying if I were to say that I wouldn’t find it amazing to spend my day working even more with PowerShell than I already do. However as an MVP I do get that luxury already although it would be a very different role to the one that I’m doing. Who knows what the future holds but I know that for me it will likely involve PowerShell for a number of years if not decades to come.

I also dragged a few people to the London Azure User Group that was happening on the evening of March 1st where we were introduced to Boris Devouge, Director of Open Source Strategy at Microsoft and I can only describe him as a ‘Fluently Funny Frenchman’  which make his presentations engaging and as this was on the new Azure Container Service (it’s an Azure User Group after all) it was interesting to hear of the partnerships that Microsoft have been recently making in this area with the push to make Azure the most open source friendly cloud. The Azure Container service was in public preview (I think) at the time of the presentation however it has since been made Generally Available and you can learn more on ACS on this post on the Azure Blog site https://azure.microsoft.com/en-us/blog/azure-container-service-is-now-generally-available/

I next attended a talk in Manchester on March 3rd at Manchester Geek Nights on Agile and Tech in Local Government delivered by Stockport Council where I was lucky to bump into my good friend Ethar who always has a good story to tell. I must get chance to catch up with him again when I’m next in Manchester and not just there on a flitting visit. The Talk by Stockport Council left me realising why our Governments, Local & National, get a lot of stick for being poor at delivery and execution of their IT projects (& projects in general) and this is because there is so much fragmentation in the IT Systems being used across all differing councils due to them all having separate and diminishing IT budgets to do any projects. I personally think that Centralisation of all of the UK Council & Local Government IT into a single pool would work much better for the public and my reasons for this are pretty simple, Enhanced Governance, Lower Boundaries to sharing data between the different departments that need to share data Nationally (think Social Care departments, Housing Departments etc) and Generally a simpler to manage Infrastructure and Workforce. Though perhaps I’m biased being from a Microsoft background which means that I can see some opportunities to scale similar services nationally which would be massively more cost efficient. Almost all the banks have done this and realised the benefits and to me it makes sense for the Public Services Sectors to do the same too! It was however interesting to hear about how Stockport Council are embracing Open Source technologies and essentially building out their own products which they are in turn open sourcing for other councils to take advantage of too. Its an interesting journey for them to take and I hope that the effort doesn’t end up being completely canned in a few years time if a Nationalisation of IT Services to Councils were to occur. It in my opinion is a logical step for this country to take though I’m not sure politicians and logic can go together. We will have to wait and see.

 

SQL Sat Exeter – March 12th. Well I’m not really sure I need to say any more than that really. However it was a great event and my first event doing a back to back demo heavy session on PowerShell DSC. Even more scary it was DSC but for SQL Server. I hadn’t realised how much of a headache the SQL Server DSC resources were until I spent the majority of the week leading up to it getting annoyed with little things like hardcoded values for where the Resource expected the Install media to be. I got that frustrated with it that I began to rewrite the resources so that it would work how I expected it to work which meant that I spent more time writing DSC Resources from scratch than actually doing anything useful. Especially as a week or two after SQL Sat Exeter I wiped the drive with the resources on them. Yes they were in Source control but only on that machine – lesson learned – DOH!!!

SQL Sat Exeter was my first real forage into the SQL Community events except User Groups and I after the fun I had with them at Exeter I can see why it is they call themselves SQLFamily. In the lead up to my sessions there was a run around to get some bacon sandwiches and a fair amount of drama with my demo’s having decided to kill themselves that morning – However I managed to get them working before my session and there was some good reviews come from it. I know where I need to improve the content and will be looking forward to SQL Sat Paris in a few weeks where I will need to cram all of the information from 2 hours into 45 minutes. #ChallengeAccepted

It was also the Saturday night at after event Curry & following drinks that the discussion about SQL Sat Manchester having a PowerShell Track came to fruition. I was lucky enough to have ended up out with Chris Testa-O’Neill and the other organisers at SQL Sat Manchester the year before (my first SQL Sat event and I went as an attendee) so it all felt natural to be there along with a number of other familiar faces like Rob Sewell and Steff & Oz Locke. Its like a reunion and I’m looking forward to what will be a kick ass SQL Sat Manchester this year. The PowerShell track shaped up nicely Smile. One thing I’ve learnt about the SQL Community is that it really does kick ass but then again all the IT Communities I’m a part of do. Our Passion brings us all together and with it we ensure to have a bloody good time when we get together. Else why bother?

On the Sunday morning I had an interesting email come in as I was sat having breakfast which lead me to question it a little with Chris & Alex Whittles and well history has been written since that morning.  I also got chance to help Rob out with a DSC issue he was having and gave him the guidance that he needed to resolve his issue in the right way as things currently stand and in future we will have a feature complete PowerShell DSC Resource for SQL Server – though this will require some community help and you can help out by voting on / adding items to the Trello board at http://sqlps.io/vote

Next up on my events (and half way through the 30 events I’d attended) was LeanAgile Manchester on March 16th – a firm favourite of mine as its a great community (like they all are) where we were treated to a talk by Jon Terry – but not that Jon Terry! – from LeanKit about how the deal with working in a Lean\Agile way with their FSGC (Frequent Small Good Decoupled – said FizzGood) approach. It’s another example of where the Software/manufacturing world bring good things to the rest of IT and generally other areas too and I would highly recommend that you go and read their blog on FizzGood at http://leankit.com/blog/2015/07/does-this-fizz-good/ and take away from it what you can.

Next up on my User groups that I attended was the Manchester SQL User Group where we would be walking through Cortana Analytics which I was looking forward to as at SQL Sat Exeter Chris Testa-O’Neill & Cortana essentially got a divorce whilst he was in the Speaker Room prepping at SQL Sat Exeter. I’m sure with a decent set of data I’ll be able to find a good use case for Cortana Analytics and I have some ideas in the pipeline so keep an eye out on future posts on this.

As an Non-Dev Admin who realised that I am really a Dev just wasn’t ready to admit it to myself, I find that the .NET User Group in Manchester is a useful group to attend especially when the topic is about .NET Core which it was on March 22nd. Even more so as with .NET Core there is a real possibility that the PowerShell Engine will eventually be open sourced especially as we are seeing a refactor of the existing Cmdlets to be able to be run on Nano Server with more and more coming each new TP and more to come for Server 2016 GA. We were treated to a history lesson on .NET Core by Matt Ellis @citizenmatt with the slide deck at http://www.slideshare.net/citizenmatt/net-core-blimey-windows-platform-user-group-manchester and again is well worth the read.

Next up was just after I had moved from Manchester to Derby and still had the hire car – and I had an itching to go see some of my SQL friends in Cardiff – especially as it was an epic event – Return of the Beards! This only means that not only did I get chance to catch up with Steff Locke again but also with Rob (again – it seems like that guy gets everywhere Winking smile) and also another one of my SQL friends Tobiasz Koprowski and lastly the other bearded SQL guy of the night Terry McCann. This was where I got to learn a bit more about TSQL from Terry and Securing SQL in Azure from Tobiasz but also see Rob’s session on the pains of Context Switching and how PowerShell & PowerBI help him not get mithered for information that can be easily made available and easily searchable with a little effort. This is for me a great example of real world use of PowerShell and PowerBI being useful together and well worth watching Rob deliver this if you can get the chance.

I then attended my first Tech Organisers Meetup in Manchester on April 7th – it was good to meet the other Tech User Group Organisers in Manchester/NW area and have the discussions that was needed as a collective to help strengthen the view that Manchester is a blossoming Tech Hub in its own rights – something that Londoners seem to miss out on. Manchester is ace because it’s cheaper than London and is actually more lively at night than London (I’ve found) and you can literally walk from one end of the main city centre to the other in about 20 minutes or so and within that you have the Northern Quarter. So you are pretty much sorted!

Next up I had another event I presented at – The SharePoint User Group in Nottingham on April 12th. I presented on PowerShell DSC for SharePoint like I did at the SharePoint User Group in Leeds in January but this was a special one for me as it was the first User Group that I presented to after being awarded MVP which being awarded on April fools day lead me to post this post Congratulations 2016 Microsoft MVP at 15:31 about 10 min after getting the Email and then Fooled Ya – Today I became a MVP at 15:55  – I also blogged Awarded the MVP Award – What it means to me and the future for the Community. We also had a talk from Garry Trinder @garrytrinder on Require.JS which can be used in conjuction with MDS (Minimal Download Strategy) in SharePoint 2013 and Online Sites to help bundle up and control your page load and transition times. Javascript is one of those dark arts that I’ve not had much more I’ve needed to do with it – but I certainly would look to use Require.JS in any of my future web projects.

My next event was PSConfEU and this was the event that I had been looking forward to because of the sheer work that went into it by all involved, including Tobias Weltner and myself to make it a success. Due to the size of this event I will put together another post in the coming days that really captures the details on what an amazing event that it was as I don’t think that a few sentences will do it any real justice. Plus I want to relive the experience in as much detail as I can so that I can share it with you as well – so that if you weren’t able to make it then hopefully you’ll do what you can to make PSConfEU 2017. Planning will begin for PSConfEU 2017 most likely early August so there will be small announcements some point after then though its still all to be determined.

From the spill over from PSConfEU I had managed to bribe June Blender to agree to come and present at the Manchester & London PowerShell User Groups – though to be honest there wasn’t much bribing involved as June had wanted to come to Manchester anyway and timing wise it just worked out great. June gave her Thinking in Events hands on lab at both groups and both groups had some great questions and I’ve had some fantastic feedback from the sessions which has lead me to start working on preparing my own hands on events for in the future. These are “in the works” so to speak and details on these will start to appear in the next few months.

Next up was my first MVP event where we went to Bletchley Park – a fantastic historical site and I’m planning to head back there again in future. The event was good for me as it allowed me to meet up with other UK MVP’s including fellow PowerShell MVP Jonathan Noble. There is a good story behind how we ended up meeting on the train up from London to Bletchley Park and it starts with me forgetting to charge my Laptop and Phone the night before. When I got to Euston I was frantically trying to make sure that I got on the right train to get to Bletchley. I had messaged Jonathan whilst on my way and had found out that we were catching the same train to Bletchley. However, phone signal is pretty poor when you are travelling out of London and just before my phone died I managed to send him a message letting him know I was about half way up the train. About 20 minutes passed and then all of a sudden this guy two rows in front of me got up and came to me and said “Hello – its Ryan isn’t it? I’m Jonathan only just got your message” and from that moment we just continued chatting. When we got to Bletchley Jonathan was able to lend me a power bank to charge my phone not that I really needed it but having charge on your phone is now a comfort thing isn’t it. We had  an afternoon of talks and then a really nice drinks and dinner where I got chance to meet some more of the MVPs which was good. We then next day had some presentations in the morning and then we had to make some Rocket Cars in the afternoon. It was great fun to something less techy but still something that most enjoyed. I was lucky to be able to get a lift from Alex Whittles from Bletchley along with Steff Locke to Birmingham New Street Station which allowed for a number of good conversations about SQLBits & SQLRelay. Both being events that in future I may get more involved in – if I can manage to stretch that far that is. Once Alex dropped me and Steff off we worked out that we either had half hour to try and get something quick to eat before running for our respective trains or we could get something decent to eat and then get a drink afterwards before catching the train after that. Naturally, decent food and drink was always going to be the winner Smile.

 

Nearly Finished with the Recap with just 6 events left to cover, so If you’ve read this far well done you can manage to make it to the end Smile

 

I then attended the SQLBits Saturday event on May 7th in Liverpool and although I got there not long before lunch I was still able to get to the sessions that I wanted to get to – mainly the SQLTools session as seeing that SSMS has been decoupled from the SQL Server Install – which is 100% the right thing to have done. Like other SQL events I bumped into Alex, Steff, Rob (he is literally everywhere Winking smile), Tobiasz & a number of other SQL people including Mark Broadbent, Niko Neugebauer, André Kamman, John Martin, Mladin Prajdic & Neil Hambley to name just a few. As per all these events once the curtains for the event has closed that is when the Food and Drinks appear and I’ve realised that I have a soft spot which stops me saying no to going for a Curry & Drinks with all these amazing people. This means that future events I’ll be planning to stick around for the almost guaranteed after Curry and the ensuing drinks and conversations that happen around them.

I then had the amazing opportunity to meet and spend a few hours with Ed & Teresa Wilson – The Scripting Guy & Scripting Wife – where I took them for a wonder down to the University of Manchester Campus and took them to KRO – a nice Dutch place for some food which was right round the corner of where I used work when I was at UoM. We then strolled leisurely around the campus on the way back towards the venue for the User Group where we had Ed talking us though OMS & Azure Automation DSC now that Ed is a part of the OMS team at Microsoft. Due to the fact that we had to get a Train to London at 21:15 the user group was an hour shorter than it normally would be so we didn’t have time for pizza and the normal after drinks that we would have normally done but the turn out was still one of the best turnouts we’ve had and there will be more events like it planned in future as well with an aim to make the next Manchester User Group occur in July.

As I mentioned Ed, Teresa and I all had a Train to catch to get to London for WinOps, and much like PSConfEU, I am planning to blog about this event separately to really capture the spirit of the event. Look out for that post in the next week or two.

 

We then had the UKITCamp which Marcus Robinson & Ed were going over the feature sets of Azure & OMS. I unfortunately missed the morning of this event due to being called onto a customer production issue conference call – 3 hours of my morning I couldn’t get back however sometimes that is how these things go and as I was leaving the Venue I found out that there was the London SQL User Group on that evening and I decided to stick around for it as the topic was “Common SQL Server Mistakes and How to Avoid them” which is the kind of SQL topic that I enjoy because it isn’t deeply technical but allows me to understand the product just that little bit better than I did beforehand.

Lastly The London PowerShell User Group, which we had Ed at again and had the highest turnout so far. Ed again was talking about OMS & Azure Automation DSC but also had a number of opportunities for some open directed questions from the audience which is always an added bonus of having more & more people turn up to the group. We over run a little with the conversations that were flowing mainly due to having an excess of beer and pizza due – something that we haven’t had happen before at the user groups. Then as per usual with the User Groups we end up finding somewhere else to go for another drink or two and continue the conversations.

 

So thats most of my last 3 months summarised – what have you done in the last 3 months?

Future posts like this will be much shorter, contain some pictures and be competed on a monthly basis.

Thanks for reading – Hope you have a great day!

Creating a set of simple Pester Tests for existing or old PowerShell Modules & making them easier to update in future.

I have long thought of a way to Automagically create some Pester Tests for the Functions contained in a module that perhaps was developed before Pester was really well known.

At that Point we may have been creating psm1 files that contained a number of nested functions within them. I know for one that I am one that did this / added to existing modules that were built this way – have a look at SPCSPS on Github or aka SharePointPowerShell on CodePlex as one of the first projects that I got involved with in the Open Source world.

*Please note I would highly advise to check out the OfficeDevPnP team work for any real SharePoint PowerShell work instead of the example I have given at PnP-PowerShell *

However this is where we are at with a number of older modules and as expressed prior about SPCSPS this was a way that was exceptionally common to run into.

However this isn’t a very scalable way of working with existing codebases and as very frequently found in the PowerShell community an author will not be able to spend the time reviewing the code and accepting pull requests from others. I have previously blogged about the need to “Pull The Community Together” to remove this totally unneeded & actually quite ridiculous barrier to better Modules for the benefit of the community. From a personal stand point –  A PR to an Open Source repository that is open with no input at all for more than a month shows that there is no value in adding to that prior Repository as it shows the Repo Owner has little/no time to do the needed Code Reviews etc.

Now one of the ways that we as a community can negate this issue is to build a stronger collaborative platform for these modules and build teams of people that can be relied on to perform cohesive reviews on various aspects of the code being added. By a Platform I mean a collective of ALL of the community to work out who all are the right people to get involved in the differing areas of the PowerShell Language.

Funnily enough GitHub within Organisations has this model already defined – called Teams. This allows us as a community to have an overarching organisation that will allow us to add the right people to get involved in discussions about certain semantics as we move on in time.

This essentially is a massive change to how we as a community do things however at this point in time really is the best way forward to minimize duplicated effort across multiple codebases and to ensure that we have the best & fully functional modules out there for the others in the community to work with.

Again please read my previous post “Pull The Community Together” on my thoughts on this.

Anyway back to the actual topic of this post. And from this point on I will be using the SPCSPS module as a good example that could be worked with as we find modules can currently be across Repo’s etc

So with SPCSPS I have 103 Functions that are put together in 11 psm1 files. Here are a few Screenshots just to back this up.

Although this “Works” its not great when there maybe a number of additions to 1 file (New Functions, Removing existing functions or Rewriting functions completely) and this can be an easy way for merge conflicts to occur – which we do not want.

So to get round this I realised that the only way was to write a function that will Export all the Functions (not Cmdlets) from a Module and whilst doing this will create a basic pester test for each of the exported functions into a User Specified Folder. My Reason for choosing to do it this way was to allow users to check the exported code before merging this into their existing codebases even though I am actually quite confident that this will work as expected.

This will allow users to refactor any of their existing code much easier going forward and will allow them to also benefit from having some basic pester tests that they can then expand upon.

 

The key component of this is the Export-Function function which has 2 parameters

  • Function – As a String
  • OutPath – As a String

Under the Hood the Export-Function function will when passed the Function Name & the OutPath will get the Function Definition and all the parameters from the Function and will then create the below files based on the following structure.

OutFilePath\FunctionVerb\FunctionName.ps1

OutFilePath\FunctionVerb\FunctionName.tests.ps1

Technically this isn’t actually difficult for us to do at all (hey we are using PowerShell right) but will allow us to quickly and easily add tests to existing (or new) code with little amount of effort and as a PowerShell Enthusiast this is exactly why I started working with PowerShell in 2013.

As a small note this will only work with public functions – though if you were to explicitly load private functions into the current session in a way they become public then you could use this to do the same for those as well.

The module is available on the PSGallery called PesterHelpers and is available on Github under https://github.com/PowerShellModules/PesterHelpers

The benefit of this module is that it can allow a quicker way to move away from modules that contain multiple functions in 1 psm1 file (or nested ps1 files) and can be used to help start to build a test suite of Pester Tests when used with the accompanying PesterHelpers.psm1 & PesterHelpers.basic.Tests.ps1 files for other modules. Is is possibly by modularising as much of the code in both of these files as possible.

A shoutout must go out to Dave Wyatt for a section of code that was contributed to ISE_Cew a while back that on a review whilst looking to expand that module lead me onto creating this Module.

 

 

How to find Local User Groups & events – My Experience

I had a discussion last night via twitter with one of the attendees that I met at the Microsoft Cloud Roadshow in London earlier this year and the outcome of the conversation was that although I find it easy to find out about events – this isn’t all that common for others.

 

So I decided that I would quickly jot down some of the places that can be useful to search to find events that are going on around you.

  • Word of Mouth – If you know a number of people in the area ask them if they know of any events going as they will likely be closest to the events.
  • Twitter – There are a number of Twitter accounts out there that are just setup to serve what’s happening in your area. A good example is the @TechNWUK twitter account which lists all the events around the North West that the group knows about.
  • Eventbrite – www.eventbrite.co.uk is another good place to find tech events – especially those that are full day events or conferences – Just do a quick search for a specific Technology and you’ll get some results back on upcoming events around you.
  • Meetup – www.meetup.com is another and increasingly more common area for User Groups to promote themselves on. Similar to Eventbrite but a much more social feel to event listings. You can also find many more non-techy events listed there which can be very interesting and useful. My only gripe with Meetup is the admin cost for setting up a Meetup group which at $89.94 per 6 months for the unlimited subscription isn’t really what I would call reasonable for a user group marketing channel though this does allow multiple groups under the 1 subscription so can be shared as part of a collective – like Get-PSUGUK
  • Facebook & LinkedIn Groups – Both of these can also be an avenue for finding out about User Groups or events.
  • MSDN Events – http://events.msdn.microsoft.com/ – this can have a number of the Microsoft focused events on there as there is the ability to register as a Technical Event lead on https://www.technicalcommunity.com/ and this allows you to get the event posted to the MSDN events pages

 

If you still can’t find any events around you then I would suggest to try the following

  • Speak with those that you recognise from the community – this could be a Twitter DM etc but is normally a good starting point as they may know of events that already exist or are in the initial starting up period
  • Try and reach out to organisers of similar events as they may likely know of one starting up soon in that area or it may just be advertised in a manner other than the above, this is especially more common when you are in more broader focused technology like the various JavaScript frameworks.
  • Broaden your search area as some user groups will try not to have meetings too close together. Examples of this would include having groups in Birmingham & Wolverhampton.

 

Lastly good luck in your search and if you still haven’t found a User Group around your area then why not think about setting one up? If there is already similar communities out there in other areas then reach out to the organisers of those events and see if they can provide any guidance.

The Pains of Poor/Missing Documentation

There will be a time where you are attempting a new task, whether that is personally or professionally and you find yourself having to resort to the documentation of the product to get to the end goal, whether that be to put together a new piece of furniture, preparing an exquisite meal or bashing different bits of software together from different companies or more commonly the same company.

One thing that is common in all these scenarios is that if the documentation is completely missing then you are forced down the road where you take the “pot luck”/”educated” guess to get to the desired end result and sometimes that can lead to some hilarious results, especially if it is in relation to cooking or building furniture.

In personal experience this has been most common with second-hand furniture and this is because there are few people that keep their assembly instructions once the furniture has been assembled. I think this is due to the “I’ll never need to take this apart and build this again” thoughts that we like to have.

This mentality as it were is what is rather similar in the IT world as well and it is because of this that we have seen lots of undocumented software features. Anyone who has worked with the SharePoint Object Models in much depth will be more than familiar with idea of missing documentation.

 

In the IT world this is something that we have all understood and realised was an issue and at some point in our careers we’ve all been on the receiving end of a lack of documentation or poor documentation and when it happens we’ve either had to turn to technical forums or write it ourselves.

Over the years this has started to get better and I for one am glad to see the initiatives that Technology Organisations are taking to start Open Sourcing product documentation. A number of Teams at Microsoft are doing this now via Github and this to me reinforces the need for all IT Pro’s & Developers to understand how to use Github & the underlying Git software as a part of the core tools within their tool belts. In 3 years time I wouldn’t be surprised if other Source Control mechanisms like SVN & Mercurial have almost been fully replaced by Git. It says something that Microsoft have fully adopted Git into both the Hosted and On-Premises versions of TFS.

So if you read this blog and you haven’t learnt Git yet but are writing PowerShell – go and watch this Session that I did for the Mississippi PowerShell UserGroup as detailed in this previous post and read up on the “My Workflow With Git” Series starting with this post

 

We are at a good point in time where the people behind the products we love and use each day are listening to us in a much more open way than previously and over the coming weeks I’ll be updating the following site with all the Microsoft UserVoice / Connect links and in a nicer format than they currently are.

If you want to help and get involved then drop me a message and I’ll get you added to the Organisation to be able to add commits

Building A Lab using Hyper-V and Lability – The End to End Example

Warning – this post is over 3800 words long and perhaps should have been split into a series – however I felt it best to keep it together – Make sure you have a brew (or 2) to keep you going throughout reading this

In this post we will be looking at how you can build a VM Lab environment from pretty much scratch. This maybe for testing SharePoint applications, SQL Server, Exchange or could be for additional peace of mind when deploying troublesome patches.

Our requirements for this include

  • Machine capable to Run Client Hyper-V – Needs SLAT addressing (most machines released in last 3 years are capable of this)
  • Windows 8.1 / 10 / Server 2012R2 / Server 2016 TP* – In this post I will be using Windows 10 build 14925 – ISO download is available from here
  • If using Windows 8.1 then you will need to install PowerShell PackageManagement – you can use the script in my previous post to do this as detailed in here
  • A Secondary/External Hard Drive or Shared Drive – this is to store all Lability Files including ISO’s, Hotfixes & VHDX files

Where do we begin?

Obviously you need to install your version of Windows as detailed above and once you have done this you can crack on!

Time Taken – ??? Minutes

However as mentioned I’m going to Use Windows 10 – This is just personal preference and is for my ease of use.

As you hopefully know by now Windows 10 comes with WMF5 and therefore we have PackageManagement installed by default. We will use this to grab any PowerShell Modules that we need from the Gallery. I personally have a Machine Setup Script that lives in my Onedrive as you can see below. As this is a Windows 10 Machine I am logging into it with my Hotmail credentials – this then means that I am able to straight away pick the folders that I want to sync to this machine (joys of the integrated ecosystem)

This takes about 5 minutes for OneDrive to finish syncing and then we are ready to go onto the next step.

Time Taken – 5 Minutes

Lability1

In this stage I will Open ISE with Administrator Privileges – this is required as I need to change the Execution Policy from Restricted to RemoteSigned as well as run other scripts that require elevation.

Once I have done this I can move onto the next step. This includes setting up my PowerShell Profile and Environment Variables and then setting up all the required functionality for me to continue working on this new machine.

This includes setting up the ability to install programs via Chocolatey like VSCode & Git and installing Modules from the PowerShell Gallery a few examples being ISE_Cew, ISESteroids, & importantly for this post Lability . Also It is worthwhile to note that at this point I am not downloading any DSC Resources as part of my setup script – this is because we will cover this later on as part of the workings of Lability.

As an additional note it is worth mentioning that the Version of Lability at the time of writing this article is 0.9.8 – however this is likely to change in future with more features being added as required. If you have a thought or suggestion (or issue then head over to the Github Repo and add your suggestions / issues.

I am also in this script enabling the Hyper-V Windows Feature to enable me to carry on with this Lab. I then initiate a System Shutdown. Overall this whole section takes maybe about 10 minutes to complete & yes I intend to build this as a DSC Resource in the near future, however it is worth while to note that Lability has a Function that will ensure that the Hyper-V feature is enabled & your are not awaiting a System Reboot for you – more on this a little later on.

Time Taken – 15 minutes

Once the reboot has completed we can then get on with the Lability bits and that is the real interesting part of this post.

Lability Functions

Lability has 38 public functions and 6 Aliases as can be seen below.

Lability8Lability9

I wouldn’t worry too much on the aliases as these are built in for continued support from prior versions of the Lability Module and will likely be removed on the 1.0 release.

We will be using a number of these functions throughout and as is always best practice have a read of the help for the functions and Yes they do include some great comment based help.

There are a number of additional private functions in the Lability module that have comment based help too but again I wouldn’t be worrying about these too much, unless you need to do a lot of debugging or want to help add to the module.

The Key Lability Functions that you will need are and likely in the below order

  • Get-LabHostDefault
  • Set-LabHostDefault
  • Reset-LabHostDefault
  • Get-LabVMDefault
  • Set-LabVMDefault
  • Reset-LabVMDefault
  • Start-LabHostConfiguration
  • Get-LabHostConfiguration
  • Test-LabHostConfiguration
  • Invoke-LabResourceDownload
  • Start-LabConfiguration
  • Start-Lab
  • Stop-Lab
  • Get-LabVM
  • Remove-LabConfiguration
  • Test-LabConfiguration
  • Import-LabHostConfiguration
  • Export-LabHostConfiguration

These are just a few of the Functions available in Lability and we will cover most of these functions in greater detail as we head through this article.

Lability Media Files

Lability has a number of different configuration files all in JSON format, and these are HostDefaults, VMDefaults & Media. All of these files are in the Config folder of the Lability Module which on your new Machine will be C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config
The HostDefaults file contains all the settings that we associate with the Lability Host Machine. These include the paths where we will be looking for any ISO’s, VHDX, Hotfixes and any additionally required Resource Files for in our Lab.

The VMDefaults file contains all the default settings that we associate with the created VM’s. This includes Media used to create the Machine, Startup RAM, Number of Processors and which virtual switch we can expect the VM’s to use. This can be useful to have just like the HostDefaults but as we will see later on in this post we are most likely to override this in our configurations.

The Media file contains the settings for any media that we we want to use. As Lability in its nature was was built for building Labs it uses the Evaluation Licensed media for the VM’s.

The benefit of this is that the items already in this file allows you to get on with building Labs almost straight away on a brand new Machine.

This file has some included Hotfix Download links for getting the DSC updates on WMF4 for Server 2012R2 & Windows 8.1, but don’t worry Lability uses these to download the hotfixes and embed them into the VHD files for you. 1 Less job to worry about Winking smile

LabHost Defaults

Firstly we need to get the LabHost Defaults setup correctly for our environment – this is important and also is great for being able to move Labs between machines if required ( I’ve had to do this a fair amount myself ) and is why I recommend that all the core Lability bits are installed on a Separate Drive.

Personally I’m using an External Hard Drive but that is because my Lab is portable. I have not tried this with a Shared Drive however there shouldn’t be much that needs to change to get it working that way.

On my External Drive I have the following Setup – I have a folder called Lability and in this I have all the Folders required by Lability as detailed in LabHost Defaults as we will see below – however I also have another folder – Lability-Dev as this was from the Zip that you can download of a repository from GitHub as this was prior to Lability being made available on the PowerShell Gallery. In essence this means that I have copy of Lability that I can edit as required – especially the 3 Lability Configuration files detailed in the previous section but also allows me to do additional debugging as required.

Firstly we will Run Get-LabHostDefault and this should return the below by default – this is because the File HostDefault.json is stored in the C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config location (remember 0.9.8 is the current version – yours may vary)

Lability4

As this is the default and I’ve been using Lability on a few different machines I have a copy of it on my External HDD in the Lability Folder. Lets see what that file says it should be.

Lability5

Well – That’s not good! As you can see on my last machine the external drive had been the D Drive but on this machine its the E Drive. A simple (yet annoying) thing that we can easily change. Now I could be Done Manually but I decided that I wanted to wrap this all together so that I don’t have to think about it again. This is simple enough so I just wrapped it in a very simple function as seen below

Lability6.1

This allows me to Update this as I move it between machines quite easily. This isn’t an ideal scenario but it works at at least.

The benefit of this is that it will update the HostDefaults file on both my C: Drive and the External Drive at the same time – Which further means that this will be easier to be portable.

We can then run the function Reset-LabHostDefault and we should get something similar to the below

Lability7

We can also do the same thing for the VMDefaults file however I find this is less likely to be a requirement as we can override the defaults in the configuration data files that we will work with and this is my preferred method.

Once we have done this we are ready to run the following function Start-LabHostConfiguration – this will on a new machine go and create the required directories as specified in the HostDefaults.json file that I have shown you how to amend and the output from Start-LabHostConfiguration is below

Lability18

We would then use Test-LabHostConfiguration to confirm that this is all correct and we can see that this is the case below

Lability19

Building your First Actual Lab

Wow that was a fair bit of setup required though a lot of it may be completely ignored depending on your own set up or if your re-visiting this post.

Now we move onto the real meaty part of the post and I’m going to use 2 examples for this – The Bundled TestLabGuide and one of my own for a SQLServer install.

So starting with the TestLabGuide.ps1 file there is only 1 small modification that I have made and this is at the end of the File and that is to include the following 2 lines

Lability10

This allows me to build the configuration for these VMs as if it was a script and this is how I am personally doing it.

However on a Machine with No DSC resources we have an issue if we are building VM’s that are Dependant on these DSC Resources.

Well within Lability there is a Function called Invoke-LabResourceDownload and this has the ability to download all the required resources that we need as defined in our configuration data file.

Within the Configuration Data file shown below, the key section for us to look at in here at this point is the NonNodeData section where we have a subsection for Lability configuration Items, this can include EnvironmentPrefix, Media, Network, Resources & most importantly for us DSC Resources.

So far I have found that we only need to run this for pulling the DSCResources as defined in our configuration data file as shown below – this is because we require them to be on the machine before we can build the mof files.

Lability15

I found that it best to have the DSCResources as RequiredVersion and not MinimumVersion as it is by default in the TestLabGuide.psd1 file – This is by preference but with the amount of changes happening to the DSC Resources its worthwhile being extra cautious here.

The output from Invoke-LabResourceDownload can be seen below and this as we can see has downloaded only the DSC Resources that we specified in the Configuration data file (TestLabGuide.psd1)

Lability11
This also means on a clean machine you will be sure that you have the right required versions. This is especially useful when building Labs in my opinion.

However if you have multiple Labs running concurrently then the next bit may be an unfortunate blow to you.

Within the Configuration keyword we have a Dynamic Keyword defined – this is Import-DSCResource – which you may have thought was a function.

With it being a Dynamic Keyword it works a little differently to a normal Function/Cmdlet and therefore we are limited as to what we can do with it – for example we cannot use Splatting with it and we also cannot pass the required DSC Resource Modules to it from outside the current file. This is required for the syntax highlighting that we get as part of the parser. If you want to learn more about the Import-DSCResource Dynamic Keyword then read up this article by the PowerShell Team – be wary it is from 2014 and there hasn’t really been any better content come out on this since (that I can find anyway)

My thoughts on this is that we should be able to pass the required DSC Resources through from the Configuration data file like we have already detailed prior – however this isn’t currently possible.To me it would be beneficial (& logical) to be able to extract this away from the configuration as it is really a part of the configuration data, especially seeing as we already have to pass configuration data to our outputted configuration keyword – in this case TestLabGuide. However this is where we are at and at this time we will need to mirror the DSC Resources between both the configuration itself and the configuration data file.

However that aside lets look at the Node data and especially the All Nodes section which is where the NodeName = *

Lability16

As we can see in here we have a few settings for the all the nodes in this configuration that will share items from Lability and these include the items we had available in the VMDefaults file as well as some other items too that we would want shared between the VM’s like DomainName etc

Further down we can see that for the Client VM’s in this Lab we are specifying different Lability_Media values for each-  so it looks like we will have both a Windows 8.1 & a Windows 10 Client Machine in this Lab.

Lability17

 

That’s enough about the configuration and configuration data side of things – lets go and build our Lab.

At this point what we want to do is just do the below.

Lability12

At this point you will be prompted for an Administrator Password and once that has been given as we can see  above it will go and create all the mof files that we need for this lab. The next step is to go and kick off the actual build of the lab which can be done as shown below

Lability13

This Function Start-LabConfiguration  is the key function of this module as it will go and

  • check that the Lability host is correctly setup – by calling Test-LabHostConfiguration  – if not it will throw an error (possible update here)
  • download any ISO’s that are required as we have expressed in configuration data if that image matches one we have listed in the Media file. It will match these to the Checksum value given in the Media file for the Image
  • download any Hotfixes that are detailed in the Hotfix section of the matched Media in the media.json file.
  • build a Master VHDX File from the ISO & Hotfixes as detailed for the media type for the Lab VM’s as downloaded above – it is worthwhile to point out that this is built of lots of smaller functions that are essentially based off of the Convert-WindowsImage script.
  • build a Lab Specific VHDX file – this is currently setup as a 127GB Dynamic Differencing disks
  • build and inject a Lab VM specific unattend.xml file into each Lab VM VHDX
  • Inject into Lab VM VHDX all required certificates
  • download & Inject any resources into that are defined in the Lability Section of the NonNodeData section of the Configuration data file – I Will show more on this in the SQL Example later on. These are injected into the Lab Specific VHDX file
  • Inject all required DSC Resources into the resulting Lab VM Specific VHDX file.
  • Inject the mof and meta.mof files for each Lab VM into corresponding VHDX file.

Seriously though – Wow – that 1 Function is doing a lot of I would call tedious work for us and depending on your internet connection speed can take anywhere between maybe 30minutes to a day to complete – 1st time I ran it I think it took about 7 hours to complete for me due to slow Internet & I was also watching Netflix at the time Winking smile

You can see the final output from this Function below

Lability20

Note – If you have your own Media you could always create new entries in Media.json for these to save the download time – Especially if you have a MSDN License

Now this is where the fun bit really starts and it also involves more waiting but hopefully not as long as the last bit took you.

All we need to do at this Point is run Start-Lab like shown below and let DSC do its thing – note that I’ve used Get-VM and not Get-LabVM – this is a small issue that I have faced and have reported it on the Github Repo

Lability21

And here is an image of all the VM’s running and getting started

Lability22

 

This part can take anywhere from 10minutes to a few hours depending on your VM Rig setup and the amount of ram allocated to each VM as part of your configuration Data and whether there is requirement to wait for other machines to have be in their desired configuration as well as the complexity of the configurations being deployed.

Under the hood Lability has injected the DSC Configuration into the VM VMDX and has setup a Bootstrap process which in turn calls Start-DSCConfiguration and passes the path of the mof files to this. You can have a look at how this is setup in a VM’s in the following folder C:\Bootstrap\ if you are interested.

Once that is done you’ll have your first fully deployed set of VM’s using DSC & Lability – Pretty amazing isn’t it!

 

SQL Server install – Showing  some of the other features of Lability

In this section I’ll try and keep the content to a minimal but still add in some additionally useful screenshots.

My ConfigurationData file is as below for the SQL Server Node, notice how we have the required properties to be able to install SQL, SourcePath, InstanceName, Features and the Lability_Resource.

Lability24

As this was taken from a previous configuration this is using the xSQLServer DSCResource – take a look at cSQLServer here as this will likely be the version that gets ported to replace the xSQLServer & xSQLPS resources ass it is relatively close to being usable in place of the two resources. Expect news on this after PSConfEU.

Also note that in the Configuration Document we are specifying an additional item in the NonNodeData Section – Resource

Lability25

This allows us to specify further resources that are stored in the E:\Lability\Resources\ folder (E being in my case of course)

I’ll let you decide what you want to put in that folder but any items for installation from the VM could be candidates, things like SharePoint Media or SQL media or other installable programs etc. You could always add your personal script library in a zip file and then get this Lability to unzip it into the right directory. Choices are up to you on this one – so be creative Winking smile

For this Lab I didn’t have the installation media already downloaded so this has had to be downloaded as part of the Start-LabConfiguration Function – however if your remember there was a Invoke-LabResourceDownload Function.

This has some additional Parameters that allow you to download any of the required items for the LabConfiguration to succeed. This can be useful for example if you happen to have a few hours where the Internet Connection is much better than that of your own – especially if you are using this for personal testing and not professional lab testing as it was originally designed to be for.

One of the other great things with this module is that you can make use of it for your lab environments regardless of whether your shop is using WMF5 or not. If your still running WMF4 (with the essential DSC Updates) then you can still build labs using this.

Wrap up

Well I hope you’ve enjoyed reading this 3800+ word post of mine and this helps you get to grips with building out Labs in an easy and repeatable way whilst having the chance to play with DSC to do it.

Remember that this Module DOES A LOT behind the scenes – if it didn’t there wouldn’t be the need for this post – and there is more functionality being introduced as appropriate all the time.

Lability is built for building labs – however you could easily use this for building production like environments – if you dare that is and I can see the benefit to doing so, I mean why re-invent the wheel when Lability will do a lot (most) of the work for you.

Like with getting to grips with most new modules always start with the Help files. This Module has a number of about_* help files and almost all the functions (even the internal ones) have Comment Based Help.

This is a module where you need to RTFM to really understand all the workings of it. Spend a few hours looking through it and understanding it a best as you can. It will be so worth it in the long term even after reading this post a decent number of times.

My do however take my hat off to Iain Brighton (@iainbrighton) on creating this module and for me it is the only module to use when building Lab Environments – So lets gather some momentum as a community to suggest enhancements and improve it even more over on the Github Repo.

My example files that I have used (especially the SQL one) will be made available in due course once Iain has decided on a scalable way forward for being able to share Lab Configurations. We have discussed a number of options (yet to be added to this issue) and if you have an Idea please add it via the Lab Sharing Issue on Github.

This is just the first in a series of posts that I intend on doing on Lability – although future ones will be much shorter but will focus in depth around the functions that I haven’t covered in this post along with some of the more interesting parts in more depth. However I expect that this will be a good starting point for you to get to grips with the Lability Module and start building and running test labs.

As per usual please let me know your thoughts on this post whether it’s via Twitter or via the below comments section and I hope you have enjoyed the read.

Awarded the MVP Award – What this means to me and the future for the community

The MVP Award is defined by Microsoft as the below

Microsoft Most Valuable Professionals, or MVPs, are community leaders who’ve demonstrated an exemplary commitment to helping others get the most out of their experience with Microsoft technologies. They share their exceptional passion, real-world knowledge, and technical expertise with the community and with Microsoft.

This means that within the different areas of the Microsoft Stack there are those out there that really believe that the world can be a better place when we come together as a united front and share the knowledge that we have.

This can be knowledge that we have gained through personal experience of working with the products that we find the most interesting and beneficial to our personal & professional lives or though being there as a point of call for other members of the community to reach out to.

One thing about the MVP Program that has always struck me as an amazing program was the willingness of the MVP’s to do what they can to help you, even if it doesn’t immediately help them in achieving anything, often giving away a decent sized proportion of their own time to do so and in reflection on receiving this award, over the last year I’ve been doing the same, although completely unware that I had been doing so.

I have attended a number of different events in the last year (for more details check out the Where I Have Been page) and have met a tremendous number of amazing people at all these events. It was the framework for the SharePoint & SQL User Groups within the UK that lead me to start thinking about reviving the PowerShell User Groups and I have blogged about this in this post and I have enjoyed every minute of it.

The future for the UK PowerShell User Groups looked good however with being Awarded MVP last week the connections that I will make from being part of the UK MVPs will hopefully allow for the User Groups to grow in the coming months/years so expect there to be news of new User Groups forming in the coming months across the UK.

To help the groups grow, I’ll be putting together an “Organisers Pack” which contain useful information and a collection of the tools, contacts and general tips required  which will help those interested in running a local group get it off the ground – however if in doubt get in contact with me.

 

However there is another aspect to receiving the MVP Award that I want to touch on briefly. As part of the MVP Program the MVP’s get the opportunity to help out in more community focused events, some ran by Microsoft, others ran by the community and others ran by non-profit organisations or the education sector. Giving back to the immediate communities is always going to be high up on my list of priorities however I am really looking forward to working with some of the bigger and more personally touching social opportunities over the next year.

 

This does mean that my calendar will be much busier but for me the end result is always going to be worth it.

Finally – A small shoutout to those that have supported me over the years and especially the last year and although I will not name anyone in particular, I’m sure that those people already know who they are!

2016 – 1 Quarter Down and 3 more to go and the Fun has only just begun!

Fooled Ya! Today I became a MVP!

 

Well only if you read this post

MVP2016

This is an exceptional honour to have been awarded the MVP for Cloud and DataCentre Management and to me this kinda feels like an early birthday present from Microsoft (my birthday is on Monday)

This isn’t something that I ever expected to achieve however it is a recognition from Microsoft themselves of the work that I have previously done for the community.

I started off down the community path only last year in that time I have made some amazing friends and met a number of other MVP’s along the way.

The Remainder of 2016 I have a lot planned to help further enhance the community and hopefully break down some of the barriers between the IT Pro world and the Development world that PowerShell has found its self right in the middle of to make this technology more accessible to all that need to use it.

With that in mind over the next few months there will be some further announcements about Get-PSUGUK – the UK PowerShell Community and its evolution.

As part of the Friends I’ve Made in the MVP Community – It has been agreed that this year at SQL Saturday Manchester by Chris Testa-O’Neill MVP Data Platform, that there will be a dedicated PowerShell Track. This will consist of mainly introduction sessions for those that have no/little PowerShell experience but there will also be some sessions on using PowerShell with a SQL Focus. This is an amazing FREE event and it is as much an honour for me to be working on that as it is to receive the MVP Award – So if your interested in Attending check out http://www.sqlsaturday.com/543 – Announcements on Sessions will be coming in the coming months.

Stay tuned for more details in future and as always – Keep Learning, Networking & Finishing what you Start.

 

Now for the weekend of celebrations to begin Smile

Thank you Microsoft and an even bigger thanks to you – the people reading this Post, keep doing what you do and helping make the community as great as it is.

Invoking PSScriptAnalyzer in Pester Tests for each Rule

This is a quick walkthrough on how you can get output from PSScriptAnalyzer rules in your Pester tests.

So you’ll need

  • Pester ( Version 3.4.0 or above )
  • PSScriptAnalyzer ( Version 1.4.0 or above )

Please note this is shown running on PowerShell  v5 as part of Windows 10 Build 14295 – results may vary on other PowerShell Versions

In the nature of the way we want to work we may have new ScriptAnalyzer rules in the near future (new version / additional community additions / your own custom ScriptAnalyzer rules etc) and we would want ensure that we test for them all without having to change much of the below code

to dynamically do this within our Context Block.

 

So our example code in our Pester Test would look like this

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

    foreach ($module in $modules) {

        Context “Testing Module  – $($module.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

    foreach ($Script in $scripts) {

        Context “Testing Module  – $($script.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

And the result that we would get would be the below

 

PS C:\Users\Ryan\OneDrive\GitHub\kilasuit\Scripts-WIP\PesterScriptAnalzyerExample> .\PesterScriptAnalzyerExample.ps1

Describing Testing all Modules in this Repo to be be correctly formatted

Describing Testing all Scripts in this Repo to be be correctly formatted

   Context Testing Module  – PesterScriptAnalzyerExample for Standard Processing

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingCmdletAliases 233ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueSwitchParameter 124ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingEmptyCatchBlock 134ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidGlobalVars 87ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidInvokingEmptyMembers 104ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidNullOrEmptyHelpMessageAttribute 70ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPositionalParameters 879ms

    [+] passes the PSScriptAnalyzer Rule PSReservedCmdletChar 75ms

    [+] passes the PSScriptAnalyzer Rule PSReservedParams 81ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidShouldContinueWithoutForce 85ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingDeprecatedManifestFields 117ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueForMandatoryParameter 123ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingUserNameAndPassWordParams 95ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingComputerNameHardcoded 113ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingConvertToSecureStringWithPlainText 98ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingInvokeExpression 75ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPlainTextForPassword 103ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWMICmdlet 138ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWriteHost 91ms

    [+] passes the PSScriptAnalyzer Rule PSMisleadingBacktick 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseBOMForUnicodeEncodedFile 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseToExportFieldsInManifest 87ms

    [+] passes the PSScriptAnalyzer Rule PSUseOutputTypeCorrectly 128ms

    [+] passes the PSScriptAnalyzer Rule PSMissingModuleManifestField 84ms

    [+] passes the PSScriptAnalyzer Rule PSPossibleIncorrectComparisonWithNull 99ms

    [+] passes the PSScriptAnalyzer Rule PSProvideCommentHelp 98ms

    [+] passes the PSScriptAnalyzer Rule PSUseApprovedVerbs 75ms

    [+] passes the PSScriptAnalyzer Rule PSUseCmdletCorrectly 867ms

    [+] passes the PSScriptAnalyzer Rule PSUseDeclaredVarsMoreThanAssigments 82ms

    [+] passes the PSScriptAnalyzer Rule PSUsePSCredentialType 91ms

    [+] passes the PSScriptAnalyzer Rule PSShouldProcess 160ms

    [+] passes the PSScriptAnalyzer Rule PSUseShouldProcessForStateChangingFunctions 86ms

    [+] passes the PSScriptAnalyzer Rule PSUseSingularNouns 177ms

    [+] passes the PSScriptAnalyzer Rule PSUseUTF8EncodingForHelpFile 176ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscTestsPresent 98ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscExamplesPresent 102ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseVerboseMessageInDSCResource 81ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalMandatoryParametersForDSC 110ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalParametersForDSC 74ms

    [+] passes the PSScriptAnalyzer Rule PSDSCStandardDSCFunctionsInResource 122ms

    [+] passes the PSScriptAnalyzer Rule PSDSCReturnCorrectTypesForDSCFunctions 101ms

 

This allows you to see from your test if it fails or not and as shown is able to be used for scripts and modules.

The example is a good example as well of getting Pester to test your Pester tests Winking smile

This example is being added into ISE_Cew (see post) in the next feature release (next week some point) though you can just copy and paste it from this blog post as well thanks to a PowerShell ISE addon called CopytoHtml by Gary Lapointe in which you can find more about it and download it at http://blog.falchionconsulting.com/index.php/2012/10/Windows-PowerShell-V3-ISE-Copy-As-HTML-Add-On/

 

Please note that although the above works fine – I dont see the point in running the Describe block if the tests below wont run so I’m adding what I think to be the better version below – this will only run the Describe blocks if there is any scripts or modules

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

if ($Modules.count -gt 0) {

    Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

        foreach ($module in $modules) {

            Context “Testing Module  – $($module.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

if ($Scripts.count -gt 0) {

    Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

        foreach ($Script in $scripts) {

            Context “Testing Module  – $($script.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

C’Ya Manchester – Hello Derby!!

After yet more changes in my housing see my previous post for a bit of background link – I have decided to settle in Derby and this has been for a few very good reasons.

  • I’ve got a good group of friends here.
  • Manchester, London, Nottingham, Birmingham & Wolverhampton are all short train journeys away and I’m expecting to spend more time between them in the coming months ahead.
  • I almost moved here back in January – but I decided to try and give Manchester another go from the work perspective and this seemingly wasn’t to be the case

However I made a good group of friends in Manchester at the various events I’ve been to there over the last 2 years (more so the last year – see this page for more details) and I’ll still be trying to attend some of the amazing events there when it is feasible.

This is yet another new chapter in the book of my life and it is one that really should allow me to lay the foundations for the future a bit easier.

3 & half Months into 2016 and it feels like its the beginning of yet another new year.

Pulling the Community Together to Improve the Quality of PowerShell Modules

In a discussion that started on Twitter a while back with June Blender about the quality of the Modules being posted to the PowerShell Gallery I had an Idea on a way that we could help improve this from the community – using the tools that we have available to us and more importantly the expertise of the rest of the community to help shape and guide the direction for modules.

The Idea starts off with a simple GitHub Organisation, in this case this one – PowerShellModules – in which we set up a number of teams in that organisation. Some of the teams that I have had in mind include but are not limited to, Documentation, Unit testing, Practise guidance and then module maintainers.

This means that there will be a much more open way of directing the development of Modules for the community but still having the ability to allow modules to be developed in a way that allows others to add to them instead of branching out and creating a second module. This also will mean that there is a stronger focus on modules being updated as the overhead isn’t on a single maintainer but can be given to a number of multiple maintainers at any given time.

 

My thoughts around this would start with me porting my current ‘in progress’ modules to the organisation and building out the teams as mentioned above with a suggestions/RFC like repository that would allow us to drive this for the better of the community.

The end goal from my perspective would be to have 1 community recommended Module for a technology (examples being GitHub, Trello, VSTS, Slack etc) that has been widely community developed that covers as much of the functionality as it can without the need for a separate module.

We have a great community of people and I think that this is the right point in time to start the drive to improve the quality of what we all output to the community in a targeted and effort efficient method.

If your interested in getting involved please comment on this post with your GitHub User Name and I’ll get you added to the Organisation in due time but please keep an eye out for the Email from GitHub requesting you to join the Organisation especially in the Junk Folder Winking smile

The Power of the Humble Pint and the Community when things are difficult!

Disclaimer This isn’t a fun post (to read or to write) and nor is it a technical post – this is a reflection on the last few years and is in its very nature quite a personal post. I expect that there will be some kick backs about this post in future and I would humbly ask that you try and imagine yourself having been in my shoes at the time of these events happening and also at the time of writing this post.

If this isn’t the sort of thing that you wish to read then I would advise that you don’t carry on from this point and find something else more fitting to your taste.

 

Sometimes I have those moments where my thoughts can be expressed along the lines of “Fuck this I’m getting out of IT” or in other cases it can be put forward even simpler “Fuck it – I’m Done with all of this”

Now I know that likely read in a way that is different from the actual truth behind it but this is something that we in IT (and generally in Adult life) tend to be expected to sweep this sort of emotion under the carpet as if it doesn’t actually exist. For me this is no longer acceptable and I will not hide my emotions away like this especially when I am struggling to deal with it all and I, like many others in similar situations have to find our own ways of coping with the daily struggles and the tipping points that we each have.

My tipping points have always been centred around home-life stability – something that most people take for granted, however for me this has been something that has been in almost constant flux since I was 16. I’ve had some good spells where things have been stable for almost 2 full years but I’ve also had the other extreme where things have been extremely volatile for a time which could typically be anywhere up to 6 months or so.

This being an area that I;m frequently reminded of I decided in the spirit of writing this post that I would do some data digging and work out a few things around my home life that could be determined as statistically interesting – or at least it was to me at the time and it has been something that I have been thinking of doing for quite some time now, so when better to do it than when I want to write about it.

So I’ve only worked out places I’ve lived since (just before) I turned 16 (I’m 26 in 2 weeks and think that 10 years is a good measure point) and I haven’t split all hotels out into separate rows in my Spreadsheet – if I did that then I feel that it would skew the data but for my own amusement I will likely do this at some point in future.

However the crunch really comes down to the numbers and I have found the following facts – some of which shock me now by looking at them

  • I’ve lived in roughly 24 different places in 10 years – this includes adding spells of stays in hotels, hostels and even a spell of time spent in my car/s
    • I’ve rented 11 places – these I had actual tenancy agreements in place
    • I’ve lived in 8 different forms of shared housing – this doesn’t include any spells of living with family or those that would be classified as family
    • I’ve lived in 4 places that were the direct result of being made homeless – although this is technically skewed as the only time this happened was when my ex partner, son and I were all made homeless – Other than this I have received NO HELP from any UK Council Housing department as I am seen as “Not in Priority Need” according to their data charts
    • I’ve spent 4 different spells where I’ve basically lived in hotels prior to this weekend (as it will be spell number 5 now)

The duration of the average spell is roughly as follows

    • Days per place = 152
    • Weeks per place = 21
    • Months per place = 5
    • Years per place = 0.42

 

The Longest spell is as follows

    • Days = 663
    • Weeks = 94
    • Months  = 21
    • Years  = 1.82

 

So where does this fit in with the Title of this blog post?

Well I suppose it really all started for me in September 2013 when it became the right time to make the decision to move away from everyone that I knew and try and start afresh closer to work and part of that was to work out how to end up meeting new people. Thankfully I learned of the SharePoint User Group, of which I’ve become a regular attendee at, as well as a few other groups thanks to them having Meetup groups, these included LeanAgile and DigiCurry to name a few.

This was the type of surrounding where I realised I felt that I was comfortable with meeting new people and I strongly feel that though these groups (& the resulting conferences I’ve attended too) I’ve made some amazing friends along the way and at some point I came to the conclusion that I would most likely still feel comfortable on the presenting side of the groups and not just as an attendee and that started me off on the journey as stated in this post and followed up in this post , this post and this post.

However its not all been fantastic throughout the years but having found various communities that I enjoy attending, I have somehow managed to scrape through all the difficult moments and made it this far however it is now getting to the point where the number of “Fuck this I’m getting out of IT” or “Fuck it – I’m Done with all of this” days are getting somewhat out of balance with what I’m able to maintain. A key part of this is again due to my current housing situation, aka a downright mess.

With this in mind I suppose what I’m getting at is that without the communities that I’ve become a part of I’m not sure I would be able to write this post.

Also over the last few weeks I’ve been asked around the why’s & how’s I manage to juggle it all. The short and simple answer is that all these communities are essentially a part of the extended family to me and with this I feel that at times I want to see and help direct the communities growth for the benefit of the community members much akin to how a parent would do the same for their children, something that I currently cannot do for my own children.

As I see the communities as extended family I plan things around “the next event” which has been a major driving force keeping me going over the last year.

So with all that in mind there may be times ahead where I’m struggling and especially more so with the Core items in life but that wont ever stop me from continuing the work I put into the community where it is still feasibly possible. There may be times ahead where I may need to unfortunately let conferences/user groups down that I’ve promised my time to speak at but this as things stands is an unfortunate bi-product of the situation I currently find myself in and if I can in anyway mitigate having to do so then I will, but currently it is looking like I may have to cancel everything that I have planned ahead, which is frustrating and infuriating especially when I’ve been looking so forward to the events I’ve got planned over the coming weeks/months.

 

2016 was supposed to be the beginning of an amazing year where things fell into place, however 3 (almost 4) months in and its feeling like 2013, 2014 & 2015 all over again.

 

It’s 4:12 and I really shouldn’t hit publish on this article but I feel that it needs done – Yes I’m 25 and Yes I Suffer from depression, but that hasn’t stopped me achieving a hell of a lot in the last few years and I don’t expect that to change. I can and will get though it no matter how difficult the roads ahead are for me.

Updated! Quick Win – Install PowerShell Package Management on systems running PowerShell v3 / v4

**Update 9th March 2016 PowerShell Team released an updated version of the PackageManagement modules today and I’ve updated the Script accordingly and will install the latest PackageManagement modules for you with a little verbose output

Updated Microsoft blog is at https://blogs.msdn.microsoft.com/powershell/2016/03/08/package-management-preview-march-2016-for-powershell-4-3-is-now-available/ **

 

This is a very very quick post about the latest feature being made available downlevel from Powershell v5.

As Microsoft have released PackageManagement (formally OneGet) that is now avaliable for PowerShell v3 & v4 as detailed in this link http://blogs.msdn.com/b/powershell/archive/2015/10/09/package-management-preview-for-powershell-4-amp-3-is-now-available.aspx

That’s right the ability to pull directly from the PowerShell Gallery but you need to install the Package Management release which I’ve Scripted for you here.

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/PSPackManInstaller’)

And if you want to look at the Script then direct link is http://bit.ly/PSPackManInstaller – this takes you to the RAW version of the file on Github so will not download or execute – but will allow you to read it

Hope this is useful for you

PS credit goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

Beware where you place Comment based Help

In working on the PSWordPress Module that Stephen Owen (@foxdeploy) has started I came across an interesting issue after running my Pester tests which calls – $ModuleFunction.Definition.Contains(‘.Synopsis’) | Should be True to check for comment based help – and it was failing even though I had Comment Based help in there. The problem was that the Help was Above the Function Keyword – so this means that it wasn’t carried through to the $ModuleFunction.Definition property. So this is a good example of why CommentBased Help for a module should be held within the initial Curly Bracket for the function.

Updated! Quick Win – Install-WMF5 (again)

 

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

 

**UPDATE 24/02/2016** WMF5 was re-released today and the below scripts should still work**

 

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but this is low priority for me as really we shouldn’t be deploying Server 2008 or Windows 7 Systems any more

<

p class=”ExternalClass9895ED4FC6204BF3B8661CE60051AB0C”>PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future