Recap of a Long February, March, April and May – Events Events Events!

I had intended that I would be doing a recap type post at the end of every month however I’ve been very busy so haven’t been able to do so for a number of months – that and I had an issue with my blog being offline for a few weeks.

Let us start with a recap on the amount of events that I managed to attend and I think that you can see that I did a lot of travelling and attending a number of different user groups.

I attended the following events

  • Get-PSUGUK – Manchester – Feb 1st
  • SharePoint User Group – Manchester – Feb 2nd
  • Azure Security Training Event – London – Feb 3rd
  • SQL User Group – Manchester – Feb 3rd
  • Get-PSUGUK – London – Feb 4th 
  • Mississippi PowerShell User Group – Feb 10th – Online
  • What’s New in Server 2016 – Microsoft Training Event – London – Feb 17th
  • What’s New in Windows 10 – Microsoft Training Event – London – Feb 18th
  • WinOps Meetup – London – Feb 23rd
  • Chef Meetup – London – Feb 24th
  • Cloud Roadshow – London – Feb 29th – Mar 1st
  • Azure User Group – London – Mar 1st
  • Manchester Geek Nights – Agile and Tech in Local Government – Mar 3rd
  • SQL Sat Exeter – Mar 12th
  • Lean Agile Manchester – Mar 16th
  • SQL User Group Manchester – Mar 17th
  • Manchester .Net – .Net Core recap – Mar 22nd
  • SQL User Group Cardiff – March 30th
  • MCR Tech Event Organisers meet – Apr 7th
  • SharePoint User Group – Nottingham – Apr 12th
  • PSConfEU – Hanover, Germany Apr 19th – 22nd
  • Get-PSUGUK Manchester – Apr 25th
  • Get-PSUGUK London – Apr 27th
  • MVP Open Day – Apr 28th – 29th
  • SQLBits Sat – May 7th
  • Get-PSUGUK Manchester – May 23rd
  • WinOps Conf London – May 24th
  • UKITCamp London – May 25th
  • SQL London User Group – May 25th
  • Get-PSUGUK London – May 26th

So in the space of the beginning of February to the end of May I attended 30 different User Groups, Training days or Conferences and that wasn’t all the ones that I had planned either due to some unfortunate illnesses that occurred as well.

Now those that know me will know that I attend the events because I’m genuinely interested in the topics at the events or catching up with the people that are there after all the events are all about the community and the networking opportunities that they bring to us.

I intend to in future post ahead of where you can catch me in the following months via the Find Me At page and then at the end of the month detail more about what I learned at the events.

Before I go into detail on the events and what happened at them just take a moment to look at the types of events that they are and the breadth of technology that they span. This may give you an insight into the differing technologies that excite and interest me going forward. 

To start Get-PSUGUK Manchester on Monday Feb 1st which seems a long time ago but is still an event that I can vaguely remember enough to post about. I presented the initial version of my “Teaching the IT Pro how to Dev” Session where I introduced my ISE_Cew Module to the Audience for helping with getting to grips with using source control with Git and unit testing with Pester. We also had our first community speaker Tim Hynes @railroadmanuk who presented on Automating Infrastructure using PowerShell with various Infrastructure API’s that he’s been working with including VMWare, Cisco & NetAPP devices. You can find his presentation at https://github.com/railroadmanuk/presentations and not long after Tim was awarded VMWare vExpert. I know he’s presented at other events since and I’m looking forward to seeing what the future holds for Tim.

Then on Tuesday Feb 2nd SharePoint User Group in Manchester will always be a group that is close to me as it was the first user group to give me the possibility to present at which you can read more about here – though this was a night about “What you need to know about SharePoint 2016” by Heath Groves @Heath_Groves and Building Enterprise Platforms by Andy Talbot @SharePointAndy – you can find Andy’s slide deck at http://www.sharepointandy.com/?p=550

Heath gave us a rundown on all the things coming in SharePoint 2016  and even prepared some take-me-homes which included the New and Removed PowerShell Cmdlets in SharePoint 2016. Andy’s session was a good thought provoking session for those that have dealt with SharePoint in the past and there are some really good points in the slide deck that are applicable to a number of different areas of IT. You can tell this deck was put together with the pains that Andy will have personally felt working with the number of different IT Departments over the years and a number of them I have felt as well as will a number of you too. Even if your not a SharePoint person go and have a look at the deck and see if it resonates with items that you feel in your day to day IT lives.

Next up on Wednesday 3rd Feb it was an early morning with a 5:15am train from Manchester to London for an Azure Security Morning at Microsoft’s offices at Victoria – this is an area that more people need to put time into and I’m looking forward to seeing some further work in this area and mainly more so from Microsoft. Saying that Microsoft recently released the Azure Security Information Site on https://azure.microsoft.com/en-us/documentation/security/ so go and have a look at it as there is a lot of good information in there. However the Security morning was a good event although I felt it would have been better as a full day event especially as there were a number of issues with getting the interactive demos/labs up and running with the Barracuda security devices mainly due to issues in the Scripts that had been provided to set everything up. They should have written Pester Tests for these scripts as I had gotten the impression that the scripts were recently updated for a recent release of the Barracuda security devices. Some of the attendees managed to get things set up however I was unable to which was not ideal.

I then had to leave London around 14:30 in order to get back to Manchester in time for the SQL Server User Group that evening. Now everyone that knows me knows my SQL knowledge isn’t close to be on par with those that live and breath SQL every day however one thing all platforms require is a data backend of sorts. So I’ve pushed myself to attend more and more SQL events where possible (as you’ll gather from the rest of this post as well) so that I can learn more about this crucial technology and be able to implement and use it in my own adventures going forward and one of the area’s that has piqued my interests is PowerBI and I was glad to be able to get what was a real beginners crash course into PowerBI by what I can only describe as an Awesome Instructor – Adam Aspin. We also had a session on SQL Server Wait Stats by Rainer Unwin which was an interesting although perhaps a bit too technically in depth for me to fully follow at this stage of my interaction with SQL Server – though I’m sure it will be something that I come back to in future.

Then the next day Thursday Feb 4th, I had to travel back down to London from Manchester for the London PowerShell User Group at RackSpace just out of Hayes and Harlington, where I also presented my Teaching the IT Pro how to Dev session with a bit of an update to it from the Manchester session. We also had Rudolf Vesely @RudolfVesely from Rackspace give an Introduction to Pester which was a great session for the audience – Rudolf will be presenting to the London group again in future on a more in depth session on Pester so look out for this.

On Feb 10th I was lucky to present to the virtual Mississippi PowerShell User Group where I Presented the Teaching the IT Pro how to Dev session – this was recorded and I’ve blogged about it in a bit more detail here.

I then attended the UKITCamps in London on Feb 17th & 18th on the What’s New in Server 2016 & What’s New in Windows 10 topics and although these are camps that I’ve previously attended there are a number of labs in there that are good to have chance to run over and replay. I also enjoy the UKITCamps as these are Microsoft delivered training days meaning that there are a number of others there that I get chance to network with along with also getting chance to catch up the guys running them, namely Ed Baker, Marcus Robinson and Andrew Fryer. I was also very lucky to get chance to head out for a meal with Ed, Marcus & the other members of the DX team that work behind the scenes to put on these events. I for one look forward to the events and them being put on by the guys in the DX Team and now how difficult it is to arrange events like these. This is before you include preparing the Slide decks and the labs that are to be used in these events. Hopefully we will see more of these events in future however there aren’t any currently planned so we will have to wait and see if more of them appear in future.

I then had a just under a week until my next event which was decided last minute where I was to present my Teaching the IT Pro how to Dev session to the WinOps group in London on Feb 23rd which was great however I suffered from a failed MicroHDMI to HDMI Adaptor so I had to try and move my demo and deck to Stephen Thair from DevOpsGuys Laptop and as per the standard developer line ‘Well, It worked on my machine’ I was unable to show the demo’s working. This has lead me to build a VM in Azure and a second Hyper-V VM for any demos that I want to run in future to ensure that demos work – Also I’m planning getting a dedicated presentation only device which I’ll wipe between events to ensure that all runs as expected along with a few backup cables & Adaptors to have with me.

Then the next night attended the Chef Meetup where I was introduced to GoCD, Terraform & Kubernetes – all look like interesting technology but I need to get a reason to get in deep with any of these technologies so look forward to me possibly blogging on these technologies in future.

I then Attended the London leg of the Microsoft Cloud Roadshow on Feb 29th & March 1st where there were a number of different sessions on throughout the event with tracks covering most of Microsofts technologies with a number of them focused on the SharePoint/Office365 ecosystem and the Azure ecosystem. The highlight of the event was the ability to go and have a few drinks with Joey Aiello one of the PowerShell PM team who was over from the US for the Cloud Roadshow. It was good to be able to have a face to face chat and I’m sure in future that there will be more chances to chat including the MVP Summit. Joey is younger than I am and is rocking a very good role at Microsoft – Imagine being part of the PowerShell Team – that is a number of peoples dream jobs and I would be lying if I were to say that I wouldn’t find it amazing to spend my day working even more with PowerShell than I already do. However as an MVP I do get that luxury already although it would be a very different role to the one that I’m doing. Who knows what the future holds but I know that for me it will likely involve PowerShell for a number of years if not decades to come.

I also dragged a few people to the London Azure User Group that was happening on the evening of March 1st where we were introduced to Boris Devouge, Director of Open Source Strategy at Microsoft and I can only describe him as a ‘Fluently Funny Frenchman’  which make his presentations engaging and as this was on the new Azure Container Service (it’s an Azure User Group after all) it was interesting to hear of the partnerships that Microsoft have been recently making in this area with the push to make Azure the most open source friendly cloud. The Azure Container service was in public preview (I think) at the time of the presentation however it has since been made Generally Available and you can learn more on ACS on this post on the Azure Blog site https://azure.microsoft.com/en-us/blog/azure-container-service-is-now-generally-available/

I next attended a talk in Manchester on March 3rd at Manchester Geek Nights on Agile and Tech in Local Government delivered by Stockport Council where I was lucky to bump into my good friend Ethar who always has a good story to tell. I must get chance to catch up with him again when I’m next in Manchester and not just there on a flitting visit. The Talk by Stockport Council left me realising why our Governments, Local & National, get a lot of stick for being poor at delivery and execution of their IT projects (& projects in general) and this is because there is so much fragmentation in the IT Systems being used across all differing councils due to them all having separate and diminishing IT budgets to do any projects. I personally think that Centralisation of all of the UK Council & Local Government IT into a single pool would work much better for the public and my reasons for this are pretty simple, Enhanced Governance, Lower Boundaries to sharing data between the different departments that need to share data Nationally (think Social Care departments, Housing Departments etc) and Generally a simpler to manage Infrastructure and Workforce. Though perhaps I’m biased being from a Microsoft background which means that I can see some opportunities to scale similar services nationally which would be massively more cost efficient. Almost all the banks have done this and realised the benefits and to me it makes sense for the Public Services Sectors to do the same too! It was however interesting to hear about how Stockport Council are embracing Open Source technologies and essentially building out their own products which they are in turn open sourcing for other councils to take advantage of too. Its an interesting journey for them to take and I hope that the effort doesn’t end up being completely canned in a few years time if a Nationalisation of IT Services to Councils were to occur. It in my opinion is a logical step for this country to take though I’m not sure politicians and logic can go together. We will have to wait and see.

 

SQL Sat Exeter – March 12th. Well I’m not really sure I need to say any more than that really. However it was a great event and my first event doing a back to back demo heavy session on PowerShell DSC. Even more scary it was DSC but for SQL Server. I hadn’t realised how much of a headache the SQL Server DSC resources were until I spent the majority of the week leading up to it getting annoyed with little things like hardcoded values for where the Resource expected the Install media to be. I got that frustrated with it that I began to rewrite the resources so that it would work how I expected it to work which meant that I spent more time writing DSC Resources from scratch than actually doing anything useful. Especially as a week or two after SQL Sat Exeter I wiped the drive with the resources on them. Yes they were in Source control but only on that machine – lesson learned – DOH!!!

SQL Sat Exeter was my first real forage into the SQL Community events except User Groups and I after the fun I had with them at Exeter I can see why it is they call themselves SQLFamily. In the lead up to my sessions there was a run around to get some bacon sandwiches and a fair amount of drama with my demo’s having decided to kill themselves that morning – However I managed to get them working before my session and there was some good reviews come from it. I know where I need to improve the content and will be looking forward to SQL Sat Paris in a few weeks where I will need to cram all of the information from 2 hours into 45 minutes. #ChallengeAccepted

It was also the Saturday night at after event Curry & following drinks that the discussion about SQL Sat Manchester having a PowerShell Track came to fruition. I was lucky enough to have ended up out with Chris Testa-O’Neill and the other organisers at SQL Sat Manchester the year before (my first SQL Sat event and I went as an attendee) so it all felt natural to be there along with a number of other familiar faces like Rob Sewell and Steff & Oz Locke. Its like a reunion and I’m looking forward to what will be a kick ass SQL Sat Manchester this year. The PowerShell track shaped up nicely Smile. One thing I’ve learnt about the SQL Community is that it really does kick ass but then again all the IT Communities I’m a part of do. Our Passion brings us all together and with it we ensure to have a bloody good time when we get together. Else why bother?

On the Sunday morning I had an interesting email come in as I was sat having breakfast which lead me to question it a little with Chris & Alex Whittles and well history has been written since that morning.  I also got chance to help Rob out with a DSC issue he was having and gave him the guidance that he needed to resolve his issue in the right way as things currently stand and in future we will have a feature complete PowerShell DSC Resource for SQL Server – though this will require some community help and you can help out by voting on / adding items to the Trello board at http://sqlps.io/vote

Next up on my events (and half way through the 30 events I’d attended) was LeanAgile Manchester on March 16th – a firm favourite of mine as its a great community (like they all are) where we were treated to a talk by Jon Terry – but not that Jon Terry! – from LeanKit about how the deal with working in a Lean\Agile way with their FSGC (Frequent Small Good Decoupled – said FizzGood) approach. It’s another example of where the Software/manufacturing world bring good things to the rest of IT and generally other areas too and I would highly recommend that you go and read their blog on FizzGood at http://leankit.com/blog/2015/07/does-this-fizz-good/ and take away from it what you can.

Next up on my User groups that I attended was the Manchester SQL User Group where we would be walking through Cortana Analytics which I was looking forward to as at SQL Sat Exeter Chris Testa-O’Neill & Cortana essentially got a divorce whilst he was in the Speaker Room prepping at SQL Sat Exeter. I’m sure with a decent set of data I’ll be able to find a good use case for Cortana Analytics and I have some ideas in the pipeline so keep an eye out on future posts on this.

As an Non-Dev Admin who realised that I am really a Dev just wasn’t ready to admit it to myself, I find that the .NET User Group in Manchester is a useful group to attend especially when the topic is about .NET Core which it was on March 22nd. Even more so as with .NET Core there is a real possibility that the PowerShell Engine will eventually be open sourced especially as we are seeing a refactor of the existing Cmdlets to be able to be run on Nano Server with more and more coming each new TP and more to come for Server 2016 GA. We were treated to a history lesson on .NET Core by Matt Ellis @citizenmatt with the slide deck at http://www.slideshare.net/citizenmatt/net-core-blimey-windows-platform-user-group-manchester and again is well worth the read.

Next up was just after I had moved from Manchester to Derby and still had the hire car – and I had an itching to go see some of my SQL friends in Cardiff – especially as it was an epic event – Return of the Beards! This only means that not only did I get chance to catch up with Steff Locke again but also with Rob (again – it seems like that guy gets everywhere Winking smile) and also another one of my SQL friends Tobiasz Koprowski and lastly the other bearded SQL guy of the night Terry McCann. This was where I got to learn a bit more about TSQL from Terry and Securing SQL in Azure from Tobiasz but also see Rob’s session on the pains of Context Switching and how PowerShell & PowerBI help him not get mithered for information that can be easily made available and easily searchable with a little effort. This is for me a great example of real world use of PowerShell and PowerBI being useful together and well worth watching Rob deliver this if you can get the chance.

I then attended my first Tech Organisers Meetup in Manchester on April 7th – it was good to meet the other Tech User Group Organisers in Manchester/NW area and have the discussions that was needed as a collective to help strengthen the view that Manchester is a blossoming Tech Hub in its own rights – something that Londoners seem to miss out on. Manchester is ace because it’s cheaper than London and is actually more lively at night than London (I’ve found) and you can literally walk from one end of the main city centre to the other in about 20 minutes or so and within that you have the Northern Quarter. So you are pretty much sorted!

Next up I had another event I presented at – The SharePoint User Group in Nottingham on April 12th. I presented on PowerShell DSC for SharePoint like I did at the SharePoint User Group in Leeds in January but this was a special one for me as it was the first User Group that I presented to after being awarded MVP which being awarded on April fools day lead me to post this post Congratulations 2016 Microsoft MVP at 15:31 about 10 min after getting the Email and then Fooled Ya – Today I became a MVP at 15:55  – I also blogged Awarded the MVP Award – What it means to me and the future for the Community. We also had a talk from Garry Trinder @garrytrinder on Require.JS which can be used in conjuction with MDS (Minimal Download Strategy) in SharePoint 2013 and Online Sites to help bundle up and control your page load and transition times. Javascript is one of those dark arts that I’ve not had much more I’ve needed to do with it – but I certainly would look to use Require.JS in any of my future web projects.

My next event was PSConfEU and this was the event that I had been looking forward to because of the sheer work that went into it by all involved, including Tobias Weltner and myself to make it a success. Due to the size of this event I will put together another post in the coming days that really captures the details on what an amazing event that it was as I don’t think that a few sentences will do it any real justice. Plus I want to relive the experience in as much detail as I can so that I can share it with you as well – so that if you weren’t able to make it then hopefully you’ll do what you can to make PSConfEU 2017. Planning will begin for PSConfEU 2017 most likely early August so there will be small announcements some point after then though its still all to be determined.

From the spill over from PSConfEU I had managed to bribe June Blender to agree to come and present at the Manchester & London PowerShell User Groups – though to be honest there wasn’t much bribing involved as June had wanted to come to Manchester anyway and timing wise it just worked out great. June gave her Thinking in Events hands on lab at both groups and both groups had some great questions and I’ve had some fantastic feedback from the sessions which has lead me to start working on preparing my own hands on events for in the future. These are “in the works” so to speak and details on these will start to appear in the next few months.

Next up was my first MVP event where we went to Bletchley Park – a fantastic historical site and I’m planning to head back there again in future. The event was good for me as it allowed me to meet up with other UK MVP’s including fellow PowerShell MVP Jonathan Noble. There is a good story behind how we ended up meeting on the train up from London to Bletchley Park and it starts with me forgetting to charge my Laptop and Phone the night before. When I got to Euston I was frantically trying to make sure that I got on the right train to get to Bletchley. I had messaged Jonathan whilst on my way and had found out that we were catching the same train to Bletchley. However, phone signal is pretty poor when you are travelling out of London and just before my phone died I managed to send him a message letting him know I was about half way up the train. About 20 minutes passed and then all of a sudden this guy two rows in front of me got up and came to me and said “Hello – its Ryan isn’t it? I’m Jonathan only just got your message” and from that moment we just continued chatting. When we got to Bletchley Jonathan was able to lend me a power bank to charge my phone not that I really needed it but having charge on your phone is now a comfort thing isn’t it. We had  an afternoon of talks and then a really nice drinks and dinner where I got chance to meet some more of the MVPs which was good. We then next day had some presentations in the morning and then we had to make some Rocket Cars in the afternoon. It was great fun to something less techy but still something that most enjoyed. I was lucky to be able to get a lift from Alex Whittles from Bletchley along with Steff Locke to Birmingham New Street Station which allowed for a number of good conversations about SQLBits & SQLRelay. Both being events that in future I may get more involved in – if I can manage to stretch that far that is. Once Alex dropped me and Steff off we worked out that we either had half hour to try and get something quick to eat before running for our respective trains or we could get something decent to eat and then get a drink afterwards before catching the train after that. Naturally, decent food and drink was always going to be the winner Smile.

 

Nearly Finished with the Recap with just 6 events left to cover, so If you’ve read this far well done you can manage to make it to the end Smile

 

I then attended the SQLBits Saturday event on May 7th in Liverpool and although I got there not long before lunch I was still able to get to the sessions that I wanted to get to – mainly the SQLTools session as seeing that SSMS has been decoupled from the SQL Server Install – which is 100% the right thing to have done. Like other SQL events I bumped into Alex, Steff, Rob (he is literally everywhere Winking smile), Tobiasz & a number of other SQL people including Mark Broadbent, Niko Neugebauer, André Kamman, John Martin, Mladin Prajdic & Neil Hambley to name just a few. As per all these events once the curtains for the event has closed that is when the Food and Drinks appear and I’ve realised that I have a soft spot which stops me saying no to going for a Curry & Drinks with all these amazing people. This means that future events I’ll be planning to stick around for the almost guaranteed after Curry and the ensuing drinks and conversations that happen around them.

I then had the amazing opportunity to meet and spend a few hours with Ed & Teresa Wilson – The Scripting Guy & Scripting Wife – where I took them for a wonder down to the University of Manchester Campus and took them to KRO – a nice Dutch place for some food which was right round the corner of where I used work when I was at UoM. We then strolled leisurely around the campus on the way back towards the venue for the User Group where we had Ed talking us though OMS & Azure Automation DSC now that Ed is a part of the OMS team at Microsoft. Due to the fact that we had to get a Train to London at 21:15 the user group was an hour shorter than it normally would be so we didn’t have time for pizza and the normal after drinks that we would have normally done but the turn out was still one of the best turnouts we’ve had and there will be more events like it planned in future as well with an aim to make the next Manchester User Group occur in July.

As I mentioned Ed, Teresa and I all had a Train to catch to get to London for WinOps, and much like PSConfEU, I am planning to blog about this event separately to really capture the spirit of the event. Look out for that post in the next week or two.

 

We then had the UKITCamp which Marcus Robinson & Ed were going over the feature sets of Azure & OMS. I unfortunately missed the morning of this event due to being called onto a customer production issue conference call – 3 hours of my morning I couldn’t get back however sometimes that is how these things go and as I was leaving the Venue I found out that there was the London SQL User Group on that evening and I decided to stick around for it as the topic was “Common SQL Server Mistakes and How to Avoid them” which is the kind of SQL topic that I enjoy because it isn’t deeply technical but allows me to understand the product just that little bit better than I did beforehand.

Lastly The London PowerShell User Group, which we had Ed at again and had the highest turnout so far. Ed again was talking about OMS & Azure Automation DSC but also had a number of opportunities for some open directed questions from the audience which is always an added bonus of having more & more people turn up to the group. We over run a little with the conversations that were flowing mainly due to having an excess of beer and pizza due – something that we haven’t had happen before at the user groups. Then as per usual with the User Groups we end up finding somewhere else to go for another drink or two and continue the conversations.

 

So thats most of my last 3 months summarised – what have you done in the last 3 months?

Future posts like this will be much shorter, contain some pictures and be competed on a monthly basis.

Thanks for reading – Hope you have a great day!

Creating a set of simple Pester Tests for existing or old PowerShell Modules & making them easier to update in future.

I have long thought of a way to Automagically create some Pester Tests for the Functions contained in a module that perhaps was developed before Pester was really well known.

At that Point we may have been creating psm1 files that contained a number of nested functions within them. I know for one that I am one that did this / added to existing modules that were built this way – have a look at SPCSPS on Github or aka SharePointPowerShell on CodePlex as one of the first projects that I got involved with in the Open Source world.

*Please note I would highly advise to check out the OfficeDevPnP team work for any real SharePoint PowerShell work instead of the example I have given at PnP-PowerShell *

However this is where we are at with a number of older modules and as expressed prior about SPCSPS this was a way that was exceptionally common to run into.

However this isn’t a very scalable way of working with existing codebases and as very frequently found in the PowerShell community an author will not be able to spend the time reviewing the code and accepting pull requests from others. I have previously blogged about the need to “Pull The Community Together” to remove this totally unneeded & actually quite ridiculous barrier to better Modules for the benefit of the community. From a personal stand point –  A PR to an Open Source repository that is open with no input at all for more than a month shows that there is no value in adding to that prior Repository as it shows the Repo Owner has little/no time to do the needed Code Reviews etc.

Now one of the ways that we as a community can negate this issue is to build a stronger collaborative platform for these modules and build teams of people that can be relied on to perform cohesive reviews on various aspects of the code being added. By a Platform I mean a collective of ALL of the community to work out who all are the right people to get involved in the differing areas of the PowerShell Language.

Funnily enough GitHub within Organisations has this model already defined – called Teams. This allows us as a community to have an overarching organisation that will allow us to add the right people to get involved in discussions about certain semantics as we move on in time.

This essentially is a massive change to how we as a community do things however at this point in time really is the best way forward to minimize duplicated effort across multiple codebases and to ensure that we have the best & fully functional modules out there for the others in the community to work with.

Again please read my previous post “Pull The Community Together” on my thoughts on this.

Anyway back to the actual topic of this post. And from this point on I will be using the SPCSPS module as a good example that could be worked with as we find modules can currently be across Repo’s etc

So with SPCSPS I have 103 Functions that are put together in 11 psm1 files. Here are a few Screenshots just to back this up.

Although this “Works” its not great when there maybe a number of additions to 1 file (New Functions, Removing existing functions or Rewriting functions completely) and this can be an easy way for merge conflicts to occur – which we do not want.

So to get round this I realised that the only way was to write a function that will Export all the Functions (not Cmdlets) from a Module and whilst doing this will create a basic pester test for each of the exported functions into a User Specified Folder. My Reason for choosing to do it this way was to allow users to check the exported code before merging this into their existing codebases even though I am actually quite confident that this will work as expected.

This will allow users to refactor any of their existing code much easier going forward and will allow them to also benefit from having some basic pester tests that they can then expand upon.

 

The key component of this is the Export-Function function which has 2 parameters

  • Function – As a String
  • OutPath – As a String

Under the Hood the Export-Function function will when passed the Function Name & the OutPath will get the Function Definition and all the parameters from the Function and will then create the below files based on the following structure.

OutFilePath\FunctionVerb\FunctionName.ps1

OutFilePath\FunctionVerb\FunctionName.tests.ps1

Technically this isn’t actually difficult for us to do at all (hey we are using PowerShell right) but will allow us to quickly and easily add tests to existing (or new) code with little amount of effort and as a PowerShell Enthusiast this is exactly why I started working with PowerShell in 2013.

As a small note this will only work with public functions – though if you were to explicitly load private functions into the current session in a way they become public then you could use this to do the same for those as well.

The module is available on the PSGallery called PesterHelpers and is available on Github under https://github.com/PowerShellModules/PesterHelpers

The benefit of this module is that it can allow a quicker way to move away from modules that contain multiple functions in 1 psm1 file (or nested ps1 files) and can be used to help start to build a test suite of Pester Tests when used with the accompanying PesterHelpers.psm1 & PesterHelpers.basic.Tests.ps1 files for other modules. Is is possibly by modularising as much of the code in both of these files as possible.

A shoutout must go out to Dave Wyatt for a section of code that was contributed to ISE_Cew a while back that on a review whilst looking to expand that module lead me onto creating this Module.

 

 

How to find Local User Groups & events – My Experience

I had a discussion last night via twitter with one of the attendees that I met at the Microsoft Cloud Roadshow in London earlier this year and the outcome of the conversation was that although I find it easy to find out about events – this isn’t all that common for others.

 

So I decided that I would quickly jot down some of the places that can be useful to search to find events that are going on around you.

  • Word of Mouth – If you know a number of people in the area ask them if they know of any events going as they will likely be closest to the events.
  • Twitter – There are a number of Twitter accounts out there that are just setup to serve what’s happening in your area. A good example is the @TechNWUK twitter account which lists all the events around the North West that the group knows about.
  • Eventbrite – www.eventbrite.co.uk is another good place to find tech events – especially those that are full day events or conferences – Just do a quick search for a specific Technology and you’ll get some results back on upcoming events around you.
  • Meetup – www.meetup.com is another and increasingly more common area for User Groups to promote themselves on. Similar to Eventbrite but a much more social feel to event listings. You can also find many more non-techy events listed there which can be very interesting and useful. My only gripe with Meetup is the admin cost for setting up a Meetup group which at $89.94 per 6 months for the unlimited subscription isn’t really what I would call reasonable for a user group marketing channel though this does allow multiple groups under the 1 subscription so can be shared as part of a collective – like Get-PSUGUK
  • Facebook & LinkedIn Groups – Both of these can also be an avenue for finding out about User Groups or events.
  • MSDN Events – http://events.msdn.microsoft.com/ – this can have a number of the Microsoft focused events on there as there is the ability to register as a Technical Event lead on https://www.technicalcommunity.com/ and this allows you to get the event posted to the MSDN events pages

 

If you still can’t find any events around you then I would suggest to try the following

  • Speak with those that you recognise from the community – this could be a Twitter DM etc but is normally a good starting point as they may know of events that already exist or are in the initial starting up period
  • Try and reach out to organisers of similar events as they may likely know of one starting up soon in that area or it may just be advertised in a manner other than the above, this is especially more common when you are in more broader focused technology like the various JavaScript frameworks.
  • Broaden your search area as some user groups will try not to have meetings too close together. Examples of this would include having groups in Birmingham & Wolverhampton.

 

Lastly good luck in your search and if you still haven’t found a User Group around your area then why not think about setting one up? If there is already similar communities out there in other areas then reach out to the organisers of those events and see if they can provide any guidance.

The Pains of Poor/Missing Documentation

There will be a time where you are attempting a new task, whether that is personally or professionally and you find yourself having to resort to the documentation of the product to get to the end goal, whether that be to put together a new piece of furniture, preparing an exquisite meal or bashing different bits of software together from different companies or more commonly the same company.

One thing that is common in all these scenarios is that if the documentation is completely missing then you are forced down the road where you take the “pot luck”/”educated” guess to get to the desired end result and sometimes that can lead to some hilarious results, especially if it is in relation to cooking or building furniture.

In personal experience this has been most common with second-hand furniture and this is because there are few people that keep their assembly instructions once the furniture has been assembled. I think this is due to the “I’ll never need to take this apart and build this again” thoughts that we like to have.

This mentality as it were is what is rather similar in the IT world as well and it is because of this that we have seen lots of undocumented software features. Anyone who has worked with the SharePoint Object Models in much depth will be more than familiar with idea of missing documentation.

 

In the IT world this is something that we have all understood and realised was an issue and at some point in our careers we’ve all been on the receiving end of a lack of documentation or poor documentation and when it happens we’ve either had to turn to technical forums or write it ourselves.

Over the years this has started to get better and I for one am glad to see the initiatives that Technology Organisations are taking to start Open Sourcing product documentation. A number of Teams at Microsoft are doing this now via Github and this to me reinforces the need for all IT Pro’s & Developers to understand how to use Github & the underlying Git software as a part of the core tools within their tool belts. In 3 years time I wouldn’t be surprised if other Source Control mechanisms like SVN & Mercurial have almost been fully replaced by Git. It says something that Microsoft have fully adopted Git into both the Hosted and On-Premises versions of TFS.

So if you read this blog and you haven’t learnt Git yet but are writing PowerShell – go and watch this Session that I did for the Mississippi PowerShell UserGroup as detailed in this previous post and read up on the “My Workflow With Git” Series starting with this post

 

We are at a good point in time where the people behind the products we love and use each day are listening to us in a much more open way than previously and over the coming weeks I’ll be updating the following site with all the Microsoft UserVoice / Connect links and in a nicer format than they currently are.

If you want to help and get involved then drop me a message and I’ll get you added to the Organisation to be able to add commits

Building A Lab using Hyper-V and Lability – The End to End Example

Warning – this post is over 3800 words long and perhaps should have been split into a series – however I felt it best to keep it together – Make sure you have a brew (or 2) to keep you going throughout reading this

In this post we will be looking at how you can build a VM Lab environment from pretty much scratch. This maybe for testing SharePoint applications, SQL Server, Exchange or could be for additional peace of mind when deploying troublesome patches.

Our requirements for this include

  • Machine capable to Run Client Hyper-V – Needs SLAT addressing (most machines released in last 3 years are capable of this)
  • Windows 8.1 / 10 / Server 2012R2 / Server 2016 TP* – In this post I will be using Windows 10 build 14925 – ISO download is available from here
  • If using Windows 8.1 then you will need to install PowerShell PackageManagement – you can use the script in my previous post to do this as detailed in here
  • A Secondary/External Hard Drive or Shared Drive – this is to store all Lability Files including ISO’s, Hotfixes & VHDX files

Where do we begin?

Obviously you need to install your version of Windows as detailed above and once you have done this you can crack on!

Time Taken – ??? Minutes

However as mentioned I’m going to Use Windows 10 – This is just personal preference and is for my ease of use.

As you hopefully know by now Windows 10 comes with WMF5 and therefore we have PackageManagement installed by default. We will use this to grab any PowerShell Modules that we need from the Gallery. I personally have a Machine Setup Script that lives in my Onedrive as you can see below. As this is a Windows 10 Machine I am logging into it with my Hotmail credentials – this then means that I am able to straight away pick the folders that I want to sync to this machine (joys of the integrated ecosystem)

This takes about 5 minutes for OneDrive to finish syncing and then we are ready to go onto the next step.

Time Taken – 5 Minutes

Lability1

In this stage I will Open ISE with Administrator Privileges – this is required as I need to change the Execution Policy from Restricted to RemoteSigned as well as run other scripts that require elevation.

Once I have done this I can move onto the next step. This includes setting up my PowerShell Profile and Environment Variables and then setting up all the required functionality for me to continue working on this new machine.

This includes setting up the ability to install programs via Chocolatey like VSCode & Git and installing Modules from the PowerShell Gallery a few examples being ISE_Cew, ISESteroids, & importantly for this post Lability . Also It is worthwhile to note that at this point I am not downloading any DSC Resources as part of my setup script – this is because we will cover this later on as part of the workings of Lability.

As an additional note it is worth mentioning that the Version of Lability at the time of writing this article is 0.9.8 – however this is likely to change in future with more features being added as required. If you have a thought or suggestion (or issue then head over to the Github Repo and add your suggestions / issues.

I am also in this script enabling the Hyper-V Windows Feature to enable me to carry on with this Lab. I then initiate a System Shutdown. Overall this whole section takes maybe about 10 minutes to complete & yes I intend to build this as a DSC Resource in the near future, however it is worth while to note that Lability has a Function that will ensure that the Hyper-V feature is enabled & your are not awaiting a System Reboot for you – more on this a little later on.

Time Taken – 15 minutes

Once the reboot has completed we can then get on with the Lability bits and that is the real interesting part of this post.

Lability Functions

Lability has 38 public functions and 6 Aliases as can be seen below.

Lability8Lability9

I wouldn’t worry too much on the aliases as these are built in for continued support from prior versions of the Lability Module and will likely be removed on the 1.0 release.

We will be using a number of these functions throughout and as is always best practice have a read of the help for the functions and Yes they do include some great comment based help.

There are a number of additional private functions in the Lability module that have comment based help too but again I wouldn’t be worrying about these too much, unless you need to do a lot of debugging or want to help add to the module.

The Key Lability Functions that you will need are and likely in the below order

  • Get-LabHostDefault
  • Set-LabHostDefault
  • Reset-LabHostDefault
  • Get-LabVMDefault
  • Set-LabVMDefault
  • Reset-LabVMDefault
  • Start-LabHostConfiguration
  • Get-LabHostConfiguration
  • Test-LabHostConfiguration
  • Invoke-LabResourceDownload
  • Start-LabConfiguration
  • Start-Lab
  • Stop-Lab
  • Get-LabVM
  • Remove-LabConfiguration
  • Test-LabConfiguration
  • Import-LabHostConfiguration
  • Export-LabHostConfiguration

These are just a few of the Functions available in Lability and we will cover most of these functions in greater detail as we head through this article.

Lability Media Files

Lability has a number of different configuration files all in JSON format, and these are HostDefaults, VMDefaults & Media. All of these files are in the Config folder of the Lability Module which on your new Machine will be C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config
The HostDefaults file contains all the settings that we associate with the Lability Host Machine. These include the paths where we will be looking for any ISO’s, VHDX, Hotfixes and any additionally required Resource Files for in our Lab.

The VMDefaults file contains all the default settings that we associate with the created VM’s. This includes Media used to create the Machine, Startup RAM, Number of Processors and which virtual switch we can expect the VM’s to use. This can be useful to have just like the HostDefaults but as we will see later on in this post we are most likely to override this in our configurations.

The Media file contains the settings for any media that we we want to use. As Lability in its nature was was built for building Labs it uses the Evaluation Licensed media for the VM’s.

The benefit of this is that the items already in this file allows you to get on with building Labs almost straight away on a brand new Machine.

This file has some included Hotfix Download links for getting the DSC updates on WMF4 for Server 2012R2 & Windows 8.1, but don’t worry Lability uses these to download the hotfixes and embed them into the VHD files for you. 1 Less job to worry about Winking smile

LabHost Defaults

Firstly we need to get the LabHost Defaults setup correctly for our environment – this is important and also is great for being able to move Labs between machines if required ( I’ve had to do this a fair amount myself ) and is why I recommend that all the core Lability bits are installed on a Separate Drive.

Personally I’m using an External Hard Drive but that is because my Lab is portable. I have not tried this with a Shared Drive however there shouldn’t be much that needs to change to get it working that way.

On my External Drive I have the following Setup – I have a folder called Lability and in this I have all the Folders required by Lability as detailed in LabHost Defaults as we will see below – however I also have another folder – Lability-Dev as this was from the Zip that you can download of a repository from GitHub as this was prior to Lability being made available on the PowerShell Gallery. In essence this means that I have copy of Lability that I can edit as required – especially the 3 Lability Configuration files detailed in the previous section but also allows me to do additional debugging as required.

Firstly we will Run Get-LabHostDefault and this should return the below by default – this is because the File HostDefault.json is stored in the C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config location (remember 0.9.8 is the current version – yours may vary)

Lability4

As this is the default and I’ve been using Lability on a few different machines I have a copy of it on my External HDD in the Lability Folder. Lets see what that file says it should be.

Lability5

Well – That’s not good! As you can see on my last machine the external drive had been the D Drive but on this machine its the E Drive. A simple (yet annoying) thing that we can easily change. Now I could be Done Manually but I decided that I wanted to wrap this all together so that I don’t have to think about it again. This is simple enough so I just wrapped it in a very simple function as seen below

Lability6.1

This allows me to Update this as I move it between machines quite easily. This isn’t an ideal scenario but it works at at least.

The benefit of this is that it will update the HostDefaults file on both my C: Drive and the External Drive at the same time – Which further means that this will be easier to be portable.

We can then run the function Reset-LabHostDefault and we should get something similar to the below

Lability7

We can also do the same thing for the VMDefaults file however I find this is less likely to be a requirement as we can override the defaults in the configuration data files that we will work with and this is my preferred method.

Once we have done this we are ready to run the following function Start-LabHostConfiguration – this will on a new machine go and create the required directories as specified in the HostDefaults.json file that I have shown you how to amend and the output from Start-LabHostConfiguration is below

Lability18

We would then use Test-LabHostConfiguration to confirm that this is all correct and we can see that this is the case below

Lability19

Building your First Actual Lab

Wow that was a fair bit of setup required though a lot of it may be completely ignored depending on your own set up or if your re-visiting this post.

Now we move onto the real meaty part of the post and I’m going to use 2 examples for this – The Bundled TestLabGuide and one of my own for a SQLServer install.

So starting with the TestLabGuide.ps1 file there is only 1 small modification that I have made and this is at the end of the File and that is to include the following 2 lines

Lability10

This allows me to build the configuration for these VMs as if it was a script and this is how I am personally doing it.

However on a Machine with No DSC resources we have an issue if we are building VM’s that are Dependant on these DSC Resources.

Well within Lability there is a Function called Invoke-LabResourceDownload and this has the ability to download all the required resources that we need as defined in our configuration data file.

Within the Configuration Data file shown below, the key section for us to look at in here at this point is the NonNodeData section where we have a subsection for Lability configuration Items, this can include EnvironmentPrefix, Media, Network, Resources & most importantly for us DSC Resources.

So far I have found that we only need to run this for pulling the DSCResources as defined in our configuration data file as shown below – this is because we require them to be on the machine before we can build the mof files.

Lability15

I found that it best to have the DSCResources as RequiredVersion and not MinimumVersion as it is by default in the TestLabGuide.psd1 file – This is by preference but with the amount of changes happening to the DSC Resources its worthwhile being extra cautious here.

The output from Invoke-LabResourceDownload can be seen below and this as we can see has downloaded only the DSC Resources that we specified in the Configuration data file (TestLabGuide.psd1)

Lability11
This also means on a clean machine you will be sure that you have the right required versions. This is especially useful when building Labs in my opinion.

However if you have multiple Labs running concurrently then the next bit may be an unfortunate blow to you.

Within the Configuration keyword we have a Dynamic Keyword defined – this is Import-DSCResource – which you may have thought was a function.

With it being a Dynamic Keyword it works a little differently to a normal Function/Cmdlet and therefore we are limited as to what we can do with it – for example we cannot use Splatting with it and we also cannot pass the required DSC Resource Modules to it from outside the current file. This is required for the syntax highlighting that we get as part of the parser. If you want to learn more about the Import-DSCResource Dynamic Keyword then read up this article by the PowerShell Team – be wary it is from 2014 and there hasn’t really been any better content come out on this since (that I can find anyway)

My thoughts on this is that we should be able to pass the required DSC Resources through from the Configuration data file like we have already detailed prior – however this isn’t currently possible.To me it would be beneficial (& logical) to be able to extract this away from the configuration as it is really a part of the configuration data, especially seeing as we already have to pass configuration data to our outputted configuration keyword – in this case TestLabGuide. However this is where we are at and at this time we will need to mirror the DSC Resources between both the configuration itself and the configuration data file.

However that aside lets look at the Node data and especially the All Nodes section which is where the NodeName = *

Lability16

As we can see in here we have a few settings for the all the nodes in this configuration that will share items from Lability and these include the items we had available in the VMDefaults file as well as some other items too that we would want shared between the VM’s like DomainName etc

Further down we can see that for the Client VM’s in this Lab we are specifying different Lability_Media values for each-  so it looks like we will have both a Windows 8.1 & a Windows 10 Client Machine in this Lab.

Lability17

 

That’s enough about the configuration and configuration data side of things – lets go and build our Lab.

At this point what we want to do is just do the below.

Lability12

At this point you will be prompted for an Administrator Password and once that has been given as we can see  above it will go and create all the mof files that we need for this lab. The next step is to go and kick off the actual build of the lab which can be done as shown below

Lability13

This Function Start-LabConfiguration  is the key function of this module as it will go and

  • check that the Lability host is correctly setup – by calling Test-LabHostConfiguration  – if not it will throw an error (possible update here)
  • download any ISO’s that are required as we have expressed in configuration data if that image matches one we have listed in the Media file. It will match these to the Checksum value given in the Media file for the Image
  • download any Hotfixes that are detailed in the Hotfix section of the matched Media in the media.json file.
  • build a Master VHDX File from the ISO & Hotfixes as detailed for the media type for the Lab VM’s as downloaded above – it is worthwhile to point out that this is built of lots of smaller functions that are essentially based off of the Convert-WindowsImage script.
  • build a Lab Specific VHDX file – this is currently setup as a 127GB Dynamic Differencing disks
  • build and inject a Lab VM specific unattend.xml file into each Lab VM VHDX
  • Inject into Lab VM VHDX all required certificates
  • download & Inject any resources into that are defined in the Lability Section of the NonNodeData section of the Configuration data file – I Will show more on this in the SQL Example later on. These are injected into the Lab Specific VHDX file
  • Inject all required DSC Resources into the resulting Lab VM Specific VHDX file.
  • Inject the mof and meta.mof files for each Lab VM into corresponding VHDX file.

Seriously though – Wow – that 1 Function is doing a lot of I would call tedious work for us and depending on your internet connection speed can take anywhere between maybe 30minutes to a day to complete – 1st time I ran it I think it took about 7 hours to complete for me due to slow Internet & I was also watching Netflix at the time Winking smile

You can see the final output from this Function below

Lability20

Note – If you have your own Media you could always create new entries in Media.json for these to save the download time – Especially if you have a MSDN License

Now this is where the fun bit really starts and it also involves more waiting but hopefully not as long as the last bit took you.

All we need to do at this Point is run Start-Lab like shown below and let DSC do its thing – note that I’ve used Get-VM and not Get-LabVM – this is a small issue that I have faced and have reported it on the Github Repo

Lability21

And here is an image of all the VM’s running and getting started

Lability22

 

This part can take anywhere from 10minutes to a few hours depending on your VM Rig setup and the amount of ram allocated to each VM as part of your configuration Data and whether there is requirement to wait for other machines to have be in their desired configuration as well as the complexity of the configurations being deployed.

Under the hood Lability has injected the DSC Configuration into the VM VMDX and has setup a Bootstrap process which in turn calls Start-DSCConfiguration and passes the path of the mof files to this. You can have a look at how this is setup in a VM’s in the following folder C:\Bootstrap\ if you are interested.

Once that is done you’ll have your first fully deployed set of VM’s using DSC & Lability – Pretty amazing isn’t it!

 

SQL Server install – Showing  some of the other features of Lability

In this section I’ll try and keep the content to a minimal but still add in some additionally useful screenshots.

My ConfigurationData file is as below for the SQL Server Node, notice how we have the required properties to be able to install SQL, SourcePath, InstanceName, Features and the Lability_Resource.

Lability24

As this was taken from a previous configuration this is using the xSQLServer DSCResource – take a look at cSQLServer here as this will likely be the version that gets ported to replace the xSQLServer & xSQLPS resources ass it is relatively close to being usable in place of the two resources. Expect news on this after PSConfEU.

Also note that in the Configuration Document we are specifying an additional item in the NonNodeData Section – Resource

Lability25

This allows us to specify further resources that are stored in the E:\Lability\Resources\ folder (E being in my case of course)

I’ll let you decide what you want to put in that folder but any items for installation from the VM could be candidates, things like SharePoint Media or SQL media or other installable programs etc. You could always add your personal script library in a zip file and then get this Lability to unzip it into the right directory. Choices are up to you on this one – so be creative Winking smile

For this Lab I didn’t have the installation media already downloaded so this has had to be downloaded as part of the Start-LabConfiguration Function – however if your remember there was a Invoke-LabResourceDownload Function.

This has some additional Parameters that allow you to download any of the required items for the LabConfiguration to succeed. This can be useful for example if you happen to have a few hours where the Internet Connection is much better than that of your own – especially if you are using this for personal testing and not professional lab testing as it was originally designed to be for.

One of the other great things with this module is that you can make use of it for your lab environments regardless of whether your shop is using WMF5 or not. If your still running WMF4 (with the essential DSC Updates) then you can still build labs using this.

Wrap up

Well I hope you’ve enjoyed reading this 3800+ word post of mine and this helps you get to grips with building out Labs in an easy and repeatable way whilst having the chance to play with DSC to do it.

Remember that this Module DOES A LOT behind the scenes – if it didn’t there wouldn’t be the need for this post – and there is more functionality being introduced as appropriate all the time.

Lability is built for building labs – however you could easily use this for building production like environments – if you dare that is and I can see the benefit to doing so, I mean why re-invent the wheel when Lability will do a lot (most) of the work for you.

Like with getting to grips with most new modules always start with the Help files. This Module has a number of about_* help files and almost all the functions (even the internal ones) have Comment Based Help.

This is a module where you need to RTFM to really understand all the workings of it. Spend a few hours looking through it and understanding it a best as you can. It will be so worth it in the long term even after reading this post a decent number of times.

My do however take my hat off to Iain Brighton (@iainbrighton) on creating this module and for me it is the only module to use when building Lab Environments – So lets gather some momentum as a community to suggest enhancements and improve it even more over on the Github Repo.

My example files that I have used (especially the SQL one) will be made available in due course once Iain has decided on a scalable way forward for being able to share Lab Configurations. We have discussed a number of options (yet to be added to this issue) and if you have an Idea please add it via the Lab Sharing Issue on Github.

This is just the first in a series of posts that I intend on doing on Lability – although future ones will be much shorter but will focus in depth around the functions that I haven’t covered in this post along with some of the more interesting parts in more depth. However I expect that this will be a good starting point for you to get to grips with the Lability Module and start building and running test labs.

As per usual please let me know your thoughts on this post whether it’s via Twitter or via the below comments section and I hope you have enjoyed the read.

Awarded the MVP Award – What this means to me and the future for the community

The MVP Award is defined by Microsoft as the below

Microsoft Most Valuable Professionals, or MVPs, are community leaders who’ve demonstrated an exemplary commitment to helping others get the most out of their experience with Microsoft technologies. They share their exceptional passion, real-world knowledge, and technical expertise with the community and with Microsoft.

This means that within the different areas of the Microsoft Stack there are those out there that really believe that the world can be a better place when we come together as a united front and share the knowledge that we have.

This can be knowledge that we have gained through personal experience of working with the products that we find the most interesting and beneficial to our personal & professional lives or though being there as a point of call for other members of the community to reach out to.

One thing about the MVP Program that has always struck me as an amazing program was the willingness of the MVP’s to do what they can to help you, even if it doesn’t immediately help them in achieving anything, often giving away a decent sized proportion of their own time to do so and in reflection on receiving this award, over the last year I’ve been doing the same, although completely unware that I had been doing so.

I have attended a number of different events in the last year (for more details check out the Where I Have Been page) and have met a tremendous number of amazing people at all these events. It was the framework for the SharePoint & SQL User Groups within the UK that lead me to start thinking about reviving the PowerShell User Groups and I have blogged about this in this post and I have enjoyed every minute of it.

The future for the UK PowerShell User Groups looked good however with being Awarded MVP last week the connections that I will make from being part of the UK MVPs will hopefully allow for the User Groups to grow in the coming months/years so expect there to be news of new User Groups forming in the coming months across the UK.

To help the groups grow, I’ll be putting together an “Organisers Pack” which contain useful information and a collection of the tools, contacts and general tips required  which will help those interested in running a local group get it off the ground – however if in doubt get in contact with me.

 

However there is another aspect to receiving the MVP Award that I want to touch on briefly. As part of the MVP Program the MVP’s get the opportunity to help out in more community focused events, some ran by Microsoft, others ran by the community and others ran by non-profit organisations or the education sector. Giving back to the immediate communities is always going to be high up on my list of priorities however I am really looking forward to working with some of the bigger and more personally touching social opportunities over the next year.

 

This does mean that my calendar will be much busier but for me the end result is always going to be worth it.

Finally – A small shoutout to those that have supported me over the years and especially the last year and although I will not name anyone in particular, I’m sure that those people already know who they are!

2016 – 1 Quarter Down and 3 more to go and the Fun has only just begun!

Fooled Ya! Today I became a MVP!

 

Well only if you read this post

MVP2016

This is an exceptional honour to have been awarded the MVP for Cloud and DataCentre Management and to me this kinda feels like an early birthday present from Microsoft (my birthday is on Monday)

This isn’t something that I ever expected to achieve however it is a recognition from Microsoft themselves of the work that I have previously done for the community.

I started off down the community path only last year in that time I have made some amazing friends and met a number of other MVP’s along the way.

The Remainder of 2016 I have a lot planned to help further enhance the community and hopefully break down some of the barriers between the IT Pro world and the Development world that PowerShell has found its self right in the middle of to make this technology more accessible to all that need to use it.

With that in mind over the next few months there will be some further announcements about Get-PSUGUK – the UK PowerShell Community and its evolution.

As part of the Friends I’ve Made in the MVP Community – It has been agreed that this year at SQL Saturday Manchester by Chris Testa-O’Neill MVP Data Platform, that there will be a dedicated PowerShell Track. This will consist of mainly introduction sessions for those that have no/little PowerShell experience but there will also be some sessions on using PowerShell with a SQL Focus. This is an amazing FREE event and it is as much an honour for me to be working on that as it is to receive the MVP Award – So if your interested in Attending check out http://www.sqlsaturday.com/543 – Announcements on Sessions will be coming in the coming months.

Stay tuned for more details in future and as always – Keep Learning, Networking & Finishing what you Start.

 

Now for the weekend of celebrations to begin Smile

Thank you Microsoft and an even bigger thanks to you – the people reading this Post, keep doing what you do and helping make the community as great as it is.

Invoking PSScriptAnalyzer in Pester Tests for each Rule

This is a quick walkthrough on how you can get output from PSScriptAnalyzer rules in your Pester tests.

So you’ll need

  • Pester ( Version 3.4.0 or above )
  • PSScriptAnalyzer ( Version 1.4.0 or above )

Please note this is shown running on PowerShell  v5 as part of Windows 10 Build 14295 – results may vary on other PowerShell Versions

In the nature of the way we want to work we may have new ScriptAnalyzer rules in the near future (new version / additional community additions / your own custom ScriptAnalyzer rules etc) and we would want ensure that we test for them all without having to change much of the below code

to dynamically do this within our Context Block.

 

So our example code in our Pester Test would look like this

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

    foreach ($module in $modules) {

        Context “Testing Module  – $($module.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

    foreach ($Script in $scripts) {

        Context “Testing Module  – $($script.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

And the result that we would get would be the below

 

PS C:\Users\Ryan\OneDrive\GitHub\kilasuit\Scripts-WIP\PesterScriptAnalzyerExample> .\PesterScriptAnalzyerExample.ps1

Describing Testing all Modules in this Repo to be be correctly formatted

Describing Testing all Scripts in this Repo to be be correctly formatted

   Context Testing Module  – PesterScriptAnalzyerExample for Standard Processing

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingCmdletAliases 233ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueSwitchParameter 124ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingEmptyCatchBlock 134ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidGlobalVars 87ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidInvokingEmptyMembers 104ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidNullOrEmptyHelpMessageAttribute 70ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPositionalParameters 879ms

    [+] passes the PSScriptAnalyzer Rule PSReservedCmdletChar 75ms

    [+] passes the PSScriptAnalyzer Rule PSReservedParams 81ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidShouldContinueWithoutForce 85ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingDeprecatedManifestFields 117ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueForMandatoryParameter 123ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingUserNameAndPassWordParams 95ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingComputerNameHardcoded 113ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingConvertToSecureStringWithPlainText 98ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingInvokeExpression 75ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPlainTextForPassword 103ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWMICmdlet 138ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWriteHost 91ms

    [+] passes the PSScriptAnalyzer Rule PSMisleadingBacktick 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseBOMForUnicodeEncodedFile 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseToExportFieldsInManifest 87ms

    [+] passes the PSScriptAnalyzer Rule PSUseOutputTypeCorrectly 128ms

    [+] passes the PSScriptAnalyzer Rule PSMissingModuleManifestField 84ms

    [+] passes the PSScriptAnalyzer Rule PSPossibleIncorrectComparisonWithNull 99ms

    [+] passes the PSScriptAnalyzer Rule PSProvideCommentHelp 98ms

    [+] passes the PSScriptAnalyzer Rule PSUseApprovedVerbs 75ms

    [+] passes the PSScriptAnalyzer Rule PSUseCmdletCorrectly 867ms

    [+] passes the PSScriptAnalyzer Rule PSUseDeclaredVarsMoreThanAssigments 82ms

    [+] passes the PSScriptAnalyzer Rule PSUsePSCredentialType 91ms

    [+] passes the PSScriptAnalyzer Rule PSShouldProcess 160ms

    [+] passes the PSScriptAnalyzer Rule PSUseShouldProcessForStateChangingFunctions 86ms

    [+] passes the PSScriptAnalyzer Rule PSUseSingularNouns 177ms

    [+] passes the PSScriptAnalyzer Rule PSUseUTF8EncodingForHelpFile 176ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscTestsPresent 98ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscExamplesPresent 102ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseVerboseMessageInDSCResource 81ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalMandatoryParametersForDSC 110ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalParametersForDSC 74ms

    [+] passes the PSScriptAnalyzer Rule PSDSCStandardDSCFunctionsInResource 122ms

    [+] passes the PSScriptAnalyzer Rule PSDSCReturnCorrectTypesForDSCFunctions 101ms

 

This allows you to see from your test if it fails or not and as shown is able to be used for scripts and modules.

The example is a good example as well of getting Pester to test your Pester tests Winking smile

This example is being added into ISE_Cew (see post) in the next feature release (next week some point) though you can just copy and paste it from this blog post as well thanks to a PowerShell ISE addon called CopytoHtml by Gary Lapointe in which you can find more about it and download it at http://blog.falchionconsulting.com/index.php/2012/10/Windows-PowerShell-V3-ISE-Copy-As-HTML-Add-On/

 

Please note that although the above works fine – I dont see the point in running the Describe block if the tests below wont run so I’m adding what I think to be the better version below – this will only run the Describe blocks if there is any scripts or modules

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

if ($Modules.count -gt 0) {

    Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

        foreach ($module in $modules) {

            Context “Testing Module  – $($module.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

if ($Scripts.count -gt 0) {

    Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

        foreach ($Script in $scripts) {

            Context “Testing Module  – $($script.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

C’Ya Manchester – Hello Derby!!

After yet more changes in my housing see my previous post for a bit of background link – I have decided to settle in Derby and this has been for a few very good reasons.

  • I’ve got a good group of friends here.
  • Manchester, London, Nottingham, Birmingham & Wolverhampton are all short train journeys away and I’m expecting to spend more time between them in the coming months ahead.
  • I almost moved here back in January – but I decided to try and give Manchester another go from the work perspective and this seemingly wasn’t to be the case

However I made a good group of friends in Manchester at the various events I’ve been to there over the last 2 years (more so the last year – see this page for more details) and I’ll still be trying to attend some of the amazing events there when it is feasible.

This is yet another new chapter in the book of my life and it is one that really should allow me to lay the foundations for the future a bit easier.

3 & half Months into 2016 and it feels like its the beginning of yet another new year.

Pulling the Community Together to Improve the Quality of PowerShell Modules

In a discussion that started on Twitter a while back with June Blender about the quality of the Modules being posted to the PowerShell Gallery I had an Idea on a way that we could help improve this from the community – using the tools that we have available to us and more importantly the expertise of the rest of the community to help shape and guide the direction for modules.

The Idea starts off with a simple GitHub Organisation, in this case this one – PowerShellModules – in which we set up a number of teams in that organisation. Some of the teams that I have had in mind include but are not limited to, Documentation, Unit testing, Practise guidance and then module maintainers.

This means that there will be a much more open way of directing the development of Modules for the community but still having the ability to allow modules to be developed in a way that allows others to add to them instead of branching out and creating a second module. This also will mean that there is a stronger focus on modules being updated as the overhead isn’t on a single maintainer but can be given to a number of multiple maintainers at any given time.

 

My thoughts around this would start with me porting my current ‘in progress’ modules to the organisation and building out the teams as mentioned above with a suggestions/RFC like repository that would allow us to drive this for the better of the community.

The end goal from my perspective would be to have 1 community recommended Module for a technology (examples being GitHub, Trello, VSTS, Slack etc) that has been widely community developed that covers as much of the functionality as it can without the need for a separate module.

We have a great community of people and I think that this is the right point in time to start the drive to improve the quality of what we all output to the community in a targeted and effort efficient method.

If your interested in getting involved please comment on this post with your GitHub User Name and I’ll get you added to the Organisation in due time but please keep an eye out for the Email from GitHub requesting you to join the Organisation especially in the Junk Folder Winking smile

The Power of the Humble Pint and the Community when things are difficult!

Disclaimer This isn’t a fun post (to read or to write) and nor is it a technical post – this is a reflection on the last few years and is in its very nature quite a personal post. I expect that there will be some kick backs about this post in future and I would humbly ask that you try and imagine yourself having been in my shoes at the time of these events happening and also at the time of writing this post.

If this isn’t the sort of thing that you wish to read then I would advise that you don’t carry on from this point and find something else more fitting to your taste.

 

Sometimes I have those moments where my thoughts can be expressed along the lines of “Fuck this I’m getting out of IT” or in other cases it can be put forward even simpler “Fuck it – I’m Done with all of this”

Now I know that likely read in a way that is different from the actual truth behind it but this is something that we in IT (and generally in Adult life) tend to be expected to sweep this sort of emotion under the carpet as if it doesn’t actually exist. For me this is no longer acceptable and I will not hide my emotions away like this especially when I am struggling to deal with it all and I, like many others in similar situations have to find our own ways of coping with the daily struggles and the tipping points that we each have.

My tipping points have always been centred around home-life stability – something that most people take for granted, however for me this has been something that has been in almost constant flux since I was 16. I’ve had some good spells where things have been stable for almost 2 full years but I’ve also had the other extreme where things have been extremely volatile for a time which could typically be anywhere up to 6 months or so.

This being an area that I;m frequently reminded of I decided in the spirit of writing this post that I would do some data digging and work out a few things around my home life that could be determined as statistically interesting – or at least it was to me at the time and it has been something that I have been thinking of doing for quite some time now, so when better to do it than when I want to write about it.

So I’ve only worked out places I’ve lived since (just before) I turned 16 (I’m 26 in 2 weeks and think that 10 years is a good measure point) and I haven’t split all hotels out into separate rows in my Spreadsheet – if I did that then I feel that it would skew the data but for my own amusement I will likely do this at some point in future.

However the crunch really comes down to the numbers and I have found the following facts – some of which shock me now by looking at them

  • I’ve lived in roughly 24 different places in 10 years – this includes adding spells of stays in hotels, hostels and even a spell of time spent in my car/s
    • I’ve rented 11 places – these I had actual tenancy agreements in place
    • I’ve lived in 8 different forms of shared housing – this doesn’t include any spells of living with family or those that would be classified as family
    • I’ve lived in 4 places that were the direct result of being made homeless – although this is technically skewed as the only time this happened was when my ex partner, son and I were all made homeless – Other than this I have received NO HELP from any UK Council Housing department as I am seen as “Not in Priority Need” according to their data charts
    • I’ve spent 4 different spells where I’ve basically lived in hotels prior to this weekend (as it will be spell number 5 now)

The duration of the average spell is roughly as follows

    • Days per place = 152
    • Weeks per place = 21
    • Months per place = 5
    • Years per place = 0.42

 

The Longest spell is as follows

    • Days = 663
    • Weeks = 94
    • Months  = 21
    • Years  = 1.82

 

So where does this fit in with the Title of this blog post?

Well I suppose it really all started for me in September 2013 when it became the right time to make the decision to move away from everyone that I knew and try and start afresh closer to work and part of that was to work out how to end up meeting new people. Thankfully I learned of the SharePoint User Group, of which I’ve become a regular attendee at, as well as a few other groups thanks to them having Meetup groups, these included LeanAgile and DigiCurry to name a few.

This was the type of surrounding where I realised I felt that I was comfortable with meeting new people and I strongly feel that though these groups (& the resulting conferences I’ve attended too) I’ve made some amazing friends along the way and at some point I came to the conclusion that I would most likely still feel comfortable on the presenting side of the groups and not just as an attendee and that started me off on the journey as stated in this post and followed up in this post , this post and this post.

However its not all been fantastic throughout the years but having found various communities that I enjoy attending, I have somehow managed to scrape through all the difficult moments and made it this far however it is now getting to the point where the number of “Fuck this I’m getting out of IT” or “Fuck it – I’m Done with all of this” days are getting somewhat out of balance with what I’m able to maintain. A key part of this is again due to my current housing situation, aka a downright mess.

With this in mind I suppose what I’m getting at is that without the communities that I’ve become a part of I’m not sure I would be able to write this post.

Also over the last few weeks I’ve been asked around the why’s & how’s I manage to juggle it all. The short and simple answer is that all these communities are essentially a part of the extended family to me and with this I feel that at times I want to see and help direct the communities growth for the benefit of the community members much akin to how a parent would do the same for their children, something that I currently cannot do for my own children.

As I see the communities as extended family I plan things around “the next event” which has been a major driving force keeping me going over the last year.

So with all that in mind there may be times ahead where I’m struggling and especially more so with the Core items in life but that wont ever stop me from continuing the work I put into the community where it is still feasibly possible. There may be times ahead where I may need to unfortunately let conferences/user groups down that I’ve promised my time to speak at but this as things stands is an unfortunate bi-product of the situation I currently find myself in and if I can in anyway mitigate having to do so then I will, but currently it is looking like I may have to cancel everything that I have planned ahead, which is frustrating and infuriating especially when I’ve been looking so forward to the events I’ve got planned over the coming weeks/months.

 

2016 was supposed to be the beginning of an amazing year where things fell into place, however 3 (almost 4) months in and its feeling like 2013, 2014 & 2015 all over again.

 

It’s 4:12 and I really shouldn’t hit publish on this article but I feel that it needs done – Yes I’m 25 and Yes I Suffer from depression, but that hasn’t stopped me achieving a hell of a lot in the last few years and I don’t expect that to change. I can and will get though it no matter how difficult the roads ahead are for me.

Updated! Quick Win – Install PowerShell Package Management on systems running PowerShell v3 / v4

**Update 9th March 2016 PowerShell Team released an updated version of the PackageManagement modules today and I’ve updated the Script accordingly and will install the latest PackageManagement modules for you with a little verbose output

Updated Microsoft blog is at https://blogs.msdn.microsoft.com/powershell/2016/03/08/package-management-preview-march-2016-for-powershell-4-3-is-now-available/ **

 

This is a very very quick post about the latest feature being made available downlevel from Powershell v5.

As Microsoft have released PackageManagement (formally OneGet) that is now avaliable for PowerShell v3 & v4 as detailed in this link http://blogs.msdn.com/b/powershell/archive/2015/10/09/package-management-preview-for-powershell-4-amp-3-is-now-available.aspx

That’s right the ability to pull directly from the PowerShell Gallery but you need to install the Package Management release which I’ve Scripted for you here.

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/PSPackManInstaller’)

And if you want to look at the Script then direct link is http://bit.ly/PSPackManInstaller – this takes you to the RAW version of the file on Github so will not download or execute – but will allow you to read it

Hope this is useful for you

PS credit goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

Beware where you place Comment based Help

In working on the PSWordPress Module that Stephen Owen (@foxdeploy) has started I came across an interesting issue after running my Pester tests which calls – $ModuleFunction.Definition.Contains(‘.Synopsis’) | Should be True to check for comment based help – and it was failing even though I had Comment Based help in there. The problem was that the Help was Above the Function Keyword – so this means that it wasn’t carried through to the $ModuleFunction.Definition property. So this is a good example of why CommentBased Help for a module should be held within the initial Curly Bracket for the function.

Updated! Quick Win – Install-WMF5 (again)

 

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

 

**UPDATE 24/02/2016** WMF5 was re-released today and the below scripts should still work**

 

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but this is low priority for me as really we shouldn’t be deploying Server 2008 or Windows 7 Systems any more

<

p class=”ExternalClass9895ED4FC6204BF3B8661CE60051AB0C”>PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future

Travelling to London – yet again!

Today I’m off to London for the 5/6 time already this year. This time I’m off to present at the WinOps Meetup and then attend the Chef Users Meetup the following day.
My Presentation at WinOps will be the one that I gave for the Mississippi PowerShell User Group.

Looking forward to seeing old faces and new ones there as well!

Presentation to Mississippi PowerShell User Group now Available on Youtube

I recently presented to the Mississippi PowerShell User Group via Skype for Business on Tuesday 9th (well Wednesday as it was 2:30am that I presented for them)

The video from that session is now online at https://youtu.be/z3CmI73LnyI

My session was around my Script & Module Creation Workflow and the Tight integration with Git & Pester that I have included in a module that is an Addon to the PowerShell ISE – Called ISE_Cew

ISE_Cew can be downloaded from the PowerShell Gallery in 1 line

Install-Module ISE_Cew

This is only if you are running PowerShell v3+ and have installed the PackageManagement Additional install – details at PackageManagement on v3/v4

Otherwise you can install it from GitHub at https://github.com/kilasuit/ISE_CEW

Hopefully this will be helpful for you and I’ll look forward to gathering feedback from you in future.

Recapping a fun filled January

January was a month where I did a lot of travelling and attending different user groups.

I attended the following events

  • 12th – PASS SQL London Chapter
  • 14th – Microsoft UKITCAMP event – What’s New in Windows 10 Enterprise
  • 14th – WinOps Meetup – London
  • 19th – SharePoint User Groups – Leeds
  • 20th – Microsoft UKITCAMP event – What’s New in Server 2016 – Manchester
  • 20th – LeanAgile Meetup – Manchester
  • 26th – .Net / Windows Platform User Group – Manchester
  • 28th – SQL User Group – Ipswich

Now those that know me will know that I attend the events because I’m genuinely interested in the topics at the events or catching up with the people that are there after all the events are all about the community and the networking opportunities that they bring to us.

I intend to in future post ahead of where you can catch me in the following month and then at the end of the month detail more about what I learned at the events.

So what did I get upto at these events?

SQLPass London – we were treated to 2 great presentations – one on Azure Machine Learning by Peter Myers who was over from Australia for the NDC conference and one on the power of Pyramid Analytics for use in SQL Server. Both left me wanting to learn more when I get chance to.

What’s New in Windows 10 – this event was great for the insights as to how the future of device deployment can be done in the mobile workforce. This is an ever changing and evolving story and one that I’m hopeful the organisations will start to work with and combine with the power of the cloud. Stay in tune for more on this in a later post.

WinOps – Well this was definitely one of the highlights of the month for me – mainly because it was a great event and from it I’ve made some great contacts and will be looking forward to seeing where the WinOps group expands to in the coming months as well as the upcoming WinOps conference.

SharePoint User Group – I always enjoy attending this User Group as the organisers always make sure to make the event great. I also learned about Node.js from Garry Tinder and this event was the only one where I gave a presentation this month and it was on PowerShell DSC for SharePoint. It spawned off some good conversations and I know I’m not the only one that is interested in seeing where this will lead. We were also treated to some lightning sessions from Simon Hudson & Simon Doy.

What’s New in Server 2016 – Server 2016 is again an evolving story and its important now more than ever to under how this is the case and how this can affect business decisions going forward. Nano Server will be a big change to how IT teams manage their infrastructure and guess what technology will be a bigger part of the IT admin’s lives? Yes that’s PowerShell which is now pretty much 10 years old.

LeanAgile – We had a great talk from Amy Lynch about the diversification challenges in IT and although it was more geared to the challenges of recruiting more women in IT the conversations that came out of it were certainly what made the night worthwhile.

Windows Platform / .NET – this was actually just a really good causal social event as there wasn’t a topic for the evening and one that lasted until 5am (yes on a Tuesday)

SQL East Anglia – Well I must have been a tad bit mad to attend this. 8 hours of driving to get there and back but it was certainly worth it. Being from a SharePoint background I’ve always had the mindset that it’s a sensible idea to learn more about the underlying Data Platform and I’m certainly not wrong in that thinking. Perhaps it’s the statistician in me coming through but data analysis and manipulation is definitely one of those dark arts that keeps me entertained. Perhaps that’s why I see a fantastic future in Azure Machine Learning and PowerBI. The session was one that took my interest as understanding what’s new in SQL 2016 for the BI professional is something that certainly ticks the boxes for me in the future of IT technology. Also being able to have a good chinwag with Mark Broadbent and congratulate him in person on his MVP Award also made the trip worthwhile for me. I’m a big advocate of the community and rewarding those that make the community a better place and I’m glad that Mark has got this Award under his belt.

Well that was what I got upto in January – February has been busy so far and I’ll post about it in due course.

#PSTweetChat – The EU Edition

Body:

If you have been involved in the #PSTweetChat events that have been running with Adam Bertram (@adbertram) & Jeffery Hicks (@JeffHicks) and a number of others, then you would be aware of just how awesome these 1-hour open discussion sessions truly are.

A number of PowerShell Questions get asked and answered from members of the PowerShell community worldwide so they can become a valuable resource to getting a right answer to an issue quickly or even just learning more about the people that make the awesome community and what they are currently up to this week.

So after a discussion about this with Adam Bertram & Jeffery Hicks on a previous #PSTweetChat I had said that I would co-ordinate a #PSTweetChat that was more EU time friendly – Well I am announcing that the EU #PSTweetChat will be monthly on the 3rd Friday of the month starting on February 19th at 10am UTC 0

This will be the case for February & March and then in April (due to the Time changes to the Clocks) we will move to 10am UTC +1

So the dates will be (mark them in your calendar)

  • February 19th – 10am UTC 0
  • March 18th – 10am UTC 0
  • April 15th – 10am UTC +1
  • May 20th – 10am UTC +1
  • June 17th – 10am UTC +1
  • July 15th – 10am UTC +1
  • August 19th – 10am UTC +1
  • September 16th – 10am UTC +1
  • October 21st – 10am UTC +1
  • November 18th – 10am UTC 0
  • December 16th – 10am UTC 0

I will look forward to the future #PSTweetChat conversations.

Published: 27/01/2016 15:14


Get-PSUGUK – Call for Speakers

The UK PowerShell User Groups (Get-PSUGUK) are undergoing an expansion with some new User Groups being sprung up across the UK over the upcoming months.

If you have been able to attend any of the previous events (Manchester & London) then you will know that I’m a big advocate for making a real community out of the User Group Meet ups – one where there is the opportunity for those from all differing IT backgrounds to rise up and present a topic to their local User Group.

With the number of differing PowerShell related Topics that there are available there should be no shortage of possible topics and there will be availability for a variety of different formats including short 15-minute Lightning presentations, 45-minute Presentations and even possibility for a full evening presentation.

With this in mind we are putting forward a Call for Speakers for the year ahead which if you are interested in presenting a topic then we have an Excel Survey that you can fill in located at http://1drv.ms/1OVuqul – please note that we are not currently looking delivering sessions remotely.

Myself and the fellow organisers, Corey Burke (@cburke007), Iain Brighton (@IainBrighton) & Richard Siddaway (@RSiddaway) will look forward to seeing you at future User Group Events and would like to invite you to follow @GetPSUGUK on Twitter for updates on the PowerShell User Group Events in Future.

To Sign up for the Manchester User Group on Feb 1st please follow this link – http://www.eventbrite.co.uk/e/get-psuguk-manchester-february-tickets-20117867082

To Sign up for the London User Group on Feb 4th please follow this link – http://www.eventbrite.co.uk/e/get-psuguk-february-london-tickets-20727283864

To see future events (including new cities as they occur) please bookmark this link – http://get-psuguk.eventbrite.com/

 

My Workflow for Using Git with Github – pt3

So this is Part 3 of a series of Blog Posts on my (currently ever changing) Workflow with Git, Github & PowerShell.

Hopefully you have had chance to look at the previous posts in this series if not they are

Part 1

Part 2

However, for this post we will be concentrating on Script & Module Creation and how we can make the overall experience more efficient with the PSISE_Addons module that I’m releasing on Github https://github.com/kilasuit/ISE_Cew

We will cover the following items today

  • Use of PSDrives for the functions & why you should use them in this case
  • Use of Git for the source control in this module – Simple and hopefully clear to follow and not too in depth
  • The Functions used in this module
  • Creating compliant PSD1 files for the PowerShell Gallery – Because it’s annoying to have to do this manually and that’s why we automate right – Again added in this Module is an example!
  • Creating some basic Pester Tests – again without even thinking about it as I am giving this to you as part of the ISE_Cew Module!

 

So firstly – Using PSDrives within the Function and why to use them.

PSDrives are a good and simple way of having a location that you can reach only in PowerShell and can use a variety of Different Providers – FileSystem, ActiveDirectory etc

We will be using the FileSystem Provider in this example for our functions.

So I begin with a Few PSDrives created in my PowerShell Profile As you can see below – I use 1 profile and encapsulate an if Statement to check if the host is the PowerShell ISE for ISE only Functions – like the 3 I will be showing you today.

011416_1355_MyWorkflowf1

 

 

 

As you can see I Have a PSDrive for all the following OneDrive, Github, Scripts, Scripts-WIP, Modules & Modules-WIP

The Important bit here is that all my Github Repo’s are actually stored in my Personal OneDrive as you can see from the above image – this means that I can Switch between Devices Very Very Quickly once things are saved 😉 – It’s probably key to point out this could be your OneDrive For Business Location as well or a Shared Drive if you are in an organisation that uses HomeDrive locations. The Possibilities are endless – save your imagination.

So from here we have our PSDrives set up and the beauty of this is that it allows very simple navigation between repo’s as you have them all centralised. In my Next Post I will be showing you how you can populate this Github Location with all your Repo’s that you have Forked and how you can ensure that all repo’s are up to date and have the latest updates pulled into them or commits pushed from them in just a few functions! So stay tuned for that!

Hopefully this will leave you with a reason to adopt PSDrives into your workflow and we can move onto the next section.

Use of Git for Source Control in this module

Quick intro – Git can be used with your own offline Repo’s – It doesn’t need to be linked to a Github Repo however this is most common and I would recommend that you use Github – You can get 5 Private Repo’s for only $7 USD a month (about £4 odd)

For more information on Git for Source Control if you are new to it I would recommend having a look at this series on PowerShellMagazine http://www.powershellmagazine.com/2015/07/13/git-for-it-professionals-getting-started-2/ – that was how I got started and also have a play with the “Learn Git in your Browser” on http://try.github.com/ – it’s definitely a useful starting point and will help you out in your future endeavours.

So the Key Git commands used in this module are

  • Git add – Simply adds files to be watched under the git version control system
  • Git commit – commits a version change to the repository location for the files

Other Key Git commands

  • git push – pushes new commits to the remote repository (this could be hosted on Github)
  • git pull – Pulls changes from the remote repostitory (this could be hosted on Github)
  • git clone – clones the remote repository to your own machine (this could be hosted on Github)

So that’s the key commands out of the way but why and when will we want to use them or in our case not think about using them.

The Functions used in this Module

 

For me I’m a bit data-centric (aka a data hoarder) – I prefer to have too much data than not enough. So to cover this I wanted a way to Auto Commit Any changes to Scripts and Modules every time I saved them

So this is where creating this module came in – and the functions contained within.

I have created 3 Core functions

  • Save-CurrentISEFile -Saves Current File that is Open in ISE whether it has been previously Saved or not
  • Save-AllNamedFiles – Saves all Files that have previously been saved
  • Save-AllUnnamedFiles – Saves All files that have not been previously save

And also 2 helper Functions

  • Request-YesOrNo (amended from the one included in SPPS – thanks to @Jpaarhuis)
  • Get-CustomCommitMessage – basic VB popup box for custom commit message

Now I must note that currently this is only compatible with v4 and above though that can change – if I get enough time and requests to do so – though you could always add this in with your own updates to the module.

So let’s look at the Process to be used with the following functions.

Imagine we are creating a script called Get-UptimeInfo – we could easily create this and then save using the default handlers in ISE however there are some issues that I’ve found

  • File path defaults to the last saved location – Example being you are working on a script in C:\MyAwesomeScript then when you click Save it will save it there and for each time you reopen ISE it will default there – Not Ideal
  • I like things Centralised – that way I know where things are!

So to Overcome this we put at the beginning of the script the following #Script#Get-UptimeInfo# – this then tells the Save-CurrentISEFile or the Save-UnnamedFiles functions that we want to create a PS1 file called Get-UptimeInfo in the Scripts-WIP PSDrive location

This would look like the below before running either Function

011416_1355_MyWorkflowf2

 

 

 

 

 

 

 

 

And the We can run either function in the Command Pane like any other function

011416_1355_MyWorkflowf3

 

 

 

 

 

 

 

 

Oooh – look at that the file has saved and named Get-UptimeInfo – it is a ps1 file and we are being prompted about whether we want to add a Custom Commit Message – So we’ll click Yes and see what we get

011416_1355_MyWorkflowf4

 

 

 

 

 

 

 

 

Here we get a popup box, (currently uses VB for this but it works and is a few lines) asking us to provide our commit message – I’ll Add the Commit Message as “Testing PSISE_Addons Save-CurrentISEFile Function”

The Result can be seen below – note there is a Get-UptimeInfo.tests.ps1 file been created as well – This is set by the what you include in your profile as suggested in the PSISE_Addons.psm1 file

011416_1355_MyWorkflowf5

 

 

 

 

 

 

 

 

 

 

 

If we wanted to do the Same with Modules then it would be something like this #Module#FindSystemInfo# and that would tell Save-CurrentISEFile or the Save-UnnamedFiles functions that we want to create a folder in the Modules-WIP PSdrive location called FindSystemInfo and in there we want to save a PSM1 file called FindSystemInfo whilst also creating a compliant psd1 file for the Gallery & also Creating a FindSystemInfo.tests.ps1 file containing some default Pester tests

011416_1355_MyWorkflowf6

 

 

 

 

 

 

 

 

 

When we run the Save-CurrentISEFile function we get the Same as before

011416_1355_MyWorkflowf7

 

 

 

 

 

 

 

 

 

Again we will Click Yes here and in the next popup we will add the message “Adding New Module FindSystemInfo” and we can see this has happened below

011416_1355_MyWorkflowf8

 

 

 

 

 

 

 

 

But we can see here that there are 3 files added – a PSD1, a PSM1 and a tests.ps1 file have all been added to a New Folder based on the Module name FindSystemInfo – but we didn’t specify these. That’s because the Functions Save-CurrentISEFile & Save-AllUnnamedFiles will do the hard work for us and create a fully compliant with the PowerShell Gallery ps1d file and also a default Pester test as long as you have them specified in your profile. BONUS – I provide you sample versions of both of these with the module. How generous is that!

But the most important thing is being able to not have to call the actual functions but using simple keyboard combinations so as part of the ISE_Cew.psm1 file there is a sample part at the bottom to add into your PowerShell Profiles – again another easy freebie!

So you can now download this from the PowerShell Gallery using Install-Module ISE_Cew – so go and get it and give me some feed back via the GitHub Repo – https://github.com/kilasuit/ISE_Cew/

ThePSGallery AutoBot – Some Issues I’ve Found.

Ok so If you didn’t already know then this happened

And although it has been interesting it has also brought up some issues (mainly data which is one of my biggest bug bears in all things IT) with the PowerShell Gallery and these include and is not limited to

  • Publish-Module is partially Broken – This is due to it requiring you to add in LicenseURI & ProjectUri when run – however the issue then lies that this information doesn’t make it to the Gallery Pages nor does it make it to the Gallery Items when using Find-Module. This means that there is a seemingly large number of Modules that don’t seem to include this *mandatory* information.

    There is a workaround and this should be what you are doing anyway but that is to ensure that in the PSD1 file for the module that this information is included in there as then it gets populated to the Gallery correctly. It is thanks to an Impromptu chat with Doug Finke (@Dfinke) that this came out of the woodwork and was confirmed as being the resolution– So thanks Doug!

Also I decided to confirm this via uploading 2 different modules ThePSGallery-Working & ThePSGallery-Broken – conclusive results show that the only method to get the LicenseURI & ProjectURI to show in the Gallery (either via the Website or via Find-Module) is to include it in the psd1 file.

PSgalleryissue

So Go and update your psd1 files to include this and please upvote this issue on UserVoice to either force this be updated or to drop the LicenseURI & ProjectURI from Publish-Module – http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11439807-gallery-issue-licenseuri-projecturi-aren-t-add

  • Author Details – Again this is massively broken due to the nature of the Gallery with simple things like spelling mistakes etc in Author names which then means there are 3 or 4 authors that are actually 1 person See the Author Details file in the Github Repo for the PSGallery ( https://github.com/kilasuit/ThePSGallery/ )for more details which was built using the below logic

 

Find-module * | Select-object Author | Sort-Object -Unique | Out-File Authors.txt

I would Suggest that the Gallery also allows the Profile to link to other Social networks like Twitter and get the Twitter Handle (Would be great to be able to get ThePSGallery Autobot to include the author in the tweets thus increasing visibility to those that submit work there)

I would also suggest that any Authors include an additional Hashtable for the PrivateData section that includes any additional Contact info – Like Twitter or Blog Urls etc and sets this as a default psd1 variable – Will be posting about this shortly.

  • Additional Metadata – I for one would like to be able to with the AutoBot to be able to tweet on 100, 1000, 10000 downloads of a module to congratulate the authors. However this isn’t made available at the present time via Find-Module however can be gotten via WebScraping Methods – Not particularly resource friendly and time consuming too. I have raised this to the Powers that be via UserVoice and you can upvote this as well via http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11279160-add-additonal-properties-to-powershell-gallery-ite
  • Lastly – Find-Module isn’t very User friendly for cmdlet, function, workflow or DSCResource searching if tags aren’t used.

    This is a bit simpler to get around but the logic is rather hidden in how to do so as you would have to call Find-Module * and then Pipe this to Select-Object -ExpandProperty Includes and then to Where-Object

    So for SharePoint it May look like this which isn’t very graceful at all but this does return the 2 Modules that have SharePoint in Function Names – Problem being what if they aren’t functions but Cmdlets.

    Find-Module * | Select-Object Name -ExpandProperty Includes | Where-Object {$_.Function -like ‘*SharePoint*’} | Select-Object Name

    Again there is a UserVoice Suggestion for this at http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11088855-find-module-needs-improvements-to-enable-better-di

Hopefully that’s a small insight to ThePSGallery AutoBot and in a future Blog Post I will detail the inner workings of the AutoBot including the actual script that runs it (not on Github as of yet)

2015 – Challenging but it’s only the beginning!

Body:

This is just a post on my own recent reflections of the events throughout 2015.

Each month in 2015 came with its ever more increasingly difficult obstacles to overcome and for the first 6 months of the year a lot of the obstacles had come about in the previous year or two and a lot of it was predominantly centred around my children.

Yes Children – Plural.

At 19 I became a father to my now 6 year old son and then at 23 I became a father to my now 2 year old daughter. So relatively early on in life I was blessed (and as of the last few years in some ways to be cursed) for becoming a parent so young.

But anyway to set the scene of entering 2015 – my daughter had been in care for the majority of her life (since she was around 3 months old) and by the end of January my son was back in care whilst going through court proceedings – and at this point it was not possible for me to have them back, let alone practical – something that still really irritates the absolute bejeasus out of me.

January – April pretty much was all over the place because of all the goings on with the children and trying to get settled into my then relatively new Job at The University of Manchester, which meant that in those months not only can I really not remember what really happened much of that time due to the vast amounts of ridiculously unnecessary documents as part of the Family & Children’s Courts processes.

May – Now if I could delete 1 month (well 1 week if I’m truly honest) then this would be the one. This was the month when the decisions on future contact with my children would be made and although I fought and fought and fought throughout the whole 76 weeks that we were under court proceedings, it was essentially all in vain as the ruling was completely against my arguments based on supposed “professional opinions”.

I have made it more than clear that the odds were against me because of a few basic facts

  • I’m Male – and regardless of the push of same sex rights, men are still widely considered as being the less capable when it comes to family
  • I was brought up to believe that you struggle and juggle – but you get through it in the end – this was perceived as me being “less willing to accept help if needed” and that is utter bullshit!
  • Other than myself in court there was only my barrister, the Children’s lawyer and my ex partners lawyer that were male and present. Female Social Worker, Lawyer for the Local Authority (not my Local authority either), Children’s Guardian & the Judge

But Also May was the month that started the ball rolling for speaking and attending #PSConfAsia – so it wasn’t all doom and gloom. Although I didn’t commit until Mid-June when I had the outcome from the Court case. Needless to say from that point onwards I made a conscious decision that I needed to really start the ball rolling for a better, more flexible and more enjoyable future – so you could say that in June I made the decision that I would at some point in the following 6 months leave the University of Manchester in pursuit of something more fitting to what I want to be doing.

So a part of this involves me making what could be a life changing and seriously difficult time ahead as I move into self-employment but it is something that I have thought about doing now for almost 3 years.

So that will be one big challenge of 2016 – however that is only the beginning as the first challenge is to find somewhere permanent to live. These last 2 months have been expensive although comfortable as I’ve spent most of the time in hotels. I dread to think how much this has cost me personally and with no real tangible gain from it at all.

2016 will see me continue the work that I started with the PowerShell User Groups here in the UK and I am looking to massively expand this where possible. This is mainly in part with the fact that I love presenting and meeting the community but also there is, in my opinion, a massive gap in the skills base of real understanding of PowerShell and in part this can be partially alleviated by increasing the number of User Groups across the UK. So I’ve already put it out there that if anyone thinks that they could co-organise then I will work with them to get these off the ground and running. I will also provide content to them and help get the community growing – the end goal is to be in a similar position to the SharePoint & SQL User Groups where there is a decent local User Group Community and then we can look at localised PowerShell Saturday’s at some point in 2017. Ambitious – but that is the way I am and with the help of those out there that want to get these things off the ground then we will achieve it – plus hopefully by this time next week I should have some good news about the future for these events – so hold on tight.

Also 2016 is the year when I will really Start-Contributing to the wider community, I’ve been promising a PSISE_Addons module for about a month now and the reason for it being delayed is because I’m just adding more and more features to it to make it better, that and I’m actually refactoring the codebase for it already. This will be one of the topics that I will be covering at the Manchester & London User Groups and I’m hoping if I’ve hit it right then it should be a major help to all that use it. Not going to give much more away than that until released (and blogged about of course)

Also 2016 will be the year that will involve lots more presenting. As it stands I have already been accepted for the PowerShell & DevOps Summit in BelleVue, WA for my 26th birthday so that will be an interesting and amazing event to attend, which I would have been looking to attend even if I hadn’t been selected to present just because of the sheer number of the PowerShell Community (and Product Group) will be there.

I’m also waiting to hear back from at least another 7 events on whether I’ll be presenting at them – a Variety of SharePoint, SQL & DevOps type events.

Then there is also #PSConfEU – which I am co-organising with Tobias Weltner and this looks to be another fantastic event – we already have a great line up of speakers and still a few slots to fill. Details about this will be posted in the next few days and I would urge you to Register at www.psconf.eu as soon as you can.

Then late on in the year I’ll be returning to Singapore for the follow on #PSConfAsia Event. And I can’t wait for that one either and hopefully there should be some good news in the upcoming weeks about this event. So again keep your eyes & ears open for updates.

That’s a brief overview of 2015 and overlook of what is to come in 2016.

But one final thing to remember – there is always a story behind every person and most of the time that story stays behind a firmly locked door. I’m happy to be open about it as being open about it all helps me remember that no matter how hard it’s been (and it’s been torture at times) I’ve got though it all and will continue to do so for years and years to come. One day the wrongs of 2015 will be corrected but the journey there for me is longer than I had originally anticipated and forms a solid core of the plan of my next 5 years.

 

So as we enter 2016 – be happy you got through 2015 and look forward to the beginning of yet another journey. This one already looks and feels like it will be amazing and the people that I meet along the way will be a fundamental core to that becoming a reality.

Published: 31/12/2015 17:35


Quick Win – Install WMF5 via PowerShell 1 Liner

Body:

**

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

 

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but may not have this ready until the new year.

 

PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future.

Published: 23/12/2015 17:45


My Workflow for using Git with GitHub, OneDrive, PowerShell & Exchange – optimising efficiency across devices. Pt 2

So this is the Second of a series of blog posts about how my workflow with Git, GitHub, PowerShell & Exchange currently works.

In this series we will cover a number of different applications and a number of custom PowerShell Functions that will really help you to optimise efficiency across devices when these processes are put in place.

As we left off the last part I had opened up the thoughts to the next stages of the process – the actual automation of the workloads and in this post I will go into detail about the integration between GitHub & Exchange to create New Inbox Folders and Inbox Rules for any newly followed Repo’s

So I’ve just followed a bunch of new GitHub Repo’s and now I want to have the Inbox rules & Folders set up for me – Manually this is not that much of a task – but why manually do this when there is a way to Automate it

121815_1235_MyWorkflowf1

 

 

 

 

 

 

So to do this we need to Query the GitHub API, and we will be utilising Invoke-WebRequest for this, to see all the Watched Repo’s which is accessible for any user via the following URL – just replace kilasuit (my GitHub alias) with the username that you want to query – https://api.github.com/users/kilasuit/subscriptions

Now this is a paginated and non-authenticated URL which means that it returns only 30 results at a time and these 30 results will only be public repos, so if your following more than 30 repos then you will need to rerun the query with a small addition to get the correct outputs, however if your following private repos then you will need to wait for a following post on how to accomplish this.

To do this we will check the request to see if there is an included Link Header – if there is then we know that the user is watching over 30 public repos and we also from this header we will get the number of pages (batches of 30) that we will need to iterate through to get all the users watched public Repo’s

Below we have a small snippet of the overall function to do the iteration of the pages (this could be tidied up – but for now it works)

$repos =@()

$web = Invoke-WebRequest -Uri “https://api.github.com/users/$githubuser/subscriptions”

$page1 = Invoke-RestMethod -Uri “https://api.github.com/users/$githubuser/subscriptions”

$page1 | ForEach-Object { $repos += $_.name }

if ($web.Headers.Keys.Contains(‘Link’))

{

    $LastLink = $web.Headers.Link.Split(‘,’)[1].replace(‘<‘,).replace(‘>’,).replace(‘ ‘,).replace(‘rel=”last”‘,).replace(‘;’,)

    [int]$last = $($lastlink[($lastlink.ToCharArray().count 1)]).tostring()

    $pages = 2..$last

    foreach ($page in $pages)

        {

        Invoke-RestMethod -Uri “https://api.github.com/users/$githubuser/subscriptions?page=$page | ForEach-Object { $repos += $_.name }

        }

}

So as you can see above we have queried the GitHub API and we have stored a result in a local variable. This then allows us to pull manipulate the data stored in the corresponding Object to add all the watched Repos (from this initial 30 responses) into an array and as you can see from the above we are piping the Web Object to the ConvertFrom-JSON command and then piping that to the ForEach-Object Command.

After this we then query the Web Object it to see if there is a Link Header. If we find that there is a Link Header, then we will generate the needed array of pages and add this to another local array called pages. We then loop through the array of pages to get all the repos and do exactly the same as above to add them into the repos array.

At this point we then have the name of all of the watched repos in the repos variable and we are making use of the name property for the rest of the function.

However, there are a lot more properties that have been collected as part of the Invoke-WebRequest call as can be seen below for the PowerShellEditorServices Repo.

Please note I’ve excluded a fair amount of properties in the below (using the *_url string) because there are some almost useless properties there unless you are going to do further queries.

121815_1232_MyWorkflowf2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

As we can see there are some interesting and useful details given here including open_issues, forks, watchers, pushed_at (when the last local commit was pushed to the GitHub Repo), updated_at (when the last commit was made – not when it was pushed), name & full_name.

So from the data that we can get programmatically for repo’s we can really do quite a lot with it – especially with all the different url options that there are included as well as shown below – notice that the majority of them are the relevant API URL – this makes building out some wrapping functions that call Invoke-WebRequest for those API endpoints very much easier – so Kudos to GitHub for making things easier for Developers to do this sort of thing.

121815_1232_MyWorkflowf3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

As it is there is certainly a lot of flexibility that comes with using GitHub and PowerShell for some automation of smaller tasks like this.

The below is the “Meat” of this function where we connect to Exchange Online using a wrapper function Connect-EXOSession (See my PoshFunctions repo for where this lives) where all we need to pass is a PSCredential.

Once we have connected to the PSSession to Exchange Online we then are gathering all the current folders that live in the GitHub Stuff Folder in my Inbox into another array variable. This means that we can then compare whether there is already a folder there or not and if the folder doesn’t exist then we will create one and then we will then create a new Inbox Rule for it to automatically move any emails that come from GitHub to the correct folder for the Repo.

 

We then write a small amount of text to the screen to tell which folders and rules have been set up – useful if like me you are following almost 50 repos already and this is likely to increase over time as well.

Connect-EXOSession -EXOCredential $EXOCredential

$folders = Get-MailboxFolder $MailboxFolderParent -GetChildren | Select-Object -ExpandProperty Name

foreach ($repo in $repos) {

    if($folders -notcontains $repo)

        { New-MailboxFolder -Parent $MailboxFolderParent -Name $repo | Out-Null

          New-InboxRule -SubjectContainsWords “[$repo]” -MoveToFolder $MailboxFolderParent\$repo -Name $repo -Force | Out-Null

          Write-Output “Folder & rule for $repo have been created”

          }

    }

Then as you can see we then are removing the PSSession, EXOSession, as to clean up the PowerShell session as the EXOSession will stay in memory unless Remove-PSSession is run.

Remove-PSSession -Name EXOSession

The full version of this function can be found in my PoshFunctions Repo on GitHub located at https://github.com/kilasuit/PoshFunctions

Stay tuned for part 3 where I will go into the details as to how and where to structure your GitHub Repo’s to allow you to then automatically check the status of all of the repo’s in 1 function. This will be a less intensive post but is still useful to read for the efficiency that can be gained from using this method and it will then help you understand the more detail specific aspects of the rest of my workflow.