2015 – Challenging but it’s only the beginning!


This is just a post on my own recent reflections of the events throughout 2015.

Each month in 2015 came with its ever more increasingly difficult obstacles to overcome and for the first 6 months of the year a lot of the obstacles had come about in the previous year or two and a lot of it was predominantly centred around my children.

Yes Children – Plural.

At 19 I became a father to my now 6 year old son and then at 23 I became a father to my now 2 year old daughter. So relatively early on in life I was blessed (and as of the last few years in some ways to be cursed) for becoming a parent so young.

But anyway to set the scene of entering 2015 – my daughter had been in care for the majority of her life (since she was around 3 months old) and by the end of January my son was back in care whilst going through court proceedings – and at this point it was not possible for me to have them back, let alone practical – something that still really irritates the absolute bejeasus out of me.

January – April pretty much was all over the place because of all the goings on with the children and trying to get settled into my then relatively new Job at The University of Manchester, which meant that in those months not only can I really not remember what really happened much of that time due to the vast amounts of ridiculously unnecessary documents as part of the Family & Children’s Courts processes.

May – Now if I could delete 1 month (well 1 week if I’m truly honest) then this would be the one. This was the month when the decisions on future contact with my children would be made and although I fought and fought and fought throughout the whole 76 weeks that we were under court proceedings, it was essentially all in vain as the ruling was completely against my arguments based on supposed “professional opinions”.

I have made it more than clear that the odds were against me because of a few basic facts

  • I’m Male – and regardless of the push of same sex rights, men are still widely considered as being the less capable when it comes to family
  • I was brought up to believe that you struggle and juggle – but you get through it in the end – this was perceived as me being “less willing to accept help if needed” and that is utter bullshit!
  • Other than myself in court there was only my barrister, the Children’s lawyer and my ex partners lawyer that were male and present. Female Social Worker, Lawyer for the Local Authority (not my Local authority either), Children’s Guardian & the Judge

But Also May was the month that started the ball rolling for speaking and attending #PSConfAsia – so it wasn’t all doom and gloom. Although I didn’t commit until Mid-June when I had the outcome from the Court case. Needless to say from that point onwards I made a conscious decision that I needed to really start the ball rolling for a better, more flexible and more enjoyable future – so you could say that in June I made the decision that I would at some point in the following 6 months leave the University of Manchester in pursuit of something more fitting to what I want to be doing.

So a part of this involves me making what could be a life changing and seriously difficult time ahead as I move into self-employment but it is something that I have thought about doing now for almost 3 years.

So that will be one big challenge of 2016 – however that is only the beginning as the first challenge is to find somewhere permanent to live. These last 2 months have been expensive although comfortable as I’ve spent most of the time in hotels. I dread to think how much this has cost me personally and with no real tangible gain from it at all.

2016 will see me continue the work that I started with the PowerShell User Groups here in the UK and I am looking to massively expand this where possible. This is mainly in part with the fact that I love presenting and meeting the community but also there is, in my opinion, a massive gap in the skills base of real understanding of PowerShell and in part this can be partially alleviated by increasing the number of User Groups across the UK. So I’ve already put it out there that if anyone thinks that they could co-organise then I will work with them to get these off the ground and running. I will also provide content to them and help get the community growing – the end goal is to be in a similar position to the SharePoint & SQL User Groups where there is a decent local User Group Community and then we can look at localised PowerShell Saturday’s at some point in 2017. Ambitious – but that is the way I am and with the help of those out there that want to get these things off the ground then we will achieve it – plus hopefully by this time next week I should have some good news about the future for these events – so hold on tight.

Also 2016 is the year when I will really Start-Contributing to the wider community, I’ve been promising a PSISE_Addons module for about a month now and the reason for it being delayed is because I’m just adding more and more features to it to make it better, that and I’m actually refactoring the codebase for it already. This will be one of the topics that I will be covering at the Manchester & London User Groups and I’m hoping if I’ve hit it right then it should be a major help to all that use it. Not going to give much more away than that until released (and blogged about of course)

Also 2016 will be the year that will involve lots more presenting. As it stands I have already been accepted for the PowerShell & DevOps Summit in BelleVue, WA for my 26th birthday so that will be an interesting and amazing event to attend, which I would have been looking to attend even if I hadn’t been selected to present just because of the sheer number of the PowerShell Community (and Product Group) will be there.

I’m also waiting to hear back from at least another 7 events on whether I’ll be presenting at them – a Variety of SharePoint, SQL & DevOps type events.

Then there is also #PSConfEU – which I am co-organising with Tobias Weltner and this looks to be another fantastic event – we already have a great line up of speakers and still a few slots to fill. Details about this will be posted in the next few days and I would urge you to Register at www.psconf.eu as soon as you can.

Then late on in the year I’ll be returning to Singapore for the follow on #PSConfAsia Event. And I can’t wait for that one either and hopefully there should be some good news in the upcoming weeks about this event. So again keep your eyes & ears open for updates.

That’s a brief overview of 2015 and overlook of what is to come in 2016.

But one final thing to remember – there is always a story behind every person and most of the time that story stays behind a firmly locked door. I’m happy to be open about it as being open about it all helps me remember that no matter how hard it’s been (and it’s been torture at times) I’ve got though it all and will continue to do so for years and years to come. One day the wrongs of 2015 will be corrected but the journey there for me is longer than I had originally anticipated and forms a solid core of the plan of my next 5 years.


So as we enter 2016 – be happy you got through 2015 and look forward to the beginning of yet another journey. This one already looks and feels like it will be amazing and the people that I meet along the way will be a fundamental core to that becoming a reality.

Published: 31/12/2015 17:35

Quick Win – Install WMF5 via PowerShell 1 Liner



** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153


PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but may not have this ready until the new year.


PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future.

Published: 23/12/2015 17:45

My Workflow for using Git with GitHub, OneDrive, PowerShell & Exchange – optimising efficiency across devices. Pt 2

So this is the Second of a series of blog posts about how my workflow with Git, GitHub, PowerShell & Exchange currently works.

In this series we will cover a number of different applications and a number of custom PowerShell Functions that will really help you to optimise efficiency across devices when these processes are put in place.

As we left off the last part I had opened up the thoughts to the next stages of the process – the actual automation of the workloads and in this post I will go into detail about the integration between GitHub & Exchange to create New Inbox Folders and Inbox Rules for any newly followed Repo’s

So I’ve just followed a bunch of new GitHub Repo’s and now I want to have the Inbox rules & Folders set up for me – Manually this is not that much of a task – but why manually do this when there is a way to Automate it








So to do this we need to Query the GitHub API, and we will be utilising Invoke-WebRequest for this, to see all the Watched Repo’s which is accessible for any user via the following URL – just replace kilasuit (my GitHub alias) with the username that you want to query – https://api.github.com/users/kilasuit/subscriptions

Now this is a paginated and non-authenticated URL which means that it returns only 30 results at a time and these 30 results will only be public repos, so if your following more than 30 repos then you will need to rerun the query with a small addition to get the correct outputs, however if your following private repos then you will need to wait for a following post on how to accomplish this.

To do this we will check the request to see if there is an included Link Header – if there is then we know that the user is watching over 30 public repos and we also from this header we will get the number of pages (batches of 30) that we will need to iterate through to get all the users watched public Repo’s

Below we have a small snippet of the overall function to do the iteration of the pages (this could be tidied up – but for now it works)

$repos =@()

$web = Invoke-WebRequest -Uri “https://api.github.com/users/$githubuser/subscriptions”

$page1 = Invoke-RestMethod -Uri “https://api.github.com/users/$githubuser/subscriptions”

$page1 | ForEach-Object { $repos += $_.name }

if ($web.Headers.Keys.Contains(‘Link’))


    $LastLink = $web.Headers.Link.Split(‘,’)[1].replace(‘<‘,).replace(‘>’,).replace(‘ ‘,).replace(‘rel=”last”‘,).replace(‘;’,)

    [int]$last = $($lastlink[($lastlink.ToCharArray().count 1)]).tostring()

    $pages = 2..$last

    foreach ($page in $pages)


        Invoke-RestMethod -Uri “https://api.github.com/users/$githubuser/subscriptions?page=$page | ForEach-Object { $repos += $_.name }



So as you can see above we have queried the GitHub API and we have stored a result in a local variable. This then allows us to pull manipulate the data stored in the corresponding Object to add all the watched Repos (from this initial 30 responses) into an array and as you can see from the above we are piping the Web Object to the ConvertFrom-JSON command and then piping that to the ForEach-Object Command.

After this we then query the Web Object it to see if there is a Link Header. If we find that there is a Link Header, then we will generate the needed array of pages and add this to another local array called pages. We then loop through the array of pages to get all the repos and do exactly the same as above to add them into the repos array.

At this point we then have the name of all of the watched repos in the repos variable and we are making use of the name property for the rest of the function.

However, there are a lot more properties that have been collected as part of the Invoke-WebRequest call as can be seen below for the PowerShellEditorServices Repo.

Please note I’ve excluded a fair amount of properties in the below (using the *_url string) because there are some almost useless properties there unless you are going to do further queries.

















As we can see there are some interesting and useful details given here including open_issues, forks, watchers, pushed_at (when the last local commit was pushed to the GitHub Repo), updated_at (when the last commit was made – not when it was pushed), name & full_name.

So from the data that we can get programmatically for repo’s we can really do quite a lot with it – especially with all the different url options that there are included as well as shown below – notice that the majority of them are the relevant API URL – this makes building out some wrapping functions that call Invoke-WebRequest for those API endpoints very much easier – so Kudos to GitHub for making things easier for Developers to do this sort of thing.

















As it is there is certainly a lot of flexibility that comes with using GitHub and PowerShell for some automation of smaller tasks like this.

The below is the “Meat” of this function where we connect to Exchange Online using a wrapper function Connect-EXOSession (See my PoshFunctions repo for where this lives) where all we need to pass is a PSCredential.

Once we have connected to the PSSession to Exchange Online we then are gathering all the current folders that live in the GitHub Stuff Folder in my Inbox into another array variable. This means that we can then compare whether there is already a folder there or not and if the folder doesn’t exist then we will create one and then we will then create a new Inbox Rule for it to automatically move any emails that come from GitHub to the correct folder for the Repo.


We then write a small amount of text to the screen to tell which folders and rules have been set up – useful if like me you are following almost 50 repos already and this is likely to increase over time as well.

Connect-EXOSession -EXOCredential $EXOCredential

$folders = Get-MailboxFolder $MailboxFolderParent -GetChildren | Select-Object -ExpandProperty Name

foreach ($repo in $repos) {

    if($folders -notcontains $repo)

        { New-MailboxFolder -Parent $MailboxFolderParent -Name $repo | Out-Null

          New-InboxRule -SubjectContainsWords “[$repo]” -MoveToFolder $MailboxFolderParent\$repo -Name $repo -Force | Out-Null

          Write-Output “Folder & rule for $repo have been created”



Then as you can see we then are removing the PSSession, EXOSession, as to clean up the PowerShell session as the EXOSession will stay in memory unless Remove-PSSession is run.

Remove-PSSession -Name EXOSession

The full version of this function can be found in my PoshFunctions Repo on GitHub located at https://github.com/kilasuit/PoshFunctions

Stay tuned for part 3 where I will go into the details as to how and where to structure your GitHub Repo’s to allow you to then automatically check the status of all of the repo’s in 1 function. This will be a less intensive post but is still useful to read for the efficiency that can be gained from using this method and it will then help you understand the more detail specific aspects of the rest of my workflow.

My Workflow for using Git with GitHub, OneDrive, PowerShell & Exchange – optimising efficiency across devices. Pt 1

So this is the First of a series of blog posts about how my workflow with Git, Github, PowerShell & Exchange currently works.

In this series we will cover a number of different applications and a number of custom PowerShell Functions that will really help you to optimise efficiency across devices when these processes are put in place.

Ok so I am going to share with you my current end to end workflow for ensuring that I commit myself to making use of Source Control for all of my Scripts (and other files when required).

So imagine this scenario – I have come across a public GitHub Repo (I’ll be using the recently published PowerShellEditorServices Repo for this) which I feel would benefit me in either my current or future workloads. So at this point I will decide to make sure to “watch” the repo as shown below.








Notice that the Default is to be not Watching the Repo – I will click the Watching and that will add this repo into the list of other repo’s that I am watching. Now you may be thinking at this point “Ok seems reasonable enough” but this is the point where it becomes interesting – well at least it does for me and my workflow – and that’s why your reading right??

So now when I access Github it will show me in my Watching section (as seen below) any new Issues or Pull Requests raised with any watched repos and by “default” you also get an email of this notification – this is the important bit as until GitHub release a Mobile Client this is important for me – as I like official apps from a vendor than a community developed app – especially ones done by a singular developer (but that’s just me)









Ok so at this point we should have a hunch of perhaps where the rest of this series may be going.

I make it quite aware that I run my own Office 365 tenant that at under £4 per month gives me SharePoint, Exchange, Azure Active Directory & Skype for Business. This allows me to focus on using the Microsoft Ecosystem that I have to hand and I can then use them for much more than simple use cases.

However, at this point I will warn that the next stage will require you to either have your own Office 365 Tenant & therefore Exchange Online or that you must be an Exchange Administrator to run the next stages – As the PowerShell Scripts I’ve got built currently run against Exchange Online (though I will be looking to port this to be usable for those using Outlook and aren’t Exchange Admins).

So first off from this point we have to create a New Folder & Inbox Rule for my Exchange Mailbox – but we don’t really want to have to add any information in for it now do we – I mean that’s why we are automating the process to the nth degree.

Stay tuned for part 2 where I will go into the details as to how to Query the GitHub API & then create new Inbox Folders & Rules based on that and then start to show you the rest of my workflow.