All posts by Ryan Yates

Fooled Ya! Today I became a MVP!

 

Well only if you read this post

MVP2016

This is an exceptional honour to have been awarded the MVP for Cloud and DataCentre Management and to me this kinda feels like an early birthday present from Microsoft (my birthday is on Monday)

This isn’t something that I ever expected to achieve however it is a recognition from Microsoft themselves of the work that I have previously done for the community.

I started off down the community path only last year in that time I have made some amazing friends and met a number of other MVP’s along the way.

The Remainder of 2016 I have a lot planned to help further enhance the community and hopefully break down some of the barriers between the IT Pro world and the Development world that PowerShell has found its self right in the middle of to make this technology more accessible to all that need to use it.

With that in mind over the next few months there will be some further announcements about Get-PSUGUK – the UK PowerShell Community and its evolution.

As part of the Friends I’ve Made in the MVP Community – It has been agreed that this year at SQL Saturday Manchester by Chris Testa-O’Neill MVP Data Platform, that there will be a dedicated PowerShell Track. This will consist of mainly introduction sessions for those that have no/little PowerShell experience but there will also be some sessions on using PowerShell with a SQL Focus. This is an amazing FREE event and it is as much an honour for me to be working on that as it is to receive the MVP Award – So if your interested in Attending check out http://www.sqlsaturday.com/543 – Announcements on Sessions will be coming in the coming months.

Stay tuned for more details in future and as always – Keep Learning, Networking & Finishing what you Start.

 

Now for the weekend of celebrations to begin Smile

Thank you Microsoft and an even bigger thanks to you – the people reading this Post, keep doing what you do and helping make the community as great as it is.

Invoking PSScriptAnalyzer in Pester Tests for each Rule

This is a quick walkthrough on how you can get output from PSScriptAnalyzer rules in your Pester tests.

So you’ll need

  • Pester ( Version 3.4.0 or above )
  • PSScriptAnalyzer ( Version 1.4.0 or above )

Please note this is shown running on PowerShell  v5 as part of Windows 10 Build 14295 – results may vary on other PowerShell Versions

In the nature of the way we want to work we may have new ScriptAnalyzer rules in the near future (new version / additional community additions / your own custom ScriptAnalyzer rules etc) and we would want ensure that we test for them all without having to change much of the below code

to dynamically do this within our Context Block.

 

So our example code in our Pester Test would look like this

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

    foreach ($module in $modules) {

        Context “Testing Module  – $($module.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

    foreach ($Script in $scripts) {

        Context “Testing Module  – $($script.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

And the result that we would get would be the below

 

PS C:\Users\Ryan\OneDrive\GitHub\kilasuit\Scripts-WIP\PesterScriptAnalzyerExample> .\PesterScriptAnalzyerExample.ps1

Describing Testing all Modules in this Repo to be be correctly formatted

Describing Testing all Scripts in this Repo to be be correctly formatted

   Context Testing Module  – PesterScriptAnalzyerExample for Standard Processing

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingCmdletAliases 233ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueSwitchParameter 124ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingEmptyCatchBlock 134ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidGlobalVars 87ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidInvokingEmptyMembers 104ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidNullOrEmptyHelpMessageAttribute 70ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPositionalParameters 879ms

    [+] passes the PSScriptAnalyzer Rule PSReservedCmdletChar 75ms

    [+] passes the PSScriptAnalyzer Rule PSReservedParams 81ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidShouldContinueWithoutForce 85ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingDeprecatedManifestFields 117ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueForMandatoryParameter 123ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingUserNameAndPassWordParams 95ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingComputerNameHardcoded 113ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingConvertToSecureStringWithPlainText 98ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingInvokeExpression 75ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPlainTextForPassword 103ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWMICmdlet 138ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWriteHost 91ms

    [+] passes the PSScriptAnalyzer Rule PSMisleadingBacktick 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseBOMForUnicodeEncodedFile 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseToExportFieldsInManifest 87ms

    [+] passes the PSScriptAnalyzer Rule PSUseOutputTypeCorrectly 128ms

    [+] passes the PSScriptAnalyzer Rule PSMissingModuleManifestField 84ms

    [+] passes the PSScriptAnalyzer Rule PSPossibleIncorrectComparisonWithNull 99ms

    [+] passes the PSScriptAnalyzer Rule PSProvideCommentHelp 98ms

    [+] passes the PSScriptAnalyzer Rule PSUseApprovedVerbs 75ms

    [+] passes the PSScriptAnalyzer Rule PSUseCmdletCorrectly 867ms

    [+] passes the PSScriptAnalyzer Rule PSUseDeclaredVarsMoreThanAssigments 82ms

    [+] passes the PSScriptAnalyzer Rule PSUsePSCredentialType 91ms

    [+] passes the PSScriptAnalyzer Rule PSShouldProcess 160ms

    [+] passes the PSScriptAnalyzer Rule PSUseShouldProcessForStateChangingFunctions 86ms

    [+] passes the PSScriptAnalyzer Rule PSUseSingularNouns 177ms

    [+] passes the PSScriptAnalyzer Rule PSUseUTF8EncodingForHelpFile 176ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscTestsPresent 98ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscExamplesPresent 102ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseVerboseMessageInDSCResource 81ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalMandatoryParametersForDSC 110ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalParametersForDSC 74ms

    [+] passes the PSScriptAnalyzer Rule PSDSCStandardDSCFunctionsInResource 122ms

    [+] passes the PSScriptAnalyzer Rule PSDSCReturnCorrectTypesForDSCFunctions 101ms

 

This allows you to see from your test if it fails or not and as shown is able to be used for scripts and modules.

The example is a good example as well of getting Pester to test your Pester tests Winking smile

This example is being added into ISE_Cew (see post) in the next feature release (next week some point) though you can just copy and paste it from this blog post as well thanks to a PowerShell ISE addon called CopytoHtml by Gary Lapointe in which you can find more about it and download it at http://blog.falchionconsulting.com/index.php/2012/10/Windows-PowerShell-V3-ISE-Copy-As-HTML-Add-On/

 

Please note that although the above works fine – I dont see the point in running the Describe block if the tests below wont run so I’m adding what I think to be the better version below – this will only run the Describe blocks if there is any scripts or modules

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

if ($Modules.count -gt 0) {

    Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

        foreach ($module in $modules) {

            Context “Testing Module  – $($module.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

if ($Scripts.count -gt 0) {

    Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

        foreach ($Script in $scripts) {

            Context “Testing Module  – $($script.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

C’Ya Manchester – Hello Derby!!

After yet more changes in my housing see my previous post for a bit of background link – I have decided to settle in Derby and this has been for a few very good reasons.

  • I’ve got a good group of friends here.
  • Manchester, London, Nottingham, Birmingham & Wolverhampton are all short train journeys away and I’m expecting to spend more time between them in the coming months ahead.
  • I almost moved here back in January – but I decided to try and give Manchester another go from the work perspective and this seemingly wasn’t to be the case

However I made a good group of friends in Manchester at the various events I’ve been to there over the last 2 years (more so the last year – see this page for more details) and I’ll still be trying to attend some of the amazing events there when it is feasible.

This is yet another new chapter in the book of my life and it is one that really should allow me to lay the foundations for the future a bit easier.

3 & half Months into 2016 and it feels like its the beginning of yet another new year.

Pulling the Community Together to Improve the Quality of PowerShell Modules

In a discussion that started on Twitter a while back with June Blender about the quality of the Modules being posted to the PowerShell Gallery I had an Idea on a way that we could help improve this from the community – using the tools that we have available to us and more importantly the expertise of the rest of the community to help shape and guide the direction for modules.

The Idea starts off with a simple GitHub Organisation, in this case this one – PowerShellModules – in which we set up a number of teams in that organisation. Some of the teams that I have had in mind include but are not limited to, Documentation, Unit testing, Practise guidance and then module maintainers.

This means that there will be a much more open way of directing the development of Modules for the community but still having the ability to allow modules to be developed in a way that allows others to add to them instead of branching out and creating a second module. This also will mean that there is a stronger focus on modules being updated as the overhead isn’t on a single maintainer but can be given to a number of multiple maintainers at any given time.

 

My thoughts around this would start with me porting my current ‘in progress’ modules to the organisation and building out the teams as mentioned above with a suggestions/RFC like repository that would allow us to drive this for the better of the community.

The end goal from my perspective would be to have 1 community recommended Module for a technology (examples being GitHub, Trello, VSTS, Slack etc) that has been widely community developed that covers as much of the functionality as it can without the need for a separate module.

We have a great community of people and I think that this is the right point in time to start the drive to improve the quality of what we all output to the community in a targeted and effort efficient method.

If your interested in getting involved please comment on this post with your GitHub User Name and I’ll get you added to the Organisation in due time but please keep an eye out for the Email from GitHub requesting you to join the Organisation especially in the Junk Folder Winking smile

The Power of the Humble Pint and the Community when things are difficult!

Disclaimer This isn’t a fun post (to read or to write) and nor is it a technical post – this is a reflection on the last few years and is in its very nature quite a personal post. I expect that there will be some kick backs about this post in future and I would humbly ask that you try and imagine yourself having been in my shoes at the time of these events happening and also at the time of writing this post.

If this isn’t the sort of thing that you wish to read then I would advise that you don’t carry on from this point and find something else more fitting to your taste.

 

Sometimes I have those moments where my thoughts can be expressed along the lines of “Fuck this I’m getting out of IT” or in other cases it can be put forward even simpler “Fuck it – I’m Done with all of this”

Now I know that likely read in a way that is different from the actual truth behind it but this is something that we in IT (and generally in Adult life) tend to be expected to sweep this sort of emotion under the carpet as if it doesn’t actually exist. For me this is no longer acceptable and I will not hide my emotions away like this especially when I am struggling to deal with it all and I, like many others in similar situations have to find our own ways of coping with the daily struggles and the tipping points that we each have.

My tipping points have always been centred around home-life stability – something that most people take for granted, however for me this has been something that has been in almost constant flux since I was 16. I’ve had some good spells where things have been stable for almost 2 full years but I’ve also had the other extreme where things have been extremely volatile for a time which could typically be anywhere up to 6 months or so.

This being an area that I;m frequently reminded of I decided in the spirit of writing this post that I would do some data digging and work out a few things around my home life that could be determined as statistically interesting – or at least it was to me at the time and it has been something that I have been thinking of doing for quite some time now, so when better to do it than when I want to write about it.

So I’ve only worked out places I’ve lived since (just before) I turned 16 (I’m 26 in 2 weeks and think that 10 years is a good measure point) and I haven’t split all hotels out into separate rows in my Spreadsheet – if I did that then I feel that it would skew the data but for my own amusement I will likely do this at some point in future.

However the crunch really comes down to the numbers and I have found the following facts – some of which shock me now by looking at them

  • I’ve lived in roughly 24 different places in 10 years – this includes adding spells of stays in hotels, hostels and even a spell of time spent in my car/s
    • I’ve rented 11 places – these I had actual tenancy agreements in place
    • I’ve lived in 8 different forms of shared housing – this doesn’t include any spells of living with family or those that would be classified as family
    • I’ve lived in 4 places that were the direct result of being made homeless – although this is technically skewed as the only time this happened was when my ex partner, son and I were all made homeless – Other than this I have received NO HELP from any UK Council Housing department as I am seen as “Not in Priority Need” according to their data charts
    • I’ve spent 4 different spells where I’ve basically lived in hotels prior to this weekend (as it will be spell number 5 now)

The duration of the average spell is roughly as follows

    • Days per place = 152
    • Weeks per place = 21
    • Months per place = 5
    • Years per place = 0.42

 

The Longest spell is as follows

    • Days = 663
    • Weeks = 94
    • Months  = 21
    • Years  = 1.82

 

So where does this fit in with the Title of this blog post?

Well I suppose it really all started for me in September 2013 when it became the right time to make the decision to move away from everyone that I knew and try and start afresh closer to work and part of that was to work out how to end up meeting new people. Thankfully I learned of the SharePoint User Group, of which I’ve become a regular attendee at, as well as a few other groups thanks to them having Meetup groups, these included LeanAgile and DigiCurry to name a few.

This was the type of surrounding where I realised I felt that I was comfortable with meeting new people and I strongly feel that though these groups (& the resulting conferences I’ve attended too) I’ve made some amazing friends along the way and at some point I came to the conclusion that I would most likely still feel comfortable on the presenting side of the groups and not just as an attendee and that started me off on the journey as stated in this post and followed up in this post , this post and this post.

However its not all been fantastic throughout the years but having found various communities that I enjoy attending, I have somehow managed to scrape through all the difficult moments and made it this far however it is now getting to the point where the number of “Fuck this I’m getting out of IT” or “Fuck it – I’m Done with all of this” days are getting somewhat out of balance with what I’m able to maintain. A key part of this is again due to my current housing situation, aka a downright mess.

With this in mind I suppose what I’m getting at is that without the communities that I’ve become a part of I’m not sure I would be able to write this post.

Also over the last few weeks I’ve been asked around the why’s & how’s I manage to juggle it all. The short and simple answer is that all these communities are essentially a part of the extended family to me and with this I feel that at times I want to see and help direct the communities growth for the benefit of the community members much akin to how a parent would do the same for their children, something that I currently cannot do for my own children.

As I see the communities as extended family I plan things around “the next event” which has been a major driving force keeping me going over the last year.

So with all that in mind there may be times ahead where I’m struggling and especially more so with the Core items in life but that wont ever stop me from continuing the work I put into the community where it is still feasibly possible. There may be times ahead where I may need to unfortunately let conferences/user groups down that I’ve promised my time to speak at but this as things stands is an unfortunate bi-product of the situation I currently find myself in and if I can in anyway mitigate having to do so then I will, but currently it is looking like I may have to cancel everything that I have planned ahead, which is frustrating and infuriating especially when I’ve been looking so forward to the events I’ve got planned over the coming weeks/months.

 

2016 was supposed to be the beginning of an amazing year where things fell into place, however 3 (almost 4) months in and its feeling like 2013, 2014 & 2015 all over again.

 

It’s 4:12 and I really shouldn’t hit publish on this article but I feel that it needs done – Yes I’m 25 and Yes I Suffer from depression, but that hasn’t stopped me achieving a hell of a lot in the last few years and I don’t expect that to change. I can and will get though it no matter how difficult the roads ahead are for me.

Updated! Quick Win – Install PowerShell Package Management on systems running PowerShell v3 / v4

**Update 9th March 2016 PowerShell Team released an updated version of the PackageManagement modules today and I’ve updated the Script accordingly and will install the latest PackageManagement modules for you with a little verbose output

Updated Microsoft blog is at https://blogs.msdn.microsoft.com/powershell/2016/03/08/package-management-preview-march-2016-for-powershell-4-3-is-now-available/ **

 

This is a very very quick post about the latest feature being made available downlevel from Powershell v5.

As Microsoft have released PackageManagement (formally OneGet) that is now avaliable for PowerShell v3 & v4 as detailed in this link http://blogs.msdn.com/b/powershell/archive/2015/10/09/package-management-preview-for-powershell-4-amp-3-is-now-available.aspx

That’s right the ability to pull directly from the PowerShell Gallery but you need to install the Package Management release which I’ve Scripted for you here.

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/PSPackManInstaller’)

And if you want to look at the Script then direct link is http://bit.ly/PSPackManInstaller – this takes you to the RAW version of the file on Github so will not download or execute – but will allow you to read it

Hope this is useful for you

PS credit goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

Beware where you place Comment based Help

In working on the PSWordPress Module that Stephen Owen (@foxdeploy) has started I came across an interesting issue after running my Pester tests which calls – $ModuleFunction.Definition.Contains(‘.Synopsis’) | Should be True to check for comment based help – and it was failing even though I had Comment Based help in there. The problem was that the Help was Above the Function Keyword – so this means that it wasn’t carried through to the $ModuleFunction.Definition property. So this is a good example of why CommentBased Help for a module should be held within the initial Curly Bracket for the function.

Updated! Quick Win – Install-WMF5 (again)

 

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

 

**UPDATE 24/02/2016** WMF5 was re-released today and the below scripts should still work**

 

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but this is low priority for me as really we shouldn’t be deploying Server 2008 or Windows 7 Systems any more

<

p class=”ExternalClass9895ED4FC6204BF3B8661CE60051AB0C”>PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future

Presentation to Mississippi PowerShell User Group now Available on Youtube

I recently presented to the Mississippi PowerShell User Group via Skype for Business on Tuesday 9th (well Wednesday as it was 2:30am that I presented for them)

The video from that session is now online at https://youtu.be/z3CmI73LnyI

My session was around my Script & Module Creation Workflow and the Tight integration with Git & Pester that I have included in a module that is an Addon to the PowerShell ISE – Called ISE_Cew

ISE_Cew can be downloaded from the PowerShell Gallery in 1 line

Install-Module ISE_Cew

This is only if you are running PowerShell v3+ and have installed the PackageManagement Additional install – details at PackageManagement on v3/v4

Otherwise you can install it from GitHub at https://github.com/kilasuit/ISE_CEW

Hopefully this will be helpful for you and I’ll look forward to gathering feedback from you in future.

Recapping a fun filled January

January was a month where I did a lot of travelling and attending different user groups.

I attended the following events

  • 12th – PASS SQL London Chapter
  • 14th – Microsoft UKITCAMP event – What’s New in Windows 10 Enterprise
  • 14th – WinOps Meetup – London
  • 19th – SharePoint User Groups – Leeds
  • 20th – Microsoft UKITCAMP event – What’s New in Server 2016 – Manchester
  • 20th – LeanAgile Meetup – Manchester
  • 26th – .Net / Windows Platform User Group – Manchester
  • 28th – SQL User Group – Ipswich

Now those that know me will know that I attend the events because I’m genuinely interested in the topics at the events or catching up with the people that are there after all the events are all about the community and the networking opportunities that they bring to us.

I intend to in future post ahead of where you can catch me in the following month and then at the end of the month detail more about what I learned at the events.

So what did I get upto at these events?

SQLPass London – we were treated to 2 great presentations – one on Azure Machine Learning by Peter Myers who was over from Australia for the NDC conference and one on the power of Pyramid Analytics for use in SQL Server. Both left me wanting to learn more when I get chance to.

What’s New in Windows 10 – this event was great for the insights as to how the future of device deployment can be done in the mobile workforce. This is an ever changing and evolving story and one that I’m hopeful the organisations will start to work with and combine with the power of the cloud. Stay in tune for more on this in a later post.

WinOps – Well this was definitely one of the highlights of the month for me – mainly because it was a great event and from it I’ve made some great contacts and will be looking forward to seeing where the WinOps group expands to in the coming months as well as the upcoming WinOps conference.

SharePoint User Group – I always enjoy attending this User Group as the organisers always make sure to make the event great. I also learned about Node.js from Garry Tinder and this event was the only one where I gave a presentation this month and it was on PowerShell DSC for SharePoint. It spawned off some good conversations and I know I’m not the only one that is interested in seeing where this will lead. We were also treated to some lightning sessions from Simon Hudson & Simon Doy.

What’s New in Server 2016 – Server 2016 is again an evolving story and its important now more than ever to under how this is the case and how this can affect business decisions going forward. Nano Server will be a big change to how IT teams manage their infrastructure and guess what technology will be a bigger part of the IT admin’s lives? Yes that’s PowerShell which is now pretty much 10 years old.

LeanAgile – We had a great talk from Amy Lynch about the diversification challenges in IT and although it was more geared to the challenges of recruiting more women in IT the conversations that came out of it were certainly what made the night worthwhile.

Windows Platform / .NET – this was actually just a really good causal social event as there wasn’t a topic for the evening and one that lasted until 5am (yes on a Tuesday)

SQL East Anglia – Well I must have been a tad bit mad to attend this. 8 hours of driving to get there and back but it was certainly worth it. Being from a SharePoint background I’ve always had the mindset that it’s a sensible idea to learn more about the underlying Data Platform and I’m certainly not wrong in that thinking. Perhaps it’s the statistician in me coming through but data analysis and manipulation is definitely one of those dark arts that keeps me entertained. Perhaps that’s why I see a fantastic future in Azure Machine Learning and PowerBI. The session was one that took my interest as understanding what’s new in SQL 2016 for the BI professional is something that certainly ticks the boxes for me in the future of IT technology. Also being able to have a good chinwag with Mark Broadbent and congratulate him in person on his MVP Award also made the trip worthwhile for me. I’m a big advocate of the community and rewarding those that make the community a better place and I’m glad that Mark has got this Award under his belt.

Well that was what I got upto in January – February has been busy so far and I’ll post about it in due course.

#PSTweetChat – The EU Edition

Body:

If you have been involved in the #PSTweetChat events that have been running with Adam Bertram (@adbertram) & Jeffery Hicks (@JeffHicks) and a number of others, then you would be aware of just how awesome these 1-hour open discussion sessions truly are.

A number of PowerShell Questions get asked and answered from members of the PowerShell community worldwide so they can become a valuable resource to getting a right answer to an issue quickly or even just learning more about the people that make the awesome community and what they are currently up to this week.

So after a discussion about this with Adam Bertram & Jeffery Hicks on a previous #PSTweetChat I had said that I would co-ordinate a #PSTweetChat that was more EU time friendly – Well I am announcing that the EU #PSTweetChat will be monthly on the 3rd Friday of the month starting on February 19th at 10am UTC 0

This will be the case for February & March and then in April (due to the Time changes to the Clocks) we will move to 10am UTC +1

So the dates will be (mark them in your calendar)

  • February 19th – 10am UTC 0
  • March 18th – 10am UTC 0
  • April 15th – 10am UTC +1
  • May 20th – 10am UTC +1
  • June 17th – 10am UTC +1
  • July 15th – 10am UTC +1
  • August 19th – 10am UTC +1
  • September 16th – 10am UTC +1
  • October 21st – 10am UTC +1
  • November 18th – 10am UTC 0
  • December 16th – 10am UTC 0

I will look forward to the future #PSTweetChat conversations.

Published: 27/01/2016 15:14


Get-PSUGUK – Call for Speakers

The UK PowerShell User Groups (Get-PSUGUK) are undergoing an expansion with some new User Groups being sprung up across the UK over the upcoming months.

If you have been able to attend any of the previous events (Manchester & London) then you will know that I’m a big advocate for making a real community out of the User Group Meet ups – one where there is the opportunity for those from all differing IT backgrounds to rise up and present a topic to their local User Group.

With the number of differing PowerShell related Topics that there are available there should be no shortage of possible topics and there will be availability for a variety of different formats including short 15-minute Lightning presentations, 45-minute Presentations and even possibility for a full evening presentation.

With this in mind we are putting forward a Call for Speakers for the year ahead which if you are interested in presenting a topic then we have an Excel Survey that you can fill in located at http://1drv.ms/1OVuqul – please note that we are not currently looking delivering sessions remotely.

Myself and the fellow organisers, Corey Burke (@cburke007), Iain Brighton (@IainBrighton) & Richard Siddaway (@RSiddaway) will look forward to seeing you at future User Group Events and would like to invite you to follow @GetPSUGUK on Twitter for updates on the PowerShell User Group Events in Future.

To Sign up for the Manchester User Group on Feb 1st please follow this link – http://www.eventbrite.co.uk/e/get-psuguk-manchester-february-tickets-20117867082

To Sign up for the London User Group on Feb 4th please follow this link – http://www.eventbrite.co.uk/e/get-psuguk-february-london-tickets-20727283864

To see future events (including new cities as they occur) please bookmark this link – http://get-psuguk.eventbrite.com/

 

My Workflow for Using Git with Github – pt3

So this is Part 3 of a series of Blog Posts on my (currently ever changing) Workflow with Git, Github & PowerShell.

Hopefully you have had chance to look at the previous posts in this series if not they are

Part 1

Part 2

However, for this post we will be concentrating on Script & Module Creation and how we can make the overall experience more efficient with the PSISE_Addons module that I’m releasing on Github https://github.com/kilasuit/ISE_Cew

We will cover the following items today

  • Use of PSDrives for the functions & why you should use them in this case
  • Use of Git for the source control in this module – Simple and hopefully clear to follow and not too in depth
  • The Functions used in this module
  • Creating compliant PSD1 files for the PowerShell Gallery – Because it’s annoying to have to do this manually and that’s why we automate right – Again added in this Module is an example!
  • Creating some basic Pester Tests – again without even thinking about it as I am giving this to you as part of the ISE_Cew Module!

 

So firstly – Using PSDrives within the Function and why to use them.

PSDrives are a good and simple way of having a location that you can reach only in PowerShell and can use a variety of Different Providers – FileSystem, ActiveDirectory etc

We will be using the FileSystem Provider in this example for our functions.

So I begin with a Few PSDrives created in my PowerShell Profile As you can see below – I use 1 profile and encapsulate an if Statement to check if the host is the PowerShell ISE for ISE only Functions – like the 3 I will be showing you today.

011416_1355_MyWorkflowf1

 

 

 

As you can see I Have a PSDrive for all the following OneDrive, Github, Scripts, Scripts-WIP, Modules & Modules-WIP

The Important bit here is that all my Github Repo’s are actually stored in my Personal OneDrive as you can see from the above image – this means that I can Switch between Devices Very Very Quickly once things are saved 😉 – It’s probably key to point out this could be your OneDrive For Business Location as well or a Shared Drive if you are in an organisation that uses HomeDrive locations. The Possibilities are endless – save your imagination.

So from here we have our PSDrives set up and the beauty of this is that it allows very simple navigation between repo’s as you have them all centralised. In my Next Post I will be showing you how you can populate this Github Location with all your Repo’s that you have Forked and how you can ensure that all repo’s are up to date and have the latest updates pulled into them or commits pushed from them in just a few functions! So stay tuned for that!

Hopefully this will leave you with a reason to adopt PSDrives into your workflow and we can move onto the next section.

Use of Git for Source Control in this module

Quick intro – Git can be used with your own offline Repo’s – It doesn’t need to be linked to a Github Repo however this is most common and I would recommend that you use Github – You can get 5 Private Repo’s for only $7 USD a month (about £4 odd)

For more information on Git for Source Control if you are new to it I would recommend having a look at this series on PowerShellMagazine http://www.powershellmagazine.com/2015/07/13/git-for-it-professionals-getting-started-2/ – that was how I got started and also have a play with the “Learn Git in your Browser” on http://try.github.com/ – it’s definitely a useful starting point and will help you out in your future endeavours.

So the Key Git commands used in this module are

  • Git add – Simply adds files to be watched under the git version control system
  • Git commit – commits a version change to the repository location for the files

Other Key Git commands

  • git push – pushes new commits to the remote repository (this could be hosted on Github)
  • git pull – Pulls changes from the remote repostitory (this could be hosted on Github)
  • git clone – clones the remote repository to your own machine (this could be hosted on Github)

So that’s the key commands out of the way but why and when will we want to use them or in our case not think about using them.

The Functions used in this Module

 

For me I’m a bit data-centric (aka a data hoarder) – I prefer to have too much data than not enough. So to cover this I wanted a way to Auto Commit Any changes to Scripts and Modules every time I saved them

So this is where creating this module came in – and the functions contained within.

I have created 3 Core functions

  • Save-CurrentISEFile -Saves Current File that is Open in ISE whether it has been previously Saved or not
  • Save-AllNamedFiles – Saves all Files that have previously been saved
  • Save-AllUnnamedFiles – Saves All files that have not been previously save

And also 2 helper Functions

  • Request-YesOrNo (amended from the one included in SPPS – thanks to @Jpaarhuis)
  • Get-CustomCommitMessage – basic VB popup box for custom commit message

Now I must note that currently this is only compatible with v4 and above though that can change – if I get enough time and requests to do so – though you could always add this in with your own updates to the module.

So let’s look at the Process to be used with the following functions.

Imagine we are creating a script called Get-UptimeInfo – we could easily create this and then save using the default handlers in ISE however there are some issues that I’ve found

  • File path defaults to the last saved location – Example being you are working on a script in C:\MyAwesomeScript then when you click Save it will save it there and for each time you reopen ISE it will default there – Not Ideal
  • I like things Centralised – that way I know where things are!

So to Overcome this we put at the beginning of the script the following #Script#Get-UptimeInfo# – this then tells the Save-CurrentISEFile or the Save-UnnamedFiles functions that we want to create a PS1 file called Get-UptimeInfo in the Scripts-WIP PSDrive location

This would look like the below before running either Function

011416_1355_MyWorkflowf2

 

 

 

 

 

 

 

 

And the We can run either function in the Command Pane like any other function

011416_1355_MyWorkflowf3

 

 

 

 

 

 

 

 

Oooh – look at that the file has saved and named Get-UptimeInfo – it is a ps1 file and we are being prompted about whether we want to add a Custom Commit Message – So we’ll click Yes and see what we get

011416_1355_MyWorkflowf4

 

 

 

 

 

 

 

 

Here we get a popup box, (currently uses VB for this but it works and is a few lines) asking us to provide our commit message – I’ll Add the Commit Message as “Testing PSISE_Addons Save-CurrentISEFile Function”

The Result can be seen below – note there is a Get-UptimeInfo.tests.ps1 file been created as well – This is set by the what you include in your profile as suggested in the PSISE_Addons.psm1 file

011416_1355_MyWorkflowf5

 

 

 

 

 

 

 

 

 

 

 

If we wanted to do the Same with Modules then it would be something like this #Module#FindSystemInfo# and that would tell Save-CurrentISEFile or the Save-UnnamedFiles functions that we want to create a folder in the Modules-WIP PSdrive location called FindSystemInfo and in there we want to save a PSM1 file called FindSystemInfo whilst also creating a compliant psd1 file for the Gallery & also Creating a FindSystemInfo.tests.ps1 file containing some default Pester tests

011416_1355_MyWorkflowf6

 

 

 

 

 

 

 

 

 

When we run the Save-CurrentISEFile function we get the Same as before

011416_1355_MyWorkflowf7

 

 

 

 

 

 

 

 

 

Again we will Click Yes here and in the next popup we will add the message “Adding New Module FindSystemInfo” and we can see this has happened below

011416_1355_MyWorkflowf8

 

 

 

 

 

 

 

 

But we can see here that there are 3 files added – a PSD1, a PSM1 and a tests.ps1 file have all been added to a New Folder based on the Module name FindSystemInfo – but we didn’t specify these. That’s because the Functions Save-CurrentISEFile & Save-AllUnnamedFiles will do the hard work for us and create a fully compliant with the PowerShell Gallery ps1d file and also a default Pester test as long as you have them specified in your profile. BONUS – I provide you sample versions of both of these with the module. How generous is that!

But the most important thing is being able to not have to call the actual functions but using simple keyboard combinations so as part of the ISE_Cew.psm1 file there is a sample part at the bottom to add into your PowerShell Profiles – again another easy freebie!

So you can now download this from the PowerShell Gallery using Install-Module ISE_Cew – so go and get it and give me some feed back via the GitHub Repo – https://github.com/kilasuit/ISE_Cew/

ThePSGallery AutoBot – Some Issues I’ve Found.

Ok so If you didn’t already know then this happened

And although it has been interesting it has also brought up some issues (mainly data which is one of my biggest bug bears in all things IT) with the PowerShell Gallery and these include and is not limited to

  • Publish-Module is partially Broken – This is due to it requiring you to add in LicenseURI & ProjectUri when run – however the issue then lies that this information doesn’t make it to the Gallery Pages nor does it make it to the Gallery Items when using Find-Module. This means that there is a seemingly large number of Modules that don’t seem to include this *mandatory* information.

    There is a workaround and this should be what you are doing anyway but that is to ensure that in the PSD1 file for the module that this information is included in there as then it gets populated to the Gallery correctly. It is thanks to an Impromptu chat with Doug Finke (@Dfinke) that this came out of the woodwork and was confirmed as being the resolution– So thanks Doug!

Also I decided to confirm this via uploading 2 different modules ThePSGallery-Working & ThePSGallery-Broken – conclusive results show that the only method to get the LicenseURI & ProjectURI to show in the Gallery (either via the Website or via Find-Module) is to include it in the psd1 file.

PSgalleryissue

So Go and update your psd1 files to include this and please upvote this issue on UserVoice to either force this be updated or to drop the LicenseURI & ProjectURI from Publish-Module – http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11439807-gallery-issue-licenseuri-projecturi-aren-t-add

  • Author Details – Again this is massively broken due to the nature of the Gallery with simple things like spelling mistakes etc in Author names which then means there are 3 or 4 authors that are actually 1 person See the Author Details file in the Github Repo for the PSGallery ( https://github.com/kilasuit/ThePSGallery/ )for more details which was built using the below logic

 

Find-module * | Select-object Author | Sort-Object -Unique | Out-File Authors.txt

I would Suggest that the Gallery also allows the Profile to link to other Social networks like Twitter and get the Twitter Handle (Would be great to be able to get ThePSGallery Autobot to include the author in the tweets thus increasing visibility to those that submit work there)

I would also suggest that any Authors include an additional Hashtable for the PrivateData section that includes any additional Contact info – Like Twitter or Blog Urls etc and sets this as a default psd1 variable – Will be posting about this shortly.

  • Additional Metadata – I for one would like to be able to with the AutoBot to be able to tweet on 100, 1000, 10000 downloads of a module to congratulate the authors. However this isn’t made available at the present time via Find-Module however can be gotten via WebScraping Methods – Not particularly resource friendly and time consuming too. I have raised this to the Powers that be via UserVoice and you can upvote this as well via http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11279160-add-additonal-properties-to-powershell-gallery-ite
  • Lastly – Find-Module isn’t very User friendly for cmdlet, function, workflow or DSCResource searching if tags aren’t used.

    This is a bit simpler to get around but the logic is rather hidden in how to do so as you would have to call Find-Module * and then Pipe this to Select-Object -ExpandProperty Includes and then to Where-Object

    So for SharePoint it May look like this which isn’t very graceful at all but this does return the 2 Modules that have SharePoint in Function Names – Problem being what if they aren’t functions but Cmdlets.

    Find-Module * | Select-Object Name -ExpandProperty Includes | Where-Object {$_.Function -like ‘*SharePoint*’} | Select-Object Name

    Again there is a UserVoice Suggestion for this at http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11088855-find-module-needs-improvements-to-enable-better-di

Hopefully that’s a small insight to ThePSGallery AutoBot and in a future Blog Post I will detail the inner workings of the AutoBot including the actual script that runs it (not on Github as of yet)

2015 – Challenging but it’s only the beginning!

Body:

This is just a post on my own recent reflections of the events throughout 2015.

Each month in 2015 came with its ever more increasingly difficult obstacles to overcome and for the first 6 months of the year a lot of the obstacles had come about in the previous year or two and a lot of it was predominantly centred around my children.

Yes Children – Plural.

At 19 I became a father to my now 6 year old son and then at 23 I became a father to my now 2 year old daughter. So relatively early on in life I was blessed (and as of the last few years in some ways to be cursed) for becoming a parent so young.

But anyway to set the scene of entering 2015 – my daughter had been in care for the majority of her life (since she was around 3 months old) and by the end of January my son was back in care whilst going through court proceedings – and at this point it was not possible for me to have them back, let alone practical – something that still really irritates the absolute bejeasus out of me.

January – April pretty much was all over the place because of all the goings on with the children and trying to get settled into my then relatively new Job at The University of Manchester, which meant that in those months not only can I really not remember what really happened much of that time due to the vast amounts of ridiculously unnecessary documents as part of the Family & Children’s Courts processes.

May – Now if I could delete 1 month (well 1 week if I’m truly honest) then this would be the one. This was the month when the decisions on future contact with my children would be made and although I fought and fought and fought throughout the whole 76 weeks that we were under court proceedings, it was essentially all in vain as the ruling was completely against my arguments based on supposed “professional opinions”.

I have made it more than clear that the odds were against me because of a few basic facts

  • I’m Male – and regardless of the push of same sex rights, men are still widely considered as being the less capable when it comes to family
  • I was brought up to believe that you struggle and juggle – but you get through it in the end – this was perceived as me being “less willing to accept help if needed” and that is utter bullshit!
  • Other than myself in court there was only my barrister, the Children’s lawyer and my ex partners lawyer that were male and present. Female Social Worker, Lawyer for the Local Authority (not my Local authority either), Children’s Guardian & the Judge

But Also May was the month that started the ball rolling for speaking and attending #PSConfAsia – so it wasn’t all doom and gloom. Although I didn’t commit until Mid-June when I had the outcome from the Court case. Needless to say from that point onwards I made a conscious decision that I needed to really start the ball rolling for a better, more flexible and more enjoyable future – so you could say that in June I made the decision that I would at some point in the following 6 months leave the University of Manchester in pursuit of something more fitting to what I want to be doing.

So a part of this involves me making what could be a life changing and seriously difficult time ahead as I move into self-employment but it is something that I have thought about doing now for almost 3 years.

So that will be one big challenge of 2016 – however that is only the beginning as the first challenge is to find somewhere permanent to live. These last 2 months have been expensive although comfortable as I’ve spent most of the time in hotels. I dread to think how much this has cost me personally and with no real tangible gain from it at all.

2016 will see me continue the work that I started with the PowerShell User Groups here in the UK and I am looking to massively expand this where possible. This is mainly in part with the fact that I love presenting and meeting the community but also there is, in my opinion, a massive gap in the skills base of real understanding of PowerShell and in part this can be partially alleviated by increasing the number of User Groups across the UK. So I’ve already put it out there that if anyone thinks that they could co-organise then I will work with them to get these off the ground and running. I will also provide content to them and help get the community growing – the end goal is to be in a similar position to the SharePoint & SQL User Groups where there is a decent local User Group Community and then we can look at localised PowerShell Saturday’s at some point in 2017. Ambitious – but that is the way I am and with the help of those out there that want to get these things off the ground then we will achieve it – plus hopefully by this time next week I should have some good news about the future for these events – so hold on tight.

Also 2016 is the year when I will really Start-Contributing to the wider community, I’ve been promising a PSISE_Addons module for about a month now and the reason for it being delayed is because I’m just adding more and more features to it to make it better, that and I’m actually refactoring the codebase for it already. This will be one of the topics that I will be covering at the Manchester & London User Groups and I’m hoping if I’ve hit it right then it should be a major help to all that use it. Not going to give much more away than that until released (and blogged about of course)

Also 2016 will be the year that will involve lots more presenting. As it stands I have already been accepted for the PowerShell & DevOps Summit in BelleVue, WA for my 26th birthday so that will be an interesting and amazing event to attend, which I would have been looking to attend even if I hadn’t been selected to present just because of the sheer number of the PowerShell Community (and Product Group) will be there.

I’m also waiting to hear back from at least another 7 events on whether I’ll be presenting at them – a Variety of SharePoint, SQL & DevOps type events.

Then there is also #PSConfEU – which I am co-organising with Tobias Weltner and this looks to be another fantastic event – we already have a great line up of speakers and still a few slots to fill. Details about this will be posted in the next few days and I would urge you to Register at www.psconf.eu as soon as you can.

Then late on in the year I’ll be returning to Singapore for the follow on #PSConfAsia Event. And I can’t wait for that one either and hopefully there should be some good news in the upcoming weeks about this event. So again keep your eyes & ears open for updates.

That’s a brief overview of 2015 and overlook of what is to come in 2016.

But one final thing to remember – there is always a story behind every person and most of the time that story stays behind a firmly locked door. I’m happy to be open about it as being open about it all helps me remember that no matter how hard it’s been (and it’s been torture at times) I’ve got though it all and will continue to do so for years and years to come. One day the wrongs of 2015 will be corrected but the journey there for me is longer than I had originally anticipated and forms a solid core of the plan of my next 5 years.

 

So as we enter 2016 – be happy you got through 2015 and look forward to the beginning of yet another journey. This one already looks and feels like it will be amazing and the people that I meet along the way will be a fundamental core to that becoming a reality.

Published: 31/12/2015 17:35


Quick Win – Install WMF5 via PowerShell 1 Liner

Body:

**

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

 

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but may not have this ready until the new year.

 

PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future.

Published: 23/12/2015 17:45


My Workflow for using Git with GitHub, OneDrive, PowerShell & Exchange – optimising efficiency across devices. Pt 2

So this is the Second of a series of blog posts about how my workflow with Git, GitHub, PowerShell & Exchange currently works.

In this series we will cover a number of different applications and a number of custom PowerShell Functions that will really help you to optimise efficiency across devices when these processes are put in place.

As we left off the last part I had opened up the thoughts to the next stages of the process – the actual automation of the workloads and in this post I will go into detail about the integration between GitHub & Exchange to create New Inbox Folders and Inbox Rules for any newly followed Repo’s

So I’ve just followed a bunch of new GitHub Repo’s and now I want to have the Inbox rules & Folders set up for me – Manually this is not that much of a task – but why manually do this when there is a way to Automate it

121815_1235_MyWorkflowf1

 

 

 

 

 

 

So to do this we need to Query the GitHub API, and we will be utilising Invoke-WebRequest for this, to see all the Watched Repo’s which is accessible for any user via the following URL – just replace kilasuit (my GitHub alias) with the username that you want to query – https://api.github.com/users/kilasuit/subscriptions

Now this is a paginated and non-authenticated URL which means that it returns only 30 results at a time and these 30 results will only be public repos, so if your following more than 30 repos then you will need to rerun the query with a small addition to get the correct outputs, however if your following private repos then you will need to wait for a following post on how to accomplish this.

To do this we will check the request to see if there is an included Link Header – if there is then we know that the user is watching over 30 public repos and we also from this header we will get the number of pages (batches of 30) that we will need to iterate through to get all the users watched public Repo’s

Below we have a small snippet of the overall function to do the iteration of the pages (this could be tidied up – but for now it works)

$repos =@()

$web = Invoke-WebRequest -Uri “https://api.github.com/users/$githubuser/subscriptions”

$page1 = Invoke-RestMethod -Uri “https://api.github.com/users/$githubuser/subscriptions”

$page1 | ForEach-Object { $repos += $_.name }

if ($web.Headers.Keys.Contains(‘Link’))

{

    $LastLink = $web.Headers.Link.Split(‘,’)[1].replace(‘<‘,).replace(‘>’,).replace(‘ ‘,).replace(‘rel=”last”‘,).replace(‘;’,)

    [int]$last = $($lastlink[($lastlink.ToCharArray().count 1)]).tostring()

    $pages = 2..$last

    foreach ($page in $pages)

        {

        Invoke-RestMethod -Uri “https://api.github.com/users/$githubuser/subscriptions?page=$page | ForEach-Object { $repos += $_.name }

        }

}

So as you can see above we have queried the GitHub API and we have stored a result in a local variable. This then allows us to pull manipulate the data stored in the corresponding Object to add all the watched Repos (from this initial 30 responses) into an array and as you can see from the above we are piping the Web Object to the ConvertFrom-JSON command and then piping that to the ForEach-Object Command.

After this we then query the Web Object it to see if there is a Link Header. If we find that there is a Link Header, then we will generate the needed array of pages and add this to another local array called pages. We then loop through the array of pages to get all the repos and do exactly the same as above to add them into the repos array.

At this point we then have the name of all of the watched repos in the repos variable and we are making use of the name property for the rest of the function.

However, there are a lot more properties that have been collected as part of the Invoke-WebRequest call as can be seen below for the PowerShellEditorServices Repo.

Please note I’ve excluded a fair amount of properties in the below (using the *_url string) because there are some almost useless properties there unless you are going to do further queries.

121815_1232_MyWorkflowf2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

As we can see there are some interesting and useful details given here including open_issues, forks, watchers, pushed_at (when the last local commit was pushed to the GitHub Repo), updated_at (when the last commit was made – not when it was pushed), name & full_name.

So from the data that we can get programmatically for repo’s we can really do quite a lot with it – especially with all the different url options that there are included as well as shown below – notice that the majority of them are the relevant API URL – this makes building out some wrapping functions that call Invoke-WebRequest for those API endpoints very much easier – so Kudos to GitHub for making things easier for Developers to do this sort of thing.

121815_1232_MyWorkflowf3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

As it is there is certainly a lot of flexibility that comes with using GitHub and PowerShell for some automation of smaller tasks like this.

The below is the “Meat” of this function where we connect to Exchange Online using a wrapper function Connect-EXOSession (See my PoshFunctions repo for where this lives) where all we need to pass is a PSCredential.

Once we have connected to the PSSession to Exchange Online we then are gathering all the current folders that live in the GitHub Stuff Folder in my Inbox into another array variable. This means that we can then compare whether there is already a folder there or not and if the folder doesn’t exist then we will create one and then we will then create a new Inbox Rule for it to automatically move any emails that come from GitHub to the correct folder for the Repo.

 

We then write a small amount of text to the screen to tell which folders and rules have been set up – useful if like me you are following almost 50 repos already and this is likely to increase over time as well.

Connect-EXOSession -EXOCredential $EXOCredential

$folders = Get-MailboxFolder $MailboxFolderParent -GetChildren | Select-Object -ExpandProperty Name

foreach ($repo in $repos) {

    if($folders -notcontains $repo)

        { New-MailboxFolder -Parent $MailboxFolderParent -Name $repo | Out-Null

          New-InboxRule -SubjectContainsWords “[$repo]” -MoveToFolder $MailboxFolderParent\$repo -Name $repo -Force | Out-Null

          Write-Output “Folder & rule for $repo have been created”

          }

    }

Then as you can see we then are removing the PSSession, EXOSession, as to clean up the PowerShell session as the EXOSession will stay in memory unless Remove-PSSession is run.

Remove-PSSession -Name EXOSession

The full version of this function can be found in my PoshFunctions Repo on GitHub located at https://github.com/kilasuit/PoshFunctions

Stay tuned for part 3 where I will go into the details as to how and where to structure your GitHub Repo’s to allow you to then automatically check the status of all of the repo’s in 1 function. This will be a less intensive post but is still useful to read for the efficiency that can be gained from using this method and it will then help you understand the more detail specific aspects of the rest of my workflow.

My Workflow for using Git with GitHub, OneDrive, PowerShell & Exchange – optimising efficiency across devices. Pt 1

So this is the First of a series of blog posts about how my workflow with Git, Github, PowerShell & Exchange currently works.

In this series we will cover a number of different applications and a number of custom PowerShell Functions that will really help you to optimise efficiency across devices when these processes are put in place.

Ok so I am going to share with you my current end to end workflow for ensuring that I commit myself to making use of Source Control for all of my Scripts (and other files when required).

So imagine this scenario – I have come across a public GitHub Repo (I’ll be using the recently published PowerShellEditorServices Repo for this) which I feel would benefit me in either my current or future workloads. So at this point I will decide to make sure to “watch” the repo as shown below.

121715_1539_MyWorkflowf1

 

 

 

 

 

 

Notice that the Default is to be not Watching the Repo – I will click the Watching and that will add this repo into the list of other repo’s that I am watching. Now you may be thinking at this point “Ok seems reasonable enough” but this is the point where it becomes interesting – well at least it does for me and my workflow – and that’s why your reading right??

So now when I access Github it will show me in my Watching section (as seen below) any new Issues or Pull Requests raised with any watched repos and by “default” you also get an email of this notification – this is the important bit as until GitHub release a Mobile Client this is important for me – as I like official apps from a vendor than a community developed app – especially ones done by a singular developer (but that’s just me)

121715_1539_MyWorkflowf2

 

 

 

 

 

 

 

Ok so at this point we should have a hunch of perhaps where the rest of this series may be going.

I make it quite aware that I run my own Office 365 tenant that at under £4 per month gives me SharePoint, Exchange, Azure Active Directory & Skype for Business. This allows me to focus on using the Microsoft Ecosystem that I have to hand and I can then use them for much more than simple use cases.

However, at this point I will warn that the next stage will require you to either have your own Office 365 Tenant & therefore Exchange Online or that you must be an Exchange Administrator to run the next stages – As the PowerShell Scripts I’ve got built currently run against Exchange Online (though I will be looking to port this to be usable for those using Outlook and aren’t Exchange Admins).

So first off from this point we have to create a New Folder & Inbox Rule for my Exchange Mailbox – but we don’t really want to have to add any information in for it now do we – I mean that’s why we are automating the process to the nth degree.

Stay tuned for part 2 where I will go into the details as to how to Query the GitHub API & then create new Inbox Folders & Rules based on that and then start to show you the rest of my workflow.

Quick Win – Install Windows 10 RSAT Tools via PowerShell

 

Body:

This is a very very quick post about installing RSAT for Windows 10 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/Win10RSATinstall2’)

And if you want to look at the Script then direct link is http://bit.ly/Win10RSATinstall2

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

I plan on turning this into a DSC Resource (or adding it to an existing resource) when I get the chance to – unless someone beats me to it.

Published: 16/11/2015 22:49

 

 

Get-PSUGUK & PSConfEU – How I got involved in Organising them

Body:

Well this is a blog post that is really like the PSConfEU website and the organisation that goes with a conference like this – a real work in progress – by this I mean that at current writing it is correct however in a week well then it will be partially obsolete, but only partially.

So lets go right back to the beginning of the story, and for me this story goes back to July and just after I had agreed that I would be setting off for the PowerShell Conference in Singapore, see my post on just how amazing that was here, and how I had really been finding that the was an issue for me with the fundamental lack of UK PowerShell movement. From my trawls of the internet I concluded that there was really only the US Summit (at this point I didn’t even know about the EU Summit and not sure how I missed it either which just goes to show how out of the loop you can be if you are not on the ball for finding where this information is out there in the depths of the internet) and I felt frustrated with this so I started to do my Home Work and found out about the PowerShell User Group in the UK – Get-PSUGUK.

At this point all there was still really available on the internet was a dead link to the old webpage and a few blog posts that I could find mentioning it. Some of these were from 2006/2007/2008 so were quite old but luckily there was an email on one of them so after about a week of arguing with myself about whether or not I should send a query email to find out the status of the User Group, I said to myself JFDI! So I Sent off an email on July 19th just after 15:30 and a little over 2 hours later I had a response – and I’ll tell you I was shocked that I got one at all. I only say that as over the years I must have had at least 25 different email addresses and had most of them running concurrently (now down to 7 – not including work email addresses) so I kind of expected to not end up getting a response at all. Luckily though the response came from none other than UK PowerShell MVP Richard Siddaway (@RSiddaway) who advised me that the user group had become inactive and that although he wanted to resurrect it, time was not on Richard’s side along with his other commitments.

Well this is where my determination (or lack of patience – still yet to be decided) kicked in and I started to push out on a variety of strands to gather some further information – like where could I look to host a user group being one of the big ones to tackle, and also there was the need to also make sure that I could post on PowerShell.org to help promote the event/s. Luckily Manchester is pretty good for its venues in which you can host User Groups so there was plenty of choice and there is a great bunch of organisers in Manchester that co-ordinate together (using Slack – more on Slack later) about the direction of User Groups and co-ordinating events that run in Manchester as well which is great to be involved in as there can be some really great crossover between user groups – an example I presented about PowerShell DSC at the DevOps User Group on Monday 2nd November and this was to a mainly *nix filled room.

So from there I made some plans with Richard to put together a Get-PSUGUK Manchester User Group and get a date arranged as well. So by the end of August I had arranged for the first meetup to take place at SpacePortX in Manchester on October 13th and had already factored in scaling for other User Groups in the UK for the administrative overheads these user groups could potentially create. By this all I really mean is that I had created an Eventbrite page, A function to interact with the Eventbrite API to add members into a Distribution List for emails advertising other possible events and worked out that I would be the sole speaker at the initial event in Manchester.

Then next thing I know it and I’m at PSConfAsia and whilst there I was discussing with the other speakers about how much I had enjoyed the event that I would be interested in running something similar myself back in the UK especially seeing as Don Jones had blogged that the powershell.org team would not be up for organising another EU Summit due to the enormous stress placed on them being mainly based in the US. Time zones are a big issue with organising an event like this and to have to try and traverse multiple time zones cannot be fun at all so I can totally understand the decision by the Powershell.org team not to continue to attempt to organise the EU event as in many ways this allows there to be others like myself and the rest of the team that have pulled together to organise PSConfEU.

So now into the real meat of this post in my opinion – its not every day that you can say that you are working with a cross continent collective to put together an event like this and all of us are doing this in our spare time and we have had to make sure that we collaborate as effectively as possible – now we could have continued with the Email list that was set up by PowerShell.org for us to get together people that would be interested in helping out pull this event off however that wasn’t going to be really optimal as conversation tracking becomes a nightmare in emails (personally if email were to disappear in the workplace that would not bother me) and we needed to segregate topics but not spilt away all the eyes from the information being discussed. Slack becomes a critical tool for this in my personal opinion and without it we would not have had all the success that we have had currently.

The PowerShell EU conference Website and the development practices behind how we are maintaining and updating it are crucial to the understanding of where IT SysAdmin is heading and with that we chose the best fitting platform for what we wanted to achieve – a Website with easy version control and ability to integrate a Continuous Integration toolset to allow us to determine if the new builds of the website were functioning as expected, plus it gives us all piece of mind as well.

So PSConf.eu is hosted using GitHub Pages which means that the site is just a GitHub Repo structured to the organising team’s GitHub Organisation. With this we are making use of Jekyll, the static site generator built in Ruby – that means that just to get the site to where it is we have all had to have a bit of a learning curve with the way to build out the site to take advantage of the way that Jekyll functions and also learn a bit of Ruby in the process. This for me has been fantastic as I love to learn something new especially if it is something completely out of the norm for me. Just see the tweet I have pinned on my twitter profile at @ryanyates1990 and that should be enough of an insight to how much I’ve enjoyed this so far 😉 and yet there is still so much more for me (and the team) to get completed prior to the end of this year including selecting the speakers.

One of the other interesting bits with this is it has empowered me to realise that not only is there a lot of hard work into conferences and events but that they can be managed in an efficient way with the right mix of technology. There is no 1 stop shop to getting an event like this right and it truly is a collaboration of efforts between us all involved.

So to end this rather long winded blog post I’d like to just mention the following

Get-PSUGUK have meet ups in Manchester and in London this November – you can sign up at www.get-psuguk.eventbrite.co.uk

The PowerShell Conference EU is still accepting Speaker Submissions we have a closing date in mind however we will likely confirm this via @PSConfEU in the next few weeks. If you are in 2 minds about submitting a session then please reach out to me directly as soon as possible.

For more details on the PowerShell EU Conference please visit www.psconf.eu

Published: 15/11/2015 14:18


SPS Munich – What an event!

Body:

Ok so I’ve neglected the blog recently and that is because I’ve genuinely not had time to think about it.

This has consisted of 2 different SharePoint Saturday’s – one as an attendee (Munich) which I’ll cover in this post & one as a “presenter” (Oslo) and I’ll come back to SPS Oslo in another post.

Firstly, SPSMunich was the first SharePoint Saturday that I’ve attended as a just an attendee and I must say that we were very much spoiled in many ways. It was a great event with such an amazing line up of Speakers, I also submitted sessions but on seeing the ending line-up I genuinely understood that my submitted sessions could not compete with the line-up but to be totally frank that was a good thing as I thoroughly enjoyed being there as an attendee and who can blame me when it was immaculately delivered.

So firstly the Keynote from SPSMunich was delivered by Jeremey Thake (@jthake) in which he delivered some amazing stats including how Office 365 is using 470+ Petabytes which is 470,000Tb – That’s a lot of storage!

After that I went to the session on Building FAQ Functionality in SharePoint delivered by Paul Hunt (or @cimares if you are on twitter) and as have been come to be expected the session was informative and captured all of the key points that you would need to be able to walk away from the session and implement this in your organisation – something that I would recommend you really investigate as SharePoint really is fantastic for that type of information store.

After that we had a coffee break which is something that between sessions really is needed to give attendees change to digest the information delivered to them in which I opportunity to briefly catchup with some of the speakers that I had met at SPSLondon – Maarten Eekels, Thomas Votchen, Elio Struyf, Hans Brender and a few others too throughout the day.

Next up I attended the “Scaling SharePoint 2016 farms with Minrole and other tools” session delivered by Spencer Harbar (@harbars) – now this was an interesting session that for me left me thinking that the implementation of MinRole is just another example of how there are teams at Microsoft that just “Do it our way” and the SharePoint team seem to have been really good at doing this over the years – or at least so it seems from my view.

Next up we had an amazing lunch can if you view the SPCAF Blog – SPSMunich Recap you will see there is a flicker album which goes to show how well the event was orchestrated – you can also find all of the Speakers Slide decks there as well.

After the truly amazing Lunch the next session I attended was none other than that of Erwin van Hunen (@erwinvanhunen) and his session on getting started with the Office Dev PnP Provisioning Engine and it was a tough call between that and the session by Thomas Votchen & Elio Struyf about “An IT Pro & a Developer walk into a bar – SharePoint Search Happy Hour” (a session that I need to have a look through the slides still) although I feel that on the day I made the right decision as the PnP Engine and some of the related PowerShell Cmdlets are proving to be very interesting for me – this is an opportunity for me to diverge from writing cmdlets/functions in actual PowerShell and write them in C# so a good learning opportunity, so I would expect to see more from me on this in the upcoming months.

After this I went to Wictor Wilén’s (@wictor) session on PowerShell DSC & Azure Resource Manager for Deploying SharePoint Farms like a Champ. This was a packed session and there were some great questions being asked by the audience which for me as a more inclined PowerShell guy was great to hear that this was a very much interested in subject for those with SharePoint Bias. One of my key takeaways from this session was that there is too big a gap in knowledge between someone that spends their time learning PowerShell to use as a dynamic language for multiple application administration to those that just want to use it for the application that they are managing. This means that to me there is a real need for there to be simpler and more structured training for those that will likely only use PowerShell for a very small subset of their work. However, that was my takeaway from the questions being asked by the audience, Wictor on the other hand demonstrated some really good and useful points on using DSC to build out your SharePoint Farm and I can only hope that more organisations take this on board in future and look at this as a real possibility (once the xSharePoint Module gets all of the core features built into it)

Next up for me was going to be a difficult one because again there were a few sessions that I wanted to attend – SharePoint Performance Monitoring & Troubleshooting and How to Secure your data in Office 365 – however I ended up going to the Search Queries Explained – A Deep Dive into Query Rules, Query Variables and Search Orchestration by Mikael Svenson (@mikaelsvenson) as I felt that I needed to think about some diversification in my knowledge and as everything covered in Mikael’s session would be stuff that in more development scaled role would be useful to me to understand the mechanics behind it. I certainly left the session wondering when I could try out putting some of this into practice and unfortunately like most things lately it has just not happened.

And lastly the final session of the Day that I attended was Jeremy Thake’s session on Building Slick Office Add-ins to impress your boss where we were shown a number of add-ins that are already available that will boost productivity for those that use them as well as a very quick demo on creating an Office Add-in which was made more entertaining by the fact that the whole demonstration was not being presented from Jeremys device.

Published: 15/11/2015 11:56


HaveIBeenPwnd – PowerShell Function

Body:

I recently became aware of a site called HaveIBeenPwnd – https://haveibeenpwned.com – run by @troyhunt and wanted to create a simple PowerShell Function for being able to check against this Database on whether your Email address (or like me addresses) has been likely to have been Pwnd!

The function is in a Module called Check-HaveIBeenPwndStatus & the current function is called Check-HaveIBeenPwndStatus with only the 1 parameter –Account in which you pass your email account across

So you would call it via Check-HaveIBeenPwndStatus –Account ‘[email protected]’ and you’ll either get back 1 of 2 messages – A warning if it is found like the below

WARNING: We have your account [email protected] marked as having been pwnd on the following sites 000webhost Adobe Black Hat World Flashback Gawker MPGH Stratfor – Please Check and change your passwords across other sites as soon as you can!

Or you’ll get back the following (obviously with your account details)

Although [email protected] has not been found in this database of PwndSites we advise that you change passwords regularly for any other accounts that may be linked to [email protected] for your own protection

Hopefully adding in this function into a profile script that will run once a month (or week or even daily if you need it) will save you some possible headaches in future.

You can find the module on GitHub at http://bit.ly/1MbnFTD

Published: 04/11/2015 03:48


Scripts, Functions, Modules, DSC resources – Pick the right option!

Body:

This may become a bit of a controversial post and that in some parts in fully intentional!

This really should be giving PowerShell Users (looking at the admins of us all that just use PowerShell as if it were VBScript with no real community or practices) the opportunity to understand the problems with what they are outputting to “get it done” at this current time.

My issue is that over the last 9 years PowerShell has become a crucial tool to be able to manage at scale (and also singular machines too) and yet there are still some truly awful examples being posted each day on how to solve what is most times a simple issue which has already been resolved by others

This needs to change and change very quickly otherwise we will be battling inefficient and quite frankly useless code for many many years to come (look at examples regarding Server Side SharePoint Code for SharePoint 2007 on either CodePlex or StackOverflow– which can in cases still be used in SharePoint 2013 & even SharePoint 2016)

Ok so perhaps I have some biased views and this is all with very good reason I may add. In 10 years time my son will be at age where he will be looking at making his decision on where he wants to take his career (hoping IT). Now I may have a very narrow view but this is mainly because we are at a point where we can ensure that we have only the best examples that ensure that we can be as modular as possible in future as to allow for the highest reusability across platforms.

In comes the need for the community to step up and work on making sure that we extend only the best Modules which we ensure are hosted on the PowerShell Gallery for ease of installation and discoverability and that we have the modules all open source developed via repo’s hosted on GitHub. This ensures consistency to the development of them and really allows the wider community to contribute back to items that affect them day to day.

In 5 years time I would love to see there be multiple communities around the differing technologies that we have in the Microsoft world that reach out to the PowerShell Community to help ensure that outputted code is fully reusable and as modular as possible whilst ensuring that it is the most performant possible. I suppose my reason for thinking this would be that I can see the IT Pro/DevOps engineers of future being fully OS & application independent.

We are at a golden age to do this now that there is the supporting technology out there to enable us to do this although it could be argued that the technology has been around for well over 10 years but it is now much more accessible to all, Items like multiple person Video & Voice Conferencing with applications like Skype, Skype for Business, Cisco Webex & GoToMeeting, Instant Messaging across platforms including on mobile with Skype for Business, WhatsApp, Telegram, Slack etc and code sharing platforms including GitHub, GitLab, BitBucket etc

So firstly I would like to suggest that if you are a user of PowerShell for applications like SharePoint, Exchange, SQL to name a few and you are finding that you are writing lots of PowerShell to get a task done then I would suggest that first thing you should do is to search the PowerShell Gallery to see if there is an existing module out there that has that functionality that you require – there is the ability to search for Function Names but not a detailed search but I have raised an issue on Connect for this issue so hopefully this will be fixed soon – you can vote for it at http://bit.ly/PSFindModule

But back on to the point of the blog post and this is where it is likely to get a bit more controversial.

Firstly, we need to go back to basics and understand what we are scripting and why we are scripting it before we determine the method that we are going down.

No matter how many times we are looking to run a script we should be looking to modularise it into small reusable sections that we call in the correct way for use later on – by this I mean that if we aren’t ensuring a configuration then we shouldn’t be creating DSC Resources we should be creating functions within modules – examples for this would be for methods where there is no real need for having a Get, Set & Test functionalities – this is especially true with services that use Web API’s that invoke REST Methods that don’t really return a corresponding Object back to the user and some examples of this for me includes posting to Social Media like Twitter & Slack – these should be written as a function within a module as there is no benefit in my opinion to create a DSC resource just because you can create one.

So for items that you need to ensure are consistent like server configurations then you should be making DSC resources, if not then please don’t make a resource as in my opinion its making a resource for the sake of making a resource which is not really useful or beneficial to the community as a whole as it will start to muddy the waters on when to use the functionalities that we have available to us and we need to simplify the way we write the scripts of the future for the better – that means the best performant scripts that are able to be built using reusable blocks that do Just Enough Action with Just Enough Output that can then be reused with other functions and therefore other technologies.

 

 

 

 

Published: 03/11/2015 15:15