BCC, Distribution Lists & General Email Etiquette

If you work in a big organisation, or like me, are on a number a of cross organisation distribution lists or listservs, then it’s likely that you have some myriad of email rules that route all emails sent to a specific distribution list to a specific folder. This is a good email management practice as you can then filter your searches to a single folder that should just have conversations related to things sent to that distribution list. I know that this is how I’ve managed being on something like 40 very chatty distribution lists in recent years.

However the point of this post was due to a reoccurring theme that I am seeing in some of the distribution lists that I am on

That theme is as below

Moving DL to BCC

I will say that this is the single most frustrating thing to see in an email that has landed directly in my inbox, because not only

  • Should it have not landed in my inbox in the first place
  • It’s disconnected from the original thread in an organisational sense making searching less optimal.

So much so I have started to do this in return (highlighting the Moving DL to BCC in the senders email) & highlighting the “this” in the first line.

Hi <sendersName>,

 

Doing this throws the email you just sent into many peoples inbox unnecessarily as it often will bypass any mail rules that people like me have added to organise our mailboxes & will ping on all their devices too.

 

It is frustrating when this happens & it happens so frequently on the many DL’s I am on however there is a better and more etiquette friendly way to do what you have just done & that would actually be to not to send 1 email but instead send 2.

1 to the dl saying you are forking this thread

1 starting the fork of the thread

 

Yes it takes you a small amount of extra time to send 2 mails not 1 but doing so makes it better for the x number of people on the dl’s

 

I didn’t need this dropping in my inbox & alerting me on all my devices – nor did anyone else that got this pushed into their inboxes. Please think about the disruption that doing this can cause to people working  in future.

 

Regards

Ryan

It’s a really simple thing, but as someone that gets alerts on

  • My Work Laptop’s Desktop
  • My Browser, if I am in SharePoint or anywhere else in Office 365
  • My Phone
  • My Fitbit

That’s a lot of noise & distraction for something that was not critical for me to read, and in many cases, wasn’t even that useful for me to read either.

I also add this line in every time this happens more than once a day (often from different senders)

This is the second time I’ve had this happen this today.

I wonder if I’ll ever get a day where I change the second time to tenth time, though I hope this post will stop that being the case

So Please, don’t do that in future & feel free to make use of the draft above to send to others that you see do that in future too (& point them to this post too)

Azure Advent Calendar

Background

The Azure Advent Calendar is a collaboration for members of the International #AzureFamily to contribute a video and blog post about a topic of their choice related to the Azure Public Cloud, which was brought to life by Richard Hooper (aka Pixel Robots) & Gregor Suttie two recent UK Azure MVP’s that make up 23 Azure MVP’s out of a total 164 UK MVP’s of which I am one of in the Cloud and Datacenter Management category.

It is fair to say that over the course of the 25 days of Advent there are a number of great sessions lined up with 3 taking place each day. totalling 75 sessions of content ranging from 30mins to 1hour each across a wide range of Azure specific content.

For more details and direct links to each of the sessions taking place you can access them in the handy table on the Azure Advent Calendar  website or directly subscribe on the Youtube Channel.  You can also find them via doing a twitter search for #AzureAdventCalendar

My Joint Contribution

This year my contribution is a joint contribution with Rik Hepworth, Chief Consulting Officer at Black Marble ( Azure MVP | @rikhepworth | Blog ) discussing some of the wider aspects of Governance, coming from a more general sense than just the Azure sense, although we will cover some of the components that have been covered in prior sessions of the Advent Calendar. Rik & I are 2 of 6 Microsoft MVP’s at Black Marble, covering areas including Azure, Cloud & Datacenter Management, Developer Technologies, Office Apps & Services & Windows Development.

Between us both we have seen how organisations have adopted the cloud in a number of very different ways, often depending on their size as an organisation and their maturity when it comes to developing, building, testing, deploying and managing their various forms of software.

The video will be available on the Azure Advent Calendar YouTube Channel and the direct link to the video is as below (which you’ll be able to watch sometime today December 24th)

 

We mentioned a number of items in this session and you can see many of them as per the below links

Azure Governance Docs – https://aka.ms/governancedocs
Azure Cloud Adoption Framwork – https://azure.microsoft.com/en-gb/cloud-adoption-framework/
https://azureadventcalendar.com/
Jack Tracey – Azure Governance – https://jacktracey.co.uk/community/azure-advent-calendar-2019/
Wesley Hakkman – Azure Blueprints – https://www.wesleyhaakman.org/azure-advent-calendar-azure-blueprints/
Ignite Governance Sessions – https://myignite.techcommunity.microsoft.com/sessions?t=%257B%2522from%2522%253A%25222019-11-03T08%253A00%253A00-05%253A00%2522%252C%2522to%2522%253A%25222019-11-08T19%253A00%253A00-05%253A00%2522%257D&g=%255B%2522on-demand%2522%255D&q=governance&s=%257B%2522name%2522%253A%2522translate.refine.label.sort.relevance%2522%252C%2522type%2522%253A0%257D
Satya Vel Session on Azure Governance – #BRK2021 – https://myignite.techcommunity.microsoft.com/sessions/81583?source=sessions
Rik Hepworth’s Session on Azure Governance – #BRK1035 – https://myignite.techcommunity.microsoft.com/sessions/78682?source=sessions
Azure Arc – #BRK3327 – https://myignite.techcommunity.microsoft.com/sessions/83989?source=sessions
Azure User Groups – https://techielass.com/azure-meetups
Sarah Lean – @techielass – https://twitter.com/TechieLass/

 

Summary

So far this has been a great event for all those involved and for the wider community to gain insight into just how big the Azure Cloud really is. I hope that those that watch mine & Rik’s contribution around the importance of the need for some governance in the cloud.

Please spread the word to everyone you know in the Azure space using the hashtag #AzureAdventCalendar & #AzureFamily

My First Ignite, Speaking and Attending – Done & Dusted – Pt1

This post is going to be all about Microsoft Ignite and my experience as both a 1st time attendee & a 1st time speaker. There’s going to be a lot to read so I highly recommend you go get a brew sorted (tea, coffee, beer, wine etc) before sitting down to read this post which at over  2600 words is quite a read, and this is just part 1 of 2 which focuses on the lead up to Ignite & not Ignite itself which will come in Pt2 of this series.

What is Ignite

Microsoft Ignite is Microsoft’s flagship conference, which hit Orlando, Florida, USA, November 4th to 8th (with an optional Pre Day on the 3rd) and with any big conference it never starts on the first day but for those of us in the wider technical communities and travelling from all over the world to get to attend them it can start as early as the week beforehand.

Microsoft Ignite is big (& then some) at over 30 thousand attendees! Many of the partners, attendees & speakers fly in from every single part of the planet so you can be guaranteed to run into ever increasingly diverse population of attendees, though there’s certainly room for diversity at conferences like Ignite to grow.

With over 1500 sessions (which you can search for on the Session Catalog), mainly about almost all things in the world of Microsoft technologies, for techies there’s something for almost everyone. However there’s also a number of sessions spread out across the week in the Diversity & Inclusion track as well, a track I hope many of those reading will watch sessions from as there were a number of fantastic sessions in this track, details of which I will go into a little later on.

In the Beginning

In my case however, it started all the way back in July when the MVP & RD Call for Content opened on July 9th and whilst I had a vague idea after a few days of umm’ing & arr’ing on what I wanted to submit on, I gave myself time to mull over it before deciding I would submit.

Once I had decided I was going to submit, I firmed up what my not 1, not 2 but 5 different session submissions would be, split across a variety of technical and non-technical sessions & over the coming weeks I polished them off before finally submitting them on the night before the finalised extended closing date for submission on August 6th.

Over the course of the following weeks I moved on with day to day life and waited patiently until acceptance emails started to go out on Tuesday August 20th.

Whilst acceptance emails started to go out around 6 PM UK Time (UTC +1 at the time) & lots of people were already announcing they’d been accepted for Ignite, I received the below stating I’d been accepted for 2 sessions 🎉

So I’d had 2 of those 5 sessions submitted accepted, however, I decided to hold back on announcing that I had been accepted until at least the following day, for a few reasons

  • I had to come to terms that I had been accepted to speak at Ignite, Microsoft’s biggest Technical Conference.
    • Whilst this seems like a given considering I submitted, it still was a huge shock to be accepted, more so on the topics that I had been accepted for.
    • It’s worth pointing out that submitting a topic is not the same as committing to present said topic, you only really can commit once you’ve been accepted & you have come to terms with being accepted and all that comes with presenting a session. Once you’ve done that you can then confirm your acceptance to present that session. Which is why it’s important that organisers don’t just put you on the line up as soon as they send out the acceptance of your session.
    • However I personally would also have liked a bit more time to mull over session acceptance as the acceptance email indicated that we had until Friday 23rd August to pull out. I would seriously recommend that event organisers give at least a full weekend in future for speakers to be able to work out all the logistics around the session, including whether it’s actually feasible or not when they aren’t dealing with their day to day workloads so they can properly relax and think things though (or in my case whiteboard it)
  • I had to be confident in my ability to deliver these sessions.
    • Whilst this again seems like a given, I was very much feeling the onset of Imposter Syndrome, because whilst I think I know a fair amount on the accepted sessions, there’s always this pit of doubt within me about my knowledge, something that I find incredibly difficult to shake off.
  • I had to feel comfortable with the idea of presenting at such a big conference
    • Prior to this I had presented at Techorama & the London leg of Ignite the Tour, which in comparison are what I would classify as medium sized conferences.
    • Both events had drained me a little due to their size & I was worried about how this would transpire to an Ignite sized event.
  • I had to feel comfortable attending such a big conference
    • This in itself was the biggest of the factors for me as over the years I’ve started finding larger crowds less and less comfortable to be a part of, so I had to feel comfortable in myself that just being at Ignite would not over stress and over stretch me, at least not too much.
  • How would agreeing to this, and the stresses of putting together sessions impact me & my mental health in the coming weeks & months
    • Did I feel I could cope with this on top of my day to day workload.
    • Did I feel that in accepting I would adequately stretch myself in a way that I would be overall comfortable with & not over stretch me.
    • At the end of the day my own mental health will always come first & whilst I know my own warning signs, I am not afraid to pull out of sessions because of my own mental health. Something I’ve only ever done once.

Once I had mulled over things, I then had a quick chat with my boss and fellow MVP Rik Hepworth about my concerns, mainly to voice them out loud to someone other than myself & gain some additional feedback on them, particularly as he’s a seasoned pro when it comes to the bigger events and I wanted to hear about the experience from someone close to me that I both know & trust. After that conversation I felt more comfortable and confident & decided that yes I would go forward with my sessions. That’s when I then announced I was going to speak, as you can see below.

I had Anna Chu (Community Lead for Microsoft Ignite) respond (who I was lucky enough to meet and have a good long chat to at Ignite)

and also Shona Bang (ex lead of Diversity & Inclusion, now rebranded #HumansOfIT & lead of the D&I track at this years Ignite) comment – with Shona giving away what one of my sessions was going to be on, (again I was lucky enough to meet Shona and have a good long chat to at Ignite too!)

Looking back at this point I am really glad that only 2 were accepted, you can read more about the sessions that were accepted (inc links direct to the videos) in my prior post Speaking at Microsoft Ignite, but not about tech, as there is a lot of work that goes into a deck & in a future post I will detail just how much work I put into my decks.

However, whilst in the run up to Ignite I had one of the other sessions I had submitted for Ignite, Making Operational Azure Management a Breeze, selected on September 27th for a UK based conference called Evolve, which was happening on Monday October 21st 2019.

Now I had 3 sessions to pull together with only 3 full weeks until Evolve & 5 weeks until Ignite.

Oh well this will be fun, I started to think to myself. #ChallengeAccepted

However I had also come to terms in the prior few weeks my sleep cycle was starting to be more worrisome than it had been previously. So I took myself to the doctors, explained the situation to them in detail (always tell your doctor the full story!) particularly as my sleep hasn’t been fantastic for best part of a decade, thanks to 100+ hour working weeks at McDonalds in my pre-IT career days in early 201x’s (yes only this current decade) & working 2 jobs (days & nights) for into late 2014, and was prescribed some sleep aid medications. I can safely say that having these have helped my sleep no end & whilst I expect my sleep cycle will take some time (& some additional lifestyle changes) to properly kick back into a more regular rhythm, I can already tell there has been some improvement in it, however this could be entirely psychological, something I am currently pondering over in more detail & intend to discuss this with my doctor over the coming months.

Roll forward to the evening of Friday 18th October, after a particularly bad week with sleep (I’d run out of my sleep meds) , I get home after what was reasonably long day and open up my three decks & corresponding Onenote’s to work out how much I had left to do. That’s when panic set in. I realised I had much more than I had thought left to do and had somehow miscalculated how long was left till Evolve, thinking I still had another week left and not just the weekend.

So here comes panic mode, and whilst in panic mode & whilst I know I shouldn’t, I end up telling myself things like

  • I can’t do this
  • I should pull out of doing this
    • I actually started writing emails to pull out of both Evolve and Ignite. Emails I later deleted!
  • I was stupid to agreeing to do this
  • I won’t enjoy myself doing this
  • I wont agree to do this again

However, all that is happening is the side of me that thinks I am destined to fail is coming to surface & this is common when levels of stress are high and end up inducing panic. Luckily, I know what this looks like and more importantly feels like, because I’ve regularly put myself into that kind of situation and over the years have come to realise ways that help me ride out the panic, with activities like walking to the shops, getting a long bath/shower, playing a frame or 5 of snooker (on the table I have at home) & chucking on one of my many calming tracks on Spotify being just a small selection of the things that help be de-escalate the stress of the situation. Heck finding a toilet and locking the door for 2-35 minutes can be the simplest way to let the situation de-escalate when I am out and about.

Once de-stressed, I revert to my more typical growth mindset of

  • I can and will do this
  • I just need to focus on what I need to do now to succeed in doing this
  • I agreed to do this because I know I can do this
  • I have previously enjoyed doing this, so I will do so again
  • Therefore I will agree to do this again
  • I will likely go through this cycle again, but it’s an important cycle to stretch myself to grow as an individual

 

If I had slept well that week, I wouldn’t have had a short (in the grand scheme of things) 2 hour meltdown about it. Thankfully a nice long soak in the bath and a new album on Spotify did wonders and set back onto the more happy path I needed to be to get shit done. This is just a simple example of what bad sleep can do to you (well me), leading to easier levels of less than usual rational thinking.

Rolling on from Friday evening to Sunday morning & the deck for Evolve is now completed, Monday comes along and I deliver the session at Evolve which I think went down well & I was able to crack on with more of my Ignite Decks and get myself back on track to where I wanted to be.

At this point there was just 13 days until Ignite and a fair amount of travel ahead for work in between it all & I still had so much more that I wanted to get into my decks, particularly my Mental Health deck, that I was now looking at my OneNote and moving things from the “must talk about” pile to the “maybe next time” pile so that I could fit it all in.

There’s so much more that I would have liked to talk about but by this point I had already spent many hours in rehearsals, edits, amendments and far too much time in MS Paint too! I couldn’t realistically afford to add in another run of heavy content amendments so started on polishing the session and really getting to know the session flow properly, particularly as this had been the deck I was most concerned about, because whilst I can write and talk about tech, building a deck and talking about a non-technical subject like Mental Health felt 100x harder, perhaps because overall this is a more important topic, so I felt additional stress to “get it right”.

Travel Time

You would think that after a fair amount of travel over the years that I would have my routine down to a tee and be 100% ready, sorted out for my travel & be all prepared, packed the night before with all the alarms set as needed to make the hour long drive to the airport to catch a 10am Sat morning flight. I however crashed on the Friday night when I walked though the door before I had set alarms, I had however been smart earlier on in the week and done extra washes throughout the week, so I was ahead of myself and could easily pack when I woke up at 5am that morning. Rushing around at 5am internally yelling at myself “shit did I pack that”, “shit where is that” and “I’m just going to miss the plane” is certainly one way to wake yourself up in a hurry. Then again so is a can of Red Bull and a cold shower.

Roll on a few hours, it’s past 7am, I have dumped the car off at airport parking, and I’m now sat in Manchester Airport with a pint (of cider) in hand and doing a run through of my Mental Health deck whilst trying to pass the time. I am now at the stage where I am genuinely happy with the deck but need to iron out a few niggles with it, and what better than almost 3 hours to do so at the airport.

But maybe the trusty Surface Book 2 has other ideas

Luckily it has lasted beyond Ignite, but it’s not far off from needing replaced now I don’t think, which is a shame as I really like the keyboard on this device as it’s so easy to type on.

However travelling to Orlando was nice and simple with a direct flight there which I have to call out Virgin Atlantic for such a lovely flight. It certainly was one of the easier and more comfortable of the flights that I have done over the years.

Once I got out at Orlando it was pretty smooth and simple enough getting though and out of Airport Security, registered for Ignite at the Airport and then went about arranging an Uber to my hotel. Once I had dumped bags off I went off in search of the convention centre which was a good 40 minute walk from my hotel and a good way to stretch the legs after the long flight.

Anyway I think that is enough for now and I hope you will join me in part 2 of this post when I will go into a day by day recap of the what the event was like, which I intend to try and get out early next week.

Speaking at Evolve Conference

You can tell that the end of year conference season is coming up as we’ve just wrapped up PSDayUK, which was a resounding success!

Then this week there are a number of events currently going on, and with some of the Black Marble team, currently at Future Decoded, and some more of the team currently at Techorama Netherlands, this week as well, the office has seemed somewhat quiet.

Then next Saturday (12th October) there is DDD @ Microsoft Reading (which I am attending) and then in just a short few weeks is Evolve Conference, where I will be speaking on Making Operational Azure Management a breeze.

So with just under 3 weeks to prep for this session, alongside preparations for my 2 Sessions that were selected for Ignite as blogged about in Speaking at Microsoft Ignite, but not about tech, there’s a lot to do and not a lot of time to do it, before the additional element of adding some work for Hacktoberfest I can see this being a VERY busy month, and that’s without factoring in the working week.

But I do love it and I will be looking forward to (maybe) a quiet last 6 weeks of the year.

Time for a Lick of Paint

After a good few years with the last rather dated look of this blog as per below

I felt that it was time for a bit of an update.

Though this also coincided with some software updates that I needed to implement as well so I thought I’d do a rolling update of theme, software and plugins and luckily most of it went ok (this time round)

Let me know what you think in the comments

Speaking at TechoramaBE

Looking through my emails & twitter on Sunday I was pleasantly surprised to see these two items popup

Followed closely with the following email

TechoramaBE

From everything I have heard about Techorama, it looks to be a good conference and I am certainly looking forward to visiting Belgium 🙂

This is the first of many submissions that I have made, but the 1st that I can accept and attend as I was also accepted for DevOps Pro Europe in Lithuania (another on the to visit list) but this unfortunately clashes with the MVP Summit so I’ve had to turn that one down 🙁

Looking forward to getting back into the swing of technical presentations yet again.

Heres to a fun filled 2019.

For more info on where I am presenting / attending you can see Find Me At & you can see Where I’ve Presented or Where have I been for a historic overview

 

PSDay 2019 – Call for Speakers

As promised in my last post – I wanted to let you know that the Call for Speakers for PSDay 2019, to be held on September 30th, at etcVenues in Birmingham, is now open.

The Call for speakers form is located over on Sessionize & we are looking for sessions for topics as listed is this post from last years PSDay Call for speakers and to be of 60 minutes in length for the main conference sessions, however unlike last year we are also accepting submissions for the post-conference Workshop Training day that we intend on putting on Tuesday 1st October 2019, also at etcVenues. More details on the Workshop Training day will come in a following post.

We currently have a cut of date of the beginning of May 1st for sessions (UTC) with a view to announce the speaker schedule around the week of PSConfEU. I would also suggest that the abstract you submit does not need to be perfect, at least on initial submission, as we have allowed for the sessions to be amended after initial submission.

Part of the reason for this is that we will come back to submitted sessions in the early weeks of May and start to reach out to speakers to either confirm or polish off their abstract in those coming weeks, prior to us announcing the full line up of speakers by mid June.

We are also currently in the process of working out what sponsorship packages of the event will look like and I would say until we release a further post on what these look like if your organisation would be interested in sponsoring PSDay then please reach out to me in this interim period, whilst we iron out what the sponsorship package would look like.

I am looking forward to PSDay & I am looking forward to seeing you there as either, an attendee, a speaker or a sponsor.

If you have any questions at all please reach out and I’d be happy to answer.

Update

Some things to note as I am expecting questions on this to crop up, we currently do not think that we will be able to help with speaker costs, however this may change, so please if you submit something and may be looking for assistance with cost’s please add this in as an additional note so that we at least have some awareness of this and can try and plan to accommodate where possible. This is far from a guarantee that we will be able to assist, but allows us to try and gauge how things pan out and if we many need to adjust costs if/when we feel we are likely able to help support those speakers that may need additional assistance and I envision that this will be looked at on a case by case basis alongside the sessions submissions.

Tickets should go on sale in stages in the upcoming few weeks – more on this once we have managed to have a further organisers discussion in the next week or so and I will blog about it when they go live.

Update 2

Regarding Travel into the UK after Brexit – This is something that I have not looked into and therefore I would suggest to all that may ask about it to just keep an eye on what happens at this stage and consult your own governments documentation guidance into as this will likely be the most up to date and will have the right information available as to how it may / may not end up affecting you.

Info Share about PSDayUK 2018 – Call for Speakers, Ticket availability & Upcoming Call for Sponsors.

This year, we as the collective behind the UK PowerShell & DevOps User Groups, are running the second PSDayUK event, the only Conference that is totally Dedicated to PowerShell here in the UK & will be held on October 10th at CodeNode, London. You can find more info about at PSDay.UK including being able to purchase tickets & the eventual schedule once published in the upcoming weeks.

We will constantly be releasing information about PSDay as we approach the time for the event via various methods of social media including via our Twitter Account @psdayuk which I would highly recommend you follow if you want to be kept in the loop of what is coming to PSDay.

To give a little bit of background, PSDay is the conference brand of the UK PowerShell & DevOps User Groups, for more information on the UK PowerShell & DevOps User Groups please see PowerShell.org.uk and PSDay is currently planned at being an annual event much like the bigger PSConf EU, PSConf Asia and US PowerShell & DevOps Summit events, and whilst each of the bigger events are multi-day events PSDay at present is a singular day event, although the format of the event may change in future, this is something that the organising team are keeping in mind for future events.

With this in mind this year we are running PSDay as a dual parallel track conference, where we a solid idea of what we are intending the tracks to contain to cater all skill sets based on what we’ve learnt as part of running consistent monthly User Groups in London and whilst there is a HUGE variety of topics that could be delved into with PowerShell we have seen a reoccurring theme with the User Group over recent months.

This means that we have been more selective in the ideas behind what sort of topics that we are looking for this year with a view to have topics along the following lines

Track Name Track Focus Suggested topics
The many components of the PowerShell Language All things related to the PowerShell Language Debugging, Classes, Remoting, Performance, WMI/CIM, Pester, PSScriptAnalyzer, DSC, Workflow,  Using .NET in PowerShell & anything centred around the Core PowerShell language that can be useful for all skill sets.
Using PowerShell as the Glue of Automation All things Automation Automating any Technology from any device installed anywhere, Azure, AWS, GCloud, Office 365, Microsoft Graph, VSTS, GitHub, PowerShell Gallery, SharePoint, Exchange, SQL Server & more

 

The idea behind the first track, ‘The many components of the PowerShell Language’, is that those that are new to PowerShell, and even those of us that have been using PowerShell for years, can come into this track and take away a wide variety of knowledge about the core parts of the PowerShell Language that comes from a more general use perspective, which would allow attendees to be able to take away and expand on what attendees learn in this track in their own times and is expected to be of a more generalist track where the skills learned can be then taken and used across an enormous number of technologies.

 

The idea behind the second track, ‘Using PowerShell as the Glue of Automation’, is to be much more centred around using PowerShell with specific technologies & is more more likely to be the track for those that want the more technically  for those people that are either well into their DevOps journeys & are already using many differing DevOps Practices and perhaps looking at further expansion of their skill set or are looking at replacing existing or embedding additional technologies within their organisations.

The Call for speakers form is located at PSDay Session Submissions & we are looking for sessions for topics as listed above and to be of 60 minutes in length. We currently have a cut of date of July 31st for sessions and I would highly suggest that any potential sessions you may want to submit be submitted quickly. I would also suggest that the abstract you submit does not need to be perfect but does need to give us as organisers the ability to pick and choose topics from all the submissions. The reason for this is that we will come back to chosen speakers, based on topic & technology and come back to them to confirm/polish off their abstract in the early weeks of August, prior to publishing the schedule by beginning of September.

We are also currently in the process of working out what sponsorship packages of the event will look like and I would say until we release a further post on what these look like if your organisation would be interested in sponsoring PSDay then please reach out to me in this interim period, whilst we iron out what the sponsorship package would look like, and we have already been approached by a few sponsors so this will be coming along very soon.

I am looking forward to PSDay & I am looking forward to seeing you there as either, an attendee, a speaker or a sponsor.

If you have any questions at all please reach out and I’d be happy to answer

 

 

MVP award renewal time, my Renewal & New recent MVPs

So this year the MVP program had a bit of a change to how it handled the renewal and awarding process, a change that I fully supported, as it would allow for the program to become, in my own opinion, more agile and therefore recognise more and more memebers of the amazing community on a more reasonable, and sustainable, cycle process. 

My only gripe, and this is so minor that I can look at it and laugh at it, as it only really affected me in a personal manner, was that the change to the renewal cycle meant that this year I got a free extension of 3 months from April till July (yay) but I lost the receiving the award as an early birthday present from Microsoft like I had for my 26th last year. 

I instead got a much better reward from Microsoft in the longer and grander scheme of things, and that was seeing 6 of the 8 of my nominees also being rewarded and recognised as well in the last year and I also notice and nominate others as I notice their impact to the communities that I’m a part of.
I’m looking forward to the rest of this year and a number of interesting user groups and conferences in the near future, including a first with a joint Presentation at SQL Saturday Manchester with my buddy Mr “SQLDBAWithABeard” & newly minted MVP, Rob Sewell. 

If you haven’t already signed up for the free SQL Saturday Manchester Event then please do at http://www.sqlsaturday.com/645/EventHome.aspx

Once SQL Sat Manchester has passed I will be starting to get myself back to blogging more regularly again but until then you can catch me on LinkedIn or Twitter.

*EDIT – I seemingly missed that I was Renewed as an MVP – happy to be part of the growing community and looking forward to the year ahead! 

5 out of 6 in 6 Days = A busy week

This week has been a busy week for me with the SQLRelay and SQLSat Munich events. It has been full of fun especially seeing as for SQL Relay we had the fun bus for travels between the different venues all across the UK.

The week started of as most other weeks do and that was with me at home in Derby on Monday Morning. This was followed by me jumping on the train to Birmingham around 11am Monday Morning for the first leg in the SQL Relay tour where I presented a completely new and fully non-technical session, something that is a little bit out of my comfort zone of the typical more heavily technically focused sessions that I’m used to delivering.

This was a session that I’ve put together based upon my own career experiences about the need to really spend time on developing and taking ownership of your career. It was pointed out on a few occasions throughout the week that at 26 I’ve still yet to really “have” a career and although in some ways that can be seen as being very true, there is also the other side of the coin, in which I’ve had the opportunity to see first hand with other colleagues how not owning your career can lead you down path that doesn’t leave you with a role that you enjoy and feel sustained and secure in.

 

I really do feel that it really is essential that you keep up with your Training and take control of your own career – as after all it’s your career and how well it goes is down to you as an individual and how determined you are to achieve the Salary and work life balance that you wish to have. Troy Hunt has blogged about his experience of making his job redundant at https://www.troyhunt.com/how-i-optimised-my-life-to-make-my-job/ and this is something that scares most people when they think about it at any real depth. I would also recommend reading the follow up post on this that Troy has done on this recently as well https://www.troyhunt.com/7-years-of-blogging-and-a-lifetime-later/ as both of these are similar to my way of thinking around work and life balances.

 

Not only did I do a new presentation but I have also been busy enjoying being on the SQL Relay FunBus as well and although I knew a number of the other fellow travellers it was good to be able to spend some more concentrated time with them. The SQL Relay is a great idea and I’m already looking forward to the ‘tour de UK’ again next year.

 

To top the week off I have also been at SQLSat Munich this weekend where there has been even more fun times with the extended #SQLFamily which has been great. I seem to have a thing for Munich and the first real weekend of October as I was also here last year for SPSMunich which you can read about the experience in my recap post

 

I however am looking forward to getting back home after pretty much a week on the road and getting ahead with some of my prep for PSConfAsia in just under 2 weeks.

PSConfEU call for Speakers is now Open!

Proud to announce that Speaker Submissions are being accepted for PSConfEU 2017 – you can submit your session proposals via the following form

A few things to note about this year’s submission and selection process

 

  • We have a hard cut of date of the end of Sunday December 1st – submissions must be in by this time or will not be accepted.
  • This is because we will have a selection committee gathering during the week commencing Monday 2nd December
  • The members of the selection committee we all vote for our favourite sessions.
  • This will begin to form a preliminary schedule.
  • We will then send out on confirmation emails to the selected speakers
  • By December 31st we expect to have confirmation from all speakers and the schedule ready to launch hopefully posted by Jan 1st.

 

Once again I am very proud to have the opportunity to be working alongside fellow MVP Tobias Weltner and the rest of the organisation team to bring to you the 2017 flavour of the PSConfEU and will look forward to seeing you at PSConfEU 2017n

My Company Re-Digitise website gets a much needed lick of paint

Also over the course of this weekend I have rebuilt the Re-Digitise site from the shell that I threw together on Github Pages to a more modern Site using WordPress as the Backend

 

Please take a few minutes to have a look at the site at https://www.re-digitise.org and get in contact with me if you feel that Re-Digitise could help your company out at all.

 

As per my Previous blog post the site is fully https where it wasn’t before although I am yet to get it set up with Cloudflare for the CDN side of things.

 

Now to start building some Promo materials for Re-Digitise Smile

Minor Blog Update – now HTTPS by default!

This Sunday I set out to force my blog hosted on Azure to be Https by Default and I mainly made use of the following Article by Troy Hunt on the underlying implementation which makes use of Cloudflare but I’ve also decided to get it set up ready for if I may want to move away from CloudFlare to Azure CDN in future.

 

There really isn’t to difficult to do this especially if you follow Troy’s post. It is something that can be completed in a manner of hours and best of all it is a free service from Cloudflare to enforce HTTPS and you get the power of a CDN built in too.

 

This means that the core items of my blog site load much quicker than they used to which is good for everyone that visits in future.

 

There are a few little amendments that you need to do on the Azure WebSite side with the Web.Config file but with it being an addition as simple as below I’m sure that it wont be something that trips people up in future

 

<rule name=”Force HTTPS” enabled=”true”>
          <match url=”(.*)” ignoreCase=”false” />
          <conditions>
            <add input=”{HTTPS}” pattern=”off” />
          </conditions>
          <action type=”Redirect” url=”https://{HTTP_HOST}/{R:1}” appendQueryString=”true” redirectType=”Permanent” />
        </rule>

 

You do need to also get your own SSL Certificate if you are using your own Domain name as by default there is a Wildcard SSL cert for the azurewebsites.net domain and I decided to go with DigiCert – http://digicert.com/ – for this as opposed to the Cloudflare cert that you could go with.

 

Hopefully now my blog will load a little quicker for you all Smile

Awarded the MVP Award – What this means to me and the future for the community

The MVP Award is defined by Microsoft as the below

Microsoft Most Valuable Professionals, or MVPs, are community leaders who’ve demonstrated an exemplary commitment to helping others get the most out of their experience with Microsoft technologies. They share their exceptional passion, real-world knowledge, and technical expertise with the community and with Microsoft.

This means that within the different areas of the Microsoft Stack there are those out there that really believe that the world can be a better place when we come together as a united front and share the knowledge that we have.

This can be knowledge that we have gained through personal experience of working with the products that we find the most interesting and beneficial to our personal & professional lives or though being there as a point of call for other members of the community to reach out to.

One thing about the MVP Program that has always struck me as an amazing program was the willingness of the MVP’s to do what they can to help you, even if it doesn’t immediately help them in achieving anything, often giving away a decent sized proportion of their own time to do so and in reflection on receiving this award, over the last year I’ve been doing the same, although completely unware that I had been doing so.

I have attended a number of different events in the last year (for more details check out the Where I Have Been page) and have met a tremendous number of amazing people at all these events. It was the framework for the SharePoint & SQL User Groups within the UK that lead me to start thinking about reviving the PowerShell User Groups and I have blogged about this in this post and I have enjoyed every minute of it.

The future for the UK PowerShell User Groups looked good however with being Awarded MVP last week the connections that I will make from being part of the UK MVPs will hopefully allow for the User Groups to grow in the coming months/years so expect there to be news of new User Groups forming in the coming months across the UK.

To help the groups grow, I’ll be putting together an “Organisers Pack” which contain useful information and a collection of the tools, contacts and general tips required  which will help those interested in running a local group get it off the ground – however if in doubt get in contact with me.

 

However there is another aspect to receiving the MVP Award that I want to touch on briefly. As part of the MVP Program the MVP’s get the opportunity to help out in more community focused events, some ran by Microsoft, others ran by the community and others ran by non-profit organisations or the education sector. Giving back to the immediate communities is always going to be high up on my list of priorities however I am really looking forward to working with some of the bigger and more personally touching social opportunities over the next year.

 

This does mean that my calendar will be much busier but for me the end result is always going to be worth it.

Finally – A small shoutout to those that have supported me over the years and especially the last year and although I will not name anyone in particular, I’m sure that those people already know who they are!

2016 – 1 Quarter Down and 3 more to go and the Fun has only just begun!

Fooled Ya! Today I became a MVP!

 

Well only if you read this post

MVP2016

This is an exceptional honour to have been awarded the MVP for Cloud and DataCentre Management and to me this kinda feels like an early birthday present from Microsoft (my birthday is on Monday)

This isn’t something that I ever expected to achieve however it is a recognition from Microsoft themselves of the work that I have previously done for the community.

I started off down the community path only last year in that time I have made some amazing friends and met a number of other MVP’s along the way.

The Remainder of 2016 I have a lot planned to help further enhance the community and hopefully break down some of the barriers between the IT Pro world and the Development world that PowerShell has found its self right in the middle of to make this technology more accessible to all that need to use it.

With that in mind over the next few months there will be some further announcements about Get-PSUGUK – the UK PowerShell Community and its evolution.

As part of the Friends I’ve Made in the MVP Community – It has been agreed that this year at SQL Saturday Manchester by Chris Testa-O’Neill MVP Data Platform, that there will be a dedicated PowerShell Track. This will consist of mainly introduction sessions for those that have no/little PowerShell experience but there will also be some sessions on using PowerShell with a SQL Focus. This is an amazing FREE event and it is as much an honour for me to be working on that as it is to receive the MVP Award – So if your interested in Attending check out http://www.sqlsaturday.com/543 – Announcements on Sessions will be coming in the coming months.

Stay tuned for more details in future and as always – Keep Learning, Networking & Finishing what you Start.

 

Now for the weekend of celebrations to begin Smile

Thank you Microsoft and an even bigger thanks to you – the people reading this Post, keep doing what you do and helping make the community as great as it is.

Invoking PSScriptAnalyzer in Pester Tests for each Rule

This is a quick walkthrough on how you can get output from PSScriptAnalyzer rules in your Pester tests.

So you’ll need

  • Pester ( Version 3.4.0 or above )
  • PSScriptAnalyzer ( Version 1.4.0 or above )

Please note this is shown running on PowerShell  v5 as part of Windows 10 Build 14295 – results may vary on other PowerShell Versions

In the nature of the way we want to work we may have new ScriptAnalyzer rules in the near future (new version / additional community additions / your own custom ScriptAnalyzer rules etc) and we would want ensure that we test for them all without having to change much of the below code

to dynamically do this within our Context Block.

 

So our example code in our Pester Test would look like this

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

    foreach ($module in $modules) {

        Context “Testing Module  – $($module.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

    foreach ($Script in $scripts) {

        Context “Testing Module  – $($script.BaseName) for Standard Processing” {

            foreach ($rule in $rules) {

                It “passes the PSScriptAnalyzer Rule $rule {

                    (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                }

            }

        }

    }

}

And the result that we would get would be the below

 

PS C:\Users\Ryan\OneDrive\GitHub\kilasuit\Scripts-WIP\PesterScriptAnalzyerExample> .\PesterScriptAnalzyerExample.ps1

Describing Testing all Modules in this Repo to be be correctly formatted

Describing Testing all Scripts in this Repo to be be correctly formatted

   Context Testing Module  – PesterScriptAnalzyerExample for Standard Processing

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingCmdletAliases 233ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueSwitchParameter 124ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingEmptyCatchBlock 134ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidGlobalVars 87ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidInvokingEmptyMembers 104ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidNullOrEmptyHelpMessageAttribute 70ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPositionalParameters 879ms

    [+] passes the PSScriptAnalyzer Rule PSReservedCmdletChar 75ms

    [+] passes the PSScriptAnalyzer Rule PSReservedParams 81ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidShouldContinueWithoutForce 85ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingDeprecatedManifestFields 117ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidDefaultValueForMandatoryParameter 123ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingUserNameAndPassWordParams 95ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingComputerNameHardcoded 113ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingConvertToSecureStringWithPlainText 98ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingInvokeExpression 75ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingPlainTextForPassword 103ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWMICmdlet 138ms

    [+] passes the PSScriptAnalyzer Rule PSAvoidUsingWriteHost 91ms

    [+] passes the PSScriptAnalyzer Rule PSMisleadingBacktick 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseBOMForUnicodeEncodedFile 100ms

    [+] passes the PSScriptAnalyzer Rule PSUseToExportFieldsInManifest 87ms

    [+] passes the PSScriptAnalyzer Rule PSUseOutputTypeCorrectly 128ms

    [+] passes the PSScriptAnalyzer Rule PSMissingModuleManifestField 84ms

    [+] passes the PSScriptAnalyzer Rule PSPossibleIncorrectComparisonWithNull 99ms

    [+] passes the PSScriptAnalyzer Rule PSProvideCommentHelp 98ms

    [+] passes the PSScriptAnalyzer Rule PSUseApprovedVerbs 75ms

    [+] passes the PSScriptAnalyzer Rule PSUseCmdletCorrectly 867ms

    [+] passes the PSScriptAnalyzer Rule PSUseDeclaredVarsMoreThanAssigments 82ms

    [+] passes the PSScriptAnalyzer Rule PSUsePSCredentialType 91ms

    [+] passes the PSScriptAnalyzer Rule PSShouldProcess 160ms

    [+] passes the PSScriptAnalyzer Rule PSUseShouldProcessForStateChangingFunctions 86ms

    [+] passes the PSScriptAnalyzer Rule PSUseSingularNouns 177ms

    [+] passes the PSScriptAnalyzer Rule PSUseUTF8EncodingForHelpFile 176ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscTestsPresent 98ms

    [+] passes the PSScriptAnalyzer Rule PSDSCDscExamplesPresent 102ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseVerboseMessageInDSCResource 81ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalMandatoryParametersForDSC 110ms

    [+] passes the PSScriptAnalyzer Rule PSDSCUseIdenticalParametersForDSC 74ms

    [+] passes the PSScriptAnalyzer Rule PSDSCStandardDSCFunctionsInResource 122ms

    [+] passes the PSScriptAnalyzer Rule PSDSCReturnCorrectTypesForDSCFunctions 101ms

 

This allows you to see from your test if it fails or not and as shown is able to be used for scripts and modules.

The example is a good example as well of getting Pester to test your Pester tests Winking smile

This example is being added into ISE_Cew (see post) in the next feature release (next week some point) though you can just copy and paste it from this blog post as well thanks to a PowerShell ISE addon called CopytoHtml by Gary Lapointe in which you can find more about it and download it at http://blog.falchionconsulting.com/index.php/2012/10/Windows-PowerShell-V3-ISE-Copy-As-HTML-Add-On/

 

Please note that although the above works fine – I dont see the point in running the Describe block if the tests below wont run so I’m adding what I think to be the better version below – this will only run the Describe blocks if there is any scripts or modules

#Script#PesterScriptAnalzyerExample#

$Here = Split-Path -Parent $MyInvocation.MyCommand.Path

$Scripts = Get-ChildItem $here\” -Filter ‘*.ps1’ -Recurse | Where-Object {$_.name -NotMatch ‘Tests.ps1’}

$Modules = Get-ChildItem $here\” -Filter ‘*.psm1’ -Recurse

$Rules = Get-ScriptAnalyzerRule

if ($Modules.count -gt 0) {

    Describe ‘Testing all Modules in this Repo to be be correctly formatted’ {

        foreach ($module in $modules) {

            Context “Testing Module  – $($module.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $module.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

if ($Scripts.count -gt 0) {

    Describe ‘Testing all Scripts in this Repo to be be correctly formatted’ {

        foreach ($Script in $scripts) {

            Context “Testing Module  – $($script.BaseName) for Standard Processing” {

                foreach ($rule in $rules) {

                    It “passes the PSScriptAnalyzer Rule $rule {

                        (Invoke-ScriptAnalyzer -Path $script.FullName -IncludeRule $rule.RuleName ).Count | Should Be 0

                    }

                }

            }

        }

    }

}

Updated! Quick Win – Install PowerShell Package Management on systems running PowerShell v3 / v4

**Update 9th March 2016 PowerShell Team released an updated version of the PackageManagement modules today and I’ve updated the Script accordingly and will install the latest PackageManagement modules for you with a little verbose output

Updated Microsoft blog is at https://blogs.msdn.microsoft.com/powershell/2016/03/08/package-management-preview-march-2016-for-powershell-4-3-is-now-available/ **

 

This is a very very quick post about the latest feature being made available downlevel from Powershell v5.

As Microsoft have released PackageManagement (formally OneGet) that is now avaliable for PowerShell v3 & v4 as detailed in this link http://blogs.msdn.com/b/powershell/archive/2015/10/09/package-management-preview-for-powershell-4-amp-3-is-now-available.aspx

That’s right the ability to pull directly from the PowerShell Gallery but you need to install the Package Management release which I’ve Scripted for you here.

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/PSPackManInstaller’)

And if you want to look at the Script then direct link is http://bit.ly/PSPackManInstaller – this takes you to the RAW version of the file on Github so will not download or execute – but will allow you to read it

Hope this is useful for you

PS credit goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

Updated! Quick Win – Install-WMF5 (again)

 

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

 

**UPDATE 24/02/2016** WMF5 was re-released today and the below scripts should still work**

 

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but this is low priority for me as really we shouldn’t be deploying Server 2008 or Windows 7 Systems any more

<

p class=”ExternalClass9895ED4FC6204BF3B8661CE60051AB0C”>PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future

#PSTweetChat – The EU Edition

Body:

If you have been involved in the #PSTweetChat events that have been running with Adam Bertram (@adbertram) & Jeffery Hicks (@JeffHicks) and a number of others, then you would be aware of just how awesome these 1-hour open discussion sessions truly are.

A number of PowerShell Questions get asked and answered from members of the PowerShell community worldwide so they can become a valuable resource to getting a right answer to an issue quickly or even just learning more about the people that make the awesome community and what they are currently up to this week.

So after a discussion about this with Adam Bertram & Jeffery Hicks on a previous #PSTweetChat I had said that I would co-ordinate a #PSTweetChat that was more EU time friendly – Well I am announcing that the EU #PSTweetChat will be monthly on the 3rd Friday of the month starting on February 19th at 10am UTC 0

This will be the case for February & March and then in April (due to the Time changes to the Clocks) we will move to 10am UTC +1

So the dates will be (mark them in your calendar)

  • February 19th – 10am UTC 0
  • March 18th – 10am UTC 0
  • April 15th – 10am UTC +1
  • May 20th – 10am UTC +1
  • June 17th – 10am UTC +1
  • July 15th – 10am UTC +1
  • August 19th – 10am UTC +1
  • September 16th – 10am UTC +1
  • October 21st – 10am UTC +1
  • November 18th – 10am UTC 0
  • December 16th – 10am UTC 0

I will look forward to the future #PSTweetChat conversations.

Published: 27/01/2016 15:14


Get-PSUGUK – Call for Speakers

The UK PowerShell User Groups (Get-PSUGUK) are undergoing an expansion with some new User Groups being sprung up across the UK over the upcoming months.

If you have been able to attend any of the previous events (Manchester & London) then you will know that I’m a big advocate for making a real community out of the User Group Meet ups – one where there is the opportunity for those from all differing IT backgrounds to rise up and present a topic to their local User Group.

With the number of differing PowerShell related Topics that there are available there should be no shortage of possible topics and there will be availability for a variety of different formats including short 15-minute Lightning presentations, 45-minute Presentations and even possibility for a full evening presentation.

With this in mind we are putting forward a Call for Speakers for the year ahead which if you are interested in presenting a topic then we have an Excel Survey that you can fill in located at http://1drv.ms/1OVuqul – please note that we are not currently looking delivering sessions remotely.

Myself and the fellow organisers, Corey Burke (@cburke007), Iain Brighton (@IainBrighton) & Richard Siddaway (@RSiddaway) will look forward to seeing you at future User Group Events and would like to invite you to follow @GetPSUGUK on Twitter for updates on the PowerShell User Group Events in Future.

To Sign up for the Manchester User Group on Feb 1st please follow this link – http://www.eventbrite.co.uk/e/get-psuguk-manchester-february-tickets-20117867082

To Sign up for the London User Group on Feb 4th please follow this link – http://www.eventbrite.co.uk/e/get-psuguk-february-london-tickets-20727283864

To see future events (including new cities as they occur) please bookmark this link – http://get-psuguk.eventbrite.com/

 

My Workflow for Using Git with Github – pt3

So this is Part 3 of a series of Blog Posts on my (currently ever changing) Workflow with Git, Github & PowerShell.

Hopefully you have had chance to look at the previous posts in this series if not they are

Part 1

Part 2

However, for this post we will be concentrating on Script & Module Creation and how we can make the overall experience more efficient with the PSISE_Addons module that I’m releasing on Github https://github.com/kilasuit/ISE_Cew

We will cover the following items today

  • Use of PSDrives for the functions & why you should use them in this case
  • Use of Git for the source control in this module – Simple and hopefully clear to follow and not too in depth
  • The Functions used in this module
  • Creating compliant PSD1 files for the PowerShell Gallery – Because it’s annoying to have to do this manually and that’s why we automate right – Again added in this Module is an example!
  • Creating some basic Pester Tests – again without even thinking about it as I am giving this to you as part of the ISE_Cew Module!

 

So firstly – Using PSDrives within the Function and why to use them.

PSDrives are a good and simple way of having a location that you can reach only in PowerShell and can use a variety of Different Providers – FileSystem, ActiveDirectory etc

We will be using the FileSystem Provider in this example for our functions.

So I begin with a Few PSDrives created in my PowerShell Profile As you can see below – I use 1 profile and encapsulate an if Statement to check if the host is the PowerShell ISE for ISE only Functions – like the 3 I will be showing you today.

011416_1355_MyWorkflowf1

 

 

 

As you can see I Have a PSDrive for all the following OneDrive, Github, Scripts, Scripts-WIP, Modules & Modules-WIP

The Important bit here is that all my Github Repo’s are actually stored in my Personal OneDrive as you can see from the above image – this means that I can Switch between Devices Very Very Quickly once things are saved 😉 – It’s probably key to point out this could be your OneDrive For Business Location as well or a Shared Drive if you are in an organisation that uses HomeDrive locations. The Possibilities are endless – save your imagination.

So from here we have our PSDrives set up and the beauty of this is that it allows very simple navigation between repo’s as you have them all centralised. In my Next Post I will be showing you how you can populate this Github Location with all your Repo’s that you have Forked and how you can ensure that all repo’s are up to date and have the latest updates pulled into them or commits pushed from them in just a few functions! So stay tuned for that!

Hopefully this will leave you with a reason to adopt PSDrives into your workflow and we can move onto the next section.

Use of Git for Source Control in this module

Quick intro – Git can be used with your own offline Repo’s – It doesn’t need to be linked to a Github Repo however this is most common and I would recommend that you use Github – You can get 5 Private Repo’s for only $7 USD a month (about £4 odd)

For more information on Git for Source Control if you are new to it I would recommend having a look at this series on PowerShellMagazine http://www.powershellmagazine.com/2015/07/13/git-for-it-professionals-getting-started-2/ – that was how I got started and also have a play with the “Learn Git in your Browser” on http://try.github.com/ – it’s definitely a useful starting point and will help you out in your future endeavours.

So the Key Git commands used in this module are

  • Git add – Simply adds files to be watched under the git version control system
  • Git commit – commits a version change to the repository location for the files

Other Key Git commands

  • git push – pushes new commits to the remote repository (this could be hosted on Github)
  • git pull – Pulls changes from the remote repostitory (this could be hosted on Github)
  • git clone – clones the remote repository to your own machine (this could be hosted on Github)

So that’s the key commands out of the way but why and when will we want to use them or in our case not think about using them.

The Functions used in this Module

 

For me I’m a bit data-centric (aka a data hoarder) – I prefer to have too much data than not enough. So to cover this I wanted a way to Auto Commit Any changes to Scripts and Modules every time I saved them

So this is where creating this module came in – and the functions contained within.

I have created 3 Core functions

  • Save-CurrentISEFile -Saves Current File that is Open in ISE whether it has been previously Saved or not
  • Save-AllNamedFiles – Saves all Files that have previously been saved
  • Save-AllUnnamedFiles – Saves All files that have not been previously save

And also 2 helper Functions

  • Request-YesOrNo (amended from the one included in SPPS – thanks to @Jpaarhuis)
  • Get-CustomCommitMessage – basic VB popup box for custom commit message

Now I must note that currently this is only compatible with v4 and above though that can change – if I get enough time and requests to do so – though you could always add this in with your own updates to the module.

So let’s look at the Process to be used with the following functions.

Imagine we are creating a script called Get-UptimeInfo – we could easily create this and then save using the default handlers in ISE however there are some issues that I’ve found

  • File path defaults to the last saved location – Example being you are working on a script in C:\MyAwesomeScript then when you click Save it will save it there and for each time you reopen ISE it will default there – Not Ideal
  • I like things Centralised – that way I know where things are!

So to Overcome this we put at the beginning of the script the following #Script#Get-UptimeInfo# – this then tells the Save-CurrentISEFile or the Save-UnnamedFiles functions that we want to create a PS1 file called Get-UptimeInfo in the Scripts-WIP PSDrive location

This would look like the below before running either Function

011416_1355_MyWorkflowf2

 

 

 

 

 

 

 

 

And the We can run either function in the Command Pane like any other function

011416_1355_MyWorkflowf3

 

 

 

 

 

 

 

 

Oooh – look at that the file has saved and named Get-UptimeInfo – it is a ps1 file and we are being prompted about whether we want to add a Custom Commit Message – So we’ll click Yes and see what we get

011416_1355_MyWorkflowf4

 

 

 

 

 

 

 

 

Here we get a popup box, (currently uses VB for this but it works and is a few lines) asking us to provide our commit message – I’ll Add the Commit Message as “Testing PSISE_Addons Save-CurrentISEFile Function”

The Result can be seen below – note there is a Get-UptimeInfo.tests.ps1 file been created as well – This is set by the what you include in your profile as suggested in the PSISE_Addons.psm1 file

011416_1355_MyWorkflowf5

 

 

 

 

 

 

 

 

 

 

 

If we wanted to do the Same with Modules then it would be something like this #Module#FindSystemInfo# and that would tell Save-CurrentISEFile or the Save-UnnamedFiles functions that we want to create a folder in the Modules-WIP PSdrive location called FindSystemInfo and in there we want to save a PSM1 file called FindSystemInfo whilst also creating a compliant psd1 file for the Gallery & also Creating a FindSystemInfo.tests.ps1 file containing some default Pester tests

011416_1355_MyWorkflowf6

 

 

 

 

 

 

 

 

 

When we run the Save-CurrentISEFile function we get the Same as before

011416_1355_MyWorkflowf7

 

 

 

 

 

 

 

 

 

Again we will Click Yes here and in the next popup we will add the message “Adding New Module FindSystemInfo” and we can see this has happened below

011416_1355_MyWorkflowf8

 

 

 

 

 

 

 

 

But we can see here that there are 3 files added – a PSD1, a PSM1 and a tests.ps1 file have all been added to a New Folder based on the Module name FindSystemInfo – but we didn’t specify these. That’s because the Functions Save-CurrentISEFile & Save-AllUnnamedFiles will do the hard work for us and create a fully compliant with the PowerShell Gallery ps1d file and also a default Pester test as long as you have them specified in your profile. BONUS – I provide you sample versions of both of these with the module. How generous is that!

But the most important thing is being able to not have to call the actual functions but using simple keyboard combinations so as part of the ISE_Cew.psm1 file there is a sample part at the bottom to add into your PowerShell Profiles – again another easy freebie!

So you can now download this from the PowerShell Gallery using Install-Module ISE_Cew – so go and get it and give me some feed back via the GitHub Repo – https://github.com/kilasuit/ISE_Cew/

ThePSGallery AutoBot – Some Issues I’ve Found.

Ok so If you didn’t already know then this happened

And although it has been interesting it has also brought up some issues (mainly data which is one of my biggest bug bears in all things IT) with the PowerShell Gallery and these include and is not limited to

  • Publish-Module is partially Broken – This is due to it requiring you to add in LicenseURI & ProjectUri when run – however the issue then lies that this information doesn’t make it to the Gallery Pages nor does it make it to the Gallery Items when using Find-Module. This means that there is a seemingly large number of Modules that don’t seem to include this *mandatory* information.

    There is a workaround and this should be what you are doing anyway but that is to ensure that in the PSD1 file for the module that this information is included in there as then it gets populated to the Gallery correctly. It is thanks to an Impromptu chat with Doug Finke (@Dfinke) that this came out of the woodwork and was confirmed as being the resolution– So thanks Doug!

Also I decided to confirm this via uploading 2 different modules ThePSGallery-Working & ThePSGallery-Broken – conclusive results show that the only method to get the LicenseURI & ProjectURI to show in the Gallery (either via the Website or via Find-Module) is to include it in the psd1 file.

PSgalleryissue

So Go and update your psd1 files to include this and please upvote this issue on UserVoice to either force this be updated or to drop the LicenseURI & ProjectURI from Publish-Module – http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11439807-gallery-issue-licenseuri-projecturi-aren-t-add

  • Author Details – Again this is massively broken due to the nature of the Gallery with simple things like spelling mistakes etc in Author names which then means there are 3 or 4 authors that are actually 1 person See the Author Details file in the Github Repo for the PSGallery ( https://github.com/kilasuit/ThePSGallery/ )for more details which was built using the below logic

 

Find-module * | Select-object Author | Sort-Object -Unique | Out-File Authors.txt

I would Suggest that the Gallery also allows the Profile to link to other Social networks like Twitter and get the Twitter Handle (Would be great to be able to get ThePSGallery Autobot to include the author in the tweets thus increasing visibility to those that submit work there)

I would also suggest that any Authors include an additional Hashtable for the PrivateData section that includes any additional Contact info – Like Twitter or Blog Urls etc and sets this as a default psd1 variable – Will be posting about this shortly.

  • Additional Metadata – I for one would like to be able to with the AutoBot to be able to tweet on 100, 1000, 10000 downloads of a module to congratulate the authors. However this isn’t made available at the present time via Find-Module however can be gotten via WebScraping Methods – Not particularly resource friendly and time consuming too. I have raised this to the Powers that be via UserVoice and you can upvote this as well via http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11279160-add-additonal-properties-to-powershell-gallery-ite
  • Lastly – Find-Module isn’t very User friendly for cmdlet, function, workflow or DSCResource searching if tags aren’t used.

    This is a bit simpler to get around but the logic is rather hidden in how to do so as you would have to call Find-Module * and then Pipe this to Select-Object -ExpandProperty Includes and then to Where-Object

    So for SharePoint it May look like this which isn’t very graceful at all but this does return the 2 Modules that have SharePoint in Function Names – Problem being what if they aren’t functions but Cmdlets.

    Find-Module * | Select-Object Name -ExpandProperty Includes | Where-Object {$_.Function -like ‘*SharePoint*’} | Select-Object Name

    Again there is a UserVoice Suggestion for this at http://windowsserver.uservoice.com/forums/301869-powershell/suggestions/11088855-find-module-needs-improvements-to-enable-better-di

Hopefully that’s a small insight to ThePSGallery AutoBot and in a future Blog Post I will detail the inner workings of the AutoBot including the actual script that runs it (not on Github as of yet)

2015 – Challenging but it’s only the beginning!

Body:

This is just a post on my own recent reflections of the events throughout 2015.

Each month in 2015 came with its ever more increasingly difficult obstacles to overcome and for the first 6 months of the year a lot of the obstacles had come about in the previous year or two and a lot of it was predominantly centred around my children.

Yes Children – Plural.

At 19 I became a father to my now 6 year old son and then at 23 I became a father to my now 2 year old daughter. So relatively early on in life I was blessed (and as of the last few years in some ways to be cursed) for becoming a parent so young.

But anyway to set the scene of entering 2015 – my daughter had been in care for the majority of her life (since she was around 3 months old) and by the end of January my son was back in care whilst going through court proceedings – and at this point it was not possible for me to have them back, let alone practical – something that still really irritates the absolute bejeasus out of me.

January – April pretty much was all over the place because of all the goings on with the children and trying to get settled into my then relatively new Job at The University of Manchester, which meant that in those months not only can I really not remember what really happened much of that time due to the vast amounts of ridiculously unnecessary documents as part of the Family & Children’s Courts processes.

May – Now if I could delete 1 month (well 1 week if I’m truly honest) then this would be the one. This was the month when the decisions on future contact with my children would be made and although I fought and fought and fought throughout the whole 76 weeks that we were under court proceedings, it was essentially all in vain as the ruling was completely against my arguments based on supposed “professional opinions”.

I have made it more than clear that the odds were against me because of a few basic facts

  • I’m Male – and regardless of the push of same sex rights, men are still widely considered as being the less capable when it comes to family
  • I was brought up to believe that you struggle and juggle – but you get through it in the end – this was perceived as me being “less willing to accept help if needed” and that is utter bullshit!
  • Other than myself in court there was only my barrister, the Children’s lawyer and my ex partners lawyer that were male and present. Female Social Worker, Lawyer for the Local Authority (not my Local authority either), Children’s Guardian & the Judge

But Also May was the month that started the ball rolling for speaking and attending #PSConfAsia – so it wasn’t all doom and gloom. Although I didn’t commit until Mid-June when I had the outcome from the Court case. Needless to say from that point onwards I made a conscious decision that I needed to really start the ball rolling for a better, more flexible and more enjoyable future – so you could say that in June I made the decision that I would at some point in the following 6 months leave the University of Manchester in pursuit of something more fitting to what I want to be doing.

So a part of this involves me making what could be a life changing and seriously difficult time ahead as I move into self-employment but it is something that I have thought about doing now for almost 3 years.

So that will be one big challenge of 2016 – however that is only the beginning as the first challenge is to find somewhere permanent to live. These last 2 months have been expensive although comfortable as I’ve spent most of the time in hotels. I dread to think how much this has cost me personally and with no real tangible gain from it at all.

2016 will see me continue the work that I started with the PowerShell User Groups here in the UK and I am looking to massively expand this where possible. This is mainly in part with the fact that I love presenting and meeting the community but also there is, in my opinion, a massive gap in the skills base of real understanding of PowerShell and in part this can be partially alleviated by increasing the number of User Groups across the UK. So I’ve already put it out there that if anyone thinks that they could co-organise then I will work with them to get these off the ground and running. I will also provide content to them and help get the community growing – the end goal is to be in a similar position to the SharePoint & SQL User Groups where there is a decent local User Group Community and then we can look at localised PowerShell Saturday’s at some point in 2017. Ambitious – but that is the way I am and with the help of those out there that want to get these things off the ground then we will achieve it – plus hopefully by this time next week I should have some good news about the future for these events – so hold on tight.

Also 2016 is the year when I will really Start-Contributing to the wider community, I’ve been promising a PSISE_Addons module for about a month now and the reason for it being delayed is because I’m just adding more and more features to it to make it better, that and I’m actually refactoring the codebase for it already. This will be one of the topics that I will be covering at the Manchester & London User Groups and I’m hoping if I’ve hit it right then it should be a major help to all that use it. Not going to give much more away than that until released (and blogged about of course)

Also 2016 will be the year that will involve lots more presenting. As it stands I have already been accepted for the PowerShell & DevOps Summit in BelleVue, WA for my 26th birthday so that will be an interesting and amazing event to attend, which I would have been looking to attend even if I hadn’t been selected to present just because of the sheer number of the PowerShell Community (and Product Group) will be there.

I’m also waiting to hear back from at least another 7 events on whether I’ll be presenting at them – a Variety of SharePoint, SQL & DevOps type events.

Then there is also #PSConfEU – which I am co-organising with Tobias Weltner and this looks to be another fantastic event – we already have a great line up of speakers and still a few slots to fill. Details about this will be posted in the next few days and I would urge you to Register at www.psconf.eu as soon as you can.

Then late on in the year I’ll be returning to Singapore for the follow on #PSConfAsia Event. And I can’t wait for that one either and hopefully there should be some good news in the upcoming weeks about this event. So again keep your eyes & ears open for updates.

That’s a brief overview of 2015 and overlook of what is to come in 2016.

But one final thing to remember – there is always a story behind every person and most of the time that story stays behind a firmly locked door. I’m happy to be open about it as being open about it all helps me remember that no matter how hard it’s been (and it’s been torture at times) I’ve got though it all and will continue to do so for years and years to come. One day the wrongs of 2015 will be corrected but the journey there for me is longer than I had originally anticipated and forms a solid core of the plan of my next 5 years.

 

So as we enter 2016 – be happy you got through 2015 and look forward to the beginning of yet another journey. This one already looks and feels like it will be amazing and the people that I meet along the way will be a fundamental core to that becoming a reality.

Published: 31/12/2015 17:35


Quick Win – Install WMF5 via PowerShell 1 Liner

Body:

**

** UPDATE 25/12/2015** Due to WMF5 Install issues the InstallWMF5.ps1 Script has been removed from GitHub until the PowerShell Product Team re-release the WMF5 installers. Once re-released I will re-release the InstallWMF5.ps1 script **

This is a very very quick post about installing WMF5 on Windows 8.1 or Server 2012 / Server 2012 R2 via this function script I created

And what better way than a simple 1 liner to grab and run the script

Invoke-Expression (New-Object Net.WebClient).DownloadString(‘http://bit.ly/InstallWMF5’)

And if you want to look at the Script then direct link is http://bit.ly/InstallWMF5

Hope this is useful for you

PS credit (again) goes to @lee_holmes for the idea from following tweet https://twitter.com/lee_holmes/status/318799702869041153

 

PPS – There is an issue with running this on Windows 7 / Server 2008R2 machines due to the need for WMF4 to be installed.

I am working on this but may not have this ready until the new year.

 

PPPS – you will see a theme here – I am intending to build functions like this and release them as a full module in the PowerShell Gallery to help Automate these tasks – so look out for some more like this in future.

Published: 23/12/2015 17:45