Speaking at Microsoft Ignite 2019, but not about tech

Yet again I’ve had another acceptance on a session that I have submitted to a conference, this time to Microsoft’s flagship conference, Ignite, this year taking place between November 4th – 8th 2019 in Orlando, Florida.

 

I’ve been accepted for 2 sessions, a 45 minute Breakout session and a 20 minute theatre session, both of which will be brand new to me sessions, and whilst both are completely non-technical, they are incredibly important topics in the Diversity & Inclusion track, something that I have been trying more and more to be better involved with.

The first session “Mental Health: It’s time to talk” is going to be an interesting and fun talk to plan, and planning has already started, including planning what tech I need to used to be able to pull together some statistics to share in my session.

The session description is as follows Microsoft Ignite Session BRK1033

The second session “Mentoring: Is it your most powerful networking tool?” is a short theatre session covering my own experience with mentoring and how I am excited with the new mentoring app as detailed in this blog post on Tech Community

The session description is as follows Microsoft Ignite Session THR1044

 

As I mentioned in a previous blog post on my Reflective thoughts on 2018 and a look ahead at 2019  I intend to be doing more and more technical and non-technical writing & also where possible more non-technical presenting as part of a personal long term goal to reduce the stigma of talking and writing about Mental Health, because like many millions out there Mental Health Affects Someone Like Me.

 

You can follow my Mental Health blog site over at @mhasl_me and if you are interested in helping writing content at all then please head to the site at https://mhasl.me and fill in the form linked to on there.

I’m looking forward to getting these 2 presentations put together for a number of reasons, including the following

  • These are both non-technical topics that are both really close to my heart, and well one that you could say is more close to my head than my heart, but it is something that I want to use my technical & non-technical skills where I can for good and this feels like a good extension of where I can start.

Whilst I am looking forward to the next 2 months of planning and presentation refinement, I will also be looking forward to taking some much needed downtime afterwards to decompress properly as  event’s like Microsoft Ignite and other conferences can be draining, especially with the International travel and change of body clock due to the time zone difference.

 

That’s all for now, but I have a few more posts in the works, so expect another new post from me soon enough

Speaking at Ignite on Tour London

You can tell that it’s the beginning of the conference session confirmation season as I’ve had another acceptance on a session that I have submitted to a conference, this time to the London leg of Microsoft’s Ignite on Tour conference on Feb 26th – 27th at the Excel London.

The session that I will be delivering is a new to me session called “Using Azure for Good – the story behind mhasl.me” and is a “from the trenches” story of the technological and architectural decisions made along the way behind the delivery of a completely new blog site that I am in the process of building and hope to finally release in the upcoming weeks.

The session description is below.

Description

In this session I will cover the technical background of a new site that I have developed for sharing stories related to many different aspects of Mental Health and how it is something that affects someone just like you or me.

In this session I will cover

  • Architectural decisions and why I made the decisions I made.
  • The Technicals behind the site and the components used as part of the building, maintaining and development of the site, from Azure DevOps to Azure Resources and how I built a repeatable deployment of the Resources that made the site a possibility from the get go.
  • Security Decisions that I made along the way and why I chose them.
  • Lessons learned along the way
  • And lastly why I think this is a topic that in tech we need to talk more about

 

As I mentioned in a previous blog post on my Reflective thoughts on 2018 and a look ahead at 2019  I intend to be doing more and more technical and non-technical writing with this new blog site as part of a personal long term goal to reduce the stigma of talking and writing about Mental Health, because like many millions out there Mental Health Affects Someone Like Me. You can follow the site’s progress via Twitter over at @mhasl_me and if you are interested in helping writing content at all then please head to the currently deployed placeholder site at https://mhasl.me and fill in the form linked to on there.

I’m looking forward to getting this presentation put together for a number of reasons, including the following

  • This is my first conference talk at a Microsoft run conference. This in itself brings a number of interesting challenges, not least ones related to and centred around my own Mental Health, especially around anxiety and my management of nerves. However this is a (mostly) fun challenge, including setting a tight time boxed deadline to actually get this up and running which means I have to actually start kicking in PDD – Presentation Driven Development as coined by @daviwil and noted in https://twitter.com/BrucePayette/status/723513244342751232
  • The technical side of this is filled with a number of interesting & complex challenges, mainly because in my own desire to  that should have a number of interesting decision points along the way, with a number of them already made and make for a number of different choices, not least in IaaS vs PaaS vs a mix.
  • The non-technical side is a topic that’s really close to my heart, and well one that you could say is more close to my head than my heart, but it is something that I want to use my technical & non-technical skills where I can for good and this feels like a good place where I can start.

On a side note, but still semi related, I have signed up as a #MindCampaigner and you can too at https://action.mind.org.uk/be-mind-campaigner

 

mind

 

https://i2.wp.com/www.staffordshire.police.uk/media/3270/mental-health-words-icon-clear-2/image/stess_3.png?resize=208%2C208&ssl=1

(Image above copied from link to this article – https://www.staffordshire.police.uk/article/2051/Personal-Safety )

 

I’m looking forward to the next 4 weeks ahead, but I am also looking forward to a few days downtime afterwards to decompress properly afterwards too.

Speaking at TechoramaBE

Looking through my emails & twitter on Sunday I was pleasantly surprised to see these two items popup

Followed closely with the following email

TechoramaBE

From everything I have heard about Techorama, it looks to be a good conference and I am certainly looking forward to visiting Belgium 🙂

This is the first of many submissions that I have made, but the 1st that I can accept and attend as I was also accepted for DevOps Pro Europe in Lithuania (another on the to visit list) but this unfortunately clashes with the MVP Summit so I’ve had to turn that one down 🙁

Looking forward to getting back into the swing of technical presentations yet again.

Heres to a fun filled 2019.

For more info on where I am presenting / attending you can see Find Me At & you can see Where I’ve Presented or Where have I been for a historic overview

 

Info Share about PSDayUK 2019 – What’s to come?

This year, we as the collective behind the UK PowerShell & DevOps User Groups, are running the third PSDayUK event, the only Conference that is totally Dedicated to PowerShell here in the UK & will be held on Monday September 30th at etcVenues, Birmingham.

I have already been asked by a few people about why we have decided to move the event this year, and a small part of that comes down to the overall event financials as running an event like PSDay or even the larger PowerShell Conferences like PSConf Asia, PSConf EU and also the US PowerShell & DevOps summit are not cheap events to run, and that is not including any time that myself and the other organisers put in behind the scenes, above and beyond our day jobs, to bring events like this & the PowerShell User Groups to you all around the UK.

One of the other reasons for the move is that we as organisers of the collective are dedicated to bringing more year on year events like PSDayUK but also other smaller events like hands on workshops as well, and looking ahead, the partnership that we will have with etcVenues should then open up further doors for us as a collective to be able to do so in other cities as well. We had already come to the decision that over the upcoming years we would intend on doing some rotation of where the event is hosted, with a view to have the event take place in some of the other larger cities in the UK, with London, Birmingham and Manchester being 3 of the main cities that we had in mind. It is of no surprise, at least to me and the other organisers, that I suggested that we look to move the vent to take place in Manchester, as after all Manchester is where the revival of the UK PowerShell User Groups first occurred in 2015 as I blogged about at https://blog.kilasuit.org/2015/08/19/coming-to-manchester-october-13th-get-psugukman, However upon further discussion with the other organisers we all settled on looking at Birmingham this year, for 2 real main reasons. Firstly, Birmingham is the second largest city of the UK and what comes with that is a fantastic set of options for travelling to Birmingham, including it being relatively easy to get to via road for those that wish to drive, a very good rail network link for those that don’t, and also there is also the Birmingham International Airport which allows visitors from out of the UK (or even within) to get to Birmingham relatively easily and the Venue itself is only a 5-10 minute walk from Birmingham New Street Station, which means it’s extremely accessible to get to. Secondly as part of the collectives plans for growth we are taking the opportunity at this point in time to expand the UK PowerShell Community by bringing it to Birmingham, and in the last few days we have had some conversations about when we will see a PowerShell Birmingham group spin up and we are trying to see how soon we can get one organised for, with a view for the first week in February. For future details on this please follow PowerShell Birmingham Meetup Group where we will announce the group (twitter account will follow).

This doesn’t mean that PSDay will not return to London in future, far from it, especially as London has been a huge driving force for the UK PowerShell User Group community over the last few years and especially the last year with them managing to secure a reoccurring host & sponsor in Dotmailer, and as I mentioned in my Reflective thoughts on 2018 and a look ahead at 2019 blog post, one of the areas for 2019 for me on a personal level with the UK PowerShell Community is to see it grow even further and there are a number of cities where we will be looking at expanding the User Group community where we can. If this is something that you would like to have an initial chat about potentially standing up a new UK PowerShell community in your city then please feel free to get in touch with me directly easiest is via Twitter and we can go from there, but also have a read of this article on PowerShell.org that I wrote some time ago – So you want to form a User Group? as it gives a lot of good pointers that I found when initially setting up the UK PowerShell Communities. It does need an update and that is something that I will be revisiting in the very near future with some updates.

However, back to PSDayUK 2019, in the coming weeks and most likely around the time that we have the first PowerShell Cardiff Meetup and that I will also be appearing on The PowerScripting Podcast Returns on Jan 16th (technically the 17th at 02:00 UTC 0, but I knew what I was signing up for), I will be posting more information about the 2019 event including, but not limited to, the opening of the call for speakers, for breakout sessions and also post-conference workshop sessions, information about when we will open ticket availability and also details in our calls for sponsors. I am sure that there will be other things that we will announce over time as well.

In the meantime you can find a recap of the 2018 and 2017 events on PSDay.UK and see all the recorded sessions from previous years on the Youtube channel which also has videos from some of our other events as well.

If you have any questions around PSDay 2019, or the UK PowerShell Community please feel free to tweet the psdayuk account for PSDay items, Tweet the getpsuguk account for UK items,  tweet me directly or leave a comment on this post and I’ll get back to you.

I’m looking forward to what 2019 brings the UK PowerShell Community, stay tuned for future updates.

Reflective thoughts on 2018 and a look ahead at 2019

With 2018 now being basically over, its time to reflect a bit, and well lets say that overall in the grand scheme of things it’s been a pretty good year.

Like any year there’s been ups, there’s been downs, there’s been lots of travel, laughter, tears, sun, rain etc etc. All in all, it’s safe to say that there’s been a bit of everything this year.

 

2018 will for many reasons stick out to me for years to come as the year that started building the foundations for my upcoming thirties, and yes you did read that right, upcoming thirties, and here are some of the reasons why (in no specific order, mainly because I could not decide on how I should order them)

  1. I’ve cut down my alcohol intakes by an amount that I am happy with and also started improving other areas of life including my diet, and hopefully other areas of health should improve over time, time will tell as they say.
  2. Got my own place, although I am still renting, I currently don’t share with anyone else which has already been immensely beneficial for my long term mental health. Don’t get me wrong living with someone is fantastic, if they are the right person that is, something that my last house share really hit home on in the weeks prior to leaving with living with someone that had less that basic levels of hygiene and cleanliness and left things like public hair on toilet seats and in the shower, and not just little bits but a serious amount of it, that I never want to share with someone I don’t already know again, if I don’t have to. However it does mean I am the only one doing the dishes and all other cleaning, but that’s something that I can happily live with, as I find cleaning rather therapeutic. This will mean that I will start get back into enjoying cooking and baking, something that I have not felt overly comfortable over the years whilst living in shared housing for above reasons. and this is something that in 2019 hopefully will change. Also, not many will have known this other than those close to me, but I was unintentionally homeless at the beginning of 2018, something that I was deemed at being without need from 2 different councils, and this is something so this signifies a major pso
  3. Got myself a car again, which makes life 100x easier and allows me the ability to improve other aspects of live in 2019, including taking up some form of martial arts and going swimming regularly. How well that goes will need to be seen but that’s a part of the plan, at least for now, I wonder how many miles I will do in travelling in 2019, and I am sure there’s some form of tracking app for that out there, this isn’t something that I have actually looked into further, so suggestions would be appreciated.
  4. I’ve been lucky to work with some amazing people at Dotmailer and look forward to seeing how the future unfolds for the team there as they have a fantastic product and a lot of fantastic staff there and I am so appreciative of how they continue to support the UK PowerShell Community. Also through some of the employee schemes I had managed to get myself a basic but functional electric guitar and some amazing Sennheiser Momentum over ear Bluetooth Heaphones at a sizeable amount off the total cost. Both have been getting good use but that will increase in future.
  5. The UK PowerShell Community has gone from strength to strength and has an extremely positive look out ahead for 2019, with more and more events springing up follow @getpsuguk for further updates & join our community at https://slofile.com/slack/get-psuguk to get chatting with like minded UK PowerShellers especially as we have planning ongoing for PSDay which should be an amazing day. More coming on PS Day early/mid January and you can stay up to date with it at https://twitter.com/psdayuk
  6. I’ve joined an awesome team at Black Marble and whilst it’s only been a very short time in the grand scheme of things I can see the role I am in at Black Marble entailing a lot of fun times as well as many technical challenges as well, and as I’ve already found out (and already really knew) the team here is filled with awesome individuals, that have really helped me feel at home, at least in a professional sense. We have an awesome set of highly intellectual individuals which makes for a lot of fun and interesting conversations in the office, that often leave me thinking that I need to pick up one of the many books that we have on the shelves, because you never know when you need to now all the inners of XML from the biggest book we have in the office. That thing is HUGE.
  7. I feel I have managed to kick the bit of Writers block that I had previously had (can you tell from this post lol) and I expect that in the coming months I will be doing a lot more blog writing, some of it on this blog and also on another blog that I am spinning up, called mhasl.me, which stands for Mental Health Affects Someone Like Me, as a place for those that suffer, or support those that suffer, with any forms of Mental Health to be able to come and write about it from their personal experience. This is part of a personal long term goal to break down the stigma of talking/writing about #mentalhealth and I envision that this site could end up becoming a go to for anyone suffering with various aliments of mental health to see that you can have less than perfect mental health and that you can still be extremely successful in life/ Expect to see me blog on here about the technical side of the site including the reasons for making the technical choices that I made for the site, as well as the running and management of this going forward and also blogging over on there over there for a long foreseeable future. I am in process of getting the site all setup atm but you can follow the twitter account I have set up – http://twitter.com/mhasl_me for updates when the site goes live and more details about what I expect the site to be all about in future, including, but not limited to, contributing, moderating and providing an input into the direction of this project.
  8. This year I finally gave Android another go in September when I upgraded to a Huawei P20 Pro, after my last Android the Sony x10 mini was far from impressive but that could have been down to the form factor of the phone more than the OS, and I am in a mixed camp between liking it and potentially moving back to iOS or as is actually likely to happen I might go for a multi device solution once again, something that I will further ponder in the upcoming months.
  9. I have further diversified my taste in music and this year has seen me more into my house/trance than ever before. I’m a big believer that diversifying the media you consume can be immensely beneficial and will be looking forward to seeing what Spotify starts to suggest for me in the New Year and as I have already got a new large Discover Weekly Archive playlist that I make use of IFTTT to put each weekly playlist into an a single playlist that will be something that I no doubt enjoy listening to as I now have over 3 days of potential content to consume I will give it my best to get through in a week or two.
  10. Some other areas of personal life are starting to have a better outlook ahead and fingers crossed for how 2019 will pan out in these, those close to me know that paperwork takes time to be completed, but it does and will get completed.
  11. Probably most important of all points from 2018, is that I’m starting to feel more genuinely content with where I am at in this stage of my life, whilst this doesn’t explicitly mean I am genuinely happier in all aspects of life, in the majority of them I am starting to feel at ease of where I am with them vs where I want to be at and a lot of that has been down to accepting and understanding that for certain things to change takes time and that I need to be more open to planning out sensibly about the actual timelines than I had been previously. This doesn’t mean that there’s not things that I want to improve on that would help me feel more content, but that’s all things that I can work on, at least in time. Therapy may / may not be something that appears in my future, this is something that I am still on the fence about.

So that’s enough about 2018, what about 2019 and what do I envision for the year ahead.

  1. 2019 will be filled with writing, more writing and more writing. Part of this will be down to the blogging that I intend to do, but also in my role we end up doing write ups for our clients, this means that I will be doing much more with my writing and looking at improving how I write going further into 2019, perhaps that might be something that I end up writing further about, fancy that eh? But nonetheless, I am looking forward to smashing the writers block and properly get back into the swing of writing once again.
  2. 2019 will be filled with more coding, which also equals more writing, this time I likely will be learning new languages/reacquainting myself with prior languages, like for example PHP which is fitting considering the technical work I am doing with mhasl, which I will write up more on in future as well.
  3. 2019 will be filled with more tech, and more innovative use of tech. I can say that this will likely be a big year for new forms of hardware and software with the likes of Hololens now getting some more traction and how Q# the Quantum computing Language is getting some more traction as well. There are also a number of other areas that are getting more traction in tech including Terraform and the other technologies from Hashicorp.
  4. 2019 will be filled with travel, again this is not just because in my role I travel to clients on sites but also that I will be doing more and more travel for the community work that I do as well. I expect I will be travelling around the UK a large amount for the PowerShell Community but also for other community endeavours as well, good thing I recently got a car again eh 😉
  5. 2019 will be filled with learning, mainly technical but also in learning more about myself and in general just learning more about lots of non-technical items as well, because life is a long journey all about learning and it’s not something that I ever intend on stopping.
  6. 2019 will be driven by Data and AI, and a significant amount of my time will be put into attending more and more Data events including Data In Devon (formally SqlSaturday Exeter) , DataScotland and also DataGrillen which I am really looking forward to. It will be nice to just attend events like these and not be either presenting or organising them and I look forward to catching up with a number of great friends in the #SQLFamily at these events.
  7. 2019 will be much more filled with reading, and by reading I don’t mean Technical reading, but more actual book reading and I’ve recently started reading this – The Idiot Brain by Dean Burnett and I can say that I will likely be reading a lot more of the books by him, as they are very well written and touch on an area that really interests me, the human brain. Dean is a neuroscientist so these books come from someone that really understands the science behind how our brains work, I intend in future to post small articles recommending books that I have read and enjoyed as well. I am also taking suggestions on what to read and if you feel you really want me to read it, especially if it’s technical, then feel free to send a copy to me directly at the Black Marble office, where I am sure we can also add it to the bookshelf, though for non technical please send me a suggestion either via a comment on here or via a DM on twitter.
  8. 2019 will be filled with more physical activities, like swimming, running and also a lot more countryside exploring, I mean I live in Yorkshire and within a 50 mile radius of me there is the Peak District and the Yorkshire Dales so there’s plenty to see, visit and explore and now that I have a car that is something that I intend on doing more and more.
  9. 2019 will be filled with laughter and also tears. Like any year there will be ups and there will be downs. That is something that will not ever change, don’t kid yourself if you think you can have 1 without the other, at least not for too long, especially as tears can be both positive and negative, and could come from laughing too much, which is almost always a good thing.
  10. 2019 will be filled with more gigs – I did not do much gigging in 2018 or even 2017/2016/2015 and that’s something that I will be looking to change. I already have bought tickets to Metallica in Manchester ( I do have a spare atm but that’s more for planning reasons than needing more than the 1 as you never know what will happen between now and then) and I have others lined up that I intend on getting tickets for including Leeds, Download to name a few.
  11. 2019 will include more music, both as a consumer and also a creator. I envision that I will be spending some considerable time getting back to grips with the guitar and getting lessons to get better as so that I can play more of my favourite tracks and also start perhaps creating some of my own, who know’s whether that’ll lead anywhere more than just to a bit more diversity in my hobbies, which is never a bad thing.
  12. 2019 will be difficult, there will be more and more upset in the world due to everything that is going on in the political world. Some of this will affect us in ways that we could not have envisioned but one thing will stay the same, the fact that we can get through the turbulence and come out the other side. Please don’t let any political differences change how you interact with others ahead, especially friends and family, views can and should be different, which leads for the possibility of healthy debates.

All in all I think that’s about all for now but I know that with 2018 almost over with, 2019 is looking to be a more prosperous and the beginnings of a very interesting future ahead.

Leave a comment or tweet me on twitter at @ryanyates1990

Re-Awarded MVP for the Third Year Now, time for some reflective thoughts

I nearly didn’t start writing this mainly as I’ve found it difficult to get back into writing, let alone be in the right mindset to want to write at all. This partially stemmed out from struggling with injury last year. Thankfully, 2018 has been much much kinder to me, and I’m glad to say I am no longer struggling with at least the physical injury, that aside I intend to start getting back into the rhythm of writing once again, something that should come more and more frequently over the upcoming weeks and months.

 

That being said, this time around I want to reflect on how I came into the MVP Program, and specifically around how this has positively affected my life and those around me, both closely and those perhaps not so close, both in physical proximity and in my ever growing reach of my networks as they have grown over the last 3 MVP Award Cycles.

 

My journey to becoming a Microsoft MVP came early on into my IT career, but also in comparison to many around me, my IT career is relatively short, in which I have only been in active IT based roles since December 2012, a little over 5 & half years, but in that time I have been extremely lucky in some of the roles that I have managed to be able to work in, whether that be working for organisations like Barclays & University of Manchester, or being a part of the Organisation team for the 2016 PSConf EU event, or being able to revive the UK PowerShell User Group Community, and as a recognition of the community work I have done I was Awarded the MVP Award in April 2016 as can be read about in the preceding posts https://blog.kilasuit.org/2016/04/01/congratulations-2016-microsoft-mvp/ –  https://blog.kilasuit.org/2016/04/01/fooled-ya-today-i-became-a-mvp/ & perhaps the more serious https://blog.kilasuit.org/2016/04/07/awarded-the-mvp-award-what-this-means-to-me-and-the-future-for-the-community/  as well as my renewal thoughts last year https://blog.kilasuit.org/2017/07/04/mvp-award-renewal-time-my-renewal-new-recent-mvps so it was only natural at some point I would write something similar this year.

 

However the journey in the years leading to becoming a Microsoft MVP came at a time things were chaotic, to say the least, in my personal life & I knew from my interest in understanding Psychology & my own Mental Health, something that I’ve been focused in learning more and more about since my early teens, that I needed something that was not only consistent in life, but something that I could really push at and find passion in. That something has consistently been centred around my working life, with a huge part of my working life requiring more in depth knowledge of a service or services which actually really boils down to my thirst for expanding my knowledge in a wide variety of areas, technology is just one of many of these areas, as mentioned above, but it is the one I have centred my career around.

 

My desire to learn, a desire that hasn’t disappeared and likely never will, pushed me down a path that forced me, or more accurately actively encouraged me, to apply a fair amount of time & effort into not only getting into the IT communities centred around the area I was in at the time (2012-2014 means predominantly SharePoint) but to also expand outside of those communities as well into areas including but not limited to, SQL Server, web development and more later on and approaching 2015 time frame, the PowerShell Community.

It was at this point I made 1 single choice, a choice that at the time was essentially a huge gamble for me, and this was played in part due to the fact that I was struggling at the time, as i said previously things were personally chaotic, and due to the chaos I was undergoing I ended up suffering with a case of social anxiety, which was brought on by the immense stress that I was undergoing at the time. This would then occasionally lead to me having small outbreaks of mild but distressing panic attacks. These would typically happen in crowded places like Shopping Centres, Train Stations, and later on at User Groups and Conferences. However they initially only happened so sporadically that I took very little notice of them, but as time progressed they became more and more common and with that they also got more intense.  but I quickly learnt coping mechanisms to manage them. I could have taken myself to the doctors and got some form of prescription of anti-anxiety medications, which mainly tend to be of the benzodiazepine family, or referred myself to for some counselling, which looking back I could have really done with at times over the years, even if it was just as a further helping hand to get me through everything.

Let this be the beginning, of what will be a long reflective look on the troubles of the past, and of the troubles that lie aheadm with  more to come in the

#TalesToDewsbury

New-Job -Role Consultant -Company ‘Black Marble’ -Start September

Well the Title says it all in as few words as possible but I wanted to just expand briefly on this new chapter.

As of the beginning of September I will be leaving dotmailer after a fantastic 6 months here and this was a decision that was not easy to make as there are many interesting & exciting things happening at dotmailer in the upcoming few years & as such they are currently undergoing a huge increase in their tech team, I would reccommend that if you are looking for a new role in the London/Croydon area & are interested in DevOps and working with a predominantly all Azure environment to have a look at the roles that they have advertised at http://careers.dotmailer.com for further information. I will miss the team at dotmailer as it’s a very good working environment, has some interesting benefits and has a very interesting future ahead of them.

That may lead you to ask, why leave? Well the answer to that comes down to only a very small few factors, with the main one being that as much as I have enjoyed London and Croydon, I am a more comfortable being based in a more country like surrounding. So with this in mind the move to Black Marble made a lot of personal sense in that regard as well as alot of professional sense as I will be working along side some other familiar UK MVP faces, including Rik Hepworth (twitter) Richard Fennel (twitter) Andy Dawson (twitter) James Croft (twitter) James Mann (twitter) & MVP & RD Robert Hogg (twitter) – These are just a few of the people at Black Marble and I am looking forward to working very closely with Chris Gardner (twitter) on more and more PowerShell for the community in the upcoming months.

Alas this also means I will be moving back up north which means that the Manchester PowerShell User Group will at some point in the upcoming months be totally resurrected.

I am extremely excited for the future and what it brings, although there are still some things that are not solidified yet, like where exactly I am moving to, what I am looking forward to is getting stuck in to this exciting challenge.

There were a few people that I had some conversations with about this prior to making the decision, the benefits of having a trusted network as always is critical to making good sensible decisions like this one, and those conversations were immensely beneficial to me in coming to the end decision.

I will miss the team at dotmailer as it has been a fantastic time here, including (attempted) weekly games of football, which will be one of many things that I will miss, though I don’t expect to be too much of a stranger to the team in the upcoming months with PSDay and the other London User Groups ahead.

So without much further to add, Let the immense fun sarcasm of moving commence, hopefully this will be a last move for some time to come.

Info Share about PSDayUK 2018 – Call for Speakers, Ticket availability & Upcoming Call for Sponsors.

This year, we as the collective behind the UK PowerShell & DevOps User Groups, are running the second PSDayUK event, the only Conference that is totally Dedicated to PowerShell here in the UK & will be held on October 10th at CodeNode, London. You can find more info about at PSDay.UK including being able to purchase tickets & the eventual schedule once published in the upcoming weeks.

We will constantly be releasing information about PSDay as we approach the time for the event via various methods of social media including via our Twitter Account @psdayuk which I would highly recommend you follow if you want to be kept in the loop of what is coming to PSDay.

To give a little bit of background, PSDay is the conference brand of the UK PowerShell & DevOps User Groups, for more information on the UK PowerShell & DevOps User Groups please see PowerShell.org.uk and PSDay is currently planned at being an annual event much like the bigger PSConf EU, PSConf Asia and US PowerShell & DevOps Summit events, and whilst each of the bigger events are multi-day events PSDay at present is a singular day event, although the format of the event may change in future, this is something that the organising team are keeping in mind for future events.

With this in mind this year we are running PSDay as a dual parallel track conference, where we a solid idea of what we are intending the tracks to contain to cater all skill sets based on what we’ve learnt as part of running consistent monthly User Groups in London and whilst there is a HUGE variety of topics that could be delved into with PowerShell we have seen a reoccurring theme with the User Group over recent months.

This means that we have been more selective in the ideas behind what sort of topics that we are looking for this year with a view to have topics along the following lines

Track Name Track Focus Suggested topics
The many components of the PowerShell Language All things related to the PowerShell Language Debugging, Classes, Remoting, Performance, WMI/CIM, Pester, PSScriptAnalyzer, DSC, Workflow,  Using .NET in PowerShell & anything centred around the Core PowerShell language that can be useful for all skill sets.
Using PowerShell as the Glue of Automation All things Automation Automating any Technology from any device installed anywhere, Azure, AWS, GCloud, Office 365, Microsoft Graph, VSTS, GitHub, PowerShell Gallery, SharePoint, Exchange, SQL Server & more

 

The idea behind the first track, ‘The many components of the PowerShell Language’, is that those that are new to PowerShell, and even those of us that have been using PowerShell for years, can come into this track and take away a wide variety of knowledge about the core parts of the PowerShell Language that comes from a more general use perspective, which would allow attendees to be able to take away and expand on what attendees learn in this track in their own times and is expected to be of a more generalist track where the skills learned can be then taken and used across an enormous number of technologies.

 

The idea behind the second track, ‘Using PowerShell as the Glue of Automation’, is to be much more centred around using PowerShell with specific technologies & is more more likely to be the track for those that want the more technically  for those people that are either well into their DevOps journeys & are already using many differing DevOps Practices and perhaps looking at further expansion of their skill set or are looking at replacing existing or embedding additional technologies within their organisations.

The Call for speakers form is located at PSDay Session Submissions & we are looking for sessions for topics as listed above and to be of 60 minutes in length. We currently have a cut of date of July 31st for sessions and I would highly suggest that any potential sessions you may want to submit be submitted quickly. I would also suggest that the abstract you submit does not need to be perfect but does need to give us as organisers the ability to pick and choose topics from all the submissions. The reason for this is that we will come back to chosen speakers, based on topic & technology and come back to them to confirm/polish off their abstract in the early weeks of August, prior to publishing the schedule by beginning of September.

We are also currently in the process of working out what sponsorship packages of the event will look like and I would say until we release a further post on what these look like if your organisation would be interested in sponsoring PSDay then please reach out to me in this interim period, whilst we iron out what the sponsorship package would look like, and we have already been approached by a few sponsors so this will be coming along very soon.

I am looking forward to PSDay & I am looking forward to seeing you there as either, an attendee, a speaker or a sponsor.

If you have any questions at all please reach out and I’d be happy to answer

 

 

It’s been a while!

I have been absent from the Technical world for a short while, and in that time I have has a few people reach out and advise me that this blog has been down, due to an expired SSL cert.

 

Simple enough fix & well ‘I’m BACK’

 

Be on the watch for a series of blog articles to follow, over the course of the next few weeks, with a number of fun & technical items to be revealed, as well as some hints at some plans to be revealed, which is starting to make it look like it’s going to be a very fun filled 2018.

 

But for now, I cannot say more, except that I am excited 😉

5 out of 6 in 6 Days = A busy week

This week has been a busy week for me with the SQLRelay and SQLSat Munich events. It has been full of fun especially seeing as for SQL Relay we had the fun bus for travels between the different venues all across the UK.

The week started of as most other weeks do and that was with me at home in Derby on Monday Morning. This was followed by me jumping on the train to Birmingham around 11am Monday Morning for the first leg in the SQL Relay tour where I presented a completely new and fully non-technical session, something that is a little bit out of my comfort zone of the typical more heavily technically focused sessions that I’m used to delivering.

This was a session that I’ve put together based upon my own career experiences about the need to really spend time on developing and taking ownership of your career. It was pointed out on a few occasions throughout the week that at 26 I’ve still yet to really “have” a career and although in some ways that can be seen as being very true, there is also the other side of the coin, in which I’ve had the opportunity to see first hand with other colleagues how not owning your career can lead you down path that doesn’t leave you with a role that you enjoy and feel sustained and secure in.

 

I really do feel that it really is essential that you keep up with your Training and take control of your own career – as after all it’s your career and how well it goes is down to you as an individual and how determined you are to achieve the Salary and work life balance that you wish to have. Troy Hunt has blogged about his experience of making his job redundant at https://www.troyhunt.com/how-i-optimised-my-life-to-make-my-job/ and this is something that scares most people when they think about it at any real depth. I would also recommend reading the follow up post on this that Troy has done on this recently as well https://www.troyhunt.com/7-years-of-blogging-and-a-lifetime-later/ as both of these are similar to my way of thinking around work and life balances.

 

Not only did I do a new presentation but I have also been busy enjoying being on the SQL Relay FunBus as well and although I knew a number of the other fellow travellers it was good to be able to spend some more concentrated time with them. The SQL Relay is a great idea and I’m already looking forward to the ‘tour de UK’ again next year.

 

To top the week off I have also been at SQLSat Munich this weekend where there has been even more fun times with the extended #SQLFamily which has been great. I seem to have a thing for Munich and the first real weekend of October as I was also here last year for SPSMunich which you can read about the experience in my recap post

 

I however am looking forward to getting back home after pretty much a week on the road and getting ahead with some of my prep for PSConfAsia in just under 2 weeks.

Speaking at SQL Saturday Munich October 8th!

So last October I attended the first SharePoint Saturday in Munich which was great event and if you want to you can read up about my experience in this previous post.

 

However it seems that in October this year I’ll be returning to Munich for the SQL Saturday event where I’ll be delivering my Why & how to implement PowerShell DSC for SQL Server session.

 

There has been a number of changes to the xSQLServer Resource over at https://github.com/powershell/xSQLServer in the last few months (yay) so there will be some cutting edge new insights into this session so it looks like I know what I’ll be spending my time on soon enough.

 

It’s also a session where I do cram in a lot of information within the hour (originally it was a 2 hour session at SQL Saturday Exeter) so if you attend make sure that you are ready to jot down a fair amount of notes and

 

There are a number of other familiar faces from the SQL Community speaking at the event so I’m looking forward to being able to catch up with them all and meet even more of the amazing #SQLFamily.

#PowerShell Side by Side #ProTip

Today I’m going to share with you a little but simple tip to enable you to do more Side by Side testing of PowerShell v6 with you current installed version in a simpler and less error prone manner.

 

Firstly we will create a new environmental variable which we can do in a number of ways but I quite doing it this way  as its easy enough to script

Function Update-PS6Path {

       

        $PS6LatestPath = Get-ChildItem ‘C:\Program Files\PowerShell’ -Directory |

                         Sort-Object CreationTime -Descending |

                         Select-Object -ExpandProperty FullName -First 1

        [Environment]::SetEnvironmentVariable(“PS6”,$PS6LatestPath,“Machine”)

    }

 

This then means that to Launch PowerShell v6 you can do this in the console to run PowerShell v6 (the latest installed version anyway) and in this case we are passing some of the available arguements to the powershell.exe application as noted at https://msdn.microsoft.com/en-us/powershell/scripting/core-powershell/console/powershell.exe-command-line-help

& $env:ps6 -NoProfile -NoLogo -ScriptBlock { $PsVersionTable } -NoExit

So hopefully this little snippet will help you out in doing some more Side by Side testing as time goes on.

1 Small thing about running PowerShell Core and Windows PowerShell side by side on Windows

*Updated August 23rd 2016 as there was a change between 6.0.0.8 & 6.0.0.9 to PSModulePath that I had missed – I will be blogging about this in more detail in a future post but for now check the updated section at the bottom of this post! *

 

If your like me and you want to test out PowerShell Core on you Windows machines as well as other *nix machines then you may get caught out with this like I did in the upgrade from 6.0.0.8 to 6.0.0.9.

 

You can grab the MSI installer for 6.0.0.9 at https://github.com/PowerShell/PowerShell/releases/tag/v6.0.0-alpha.9  however do note that there are no Windows 7 or Windows 8 installers due to the requirements for WMF 4 to have been installed prior to WMF 5 as noted in this issue https://github.com/PowerShell/PowerShell/issues/1931 which links to this issue https://github.com/PowerShell/PowerShell/issues/1705
So lets get into the Side by Side stuff Smile

 

Once you’ve installed the MSI install you can run PowerShell 6.0.0.x alongside the Installed Version on your machine like so

 

PS-SBS

 

This is because PowerShell 6x installs in the following Location C:\Program Files\PowerShell\ and as you can see below I have installed 6.0.0.8 & 6.0.0.9 on my machine.

PS-SBS2

This also means that if we look in our Start Menu you can see the following new options

 

PS-SBS3

 

*Note* This will not change your default version of PowerShell from the one that is at C:\Windows\System32\WindowsPowerShell\v1.0\ so if your running Windows10 on the Insider Fast ring like me then it will run 5.1.1405.1000

 

To run one of these alpha versions you have to explicitly do so from the Start menu (or a desktop link if you create one) so you can be sure that this will not cause your any issues with day to day PowerShell use.

 

Hopefully that clears up any potential confusion!

 

In 6.0.0.8 the $profile Variable referenced the Windows PowerShell Documents location as can be seen below

PScore-6.0.0.8

 

Whereas in 6.0.0.9 we have a new location as shown below

PScore-6.0.0.9

 

So when we load 6.0.0.9 we wont get our profile to load as it doesn’t exist.

So that we can get our current profile to load in 6.0.0.9 we can do what we would normally do and just use New-Item like I’ve shown below

PScore-6.0.0.9-2

 

This seems only to have been needed in 6.0.0.8 & not 6.0.0.9 – as the default values in 6.0.0.9 for the PSModulePath are not what we have set in the ENV Variable and I’m not sure how this works but will dig in and post about this at a later date!.

 

Then next time we load 6.0.0.9 we will have a working profile but the issue is that this will now enable loading of all the modules that we have in our PSModulePath environmental variable.

 

. However we can get round this by 1 simple line in our Profile

If ($PSVersionTable.PSEdition -ne ‘Desktop’)
{ if ($IsWindows -eq $true)
{  $version = $host.UI.RawUI.WindowTitle.Split(‘_’)[1] ;
$env:PSModulePath = “C:\Program Files\PowerShell\$version\Modules” ;
Write-Output ‘Removed all but the shipped Core modules’
}
}

This is a Windows only forwards compatible inclusion in your profile & will only affect the local session of PowerShell that is running.

So you can be sure that this will work across your Windows Machines however ideally we will get some amendments to PSVersionTable as noted in https://github.com/PowerShell/PowerShell/issues/1997 & https://github.com/PowerShell/PowerShell/issues/1936 to be able to tell the OS easier and more dynamically.

 

The $IsWindows variable is only available in PSCore along with $IsOSX , $IsLinux & $IsCoreCLR so you cannot currently use them in the Full Version of PowerShell and currently I don’t think that you can build the full version of PowerShell from the Repository to a 6x version. However this may change in future.

So you can actually with 6.0.0.9 and above ignore the above section completely and comment that out of your profile (or delete it)

 

This is also a good example of the rate of change between the different alpha versions although I’ve checked the commit notes and cant see this change mentioned in a concise and easy to understand manner so I will feed this back to them to see if release notes can be improved in future.

My Opinion on Open Source PowerShell and what this means to the PowerShell community

If you’ve been under a rock the last few days (or for days/weeks/months depending on when your reading this blog post) then you would have missed that on Thursday August 18th 2016 –  Microsoft Open Sourced PowerShell!

Not only did they Open Source PowerShell they have released a Cross-Platform alpha version that can be installed on a variety of Linux Distros as well as a Mac OSX version.

 

You can read about it in more detail from the Jeffery Snover himself from over at https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/

You can also read the PowerShell Teams blog (which has some great links too) on this at https://blogs.msdn.microsoft.com/powershell/2016/08/18/powershell-on-linux-and-open-source-2/

 

But what does this really mean to you, me, & all the other PowerSheller’s across the globe?

 

Firstly

 

  • Well done – you picked to learn a technology, which celebrates its 10 year old anniversary later this year (November 14th) and that was considered a “Windows only” tool and now it’s not this now means that you could start working with other platforms – increasing your value to your employer *cough maybe its time for that payrise*
  • PowerShell is likely to start to change even quicker (I’m speculating past the Server 2016 launch here)
  • If you don’t want to get involved and help with building PowerShell v6 (& fixing any bugs you find) then you can let this announcement pass you by a little and await an actual real release in the future

However if you do

 

  • You need to start learning some more new things and not just start learning PowerShell as there is now an even bigger ecosystem of tools that you need to learn to be really efficient and successful in IT going forward.
  • You need to learn how to work with source control and I will recommend git like I do to every one else. Check out https://help.github.com/articles/good-resources-for-learning-git-and-github/ for some resources but a google search will get you some others too.
  • You need to learn how to work with Github Issues and search and be able to file an issue that has enough information to be useful for the person reading it. Some of this is helped by Issue Templates but these don’t always capture all the possible required information so be prepared to be asked for more info.
  • You need to start attending User Groups & Conferences and train for your IT Future as the world of IT is undergoing a massive change and this isn’t going to stop anytime soon and if anything the rate of change is going to start getting quicker and quicker.

 

 

So where are we right now?

 

Currently the release of PowerShell to Github is an Alpha release – this means that it is not supported in any case for any production uses at all! Basically its out there for you to “kick the tyres” so to speak.

It also means that at least for now and the near future you may think that you have 2 places to raise issues. Github & UserVoice.

However The Guidance from the PowerShell team is this at present

Customers and Enterprise users should still raise these on UserVoice as this is still where issues relating to the PowerShell Engine that are contained within Windows Client and Server Systems including WMF releases should be raised.

Basically this means for issues that relate to anything PowerShell v5.1 and below should be raised on UserVoice.

My understanding of this is because these versions of PowerShell haven’t been released to Github (we have the changes that have occurred since PowerShell 5.1 was rolled up for WMF 5.1) so changes to them can only be done by the PowerShell team – plus we do need to remember that Server 2016 is still yet to RTM and the Source code for that will have been sealed in preparation for launch. So any fixes to the PowerShell engine included in Server 2016 or the RTM version of WMF 5.1 will come either by hotfixes or a recommendation to upgrade to a stable version of PowerShell 6 once released as we currently have alpha releases available on GitHub.

However For Developers and those that feel comfortable to do so then they can raise issues on Github.

This is where current guidance from the PowerShell team could easily bring a little confusion to some but we have to remember that this is new ground for the PowerShell Team so they will need to have some time to sort out how they work with the different streams. It is likely (& I’m just speculating here) that the team has an internal consolidated issue tracker that tracks UserVoice and all of the PowerShell Repo’s, however be on the look out for a blog post from the PowerShell team at https://blogs.msdn.microsoft.com/powershell in the next few weeks where they will be detailing how they interact with the community across these mediums.

 

So What does the future hold for PowerShell?

 

Over the course of the upcoming months we will see a number of further alpha releases as well as a stronger emphasis on making use of the PowerShell RFC Process for any changes to how the PowerShell Engine works. The PowerShell RFC Process can be found at https://github.com/PowerShell/PowerShell-RFC and there are a few new additions to this already from outside of the PowerShell Team.

 

But the interesting thing from this point on is that there will be more and more of the PowerShell Eco System open sourced including 1 module that I’ve been waiting to tear apart – PowerShellGet – which Jason Shirk confirmed is planned to happen in the future in this Issue https://github.com/PowerShell/PowerShell/issues/1979 – It is also worth noting that a number of the modules that we have inbox on Windows 10 Machines are not written by the PowerShell team so there is likely a chance that the module, cmdlet or function that you have ideas to improve (New-Item is one I’d like to see be a bit more intelligent with folder creation) may not be open sourced – however I think it is a matter of time before we see there be demand for these to be open sourced as well and there are already calls for other modules from other teams to be Open Sourced including the SQLServer module (was SQLPS) which shows where the ecosystem has been going for some time now.

 

Overall I’m incredibly proud to be working with such an amazing product that now has opened even more doors to me than what it had available to it before. You never know what the future will hold but now I have skill that can be used cross platform that means to me that the possibilities in the upcoming months & years of my IT Career are even more prosperous than they were last week.

 

If you haven’t yet picked up PowerShell I would seriously urge you to do so!

If your struggling on how to pick up this language and understand the benefits that it can bring you in your organisation or even personally and are interested in getting some training that is tailored to your needs or your organisations needs then check out my company Re-Digitise at https://www.re-digitise.org

 

I’m looking forward to seeing how the future pans out with xplat PowerShell – what are you looking forward to the most with this?

Functional / Non-Functional Pester Tests and why I think you really should have a form of both.

So in this blog post I’m going to cover why there is a need to create Functional & Non-Functional Pester Tests for your PowerShell Modules but before I get into the nitty gritty into the whys behind creating both let me explain what the real differences are between the two because it may not be something that you have previously thought about or considered in your journey up until this point.

 

Functional

  • Used to test the code’s different use cases
  • Can be either be a form of Unit or Integration Test
  • Where we “Mock” the functionality to confirm it works as expected
  • To Determine the level of code coverage that your tests actually hit
  • Makes Functionality changes simpler and easier going forward as long as you write more Functional tests
  • Should save headaches as code moves between environments as part of a Build/Release Pipeline
  • Provides a Documentation Mechanism to catch either bugs so these can be fixed
  • Provides a Documentation Mechanism to potentially highlight where you may be able to make possible improvements

 

Non-Functional

  • Can be more referred to as “Traditional Documentation”
  • Aids Newcomers to the code base by being suggestive that you provide some useful help documentation
  • This can also aid newcomers in learning how to understand some of the more advanced functionality
  • We get Validation on the Functions Parameter types – i.e should the parameter be a String for input
  • Confirmations on whether the Parameter a Mandatory Parameter or not ?
  • Gives us a basic form of ParameterSet Validation
  • Gives us a basic form of Parameter Position Validation
  • Does the Parameter Accept Pipeline Input ?
  • Does the Parameter Accept Pipeline Input by Property Name ?
  • Does the Parameter use Advanced Validation at all ?
  • Does the Parameter have at least some help text defined ?
  • Does the Function have at least a basic level of Comment Based Help ? – lets leave the pro’s & con’s for another topic shall we.

 

So with the additional amount of tests that we may have to write from looking at the above why should we spend the time writing these tests?

This is where the story for Non-Functional tests becomes a little hazy in some ways but it really depends on the situation on how you’ve ended up with this module.

 

These possibilities can include

You’ve Inherited or downloaded someone else’s code and you have no clue what its doing because it’s

  • Not well documented with little or no help
  • Difficult to read because of the formatting
  • Uses a number of privately scoped functions
  • All the functions are either in a single ps1 or psm1 file
  • Just needs to be refactored to make it easier to manage, maintain & update going forward

Or it may just be that

  • It almost does what you need but you need to extend the functionality
  • You want to dig a little deeper into how it works
  • You are possibly continuing a discontinued open source project
  • Or you are looking at your own older code and want to give it a much needed update considering you’ve become a more experienced scripter than you were when you originally wrote it

 

If you were to go and create all the Non-Functional Tests that I’ve listed above then this will give you a lot of additional tests (& I mean a lot) that you would then have available to you to provide you some more trust in your code whilst you refactor or just understand how all the bolts fit together.

However I will point out that from this is really meant to provide you with a Singular Set Baseline on what is included in the module and not how the Module actually functions as that’s the role of the Functional Tests to do so.

 

In my next post I will show you how we can automagically create these Non-Functional Tests for each function included in an existing Script Module, including those functions that are defined as private/internal functions to give us a better chance of being able to manage, maintain & update it going forward.

Recap of a Long February, March, April and May – Events Events Events!

I had intended that I would be doing a recap type post at the end of every month however I’ve been very busy so haven’t been able to do so for a number of months – that and I had an issue with my blog being offline for a few weeks.

Let us start with a recap on the amount of events that I managed to attend and I think that you can see that I did a lot of travelling and attending a number of different user groups.

I attended the following events

  • Get-PSUGUK – Manchester – Feb 1st
  • SharePoint User Group – Manchester – Feb 2nd
  • Azure Security Training Event – London – Feb 3rd
  • SQL User Group – Manchester – Feb 3rd
  • Get-PSUGUK – London – Feb 4th 
  • Mississippi PowerShell User Group – Feb 10th – Online
  • What’s New in Server 2016 – Microsoft Training Event – London – Feb 17th
  • What’s New in Windows 10 – Microsoft Training Event – London – Feb 18th
  • WinOps Meetup – London – Feb 23rd
  • Chef Meetup – London – Feb 24th
  • Cloud Roadshow – London – Feb 29th – Mar 1st
  • Azure User Group – London – Mar 1st
  • Manchester Geek Nights – Agile and Tech in Local Government – Mar 3rd
  • SQL Sat Exeter – Mar 12th
  • Lean Agile Manchester – Mar 16th
  • SQL User Group Manchester – Mar 17th
  • Manchester .Net – .Net Core recap – Mar 22nd
  • SQL User Group Cardiff – March 30th
  • MCR Tech Event Organisers meet – Apr 7th
  • SharePoint User Group – Nottingham – Apr 12th
  • PSConfEU – Hanover, Germany Apr 19th – 22nd
  • Get-PSUGUK Manchester – Apr 25th
  • Get-PSUGUK London – Apr 27th
  • MVP Open Day – Apr 28th – 29th
  • SQLBits Sat – May 7th
  • Get-PSUGUK Manchester – May 23rd
  • WinOps Conf London – May 24th
  • UKITCamp London – May 25th
  • SQL London User Group – May 25th
  • Get-PSUGUK London – May 26th

So in the space of the beginning of February to the end of May I attended 30 different User Groups, Training days or Conferences and that wasn’t all the ones that I had planned either due to some unfortunate illnesses that occurred as well.

Now those that know me will know that I attend the events because I’m genuinely interested in the topics at the events or catching up with the people that are there after all the events are all about the community and the networking opportunities that they bring to us.

I intend to in future post ahead of where you can catch me in the following months via the Find Me At page and then at the end of the month detail more about what I learned at the events.

Before I go into detail on the events and what happened at them just take a moment to look at the types of events that they are and the breadth of technology that they span. This may give you an insight into the differing technologies that excite and interest me going forward. 

To start Get-PSUGUK Manchester on Monday Feb 1st which seems a long time ago but is still an event that I can vaguely remember enough to post about. I presented the initial version of my “Teaching the IT Pro how to Dev” Session where I introduced my ISE_Cew Module to the Audience for helping with getting to grips with using source control with Git and unit testing with Pester. We also had our first community speaker Tim Hynes @railroadmanuk who presented on Automating Infrastructure using PowerShell with various Infrastructure API’s that he’s been working with including VMWare, Cisco & NetAPP devices. You can find his presentation at https://github.com/railroadmanuk/presentations and not long after Tim was awarded VMWare vExpert. I know he’s presented at other events since and I’m looking forward to seeing what the future holds for Tim.

Then on Tuesday Feb 2nd SharePoint User Group in Manchester will always be a group that is close to me as it was the first user group to give me the possibility to present at which you can read more about here – though this was a night about “What you need to know about SharePoint 2016” by Heath Groves @Heath_Groves and Building Enterprise Platforms by Andy Talbot @SharePointAndy – you can find Andy’s slide deck at http://www.sharepointandy.com/?p=550

Heath gave us a rundown on all the things coming in SharePoint 2016  and even prepared some take-me-homes which included the New and Removed PowerShell Cmdlets in SharePoint 2016. Andy’s session was a good thought provoking session for those that have dealt with SharePoint in the past and there are some really good points in the slide deck that are applicable to a number of different areas of IT. You can tell this deck was put together with the pains that Andy will have personally felt working with the number of different IT Departments over the years and a number of them I have felt as well as will a number of you too. Even if your not a SharePoint person go and have a look at the deck and see if it resonates with items that you feel in your day to day IT lives.

Next up on Wednesday 3rd Feb it was an early morning with a 5:15am train from Manchester to London for an Azure Security Morning at Microsoft’s offices at Victoria – this is an area that more people need to put time into and I’m looking forward to seeing some further work in this area and mainly more so from Microsoft. Saying that Microsoft recently released the Azure Security Information Site on https://azure.microsoft.com/en-us/documentation/security/ so go and have a look at it as there is a lot of good information in there. However the Security morning was a good event although I felt it would have been better as a full day event especially as there were a number of issues with getting the interactive demos/labs up and running with the Barracuda security devices mainly due to issues in the Scripts that had been provided to set everything up. They should have written Pester Tests for these scripts as I had gotten the impression that the scripts were recently updated for a recent release of the Barracuda security devices. Some of the attendees managed to get things set up however I was unable to which was not ideal.

I then had to leave London around 14:30 in order to get back to Manchester in time for the SQL Server User Group that evening. Now everyone that knows me knows my SQL knowledge isn’t close to be on par with those that live and breath SQL every day however one thing all platforms require is a data backend of sorts. So I’ve pushed myself to attend more and more SQL events where possible (as you’ll gather from the rest of this post as well) so that I can learn more about this crucial technology and be able to implement and use it in my own adventures going forward and one of the area’s that has piqued my interests is PowerBI and I was glad to be able to get what was a real beginners crash course into PowerBI by what I can only describe as an Awesome Instructor – Adam Aspin. We also had a session on SQL Server Wait Stats by Rainer Unwin which was an interesting although perhaps a bit too technically in depth for me to fully follow at this stage of my interaction with SQL Server – though I’m sure it will be something that I come back to in future.

Then the next day Thursday Feb 4th, I had to travel back down to London from Manchester for the London PowerShell User Group at RackSpace just out of Hayes and Harlington, where I also presented my Teaching the IT Pro how to Dev session with a bit of an update to it from the Manchester session. We also had Rudolf Vesely @RudolfVesely from Rackspace give an Introduction to Pester which was a great session for the audience – Rudolf will be presenting to the London group again in future on a more in depth session on Pester so look out for this.

On Feb 10th I was lucky to present to the virtual Mississippi PowerShell User Group where I Presented the Teaching the IT Pro how to Dev session – this was recorded and I’ve blogged about it in a bit more detail here.

I then attended the UKITCamps in London on Feb 17th & 18th on the What’s New in Server 2016 & What’s New in Windows 10 topics and although these are camps that I’ve previously attended there are a number of labs in there that are good to have chance to run over and replay. I also enjoy the UKITCamps as these are Microsoft delivered training days meaning that there are a number of others there that I get chance to network with along with also getting chance to catch up the guys running them, namely Ed Baker, Marcus Robinson and Andrew Fryer. I was also very lucky to get chance to head out for a meal with Ed, Marcus & the other members of the DX team that work behind the scenes to put on these events. I for one look forward to the events and them being put on by the guys in the DX Team and now how difficult it is to arrange events like these. This is before you include preparing the Slide decks and the labs that are to be used in these events. Hopefully we will see more of these events in future however there aren’t any currently planned so we will have to wait and see if more of them appear in future.

I then had a just under a week until my next event which was decided last minute where I was to present my Teaching the IT Pro how to Dev session to the WinOps group in London on Feb 23rd which was great however I suffered from a failed MicroHDMI to HDMI Adaptor so I had to try and move my demo and deck to Stephen Thair from DevOpsGuys Laptop and as per the standard developer line ‘Well, It worked on my machine’ I was unable to show the demo’s working. This has lead me to build a VM in Azure and a second Hyper-V VM for any demos that I want to run in future to ensure that demos work – Also I’m planning getting a dedicated presentation only device which I’ll wipe between events to ensure that all runs as expected along with a few backup cables & Adaptors to have with me.

Then the next night attended the Chef Meetup where I was introduced to GoCD, Terraform & Kubernetes – all look like interesting technology but I need to get a reason to get in deep with any of these technologies so look forward to me possibly blogging on these technologies in future.

I then Attended the London leg of the Microsoft Cloud Roadshow on Feb 29th & March 1st where there were a number of different sessions on throughout the event with tracks covering most of Microsofts technologies with a number of them focused on the SharePoint/Office365 ecosystem and the Azure ecosystem. The highlight of the event was the ability to go and have a few drinks with Joey Aiello one of the PowerShell PM team who was over from the US for the Cloud Roadshow. It was good to be able to have a face to face chat and I’m sure in future that there will be more chances to chat including the MVP Summit. Joey is younger than I am and is rocking a very good role at Microsoft – Imagine being part of the PowerShell Team – that is a number of peoples dream jobs and I would be lying if I were to say that I wouldn’t find it amazing to spend my day working even more with PowerShell than I already do. However as an MVP I do get that luxury already although it would be a very different role to the one that I’m doing. Who knows what the future holds but I know that for me it will likely involve PowerShell for a number of years if not decades to come.

I also dragged a few people to the London Azure User Group that was happening on the evening of March 1st where we were introduced to Boris Devouge, Director of Open Source Strategy at Microsoft and I can only describe him as a ‘Fluently Funny Frenchman’  which make his presentations engaging and as this was on the new Azure Container Service (it’s an Azure User Group after all) it was interesting to hear of the partnerships that Microsoft have been recently making in this area with the push to make Azure the most open source friendly cloud. The Azure Container service was in public preview (I think) at the time of the presentation however it has since been made Generally Available and you can learn more on ACS on this post on the Azure Blog site https://azure.microsoft.com/en-us/blog/azure-container-service-is-now-generally-available/

I next attended a talk in Manchester on March 3rd at Manchester Geek Nights on Agile and Tech in Local Government delivered by Stockport Council where I was lucky to bump into my good friend Ethar who always has a good story to tell. I must get chance to catch up with him again when I’m next in Manchester and not just there on a flitting visit. The Talk by Stockport Council left me realising why our Governments, Local & National, get a lot of stick for being poor at delivery and execution of their IT projects (& projects in general) and this is because there is so much fragmentation in the IT Systems being used across all differing councils due to them all having separate and diminishing IT budgets to do any projects. I personally think that Centralisation of all of the UK Council & Local Government IT into a single pool would work much better for the public and my reasons for this are pretty simple, Enhanced Governance, Lower Boundaries to sharing data between the different departments that need to share data Nationally (think Social Care departments, Housing Departments etc) and Generally a simpler to manage Infrastructure and Workforce. Though perhaps I’m biased being from a Microsoft background which means that I can see some opportunities to scale similar services nationally which would be massively more cost efficient. Almost all the banks have done this and realised the benefits and to me it makes sense for the Public Services Sectors to do the same too! It was however interesting to hear about how Stockport Council are embracing Open Source technologies and essentially building out their own products which they are in turn open sourcing for other councils to take advantage of too. Its an interesting journey for them to take and I hope that the effort doesn’t end up being completely canned in a few years time if a Nationalisation of IT Services to Councils were to occur. It in my opinion is a logical step for this country to take though I’m not sure politicians and logic can go together. We will have to wait and see.

 

SQL Sat Exeter – March 12th. Well I’m not really sure I need to say any more than that really. However it was a great event and my first event doing a back to back demo heavy session on PowerShell DSC. Even more scary it was DSC but for SQL Server. I hadn’t realised how much of a headache the SQL Server DSC resources were until I spent the majority of the week leading up to it getting annoyed with little things like hardcoded values for where the Resource expected the Install media to be. I got that frustrated with it that I began to rewrite the resources so that it would work how I expected it to work which meant that I spent more time writing DSC Resources from scratch than actually doing anything useful. Especially as a week or two after SQL Sat Exeter I wiped the drive with the resources on them. Yes they were in Source control but only on that machine – lesson learned – DOH!!!

SQL Sat Exeter was my first real forage into the SQL Community events except User Groups and I after the fun I had with them at Exeter I can see why it is they call themselves SQLFamily. In the lead up to my sessions there was a run around to get some bacon sandwiches and a fair amount of drama with my demo’s having decided to kill themselves that morning – However I managed to get them working before my session and there was some good reviews come from it. I know where I need to improve the content and will be looking forward to SQL Sat Paris in a few weeks where I will need to cram all of the information from 2 hours into 45 minutes. #ChallengeAccepted

It was also the Saturday night at after event Curry & following drinks that the discussion about SQL Sat Manchester having a PowerShell Track came to fruition. I was lucky enough to have ended up out with Chris Testa-O’Neill and the other organisers at SQL Sat Manchester the year before (my first SQL Sat event and I went as an attendee) so it all felt natural to be there along with a number of other familiar faces like Rob Sewell and Steff & Oz Locke. Its like a reunion and I’m looking forward to what will be a kick ass SQL Sat Manchester this year. The PowerShell track shaped up nicely Smile. One thing I’ve learnt about the SQL Community is that it really does kick ass but then again all the IT Communities I’m a part of do. Our Passion brings us all together and with it we ensure to have a bloody good time when we get together. Else why bother?

On the Sunday morning I had an interesting email come in as I was sat having breakfast which lead me to question it a little with Chris & Alex Whittles and well history has been written since that morning.  I also got chance to help Rob out with a DSC issue he was having and gave him the guidance that he needed to resolve his issue in the right way as things currently stand and in future we will have a feature complete PowerShell DSC Resource for SQL Server – though this will require some community help and you can help out by voting on / adding items to the Trello board at http://sqlps.io/vote

Next up on my events (and half way through the 30 events I’d attended) was LeanAgile Manchester on March 16th – a firm favourite of mine as its a great community (like they all are) where we were treated to a talk by Jon Terry – but not that Jon Terry! – from LeanKit about how the deal with working in a Lean\Agile way with their FSGC (Frequent Small Good Decoupled – said FizzGood) approach. It’s another example of where the Software/manufacturing world bring good things to the rest of IT and generally other areas too and I would highly recommend that you go and read their blog on FizzGood at http://leankit.com/blog/2015/07/does-this-fizz-good/ and take away from it what you can.

Next up on my User groups that I attended was the Manchester SQL User Group where we would be walking through Cortana Analytics which I was looking forward to as at SQL Sat Exeter Chris Testa-O’Neill & Cortana essentially got a divorce whilst he was in the Speaker Room prepping at SQL Sat Exeter. I’m sure with a decent set of data I’ll be able to find a good use case for Cortana Analytics and I have some ideas in the pipeline so keep an eye out on future posts on this.

As an Non-Dev Admin who realised that I am really a Dev just wasn’t ready to admit it to myself, I find that the .NET User Group in Manchester is a useful group to attend especially when the topic is about .NET Core which it was on March 22nd. Even more so as with .NET Core there is a real possibility that the PowerShell Engine will eventually be open sourced especially as we are seeing a refactor of the existing Cmdlets to be able to be run on Nano Server with more and more coming each new TP and more to come for Server 2016 GA. We were treated to a history lesson on .NET Core by Matt Ellis @citizenmatt with the slide deck at http://www.slideshare.net/citizenmatt/net-core-blimey-windows-platform-user-group-manchester and again is well worth the read.

Next up was just after I had moved from Manchester to Derby and still had the hire car – and I had an itching to go see some of my SQL friends in Cardiff – especially as it was an epic event – Return of the Beards! This only means that not only did I get chance to catch up with Steff Locke again but also with Rob (again – it seems like that guy gets everywhere Winking smile) and also another one of my SQL friends Tobiasz Koprowski and lastly the other bearded SQL guy of the night Terry McCann. This was where I got to learn a bit more about TSQL from Terry and Securing SQL in Azure from Tobiasz but also see Rob’s session on the pains of Context Switching and how PowerShell & PowerBI help him not get mithered for information that can be easily made available and easily searchable with a little effort. This is for me a great example of real world use of PowerShell and PowerBI being useful together and well worth watching Rob deliver this if you can get the chance.

I then attended my first Tech Organisers Meetup in Manchester on April 7th – it was good to meet the other Tech User Group Organisers in Manchester/NW area and have the discussions that was needed as a collective to help strengthen the view that Manchester is a blossoming Tech Hub in its own rights – something that Londoners seem to miss out on. Manchester is ace because it’s cheaper than London and is actually more lively at night than London (I’ve found) and you can literally walk from one end of the main city centre to the other in about 20 minutes or so and within that you have the Northern Quarter. So you are pretty much sorted!

Next up I had another event I presented at – The SharePoint User Group in Nottingham on April 12th. I presented on PowerShell DSC for SharePoint like I did at the SharePoint User Group in Leeds in January but this was a special one for me as it was the first User Group that I presented to after being awarded MVP which being awarded on April fools day lead me to post this post Congratulations 2016 Microsoft MVP at 15:31 about 10 min after getting the Email and then Fooled Ya – Today I became a MVP at 15:55  – I also blogged Awarded the MVP Award – What it means to me and the future for the Community. We also had a talk from Garry Trinder @garrytrinder on Require.JS which can be used in conjuction with MDS (Minimal Download Strategy) in SharePoint 2013 and Online Sites to help bundle up and control your page load and transition times. Javascript is one of those dark arts that I’ve not had much more I’ve needed to do with it – but I certainly would look to use Require.JS in any of my future web projects.

My next event was PSConfEU and this was the event that I had been looking forward to because of the sheer work that went into it by all involved, including Tobias Weltner and myself to make it a success. Due to the size of this event I will put together another post in the coming days that really captures the details on what an amazing event that it was as I don’t think that a few sentences will do it any real justice. Plus I want to relive the experience in as much detail as I can so that I can share it with you as well – so that if you weren’t able to make it then hopefully you’ll do what you can to make PSConfEU 2017. Planning will begin for PSConfEU 2017 most likely early August so there will be small announcements some point after then though its still all to be determined.

From the spill over from PSConfEU I had managed to bribe June Blender to agree to come and present at the Manchester & London PowerShell User Groups – though to be honest there wasn’t much bribing involved as June had wanted to come to Manchester anyway and timing wise it just worked out great. June gave her Thinking in Events hands on lab at both groups and both groups had some great questions and I’ve had some fantastic feedback from the sessions which has lead me to start working on preparing my own hands on events for in the future. These are “in the works” so to speak and details on these will start to appear in the next few months.

Next up was my first MVP event where we went to Bletchley Park – a fantastic historical site and I’m planning to head back there again in future. The event was good for me as it allowed me to meet up with other UK MVP’s including fellow PowerShell MVP Jonathan Noble. There is a good story behind how we ended up meeting on the train up from London to Bletchley Park and it starts with me forgetting to charge my Laptop and Phone the night before. When I got to Euston I was frantically trying to make sure that I got on the right train to get to Bletchley. I had messaged Jonathan whilst on my way and had found out that we were catching the same train to Bletchley. However, phone signal is pretty poor when you are travelling out of London and just before my phone died I managed to send him a message letting him know I was about half way up the train. About 20 minutes passed and then all of a sudden this guy two rows in front of me got up and came to me and said “Hello – its Ryan isn’t it? I’m Jonathan only just got your message” and from that moment we just continued chatting. When we got to Bletchley Jonathan was able to lend me a power bank to charge my phone not that I really needed it but having charge on your phone is now a comfort thing isn’t it. We had  an afternoon of talks and then a really nice drinks and dinner where I got chance to meet some more of the MVPs which was good. We then next day had some presentations in the morning and then we had to make some Rocket Cars in the afternoon. It was great fun to something less techy but still something that most enjoyed. I was lucky to be able to get a lift from Alex Whittles from Bletchley along with Steff Locke to Birmingham New Street Station which allowed for a number of good conversations about SQLBits & SQLRelay. Both being events that in future I may get more involved in – if I can manage to stretch that far that is. Once Alex dropped me and Steff off we worked out that we either had half hour to try and get something quick to eat before running for our respective trains or we could get something decent to eat and then get a drink afterwards before catching the train after that. Naturally, decent food and drink was always going to be the winner Smile.

 

Nearly Finished with the Recap with just 6 events left to cover, so If you’ve read this far well done you can manage to make it to the end Smile

 

I then attended the SQLBits Saturday event on May 7th in Liverpool and although I got there not long before lunch I was still able to get to the sessions that I wanted to get to – mainly the SQLTools session as seeing that SSMS has been decoupled from the SQL Server Install – which is 100% the right thing to have done. Like other SQL events I bumped into Alex, Steff, Rob (he is literally everywhere Winking smile), Tobiasz & a number of other SQL people including Mark Broadbent, Niko Neugebauer, André Kamman, John Martin, Mladin Prajdic & Neil Hambley to name just a few. As per all these events once the curtains for the event has closed that is when the Food and Drinks appear and I’ve realised that I have a soft spot which stops me saying no to going for a Curry & Drinks with all these amazing people. This means that future events I’ll be planning to stick around for the almost guaranteed after Curry and the ensuing drinks and conversations that happen around them.

I then had the amazing opportunity to meet and spend a few hours with Ed & Teresa Wilson – The Scripting Guy & Scripting Wife – where I took them for a wonder down to the University of Manchester Campus and took them to KRO – a nice Dutch place for some food which was right round the corner of where I used work when I was at UoM. We then strolled leisurely around the campus on the way back towards the venue for the User Group where we had Ed talking us though OMS & Azure Automation DSC now that Ed is a part of the OMS team at Microsoft. Due to the fact that we had to get a Train to London at 21:15 the user group was an hour shorter than it normally would be so we didn’t have time for pizza and the normal after drinks that we would have normally done but the turn out was still one of the best turnouts we’ve had and there will be more events like it planned in future as well with an aim to make the next Manchester User Group occur in July.

As I mentioned Ed, Teresa and I all had a Train to catch to get to London for WinOps, and much like PSConfEU, I am planning to blog about this event separately to really capture the spirit of the event. Look out for that post in the next week or two.

 

We then had the UKITCamp which Marcus Robinson & Ed were going over the feature sets of Azure & OMS. I unfortunately missed the morning of this event due to being called onto a customer production issue conference call – 3 hours of my morning I couldn’t get back however sometimes that is how these things go and as I was leaving the Venue I found out that there was the London SQL User Group on that evening and I decided to stick around for it as the topic was “Common SQL Server Mistakes and How to Avoid them” which is the kind of SQL topic that I enjoy because it isn’t deeply technical but allows me to understand the product just that little bit better than I did beforehand.

Lastly The London PowerShell User Group, which we had Ed at again and had the highest turnout so far. Ed again was talking about OMS & Azure Automation DSC but also had a number of opportunities for some open directed questions from the audience which is always an added bonus of having more & more people turn up to the group. We over run a little with the conversations that were flowing mainly due to having an excess of beer and pizza due – something that we haven’t had happen before at the user groups. Then as per usual with the User Groups we end up finding somewhere else to go for another drink or two and continue the conversations.

 

So thats most of my last 3 months summarised – what have you done in the last 3 months?

Future posts like this will be much shorter, contain some pictures and be competed on a monthly basis.

Thanks for reading – Hope you have a great day!

Creating a set of simple Pester Tests for existing or old PowerShell Modules & making them easier to update in future.

I have long thought of a way to Automagically create some Pester Tests for the Functions contained in a module that perhaps was developed before Pester was really well known.

At that Point we may have been creating psm1 files that contained a number of nested functions within them. I know for one that I am one that did this / added to existing modules that were built this way – have a look at SPCSPS on Github or aka SharePointPowerShell on CodePlex as one of the first projects that I got involved with in the Open Source world.

*Please note I would highly advise to check out the OfficeDevPnP team work for any real SharePoint PowerShell work instead of the example I have given at PnP-PowerShell *

However this is where we are at with a number of older modules and as expressed prior about SPCSPS this was a way that was exceptionally common to run into.

However this isn’t a very scalable way of working with existing codebases and as very frequently found in the PowerShell community an author will not be able to spend the time reviewing the code and accepting pull requests from others. I have previously blogged about the need to “Pull The Community Together” to remove this totally unneeded & actually quite ridiculous barrier to better Modules for the benefit of the community. From a personal stand point –  A PR to an Open Source repository that is open with no input at all for more than a month shows that there is no value in adding to that prior Repository as it shows the Repo Owner has little/no time to do the needed Code Reviews etc.

Now one of the ways that we as a community can negate this issue is to build a stronger collaborative platform for these modules and build teams of people that can be relied on to perform cohesive reviews on various aspects of the code being added. By a Platform I mean a collective of ALL of the community to work out who all are the right people to get involved in the differing areas of the PowerShell Language.

Funnily enough GitHub within Organisations has this model already defined – called Teams. This allows us as a community to have an overarching organisation that will allow us to add the right people to get involved in discussions about certain semantics as we move on in time.

This essentially is a massive change to how we as a community do things however at this point in time really is the best way forward to minimize duplicated effort across multiple codebases and to ensure that we have the best & fully functional modules out there for the others in the community to work with.

Again please read my previous post “Pull The Community Together” on my thoughts on this.

Anyway back to the actual topic of this post. And from this point on I will be using the SPCSPS module as a good example that could be worked with as we find modules can currently be across Repo’s etc

So with SPCSPS I have 103 Functions that are put together in 11 psm1 files. Here are a few Screenshots just to back this up.

Although this “Works” its not great when there maybe a number of additions to 1 file (New Functions, Removing existing functions or Rewriting functions completely) and this can be an easy way for merge conflicts to occur – which we do not want.

So to get round this I realised that the only way was to write a function that will Export all the Functions (not Cmdlets) from a Module and whilst doing this will create a basic pester test for each of the exported functions into a User Specified Folder. My Reason for choosing to do it this way was to allow users to check the exported code before merging this into their existing codebases even though I am actually quite confident that this will work as expected.

This will allow users to refactor any of their existing code much easier going forward and will allow them to also benefit from having some basic pester tests that they can then expand upon.

 

The key component of this is the Export-Function function which has 2 parameters

  • Function – As a String
  • OutPath – As a String

Under the Hood the Export-Function function will when passed the Function Name & the OutPath will get the Function Definition and all the parameters from the Function and will then create the below files based on the following structure.

OutFilePath\FunctionVerb\FunctionName.ps1

OutFilePath\FunctionVerb\FunctionName.tests.ps1

Technically this isn’t actually difficult for us to do at all (hey we are using PowerShell right) but will allow us to quickly and easily add tests to existing (or new) code with little amount of effort and as a PowerShell Enthusiast this is exactly why I started working with PowerShell in 2013.

As a small note this will only work with public functions – though if you were to explicitly load private functions into the current session in a way they become public then you could use this to do the same for those as well.

The module is available on the PSGallery called PesterHelpers and is available on Github under https://github.com/PowerShellModules/PesterHelpers

The benefit of this module is that it can allow a quicker way to move away from modules that contain multiple functions in 1 psm1 file (or nested ps1 files) and can be used to help start to build a test suite of Pester Tests when used with the accompanying PesterHelpers.psm1 & PesterHelpers.basic.Tests.ps1 files for other modules. Is is possibly by modularising as much of the code in both of these files as possible.

A shoutout must go out to Dave Wyatt for a section of code that was contributed to ISE_Cew a while back that on a review whilst looking to expand that module lead me onto creating this Module.

 

 

How to find Local User Groups & events – My Experience

I had a discussion last night via twitter with one of the attendees that I met at the Microsoft Cloud Roadshow in London earlier this year and the outcome of the conversation was that although I find it easy to find out about events – this isn’t all that common for others.

 

So I decided that I would quickly jot down some of the places that can be useful to search to find events that are going on around you.

  • Word of Mouth – If you know a number of people in the area ask them if they know of any events going as they will likely be closest to the events.
  • Twitter – There are a number of Twitter accounts out there that are just setup to serve what’s happening in your area. A good example is the @TechNWUK twitter account which lists all the events around the North West that the group knows about.
  • Eventbrite – www.eventbrite.co.uk is another good place to find tech events – especially those that are full day events or conferences – Just do a quick search for a specific Technology and you’ll get some results back on upcoming events around you.
  • Meetup – www.meetup.com is another and increasingly more common area for User Groups to promote themselves on. Similar to Eventbrite but a much more social feel to event listings. You can also find many more non-techy events listed there which can be very interesting and useful. My only gripe with Meetup is the admin cost for setting up a Meetup group which at $89.94 per 6 months for the unlimited subscription isn’t really what I would call reasonable for a user group marketing channel though this does allow multiple groups under the 1 subscription so can be shared as part of a collective – like Get-PSUGUK
  • Facebook & LinkedIn Groups – Both of these can also be an avenue for finding out about User Groups or events.
  • MSDN Events – http://events.msdn.microsoft.com/ – this can have a number of the Microsoft focused events on there as there is the ability to register as a Technical Event lead on https://www.technicalcommunity.com/ and this allows you to get the event posted to the MSDN events pages

 

If you still can’t find any events around you then I would suggest to try the following

  • Speak with those that you recognise from the community – this could be a Twitter DM etc but is normally a good starting point as they may know of events that already exist or are in the initial starting up period
  • Try and reach out to organisers of similar events as they may likely know of one starting up soon in that area or it may just be advertised in a manner other than the above, this is especially more common when you are in more broader focused technology like the various JavaScript frameworks.
  • Broaden your search area as some user groups will try not to have meetings too close together. Examples of this would include having groups in Birmingham & Wolverhampton.

 

Lastly good luck in your search and if you still haven’t found a User Group around your area then why not think about setting one up? If there is already similar communities out there in other areas then reach out to the organisers of those events and see if they can provide any guidance.

The Pains of Poor/Missing Documentation

There will be a time where you are attempting a new task, whether that is personally or professionally and you find yourself having to resort to the documentation of the product to get to the end goal, whether that be to put together a new piece of furniture, preparing an exquisite meal or bashing different bits of software together from different companies or more commonly the same company.

One thing that is common in all these scenarios is that if the documentation is completely missing then you are forced down the road where you take the “pot luck”/”educated” guess to get to the desired end result and sometimes that can lead to some hilarious results, especially if it is in relation to cooking or building furniture.

In personal experience this has been most common with second-hand furniture and this is because there are few people that keep their assembly instructions once the furniture has been assembled. I think this is due to the “I’ll never need to take this apart and build this again” thoughts that we like to have.

This mentality as it were is what is rather similar in the IT world as well and it is because of this that we have seen lots of undocumented software features. Anyone who has worked with the SharePoint Object Models in much depth will be more than familiar with idea of missing documentation.

 

In the IT world this is something that we have all understood and realised was an issue and at some point in our careers we’ve all been on the receiving end of a lack of documentation or poor documentation and when it happens we’ve either had to turn to technical forums or write it ourselves.

Over the years this has started to get better and I for one am glad to see the initiatives that Technology Organisations are taking to start Open Sourcing product documentation. A number of Teams at Microsoft are doing this now via Github and this to me reinforces the need for all IT Pro’s & Developers to understand how to use Github & the underlying Git software as a part of the core tools within their tool belts. In 3 years time I wouldn’t be surprised if other Source Control mechanisms like SVN & Mercurial have almost been fully replaced by Git. It says something that Microsoft have fully adopted Git into both the Hosted and On-Premises versions of TFS.

So if you read this blog and you haven’t learnt Git yet but are writing PowerShell – go and watch this Session that I did for the Mississippi PowerShell UserGroup as detailed in this previous post and read up on the “My Workflow With Git” Series starting with this post

 

We are at a good point in time where the people behind the products we love and use each day are listening to us in a much more open way than previously and over the coming weeks I’ll be updating the following site with all the Microsoft UserVoice / Connect links and in a nicer format than they currently are.

If you want to help and get involved then drop me a message and I’ll get you added to the Organisation to be able to add commits

Building A Lab using Hyper-V and Lability – The End to End Example

Warning – this post is over 3800 words long and perhaps should have been split into a series – however I felt it best to keep it together – Make sure you have a brew (or 2) to keep you going throughout reading this

In this post we will be looking at how you can build a VM Lab environment from pretty much scratch. This maybe for testing SharePoint applications, SQL Server, Exchange or could be for additional peace of mind when deploying troublesome patches.

Our requirements for this include

  • Machine capable to Run Client Hyper-V – Needs SLAT addressing (most machines released in last 3 years are capable of this)
  • Windows 8.1 / 10 / Server 2012R2 / Server 2016 TP* – In this post I will be using Windows 10 build 14925 – ISO download is available from here
  • If using Windows 8.1 then you will need to install PowerShell PackageManagement – you can use the script in my previous post to do this as detailed in here
  • A Secondary/External Hard Drive or Shared Drive – this is to store all Lability Files including ISO’s, Hotfixes & VHDX files

Where do we begin?

Obviously you need to install your version of Windows as detailed above and once you have done this you can crack on!

Time Taken – ??? Minutes

However as mentioned I’m going to Use Windows 10 – This is just personal preference and is for my ease of use.

As you hopefully know by now Windows 10 comes with WMF5 and therefore we have PackageManagement installed by default. We will use this to grab any PowerShell Modules that we need from the Gallery. I personally have a Machine Setup Script that lives in my Onedrive as you can see below. As this is a Windows 10 Machine I am logging into it with my Hotmail credentials – this then means that I am able to straight away pick the folders that I want to sync to this machine (joys of the integrated ecosystem)

This takes about 5 minutes for OneDrive to finish syncing and then we are ready to go onto the next step.

Time Taken – 5 Minutes

Lability1

In this stage I will Open ISE with Administrator Privileges – this is required as I need to change the Execution Policy from Restricted to RemoteSigned as well as run other scripts that require elevation.

Once I have done this I can move onto the next step. This includes setting up my PowerShell Profile and Environment Variables and then setting up all the required functionality for me to continue working on this new machine.

This includes setting up the ability to install programs via Chocolatey like VSCode & Git and installing Modules from the PowerShell Gallery a few examples being ISE_Cew, ISESteroids, & importantly for this post Lability . Also It is worthwhile to note that at this point I am not downloading any DSC Resources as part of my setup script – this is because we will cover this later on as part of the workings of Lability.

As an additional note it is worth mentioning that the Version of Lability at the time of writing this article is 0.9.8 – however this is likely to change in future with more features being added as required. If you have a thought or suggestion (or issue then head over to the Github Repo and add your suggestions / issues.

I am also in this script enabling the Hyper-V Windows Feature to enable me to carry on with this Lab. I then initiate a System Shutdown. Overall this whole section takes maybe about 10 minutes to complete & yes I intend to build this as a DSC Resource in the near future, however it is worth while to note that Lability has a Function that will ensure that the Hyper-V feature is enabled & your are not awaiting a System Reboot for you – more on this a little later on.

Time Taken – 15 minutes

Once the reboot has completed we can then get on with the Lability bits and that is the real interesting part of this post.

Lability Functions

Lability has 38 public functions and 6 Aliases as can be seen below.

Lability8Lability9

I wouldn’t worry too much on the aliases as these are built in for continued support from prior versions of the Lability Module and will likely be removed on the 1.0 release.

We will be using a number of these functions throughout and as is always best practice have a read of the help for the functions and Yes they do include some great comment based help.

There are a number of additional private functions in the Lability module that have comment based help too but again I wouldn’t be worrying about these too much, unless you need to do a lot of debugging or want to help add to the module.

The Key Lability Functions that you will need are and likely in the below order

  • Get-LabHostDefault
  • Set-LabHostDefault
  • Reset-LabHostDefault
  • Get-LabVMDefault
  • Set-LabVMDefault
  • Reset-LabVMDefault
  • Start-LabHostConfiguration
  • Get-LabHostConfiguration
  • Test-LabHostConfiguration
  • Invoke-LabResourceDownload
  • Start-LabConfiguration
  • Start-Lab
  • Stop-Lab
  • Get-LabVM
  • Remove-LabConfiguration
  • Test-LabConfiguration
  • Import-LabHostConfiguration
  • Export-LabHostConfiguration

These are just a few of the Functions available in Lability and we will cover most of these functions in greater detail as we head through this article.

Lability Media Files

Lability has a number of different configuration files all in JSON format, and these are HostDefaults, VMDefaults & Media. All of these files are in the Config folder of the Lability Module which on your new Machine will be C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config
The HostDefaults file contains all the settings that we associate with the Lability Host Machine. These include the paths where we will be looking for any ISO’s, VHDX, Hotfixes and any additionally required Resource Files for in our Lab.

The VMDefaults file contains all the default settings that we associate with the created VM’s. This includes Media used to create the Machine, Startup RAM, Number of Processors and which virtual switch we can expect the VM’s to use. This can be useful to have just like the HostDefaults but as we will see later on in this post we are most likely to override this in our configurations.

The Media file contains the settings for any media that we we want to use. As Lability in its nature was was built for building Labs it uses the Evaluation Licensed media for the VM’s.

The benefit of this is that the items already in this file allows you to get on with building Labs almost straight away on a brand new Machine.

This file has some included Hotfix Download links for getting the DSC updates on WMF4 for Server 2012R2 & Windows 8.1, but don’t worry Lability uses these to download the hotfixes and embed them into the VHD files for you. 1 Less job to worry about Winking smile

LabHost Defaults

Firstly we need to get the LabHost Defaults setup correctly for our environment – this is important and also is great for being able to move Labs between machines if required ( I’ve had to do this a fair amount myself ) and is why I recommend that all the core Lability bits are installed on a Separate Drive.

Personally I’m using an External Hard Drive but that is because my Lab is portable. I have not tried this with a Shared Drive however there shouldn’t be much that needs to change to get it working that way.

On my External Drive I have the following Setup – I have a folder called Lability and in this I have all the Folders required by Lability as detailed in LabHost Defaults as we will see below – however I also have another folder – Lability-Dev as this was from the Zip that you can download of a repository from GitHub as this was prior to Lability being made available on the PowerShell Gallery. In essence this means that I have copy of Lability that I can edit as required – especially the 3 Lability Configuration files detailed in the previous section but also allows me to do additional debugging as required.

Firstly we will Run Get-LabHostDefault and this should return the below by default – this is because the File HostDefault.json is stored in the C:\Program Files\WindowsPowerShell\Modules\Lability\0.9.8\Config location (remember 0.9.8 is the current version – yours may vary)

Lability4

As this is the default and I’ve been using Lability on a few different machines I have a copy of it on my External HDD in the Lability Folder. Lets see what that file says it should be.

Lability5

Well – That’s not good! As you can see on my last machine the external drive had been the D Drive but on this machine its the E Drive. A simple (yet annoying) thing that we can easily change. Now I could be Done Manually but I decided that I wanted to wrap this all together so that I don’t have to think about it again. This is simple enough so I just wrapped it in a very simple function as seen below

Lability6.1

This allows me to Update this as I move it between machines quite easily. This isn’t an ideal scenario but it works at at least.

The benefit of this is that it will update the HostDefaults file on both my C: Drive and the External Drive at the same time – Which further means that this will be easier to be portable.

We can then run the function Reset-LabHostDefault and we should get something similar to the below

Lability7

We can also do the same thing for the VMDefaults file however I find this is less likely to be a requirement as we can override the defaults in the configuration data files that we will work with and this is my preferred method.

Once we have done this we are ready to run the following function Start-LabHostConfiguration – this will on a new machine go and create the required directories as specified in the HostDefaults.json file that I have shown you how to amend and the output from Start-LabHostConfiguration is below

Lability18

We would then use Test-LabHostConfiguration to confirm that this is all correct and we can see that this is the case below

Lability19

Building your First Actual Lab

Wow that was a fair bit of setup required though a lot of it may be completely ignored depending on your own set up or if your re-visiting this post.

Now we move onto the real meaty part of the post and I’m going to use 2 examples for this – The Bundled TestLabGuide and one of my own for a SQLServer install.

So starting with the TestLabGuide.ps1 file there is only 1 small modification that I have made and this is at the end of the File and that is to include the following 2 lines

Lability10

This allows me to build the configuration for these VMs as if it was a script and this is how I am personally doing it.

However on a Machine with No DSC resources we have an issue if we are building VM’s that are Dependant on these DSC Resources.

Well within Lability there is a Function called Invoke-LabResourceDownload and this has the ability to download all the required resources that we need as defined in our configuration data file.

Within the Configuration Data file shown below, the key section for us to look at in here at this point is the NonNodeData section where we have a subsection for Lability configuration Items, this can include EnvironmentPrefix, Media, Network, Resources & most importantly for us DSC Resources.

So far I have found that we only need to run this for pulling the DSCResources as defined in our configuration data file as shown below – this is because we require them to be on the machine before we can build the mof files.

Lability15

I found that it best to have the DSCResources as RequiredVersion and not MinimumVersion as it is by default in the TestLabGuide.psd1 file – This is by preference but with the amount of changes happening to the DSC Resources its worthwhile being extra cautious here.

The output from Invoke-LabResourceDownload can be seen below and this as we can see has downloaded only the DSC Resources that we specified in the Configuration data file (TestLabGuide.psd1)

Lability11
This also means on a clean machine you will be sure that you have the right required versions. This is especially useful when building Labs in my opinion.

However if you have multiple Labs running concurrently then the next bit may be an unfortunate blow to you.

Within the Configuration keyword we have a Dynamic Keyword defined – this is Import-DSCResource – which you may have thought was a function.

With it being a Dynamic Keyword it works a little differently to a normal Function/Cmdlet and therefore we are limited as to what we can do with it – for example we cannot use Splatting with it and we also cannot pass the required DSC Resource Modules to it from outside the current file. This is required for the syntax highlighting that we get as part of the parser. If you want to learn more about the Import-DSCResource Dynamic Keyword then read up this article by the PowerShell Team – be wary it is from 2014 and there hasn’t really been any better content come out on this since (that I can find anyway)

My thoughts on this is that we should be able to pass the required DSC Resources through from the Configuration data file like we have already detailed prior – however this isn’t currently possible.To me it would be beneficial (& logical) to be able to extract this away from the configuration as it is really a part of the configuration data, especially seeing as we already have to pass configuration data to our outputted configuration keyword – in this case TestLabGuide. However this is where we are at and at this time we will need to mirror the DSC Resources between both the configuration itself and the configuration data file.

However that aside lets look at the Node data and especially the All Nodes section which is where the NodeName = *

Lability16

As we can see in here we have a few settings for the all the nodes in this configuration that will share items from Lability and these include the items we had available in the VMDefaults file as well as some other items too that we would want shared between the VM’s like DomainName etc

Further down we can see that for the Client VM’s in this Lab we are specifying different Lability_Media values for each-  so it looks like we will have both a Windows 8.1 & a Windows 10 Client Machine in this Lab.

Lability17

 

That’s enough about the configuration and configuration data side of things – lets go and build our Lab.

At this point what we want to do is just do the below.

Lability12

At this point you will be prompted for an Administrator Password and once that has been given as we can see  above it will go and create all the mof files that we need for this lab. The next step is to go and kick off the actual build of the lab which can be done as shown below

Lability13

This Function Start-LabConfiguration  is the key function of this module as it will go and

  • check that the Lability host is correctly setup – by calling Test-LabHostConfiguration  – if not it will throw an error (possible update here)
  • download any ISO’s that are required as we have expressed in configuration data if that image matches one we have listed in the Media file. It will match these to the Checksum value given in the Media file for the Image
  • download any Hotfixes that are detailed in the Hotfix section of the matched Media in the media.json file.
  • build a Master VHDX File from the ISO & Hotfixes as detailed for the media type for the Lab VM’s as downloaded above – it is worthwhile to point out that this is built of lots of smaller functions that are essentially based off of the Convert-WindowsImage script.
  • build a Lab Specific VHDX file – this is currently setup as a 127GB Dynamic Differencing disks
  • build and inject a Lab VM specific unattend.xml file into each Lab VM VHDX
  • Inject into Lab VM VHDX all required certificates
  • download & Inject any resources into that are defined in the Lability Section of the NonNodeData section of the Configuration data file – I Will show more on this in the SQL Example later on. These are injected into the Lab Specific VHDX file
  • Inject all required DSC Resources into the resulting Lab VM Specific VHDX file.
  • Inject the mof and meta.mof files for each Lab VM into corresponding VHDX file.

Seriously though – Wow – that 1 Function is doing a lot of I would call tedious work for us and depending on your internet connection speed can take anywhere between maybe 30minutes to a day to complete – 1st time I ran it I think it took about 7 hours to complete for me due to slow Internet & I was also watching Netflix at the time Winking smile

You can see the final output from this Function below

Lability20

Note – If you have your own Media you could always create new entries in Media.json for these to save the download time – Especially if you have a MSDN License

Now this is where the fun bit really starts and it also involves more waiting but hopefully not as long as the last bit took you.

All we need to do at this Point is run Start-Lab like shown below and let DSC do its thing – note that I’ve used Get-VM and not Get-LabVM – this is a small issue that I have faced and have reported it on the Github Repo

Lability21

And here is an image of all the VM’s running and getting started

Lability22

 

This part can take anywhere from 10minutes to a few hours depending on your VM Rig setup and the amount of ram allocated to each VM as part of your configuration Data and whether there is requirement to wait for other machines to have be in their desired configuration as well as the complexity of the configurations being deployed.

Under the hood Lability has injected the DSC Configuration into the VM VMDX and has setup a Bootstrap process which in turn calls Start-DSCConfiguration and passes the path of the mof files to this. You can have a look at how this is setup in a VM’s in the following folder C:\Bootstrap\ if you are interested.

Once that is done you’ll have your first fully deployed set of VM’s using DSC & Lability – Pretty amazing isn’t it!

 

SQL Server install – Showing  some of the other features of Lability

In this section I’ll try and keep the content to a minimal but still add in some additionally useful screenshots.

My ConfigurationData file is as below for the SQL Server Node, notice how we have the required properties to be able to install SQL, SourcePath, InstanceName, Features and the Lability_Resource.

Lability24

As this was taken from a previous configuration this is using the xSQLServer DSCResource – take a look at cSQLServer here as this will likely be the version that gets ported to replace the xSQLServer & xSQLPS resources ass it is relatively close to being usable in place of the two resources. Expect news on this after PSConfEU.

Also note that in the Configuration Document we are specifying an additional item in the NonNodeData Section – Resource

Lability25

This allows us to specify further resources that are stored in the E:\Lability\Resources\ folder (E being in my case of course)

I’ll let you decide what you want to put in that folder but any items for installation from the VM could be candidates, things like SharePoint Media or SQL media or other installable programs etc. You could always add your personal script library in a zip file and then get this Lability to unzip it into the right directory. Choices are up to you on this one – so be creative Winking smile

For this Lab I didn’t have the installation media already downloaded so this has had to be downloaded as part of the Start-LabConfiguration Function – however if your remember there was a Invoke-LabResourceDownload Function.

This has some additional Parameters that allow you to download any of the required items for the LabConfiguration to succeed. This can be useful for example if you happen to have a few hours where the Internet Connection is much better than that of your own – especially if you are using this for personal testing and not professional lab testing as it was originally designed to be for.

One of the other great things with this module is that you can make use of it for your lab environments regardless of whether your shop is using WMF5 or not. If your still running WMF4 (with the essential DSC Updates) then you can still build labs using this.

Wrap up

Well I hope you’ve enjoyed reading this 3800+ word post of mine and this helps you get to grips with building out Labs in an easy and repeatable way whilst having the chance to play with DSC to do it.

Remember that this Module DOES A LOT behind the scenes – if it didn’t there wouldn’t be the need for this post – and there is more functionality being introduced as appropriate all the time.

Lability is built for building labs – however you could easily use this for building production like environments – if you dare that is and I can see the benefit to doing so, I mean why re-invent the wheel when Lability will do a lot (most) of the work for you.

Like with getting to grips with most new modules always start with the Help files. This Module has a number of about_* help files and almost all the functions (even the internal ones) have Comment Based Help.

This is a module where you need to RTFM to really understand all the workings of it. Spend a few hours looking through it and understanding it a best as you can. It will be so worth it in the long term even after reading this post a decent number of times.

My do however take my hat off to Iain Brighton (@iainbrighton) on creating this module and for me it is the only module to use when building Lab Environments – So lets gather some momentum as a community to suggest enhancements and improve it even more over on the Github Repo.

My example files that I have used (especially the SQL one) will be made available in due course once Iain has decided on a scalable way forward for being able to share Lab Configurations. We have discussed a number of options (yet to be added to this issue) and if you have an Idea please add it via the Lab Sharing Issue on Github.

This is just the first in a series of posts that I intend on doing on Lability – although future ones will be much shorter but will focus in depth around the functions that I haven’t covered in this post along with some of the more interesting parts in more depth. However I expect that this will be a good starting point for you to get to grips with the Lability Module and start building and running test labs.

As per usual please let me know your thoughts on this post whether it’s via Twitter or via the below comments section and I hope you have enjoyed the read.

Awarded the MVP Award – What this means to me and the future for the community

The MVP Award is defined by Microsoft as the below

Microsoft Most Valuable Professionals, or MVPs, are community leaders who’ve demonstrated an exemplary commitment to helping others get the most out of their experience with Microsoft technologies. They share their exceptional passion, real-world knowledge, and technical expertise with the community and with Microsoft.

This means that within the different areas of the Microsoft Stack there are those out there that really believe that the world can be a better place when we come together as a united front and share the knowledge that we have.

This can be knowledge that we have gained through personal experience of working with the products that we find the most interesting and beneficial to our personal & professional lives or though being there as a point of call for other members of the community to reach out to.

One thing about the MVP Program that has always struck me as an amazing program was the willingness of the MVP’s to do what they can to help you, even if it doesn’t immediately help them in achieving anything, often giving away a decent sized proportion of their own time to do so and in reflection on receiving this award, over the last year I’ve been doing the same, although completely unware that I had been doing so.

I have attended a number of different events in the last year (for more details check out the Where I Have Been page) and have met a tremendous number of amazing people at all these events. It was the framework for the SharePoint & SQL User Groups within the UK that lead me to start thinking about reviving the PowerShell User Groups and I have blogged about this in this post and I have enjoyed every minute of it.

The future for the UK PowerShell User Groups looked good however with being Awarded MVP last week the connections that I will make from being part of the UK MVPs will hopefully allow for the User Groups to grow in the coming months/years so expect there to be news of new User Groups forming in the coming months across the UK.

To help the groups grow, I’ll be putting together an “Organisers Pack” which contain useful information and a collection of the tools, contacts and general tips required  which will help those interested in running a local group get it off the ground – however if in doubt get in contact with me.

 

However there is another aspect to receiving the MVP Award that I want to touch on briefly. As part of the MVP Program the MVP’s get the opportunity to help out in more community focused events, some ran by Microsoft, others ran by the community and others ran by non-profit organisations or the education sector. Giving back to the immediate communities is always going to be high up on my list of priorities however I am really looking forward to working with some of the bigger and more personally touching social opportunities over the next year.

 

This does mean that my calendar will be much busier but for me the end result is always going to be worth it.

Finally – A small shoutout to those that have supported me over the years and especially the last year and although I will not name anyone in particular, I’m sure that those people already know who they are!

2016 – 1 Quarter Down and 3 more to go and the Fun has only just begun!

Pulling the Community Together to Improve the Quality of PowerShell Modules

In a discussion that started on Twitter a while back with June Blender about the quality of the Modules being posted to the PowerShell Gallery I had an Idea on a way that we could help improve this from the community – using the tools that we have available to us and more importantly the expertise of the rest of the community to help shape and guide the direction for modules.

The Idea starts off with a simple GitHub Organisation, in this case this one – PowerShellModules – in which we set up a number of teams in that organisation. Some of the teams that I have had in mind include but are not limited to, Documentation, Unit testing, Practise guidance and then module maintainers.

This means that there will be a much more open way of directing the development of Modules for the community but still having the ability to allow modules to be developed in a way that allows others to add to them instead of branching out and creating a second module. This also will mean that there is a stronger focus on modules being updated as the overhead isn’t on a single maintainer but can be given to a number of multiple maintainers at any given time.

 

My thoughts around this would start with me porting my current ‘in progress’ modules to the organisation and building out the teams as mentioned above with a suggestions/RFC like repository that would allow us to drive this for the better of the community.

The end goal from my perspective would be to have 1 community recommended Module for a technology (examples being GitHub, Trello, VSTS, Slack etc) that has been widely community developed that covers as much of the functionality as it can without the need for a separate module.

We have a great community of people and I think that this is the right point in time to start the drive to improve the quality of what we all output to the community in a targeted and effort efficient method.

If your interested in getting involved please comment on this post with your GitHub User Name and I’ll get you added to the Organisation in due time but please keep an eye out for the Email from GitHub requesting you to join the Organisation especially in the Junk Folder Winking smile

#PSConfAsia – What an amazing experience!

So this post is a fair bit overdue and that’s because I’ve been very very busy as of late.

However let’s get right into it!

#PSConfAsia was an important event for me mainly because it was my first venture out of the UK in 20 years – yes 20 – which meant that I needed to sort out my passport – something that had been on my to-do list for about 6 years and it was my first real holiday EVER! It also happened to be very well needed and well timed venture abroad – for those that know me well will understand why the timing of this was crucial to me and perhaps for those that are still to get to know me better this may be something that I share with you in time.

So I had it planned that I would be in Singapore for a few days before the conference and that ended up being that I was in Singapore for 6 days in total which for my first dose of international travel this I think for me was more than enough – my previous “holidays” over the last 5-7 years have been a few days off at a time with almost no travel from the house so this was going to be a really big change for me.

The actual travel there was something I was a little anxious about due to me not knowing if I would enjoy flying or not. Luckily it seems that I’m not fazed by it at all and this has since lead to me looking to other events to attend/present at (more on this in a later blog post)

But #PSConfAsia was more than just a conference with a few days for holiday in-front of it for me – it was a chance to meet with an ex-colleague and also co-organiser of this event, Matt Hitchcock, which was a key driver for me to attend the event let alone put together a presentation for it.

So my travel to Singapore started off after #SPSCambridge as I was flying from Heathrow the next morning and on arrival to Heathrow and getting checked in I had the fun of being the guy randomly picked for Explosive Swabs checks whilst – You can imagine what was going through my mind at that point with it being my first every flight abroad.

The journey consisted of 2 7 hour legs and in between them a short layover in Abu Dhabi, luckily only for a little under 3hours which allowed me to kind of just relax a little and have a wonder round the Airport and have a look through the shopping areas. Also at Abu Dhabi the free Wifi was really slow so instead of any Social media catching up I watched some videos that I had prepped for the journey.

The Flights with Etihad were comfortable and had some useful amenities on board and in seat that included on board Wifi (unfortunately not free) a plug socket at your seat and also in seat entertainment system that also allowed you to chat with someone else based on their seat number – this could be a very useful addition to all flights and I would hope that the likes of EasyJet etc will install these on their flights soon.

However once I got to Singapore I had the conundrum that I would be without data or calls (not paying the silly roaming charges) but luckily for me I had researched about this before I had left and Singapore has a great selection of Tourist mobile sim offers that typically include upto 100Gb of 4G data with local call and texts for upto a 10 day stay for the equivalent of £15. That’s just amazing value and is indictivative of how advanced Singapore is in terms of its Infrastructure.

There was also another great thing in Singapore that anyone visiting should take advantage of. This was the Tourist travel pass which at roughly £15 gave you unlimited travel on buses or the Singapore MRT (like the London Underground/Overground systems) but this could also be replaced after 3 days with another one so I made good use of it to see as much of Singapore as I could whilst I was there and I did manage to see a fair amount of Singapore but still have other places to see too J (for the next visit).

On the Monday I decided that I would head to the hotel and drop off my luggage, have a nap and then make my way into the Centre to see where the Microsoft buildings were and have a general touristy nose around. I also on that day decided that I needed some comfort food – What better than Singapore McDonalds – who will deliver as well which is a little mad considering that they won’t in the UK.

However most of the fun started to slowly build from the Tuesday evening as that was when I got to meet Matt & Milton (another of the Organisers) for some food and drinks in the city area, which was a great experience and made me realise how expensive any kind of alcoholic drink is in Singapore with prices ranging anywhere upto £13 a pint. Owch!

102015_0006_PSConfAsiaW1

 

 

 

 

 

 

 

 

 

Wednesday started Early as another one of the Organisers Ben had arrived late on that previous evening and was in Singapore for work both Wednesday and & Thursday and he mentioned that he was heading into the office early that morning so I tagged along with him and he showed me what a typical breakfast was in a local restaurant in the centre. I have to say the Kaya Toast was really tasty and I’ll look forward to it when I next get chance to over there. Also whilst at breakfast I got chance to try out local coffee as well and again was not disappointed at all. I then spent most of Wednesday just being a tourist and seeing some more of the areas of Singapore (whilst riding the MRT and downloading Windows10 as I did so – gotta make use of my 4G data somehow) and then made my way back to the hotel mid-afternoon as to meet up with the organisers & some of the other speakers that had arrived – which included Ferdinand Rios from Sapien, Narayanan Lakshmanan from the PowerShell Product team, Benjamin Hodge from Kemp (also Organiser), Milton Goh (Organiser), Matt Hitchcock (Organiser) & Jaap Brasser and below is a picture of us all on that night.

102015_0006_PSConfAsiaW2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

As you can see we had a great time J

This was just the Wednesday night!

On the Thursday I met up with Gokan Ozcifci A SharePoint MVP who I’d briefly met at SPSLondon and was also speaking at PSConf and went and did some further sightseeing of Singapore which was goo as we managed to get around the Gardens at the Marina Bay Sands which was great fun to go round and see everything there. We also went to the SkyPark on the Marina Bay where you can see some amazing views including this cracker of a one below.

101915_2356_PSConfAsiaW3

 

 

 

We then had a small gathering that evening with some of the Attendees and a few of the other speakers too.

101915_2046_PSConfAsiaW3

 

 

 

 

 

 

 

 

 

 

 

 

 

And then the Friday (and the beginning of the Conference) and what an amazing day that was – I decided that I would go and meet Ravikanth Chaganti & Deepak Dhami at the Airport as they were flying in early that morning. I must say I am glad I did as it was a lot of fun meeting the both of them and having some time to engage with them on a more casual basis than it would have been at the conference. It can be seen below as to the fun that we had – I even set up my laptop to be an interactive Display board similar to the ones that the Taxi/Chauffeur drivers have as can be seen below.

101915_2356_PSConfAsiaW5

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Though Deepak didn’t know that I was coming to meet him so walked right past without even realising – Luckily Ravi Had arrived a little earlier so we managed to between us get his attention and then we headed to the hotel to let Ravi & Deepak check in, get ready and then we arranged to head to the Venue.

Now we arrived a little bit through the Keynote by Jeffery Snover but I was still able to get a quick chance to give him a personal thanks (even if the session was via a Skype call) as can be seen below (Thanks Jaap for the picture!) It is without a doubt that without the Invention of PowerShell I would not be finding a career in IT as fun as I currently do and will likely do so for many more years to come!

101915_2356_PSConfAsiaW6

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now throughout the day I had chance to chat to most of the presenters and some of the attendees and sat in on a few sessions where I could. After the day wrapped up there was an organised speaker’s dinner which was just fantastic and set in a great location with a great view over Singapore. However I had to leave the dinner early due to having not slept very well the last few days leading up to the conference.

I then as per usual for me when I’ve not slept very well (been like this for well over 12 years!) I find that I end up having a ridiculously long sleep – in this case almost 14hours and is typical that it happens on a Saturday as well! This then meant that I missed all of the morning’s sessions as well and arrived to the venue just in time to give my session which unfortunately I had a bad case of the Demo Gods Wrath! I’ve yet to revisit and restructure my session but this is on my to-do list for November! After my session there were a few other great sessions in which I need to get back myself into them in more depth – including Pester testing – my annoying phrase from PSConfAsia for the following week was definitely ‘Let me Pester you about that later’ or some other Pester related variant.

We then had a great after event food, drinks and prize giving event afterward at the truly most English pub possible outside of England and it was a great event. This below has to be my favourite picture from the after event (other than the picture in picture in picture in picture that we took) with Jason Brown from Domain.com.au (attendee / guest DevOps Panelist) and Sebastian (attendee that won the F1 tickets)

101915_2356_PSConfAsiaW7

 

 

 

 

 

 

 

 

 

 

 

 

We then after that night had a few of us meet up for a final lunch in a great little Indian in the Little India suburb of Singapore where Ravi made sure that we had some great food whilst we had the chance to.

101915_2356_PSConfAsiaW8

 

 

 

 

 

 

 

 

 

 

 

 

 

 

After we left we went and had a relaxing few drinks in the Singapore sun and then we had to make our way to the airport as it was time to get ready to depart and I was luckily enough to get even more time with Ravi to discuss a certain upcoming DSC project of mine and it was great to get some more insights from one of the most well known in the field and I was lucky enough to get myself a selfie with him (and I’m not a photo taking person) at the airport before we had to depart to board our separate flights.

101915_2356_PSConfAsiaW9

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

All in all the 6 days I spent in Singapore was an amazing adventure and it is one that I am hoping to be able to partake in again next year – hopefully it will be as fun filled as it was this year and I have been lucky to have made some amazing friends from it and I cannot wait to catch up with them all again in the near future.