Server Management Tools in the cloud

There are a number of things about my job that keep me awake at night or wake me up early and none of them are bad. It’s all about the excitement of what is coming our way in the management of our servers.

One of those excitement points as I call them was October last year when Jeffrey Snover demonstrated the ability to manage the awesome Nano server technology (another excitement point) from a cloud based console. No date was given and no expectation of the true goodness that was about to be delivered.

Well on February 9th the Server management Tools (SMT) went into public preview as an Azure service.


This new service allows you to use your Microsoft Azure Portal to create multiple instances of Server Management Tools, 1 instance for each server you would like to manage.

You need to have an SMT gateway which you can install on one of your Azure VM’s or in your datacentre.

Once that is done you are free to use this portal to carry out a wide range of management functions.

The service really comes into its own when you have a number of hosts running the new headless Nano server deployment. There is no way to manage these other than by remote PowerShell, you cannot RD into the console (there isn’t one). You can connect directly to make a couple of firewall changes and checkout the networking, but that is all.

The following screenshots taken from a Nano VM (which is also a container host) show the limitations of the direct management features


Nano server is featured widely on Channel 9 and Microsoft Virtual Academy, why not brush up on your Nano server skills before reading on.

The process of setting up SMT is not a difficult one but you do need to have an Azure subscription and to make it super-easy, here is a link to a 30 day free trial with enough credit to get you going (£125)

Rather than take you on a complete walkthrough which is already available here, I have produced a @Ch9 video which lasts around 30 minutes and takes you through the whole process of setting up VM’s the SMT service, gateway and management options.

You can find that here. This is the first entry in the new @TechNetUK How-to series on Channel 9, please feel free to send in requests for How-to guides to

Having deployed the full set of SMT tools to your Nano server it is then possible to run a remote PowerShell session directly onto the server. In addition to that the PowerShell console has the full command window experience and a Script Editor to boot.


And, there’s more

You can view, edit and add to the registry of the machine


you can also carry out all the below management functions.


A whole array of powerful functions that are just not available to you on the Nano interface (as it is).

It is particularly useful to be able to join a domain, rename the machine and check on roles and services available or installed.

The one gotcha you will see in the @Ch9 video at 22 minutes 40 seconds is the failure to connect to the Nano server from SMT.

Regular PowerShell users will know instantly that it is because the machine you want to manage is neither in the same domain as the management server or is it in the TrustedHosts list.

A simple one liner run in an elevated PowerShell session on the management server will fix this.



Set-item WSMAN:\localhost\client\TrustedHosts *

Now for a production system please replace the * with the IP address of the server you want to manage. For a lab system adding the * wildcard to open up all options is acceptable.


Once done, this error will disappear.

The flexibility of this service to manage your cloud or on-premises servers is what I like most. I can now be anywhere and as long as I have web-access I can use the Azure Portal to manage my entire Nano server (and full Server 2016 TP4) infrastructure (albeit one by one).

I hope you enjoy the video and do keep checking this blog and the @Ch9 page for new content.

Just how mobile is the Windows 10 experience?

Windows 10  a Mobile experience?w10mobility

I have been cooped up at home for the last couple of weeks, unable to drive or travel very far after shoulder surgery. This lack of mobility got me thinking about my normal working life and just how mobile I am and how does Windows 10 live up to one of its main aims, to enable the windows experience to be more mobile.

I should add that even though I will be talking about Windows 10 on a phone in this post I am not using the word ‘mobile’ to equate to smartphone. It is the mobility of the experience that is key here and not working whilst moving, there are key differences.


The question I will try and answer is;

Is there anything I can do at the office on a Windows 10 desktop device, that I cannot do with Windows 10 due to my location or device on which I am trying to work?

I deliberately exclude the availability of internet as important in this question. Internet access is assumed.

So I thought the best way for me to answer this question is by looking at all the activities I engage in, at work, as an IT Pro Technical Evangelist.

So lets start with where I work and what I use to do that work.

The Office (Work Premises)


I have added the parenthesis as I am a home worker and I use my own machine at home as you will see later on.

So Microsoft have provided me with a device to use. I currently have,as my main Domain-joined device, a Surface Pro 3 (i7, 8GB RAM 256GB SSD). Yes I count myself very lucky as it is a super piece of kit. I do covet a Surface Pro 4 or Surface Book and maybe my nice business manager Phil will be kind to me later on this year. If not I will get myself a Surface Pro 4 keyboard as they seem to make the whole experience so much better. Still, I digress.

When I arrive at work in Building 2 floor 2 section H (the hot desk section indicated by the discrete red arrow above) I plug this machine into one of the lovely Dell monitors (U2515) that the aforementioned Phil bought for us which allows me some real space to work with.

From here I am able to print through the location aware printing we use (controlled with Smartcard access at the printer), I can access all the internal file shares which seem to be disappearing in favour of cloud storage such as OneDrive and OneDrive for Business.

I have wired and Wi-Fi access to the corporate network and all resources to which I  am authenticated and authorised to access.

I have access to Cortana and all of my store apps which I am signed into with my Microsoft Account for personal apps and the Business Store Apps when I add an organisational account to the store.

So – that is my aim. The same level of connectivity and functionality or productivity that I have from that desk.

The Home Office


At home due to age and infirmity I use an electrically adjustable standing desk and i have a Dell 1650 Workstation (i7, 16GB RAM 1TB SSD and other local storage from various NAs devices and Local disks) with a four monitor setup. This may sound like overkill but when i bought it and se it up i was a full time trainer and a lot of my work was done from home with online remote training. For this I needed to make sure that I had enough screen real estate and grunt to manage many VM’s and applications. Suffice it to say I do not need all this now.

I have an Enterprise version of Windows 10 running the February 1511 update. I sign in with my Microsoft account and I have used Work Access to enrol my PC into device management. Our IT department (MSIT – this is worth looking at we have a huge network) use Microsoft Intune  (part of the Enterprise Mobility Suite) for this.

This is a pre requisite to be allowed to get onto the MSIT Virtual Private Network (VPN). Since my machine is mine and is not domain joined, that is the only way I can access all the resources and applications I want in the style I want. Having done this, I also use my Microsoft employee badge which is a smart card as a source for Multi Factor Authentication (MFA). It gets better, each time I connect to the VPN, I now receive an MFA request on my Lumia 950 smartphone which is either a phone call or an app authentication request.


The reason for this is that Microsoft have made a decision to ensure all external access to the corporate network relies on phone based MFA.

If I want access to certain internal websites or resources without connecting to the VPN I can use my smart card or a fairly new feature a virtual smart card. The Trusted Platform Module on my PC stores a certificate which is enabled by my physical smart card. I still have to enter a complex PIN but I don’t have to plug in my card every time.

So I can still print to my work printers as well as my home printers. I have full access to all the work resources and applications I have when in the office. It is my machine but I have policies enforced on it from Microsoft Intune,like a requirement for a work PIN to be created (part of the Microsoft Passport system). The Intune admin could also prevent me from copying any work related data into applications that are not controlled. A feature called conditional access.

I ought to mention that the Microsoft Intune and the Enterprise Mobility Suite are additional options that have additional licensing costs.

If I don’t want to spend any extra money I still have the option to use a VPN without the additional device management provided by Intune.

I could use my Surface Pro 3 at home as I have a dock and can use two monitors with that as well. Finally I could use Windows to Go. If you are an enterprise user with a copy of Windows 10 Enterprise then you can use a wizard to create a USB stick based version to carry around with you. This can then be plugged into any device that will run Windows 10 and that can boot from USB. The device owner is reassured that you cannot mess around with his data or hardware and you are safe in the knowledge that this BitLocker protected device is safe secure and you can remain fully functional on the road, at home or anywhere else that there is a device waiting for you.

At Customer Sites or events

Now I can dream that I will get to work in such idyllic locations, but it really doesnt matter where I am or what I want to do. Armed with the Surface Pro 3 or the Windows to Go stick I am pretty much bullet proof as far as devices go.

If,however, I am caught short and only have my phone with me, i have scored there as well. I have a Microsoft Lumia 950 as my issued work smartphone. You might have gathered by now that I am a bit of a gadget geek. This device is pure magic for me. I was lucky enough to given a prototype and a trial version so have used this for the best part of a year now.

As a phone it does what it should as a smartphone it equally holds up to most competition. It is fast, the camera is excellent, it runs Windows 10 so all my Universal Windows Platform apps (UWP) work seamlessly as well. It has more RAM and a faster processor (with six cores) than any PC I had had up until at least 2008 or so.

The big news now though is that this phone will turn itself into a PC just by adding a Bluetooth keyboard and mouse and either a large screen with Miracast or add the small dock and you can use wired keyboard and mouse and up to two additional screens.

I now use this as my presentation device at almost all events and meetings. It is also under Intune management and has access to the VPN, indeed it has app activated VPN as well so any application that requires VPN access automatically connects.

This phenomena is known as Continuum for the phone and is pretty much game changing.

The only downside is that until developers have written more UWP apps, the number of apps that will run in the large screen is small. I am glad to say that all my productivity apps as well as Microsoft Edge do run in a full screen.

So now from playing films (sorry movies) and videos whilst away from home to presenting awesome slides and demonstrations it can do it all. All the other brands, are now playing catch up with turning the phone into a single device to do everything on!

While travelling

I added this as the only really mobile part as in moving while using technology. I often work when I am travelling on a  train or plane. All the above applies although I don’t often use the phone for anything other than sharing my 4G EE connection with my Surface Pro 3 to get some reasonable connectivity.


I appreciate that I haven’t covered many of the Windows ecosystem devices in this post. Simply because I don’t use them day to day.

I haven’t seen a HoloLens up close. I don’t have access to a Surface Hub at home or in my working day very often. I do have an Xbox One but rarely use it for work, other than watching great Channel 9 and Microsoft Virtual Academy content. Indeed there is a load of very good content up on the Enterprise Mobility Suite why not take a look at some lab walkthrough videos I recorded showcasing the abilities of the Enterprise Mobility Suite?


I have all I need to work anywhere on any device (even non- Windows devices using Office 365 and the Enterprise Mobility Suite).

Windows 10 fits the bill for everything I need to do.

Now to find that nice man Phil and get hold of the Surface Book I truly deserve.

The Life of an IT Pro: Past , present and future


In my day to day role I am constantly reminded of the fast pace of our new mobile-first, cloud-first world. You only have to look at the Microsoft Azure cloud services or the Office 365 products and services to see just how rapidly things change.

This constant refreshing of our IT Pro world started me thinking about the old times, the old products and what life was like way back when. I also started to get a little uneasy about what is going to happen in the future to the traditional IT Pro.

So why don’t I take you on a journey down memory lane, through to the modern day and what might be around the corner for us all in the future.


My first IT Pro role was back in 1994 when I was serving as a Police Officer in West Midlands Police. There were the usual political wrangles of procuring kit, software and resources to provide a new network, isolated from the main corporate network on which we would carry out a project to provide Crime Mapping, Human Resources planning etc.

My role was that of server hugger, sorry, Server Administrator. We faced several big issues with hardware and software but eventually went with the following:

A Pentium processor, not much RAM, a tiny hard disk running Microsoft Windows NT3.1 and five client PC’s running Windows for Workgroups 3.11 over a thin net Coax network.

The majority of you reading this will not have seen ANY of these things let alone used them. The interesting thing is that despite the tiny RAM and tiny hard disk it was plenty for what we needed at that time.

The operating system came on 3.44 HD Floppy Disks, or as one young delegate at a recent event said to me, “ah yes isn’t that the save icon?!”


To install the OS you had to hand feed the machine a huge number of these floppy disks in order, and if one was wrong, corrupted or an error occurred you had to start again.

Eventually we had the system up and running and we managed to use the MapInfo product (now part of Pitney Bowes) and its Map Basic development language to produce a very interesting crime mapping product, the first of its kind in the UK and the forerunner to a lot of today’s more intelligent policing strategies based on data analysis.

I need to add here that I had nothing to do with the development other than some rather neat Excel Macros to feed in the data. That was all down to a PC colleague John Kelly and the boss Supt Gil Oxley. (Gil went on to run IT in West Midlands Police and to form his own company providing HR resourcing software for rostering shift patterns. I don’t know what John is up to now).

What problems arose then that we still see today?



– The term DevOps didn’t exist but the common sense processes between Developers and Operations staff did. But how often was it used? Well, the team was three strong, Devs and Ops worked together and there was only one application so we didn’t have much of a problem. There was no cloud, no real internet and many fewer threats too. The one challenge I faced most often was nosey police officers taking off one of the Ethernet coax terminators and sending the whole network down.



– There was no TCP/IP stack for Windows so it ran on NetBeui, and because that dint follow the OSI 7 layer model it wasn’t allowed to run on our corp network so we had to build our own. It was slow and clunky – see above for people taking it down at will.



– Installations took forever, CD Roms had just come in but our software was distributed on Floppy Disks. Hard Disks were slow and backup took forever.

Remote Access / Enterprise Mobility


– This was only possible by modem which made it very very slow, and was only available for administrators on the server, there was no terminal services or RDS back then. There was no concept of mobile computing, especially not in a Police environment.



– These were released as Service packs once in a blue moon and were essentially a whole load more floppy disks. If there were bugs and you had an account and a support contract you had access to download hotfixes by modem from the FTP site (slowly).



– This was one of the hardest challenges and to some degree is still the same today. The difference here is that we were training from scratch with Windows. Almost all Police systems were not running on GUI Operating Systems. The average user was initially actively against the extra work needed to use the systems and didn’t like having to use the HR system for rostering either. The majority did not have PCs at home and almost none of them had any form of mobile communications, although some had new-fangled message pagers.


Even though I am a technical Evangelist and I don’t have a regular IT Pro day job, I do help out family members and friends with their business IT. So since this is current and real I will obfuscate identifiable parts of the present section.

The scenario is a small business with around 20 users. The users and management are not hugely IT friendly or literate (by today’s standards), but they all have smartphones and laptops / PCs at home.

The network in use has Cat 6 cabling and Gigabit managed switches for each segment. There are three servers; one running Windows Server 2012 R2 Essentials, one is the old Windows Server 2008 Small Business Server and the other is a Server 2008 R2 RDS Session host for remote desktop access. The new Essentials server uses Azure Backup and is integrated into the Office 365 business premium subscription. The network uses IPv4 and IPv6 and the phone system has just been replaced with a fully Ethernet VoIP system. The card payment system has also been integrated and no longer uses the telephone network but is connected through the Cable internet system running at 100Mbps.

There are 15 Windows PCs, 3 different WiFi networks and all users have full remote access both to the RDS Server and through Essentials to their desktop PC should they need it.

There is LOTS of RAM, large fast processors and masses of RAID 0 and RAID 5 hard disk arrays.

There is also a LTO 4 tape drive for backup locally which is removed from site daily.

The administration of this system is all completed out of work hours and mostly remotely.



– All applications in this network are commercial, either off-the-shelf or LOB applications, and the providers require some form of remote access to perform updates. The OS is Windows and is managed by online updates. There is no concept of DevOps in this business as there are no in-house or remote developers.



– Fast, Gigabit Ethernet fully-switched network with a small business router connecting to the outside world through cable internet and VPN.



– Installations are often carried out through remote access, take a few minutes and the BITS are always downloaded through a seriously fast internet connection. Media (if used) is either USB or DVD and the backups are rapid whether to tape or Azure.

Remote Access


– The business cannot survive in its current model without remote access. Several people work from home, the bookkeeper is often remote only. All users regularly connect to access data which has to be stored on site for regulatory reasons. Office 365 is the email solution.



– These are automated, online and fast. No longer managed by WSUS and will soon be using the Windows 10 model for client updates current branch. Servers are updated by Windows Update on a regular basis. No more FTP or disk updates. 3rd party suppliers often connect to update their applications directly.



– From users resenting PCs, they now require faster, better PCs and applications. Downtime is not accepted and the users have a hundred ways to communicate the smallest upset to service availability. All users connect to email remotely either by PCs or smartphones. OneDrive keeps them connected to work documents as does Office 365.


This is the bit I get to fantasise about; such as how easy being an admin will be in the near future.

Obviously hardware will never be the limiting factor, and applications are continuously integrated and deployed so they are always up-to-date and functioning.

The local datacentre is running on Windows Server 2016 with Azure stack deployed. Developers are using a mixture of Visual Studio with Team Services to manage deployment.

So what will I do as an IT Pro 10 or 15 years from now?



– This is the area I expect to change most in business IT systems in the future. I would imagine all IT Pros will need to understand the mind and processes of a Developer and ensure that software defined infrastructure, software defined networking and storage are all configured to provide what they need in Dev, Test and Production at all times.



– Obviously all networking will be at 50GBe from server to client and faster for the core services. The big change for the IT Pro (other than the cable chap and the rack installation guy) will be that everything is software defined. There will be one console to ensure that the whole stack from storage to compute, through networking is scalable, flexible and self-service within set parameters.



– Windows Server 2016 advances mean that all storage is directly attached and running in Storage spaces direct with Failover clusters hosting the necessary Nano-based Virtual machines. All high value data is stored on shielded VM’s protected by Active Directory-based attestation protected by HSM and TPM.

Remote Access


– Enterprise Mobility is the new normal. All users have access to thousands of SaaS apps with SSO provided by Azure AD integration. Microsoft Intune and Azure AD join enable cradle to grave management of Windows and other devices.



– These will be delivered in secure packages, through online updates, perhaps staged in cached areas of your own network and not just for Windows but for all technology that requires updates to software and firmware.



– If the More Personal Computing vision of Satya Nadella has been successful, the methods of interaction with your PC will not stop at Holograms. The guys and girls at MS Research will be seeking out new ways of interacting with and controlling our technology.

You know what that sounds a lot like NOW, just faster and everyone will have adopted it!

In all seriousness, the IT Pro needs to evolve with each generation of Developer, User and Operating System, to ensure that his or her skills are kept up to date and relevant.

Don’t get left behind – make sure you know what’s coming by checking out the Microsoft Virtual Academy and Channel 9.

After all, we live in a world where PowerShell now works with Linux and network speeds to end users have moved from 5Mbps to 10GBps.

In 1994 when I built my first network, neither PowerShell or Linux existed (Ed: Linux did, but Linux 1.0 had not actually been released at the time of the network install).

So what will happen in the future? One thing is for sure it, isn’t going to be the fastest or the fittest that survive. Rather, it will be the one who is most willing to adapt to change.

If you are an IT Pro in today’s world just ask yourself – am I skilled and qualified enough to move into the future? Why not start by coming along to one of our #InnovateIT events and see what’s changing in Enterprise Mobility, Cloud Infrastructure and Datacentre Infrastructure, not forgetting of course the Operations Management Suite. You can register for an upcoming session near you here.

#InnovateIT- UK IT Camps, new content, new events

The @TechNetUK #InnovateIT events are changing – Find out how and why.


One of the most noticeable differences between the ‘Old World’ and the new Mobile first, cloud first world of Microsoft is that the content rarely stays the same for more than a few months.

This rapidly changing theme has an impact everywhere, the products, the training courses, exam content and now to our very own @TechNetUK #InnovateIT IT Camps.

For the last six months we have been running a series of events with hands-on labs and audience participation covering What’s new in Windows 10 Enterprise, What’s new in Windows Server 2016 and What’s new in Cloud Infrastructure based on Azure Resource Manager templates.

Well, out with the old and in with the new we have three new areas of content with right up to date material and labs to use. Please note you still need to bring your own device to use the labs.

This device only needs to have internet access and a browser. A keyboard is highly recommended as the iPad / Tablet without keyboard experience is hard to use.

A screen size in excess of 8” is also recommended , especially if your eyes are as old and tired as mine are.

The new areas we are covering between March and July are:


Empowering IT to protect and enable productivity

This one day event covers the following topics

1.Identity and Access Management

2.Increase productivity and reduce helpdesk costs with self-service and SSO experiences

3.Manage and control access to corporate resources

4.Mobile Device and Application Management

5.Information Protection


Increasing efficiency, and scalability with Windows Server 2016 and Microsoft Azure

This one day event covers the following topics

1.Datacenter Infrastructure

2.Azure Infrastructure


This one day event covers the following topics

1.Introducing the Microsoft Azure Stack

2.Visibility, Insights and Security Analytics with the Operations Management Suite

4.Cloud-Enabled Protection with the Operations Management Suite

5.Hybrid Automation with the Operations Management Suite

The days takes the form of a mixture of lab exercises, demonstrations, presentations and useful group chats about areas of technology that will be key to most if not all businesses over the next few months and years.

This obviously keeps us in the @TechNetUk office on our toes in a technical learning way, which is good. It also means we can come around the country and pass on the latest greatest advances in the Microsoft world to you all.

Register HERE to attend, the site is being updated to reflect these changes over the next few days so remember to come back if they are not yet available.

Come along and meet your friendly neighbourhood Technical Evangelists and Microsoft MVPs get learning the future of IT.

You can also keep up with the topics at the Channel9 portal and the Microsoft Virtual Academy

Finally you can also practice the skills we demonstrate in the #InnovateIT events by using the TechNet Virtual Labs

Microsoft Azure Stack: What is it?

Over recent weeks whilst running our InnovateIT IT Camp events around the country I have been asked one question more than most. Microsoft Azure Stack: What is it?

Well here goes for an Azure Stack 101 explanation.image

For several years now Microsoft has been growing its Public cloud offering Microsoft Azure. The services are hosted in datacenters around the world and provide access to scalable compute, storage and networking for customers to deploy their Virtual Machines, Web Sites and many other business services.

On February 3rd this year the Microsoft Azure Stack Technical Preview (TP1) was made available for download.

Microsoft Azure Stack takes the power of Azure and makes it available in your own Datacenter.

I recommend taking a look at the many resources available on MVA and Channel9 to get up to speed.


MVA Courses



Channel 9 resources

The rest of this post will be a short introduction to what is available in the recently released Technical Preview of the Microsoft Azure Stack and what you will need to try it out.

Be warned the single node Proof of Concept preview has some fairly hefty hardware requirements (it is meant to run in a Datacenter after all)


I am unable to show an install, configure and run post as I don’t have the hardware available to do this.

The Hardware above is in itself an issue that a lot of small businesses and individuals will face when trying to skill up with Azure Stack.

So what does TP1 look like?

Well, once you have installed the Azure stack BITS on your 12 Cores with 96GB RAM and 5 hard disks, you have a number of services available. As far as Compute is concerned we are limited to Windows Server 2012 R2 for now. The number and type of services are being added bit by bit.


Watch Jeffrey Snover demonstrate this here.

In addition to compute you also get the same Storage solutions as Azure provides, Blob storage, table storage etc. with the same features and properties.

The benefit in using Azure Stack is that you can now deploy your services anywhere, Microsoft Azure, Azure Stack in your datacentre or Azure Stack in your hosting partners Datacenter, or spread the services across all three.

Azure Stack allows the IT pro to provide a plan to developers that limits compute, storage and other resources to prevent sprawl in your Datacenter. This is similar to the current Windows Azure Pack offering. The difference is that Azure Pack relies on an underlying System Center deployment to provide tenancies. Azure stack uses the changes in Windows Server 2016 and the Azure Stack code to manage the virtualization and networking directly without the added overhead of System Center.

Azure Stack also uses Azure Resource Manager (ARM) templates, this allows your users to deploy resources within their own Azure Stack subscriptions which you provide to them as part of the plans you create.

There are a whole bunch of ARM templates available in GitHub which you can deploy as is or amend to fit your own infrastructure. You can deploy these to Azure or Azure Stack because the same code is running on both platforms.

Azure Stack will allow you to make your infrastructure available to the development community within your organisation without losing control of it. The ability to allow continuous deployment and integration using Visual Studio or PowerShell or GitHub makes this a perfect DevOps tool

The Azure Stack will evolve and develop as the months go by and I will post more when the next TP is available.

Time to get the Preview or at least watch Mark Russinovitch and Jeffrey Snover launch the Preview here

What should I study? Which Exam should I take?

In the TechNet UK office we have an open email address for people to ask us all manner of questions and even sometimes complain about us or praise us… but mostly we like it when people tell us what they want from us, that way we know our audience is getting what they want. In this case information about Azure exams.

The email address is, please use it. Tell us what you want from us!

This post is in answer to a very interesting question and one I come across all the time in my role as both a Technical Evangelist and also as a Microsoft Certified Trainer (MCT)

The two questions I want to address today are:

What should I study?

Which exam should I take?

This is in response to one of our IT Pros out there who asked.

Hi, my name is xxxx and i want ‘CCNA’ OR ‘Microsoft Azure Specialist certifications’ which 1 is best for me please tel me and tell me online certification i want both of one, which is best online university?

There are a number of variables in this question which prevent me giving direct advice to the keen learner.

Everyone I talk to is in a different career stage and on a different career path.

So I will assume that the question is being posed from the perspective of a fairly new entrant to the industry and that the questioner is keen on Infrastructure, networking and cloud services.

So to the certifications.

CCNA or Cisco Certified Network Associate is not an easy ask and requires access to either a test lab of Ciso networking kit such as routers, switches etc or to simulation software (both is better). The certification is all about configuring switches and routers as well as networking fundamentals areound TCP IP v4 and v6 as well as the TCP multi layered model of networking, right down to what makes up a TCP Packet.

Cisco have a network academy and many universities offer this course, including the Open University.

The certification can be obtained in two ways. passing a single big exam or taking two smaller separate exams. Neither contains easier questions but people often run out of time in the larger exam. You do get the CCENT certification with the first exam.

The main difference between Microsoft tests and CISCO ones are that once a question has been answered in a CISCO exam you cannot go back.

This certification is widely valued and is very useful to get a job in a company supporting a physical network.

For study and training you would need to go to the CISCO website.

The second option asked about was an Azure Specialist Certification. there are three of these and if you pass all three then you gain the MCSD or Microsoft Certified Solutions Developer. (This is the shortest route to that certification which is the developer equivalent of the MCSE)


The three Azure specialist certifications are shown above. NOTE the MCSD, like the MCSE require re-certification every three years.

Which one of these you take very much depends upon your skil set and focus at work.

70-532 is the Developer exam

70-533 is the Infrastructure exam and

70-534 is the architects exam which actually contains bits of each.

From this page you can access the exam objective domains and details of training courses.

If you want less formal training, then you can try the Microsoft Virtual Academy or Channel 9, both of which have a mass of content on the topic. The links above wil take you to Azure specific topics on both platforms. I thoroughly recommend them to you.

Finally Microsoft run a learning community, here you can join study groups, hear about others stories and generally feel part of that IT Pro community and Dev ones for studying the Microsoft Certifications.

I moderate a couple of forums on here including the Office 365 study groups.

For any Microsoft Azure exam you will need access to a subscription. These are available in lots of ways.

You can sign up for a free trial, or use your MSDN subscription Azure credits, but make sure when signing up for a subscription that it is one that includes the credits.

Or you can come on one of our Windows or Hybrid Cloud InnovateIT events, register here and you will receive a 30 day free Azure pass.

The bottom line is that you must have a detailed knowledge of all the areas in the exam topic lists.

These are not easy exams and bear in mind that the content of the exam does not change as often as the Azure services do, so a certain knowledge of a few months past will help when sitting these.

As for which you take CISCO or Azure it depends what job you want or are doing,personally I would do the Azure ones and then the CISCO one, since the Azure networking knowledge is not directly linked to knowing your CISCo binary AND and Or ing. since Azure takes care of most of the networking for you.

You can sign up for an exam here.

You can download the latest technical certification roadmap here as well

Good luck

Windows Containers: What they are and how they work


I thought a brand New Year warranted a post on some brand new technology that I predict will have a huge impact on us this year.

The It Industry is a maze of Three Letter Acronyms (TLA’s) and Buzzwords they seem to proliferate at an ever-increasing rate. One of the latest ‘fads’, ‘trends’ or buzzwords that has gained a great deal of momentum over the last 12 months is the wonderful world of containers.

Hopefully, if you currently don’t know what they are or why we should be diving in feet first into this new(ish) technology then after reading this post you should have a clearer idea. Why not get testing the Microsoft implementation with Windows Server Technical preview 4 (TP4) (get it here) or on your Windows 10 client Hyper-V using PowerShell by followinf the steps Here.

There are also a bunch of free MVA courses here and to finally quench your thirst for knowledge, Channel 9 has a containers channel here.

So having prepped you for the journey into Containers why not take a quick look at an intro video on Channel 9.

Microsoft has been in partnership with Docker for some time,integrating their container technology into Microsoft Azure and Windows Server 2016. With TP4 Microsoft have also introduced both Windows Containers and Hyper-V Containers, both of which carry out the same function of providing a platform for lightweight portable application to be hosted, sclaed and reworked in seconds with differing levels of isolation.

At its most fundamental level container tehcnology is a way of packing many identical, similar or completely different applications in isolation on the same host (or ported to the cloud)

I know, we can already do that with Hyper-V and other hypervisors. BUT this technology only requires one base operating system to be in use onthe container host, all the containers use links to that operating system. A container host can be a physical host or a virtual machine. The container analogy is similar to the way a differencing disk contains only changes to the parent installation, the base OS image is never changed.

Microsoft Container technology now provides two base images nanoserver and Windows Server Core. There are also two types of container which provide the same functionality but different levels of container isolation as shown below.


A Hyper-V container will be run in a very lightweight virtual machine, ensuring that the isolation is total. This will allow Microsoft Container technology to support a full public multi-tenancy solution.

The clever part of the solution is that its quite easy to create, amend, copy and recreate container images and then deploy them as containers running modern appplications, web browsers etc. on your server.

Normally my posts include a whole bunch of how-to steps and advice but all of these are already listed in a step by step way on so I will limit myself to your first step.

From either a Windows 10 machine that runs Hyer-V client or from a Windows Server running Hyper-V fire up your best friend (PowerShell) in an administrative window. Then enter this one command.

 wget -uri -OutFile C:\Install-ContainerHost.ps1

This creates a new script which will then set your system up as a container host, so run this comand straight afterwards.

New-ContainerHost.ps1 –VmName Tp4ContainerHost -WindowsImage ServerDatacenterCore –Hyperv

Once the script completes you will have a running container host ready to start experiementing.

Be warned htough, this script takes a long time to runas it downloads the base OS images.

Happy Containerising!

Devops Practices – Infrastructure as Code

This post is provided by PowerShell MVP Richard Siddaway

One of the biggest issues in IT is the apparent split between development teams and operations teams. I deliberately say apparent because as with many things in IT the hype doesn’t always match reality. DevOps is the new IT ‘wunderkind’ that will cure all your IT woes at the wave of a wand.  OK cynical view put aside for a while – the DevOps movement does bring benefits and in many organizations there is a dichotomy between development, operations and purchasing. The last one isn’t talked about much but how many times have operations between the last to find out that a new application has been purchased. Oh, and it has to be installed by the end of the week!

One of the biggest stumbling blocks in IT is the movement of applications into production. The development team have an environment where everything works and when the application is passed over the fence to operations everything falls apart. There a number of reasons for this including:

·         Development infrastructure doesn’t match production

·         Development machine configurations don’t match production

·         Load in production greater than development

·         Permissions different in development and production

·         Application configurations aren’t documented

Infrastructure as Code (IaC) is all about applying software development techniques, processes and tools to manage the deployment and configuration of your servers and applications.

Why do you need to do this? And what exactly is IaC?

Let’s start by viewing what it’s not. Since PowerShell’s introduction 9 years ago many people have written code to configure their servers. You might use Desired State Configuration:

Configuration ODataSetup


  param (





  Node $node


    WindowsFeature IISbasic


       Ensure = ‘Present’

       Name = ‘Web-Server’



    WindowsFeature HTTPtracing


       Ensure = ‘Present’

       Name = ‘Web-Http-Tracing’

       DependsOn = “[WindowsFeature]IISbasic”



    WindowsFeature BasicAuth


       Ensure = ‘Present’

       Name = ‘Web-Basic-Auth’

       DependsOn = “[WindowsFeature]HTTPtracing”



    WindowsFeature WinAuth


       Ensure = ‘Present’

       Name = ‘Web-Windows-Auth’

       DependsOn = “[WindowsFeature]BasicAuth”



    WindowsFeature ManCon


       Ensure = ‘Present’

       Name = ‘Web-Mgmt-Console’

       DependsOn = “[WindowsFeature]WinAuth”



    WindowsFeature Modata


       Ensure = ‘Present’

       Name = ‘ManagementOdata’

       DependsOn = “[WindowsFeature]ManCon”



    WindowsFeature Desk


       Ensure = ‘Present’

       Name = ‘Desktop-Experience’

       DependsOn = “[WindowsFeature]Modata”





       RebootNodeIfNeeded = ‘True’






This is a configuration I used to create a demo machine for a talk on Odata that I presented at the PowerShell Summit NA 2015.  Its purpose is to ensure that a number of Windows features are installed on the machine including the web server (IIS), Odata management extensions and the desktop experience. The configuration handles the reboot.

The configuration is run:

$node = “W12R2OD01”

$path = ‘C:\Scripts\Mof’


$params = @{

 Node = $node

 OutputPath = $path


ODataSetup @params 


This creates a MOF file which is pushed to the remote machine to install the configuration:

$cs = New-CimSession -ComputerName $node

Start-DscConfiguration -CimSession $cs -Path $path -Verbose -Wait -Force


Other configuration tools such as Puppet, Chef, Ansible or Salt may be used instead of, or in conjunction with, DSC.  You might just use a script to configure parts of your new server:

Get-NetAdapter -Name Ethernet | Rename-NetAdapter -NewName ‘LAN’


## set network adapter and IP address

$index = Get-NetAdapter -Name LAN | select -ExpandProperty ifIndex



$ipv4address = ‘’

New-NetIPAddress -InterfaceIndex $index -AddressFamily IPv4 -IPAddress $ipv4address -PrefixLength 24


Set-DnsClientServerAddress -InterfaceIndex $index -ServerAddresses “”

Set-DnsClient -InterfaceIndex $index -ConnectionSpecificSuffix “”



##  join domain


$newname = ‘NewServer’

$cred = Get-Credential -Credential

Add-Computer -Credential $cred -DomainName manticore  -NewName $newname  -Force -Restart


Or you might use a combination of scripts and configuration.

So, you’ve automated your build process to a greater, or lesser, extent. You have a reproducible process but it’s still not IaC – you’re using software to manage your infrastructure but you need to take a few more steps into the development world.

So what do we need to do?

At the minimum you need to:

·         Apply source control to your infrastructure code

·         Put together a process that takes a new application build, creates the required infrastructure and deploys the application

·         Build tests into your infrastructure code – and keep those tests up to date as you make changes

·         Make sure that infrastructure used for development, test and production is created using the same code

Source control at its simplest is a matter of tracking the changes that are made to your code, who made them and most importantly gives you an easy way to roll back changes. Think of the scenario where you have a number of web farms each supporting an important web application. You need the server configuration in each web farm to be identical across the farm. Assume that your configuration code is changed by an overzealous junior administrator. No one will know until you need to deploy a new server into the farm to cope with the Christmas rush. Oops. The deployment works the difference in configuration brings the application to its knees.

Source control means that you can restrict who can change the code. It means you know that a particular version will work and that you know where it is. One piece of grit removed from the machine and ultimately less work firefighting.

Creating a build process that will create infrastructure and deploy the application is one endpoint of IaC. Your developers commit their changes, and the build process spins up a new server and deploys the application. This means that testing is always performed on a clean machine with a known configuration. It’s possible with a source control product such as Team Foundation Server (TFS) to create a number of builds – development, test and production for instance – and chose which one is targeted.

Don’t forget the change control! Deployments into development, and possibly test, should be standard changes – moving a new version into production and your organisation needs to make a call on how much risk it can accept. There are organisations that roll out several new versions of an application per day – this includes building new infrastructure for the application each time.

Testing is critical to any build process. You need to develop a set of tests that can be run, preferably, automatically, when your build process is changed. If you’re using PowerShell for your code you can use the Pester module (ships in Windows 10 and Server 2016 or can be downloaded from the PowerShell gallery). A simple example of creating tests with Pester:

function divisiontest {

    param (




    return $a / $b




Describe -Tags ‘BasicTests’ ‘divisiontest’{

    It ‘divides positive numbers’{

        divisiontest 6 3 | Should Be 2




    It ‘divides negative numbers’ {

        divisiontest -6 -3 | Should Be 2



    It ‘throws error for divide by zero’ {

        {divisiontest 6 0} | Should Throw




The function divides the first parameter by the second. The tests are wrapped in the Describe block and in this case test for correct division of positive and negative numbers. The third test is to determine if the function correctly throws an error when attempting to divide by zero. The results look like this:

Describing divisiontest

 [+] divides positive numbers 70ms

 [+] divides negative numbers 25ms

 [+] throws error for divide by zero 45ms


When testing you IaC routines you’ll need to look at testing things like installed windows features and roles, network configuration (IP addresses and subnet masks) or the existence of particular folders (use Test-Path).

The last point about making sure you use the same code to create the infrastructure for development, test and production is important as it ensures that you have consistency across the environments. This prevents any assumptions made in development or testing causing problems in production. Also, you know now know the configuration you need, its creation has been tested and you’ll have a smooth roll out into production (hopefully).

Put all of the above into play and you’ve taken the most important steps into the brave new world of DevOps – you’ve started.  Implementing IaC is a great first step but it will all fall apart on you if you don’t have good change control. I’ve seen many organisations that claim to be using change control but in reality they’ve created a bureaucratic overhead that doesn’t achieve its goals because:

·         It doesn’t have the authority to stop dangerous changes

·         Its viewed as an overhead rather than a protection

·         It purpose isn’t understood – its viewed as a rubber stamp for change

Make sure your change control process is robust and works. It’s quite telling that in the Phoenix Project ( the first thing that is implemented is a change control process that works. If you’ve not read the Phoenix Project its highly recommended as an introduction to DevOps. The book spends a bit too long on the scene setting and problem description as opposed to the solutions but it’s well worth a read.

Finally, should you use Infrastructure as Code for all applications? Ideally yes but (strange how there’s always a but) you will derive most benefit from using IaC with applications that change frequently, or are being developed and rolled out in a phased manner. The one-off deployment of a COTS application isn’t going to be the best place to start though ultimately your IaC processes should support that as well.



Demystifying DevOps Behaviours

In an effort to demystify the new and exciting world of DevOps and DevOps culture, I am hosting a series of guest posts on this blog over the next few months.

Firstly we need to have a look at just what those behaviours are. To do this, I have enlisted the help of a number of Microsoft Most Valuable Professionals (MVPs).

When I ask attendees at our InnovateIT UKITCamps what DevOps is, the answers I get are less than confident and often conflicting, as shown in the Elephant slide below.


The discussions often lead on to what behaviours and practices make up the modern culture of DevOps.

I have listed them below

•Infrastructure as Code (IaC)

•Continuous Integration

•Automated Testing

•Continuous Deployment

•Release Management

•App Performance Monitoring

•Load Testing & Auto-Scale

•Availability Monitoring

•Change/Configuration Management

•Feature Flags

•Automated Environment De-Provisioning

•Self Service Environments

•Automated Recovery (Rollback & Roll-Forward)

•Hypothesis Driven Development

•Testing in Production

•Fault Injection

•Usage Monitoring/User Telemetry

Rather than try and address these practices in a single post, I thought it would be really interesting to get the thoughts of some industry experts to help out. What better way of doing that than asking our MVPs to pick a practice and explain it from their own experience?

First up is Richard Siddaway a PowerShell MVP who blogs here and you can follow him on Twitter here

You can find his post  on Infrastructure as Code, here

The UK IT Camp Roadshow travels to Cardiff and Birmingham.

In December we travel to Cardiff and Birmingham to deliver the next phase in our IT Innovation series of IT Camps.

We have four UK IT Camp events in December.

1st December – Radisson Blu hotel Cardiff       –    Azure (Cloud Infrastructure)

2nd  December – Radisson Blu hotel Cardiff     –   Windows 10 Enterprise

9th December  – The Studio Birmingham              –   Windows 10 Enterprise

10th December  – The Studio Birmingham            –    Azure (Cloud Infrastructure)

(Register by clicking the links above)

These are full day events with at least 60-70% hands on labs – so bring at least one Device that you can use for comfortable lab use all day. (12-15inch screens are good)

If you have two, all the better as you can have the pdf instructions on a separate screen.

For the Windows Events we will cover

 Application compatibility,

Identity and Security,

Configuration Management,

Modern Provisioning Practices,

 There are a whole list of demos and labs to do. As well as some top rate instruction and discussion.

For the Azure events we will cover

Deploying Azure Infrastructure using ARM templates (GitHub, JSON, VS Code)

Designing Azure Compute and Storage for best performance

Designing Azure networking for advanced security

Designing Site Recovery and Migration (ASR)

Designing Identity Solutions (Azure AD)

 You definitely don’t want to miss this. We are covering many new topics, techniques and skills that will be invaluable in this modern cloudy devopsy world we are entering.

Register at the links above.

Come and see us you won’t regret it.