DevOps – a word or a Brave New World?

  1. DevOps – A Journey into Open Source Tools

Once a year the team of Evangelists I work with gather together and spend two days engaged in a variety of different pursuits. This year we made our way to the Microsoft Research Labs in Cambridge.
Whenever I make the journey to this lab I do so with trepidation. I work in a large team of very clever people there are many degrees, several Doctorates and number of patent holders amongst them, but when we enter the hallowed ground that is the MSR Lab, we all feel a little awed in the presence of a large number of people who on the surface appear normal, ordinary types but actually have brains the size of a small planet and what’s more use those brains to develop useful technology for us all. These technologies include

Bing Translator
Clutter (For Office 365 Outlook)
And many more.
So for our humble team to turn up and expect to produce something comparable in the inaugural UK DX DevOps Hackathon was always a stretch too far. One thing is for sure though the two days were going to be fun, noisy and full of problems just dying to be solved by the wonderful world of DevOps.

First – What is DevOps, why do we need it and how do we use it?

DevOps has a Wikipedia definition of

DevOps is a software development method that emphasizes communication, collaboration (information sharing and web service usage), integration, automation, and measurement of cooperation between software developers and other IT professionals. The method acknowledges the interdependence of software development, quality assurance (QA), and IT operations, and aims to help an organization rapidly produce software products and services and to improve operations performance.

Well that is a mouthful of long words.

I describe DevOps as common sense and the way that all organisations should always have been managing their business. Rather than spend many of my words (in a self imposed but small all the same word limit) explaining the theory, have a read of my colleague Susan Smith’s blog post on TechNet from January this year.


So what was the aim of the two-day event?

The department I work in is primarily staffed with Developer Evangelists with the aim of assisting Developers to use the Windows platform for their applications. There are a few of us that are IT Infrastructure evangelists and the exercise was devised to skill us up as a team to be able to help our customers to embrace the culture that is DevOps to allow for more agile and robust application releases and development.

Essentially we were to have an idea, create a solution and make sure the process used a number of the DevOps practices, with the aim of the solution being publically available in a Git Hub repository.git


As a team we can then visit customers and help them to embrace the culture and practices in their own organisations.

That was the theory.

The first hurdle was getting to the Lab in Cambridge. If you have ever driven in Cambridge you will understand just how dysfunctional the transport system c

an be if you are unlucky enough to travel on 4 wheels. I rode in on my trusty BMW motorcycle but others used, trains, park and ride and even taxis from our hotel on the outskirts of the town. Huge queues and lots of road works led to timings being rather flexible!

Not Cambridge - But it feels like this

Not Cambridge – But it feels like this











After an excellent briefing from our external and very enthusiastic facilitators we got to choose our preferred project and team. This was surprisingly painless and I didn’t suffer the ignominy of being the last to be chosen, but only because each team had to have a mix of Developer and IT Pro evangelists.

I chose to work on a project to help our resident education / gaming Evangelist Lee Stott to produce a solution to enable greater use of a student DreamSpark Azure subscription, this would allow them to step through a manual process for creation of a website and a MySQL (ClearDB managed) Database. Once this was done the lessons had been learned and the student can then take advantage of an automated system to deploy future such solutions and learn the DevOps practices whilst doing so.

(DreamSpark is Microsoft's free offering to students of all ages to help them
gain access to software services and help to learn to use and develop with the 
Microsoft platform tools).


Other projects included a mixture of IoT gadgets from our resident IoT gurus Paul Foster and Bianca Furtuna and a project to allow easy updating of an ecommerce website by Martin Kearn. The final project was a Microsoft Band integration solution cunningly entitled Band on the Run.

I was slightly nervous working in the company of such awesome developer characters as Martin Beeby , Amy Nicholson and Lee Stott and this was borne out as the first suggestion in our team huddle came from Martin and went something like this.

“Why don’t we use as many non-Microsoft technologies as are available in 
delivering this as we can?”


So there we have it, the rest of the two days became filled with acronyms and products I had never heard of before and had almost no likelihood of understanding even if I had heard of them.

So what was our proposed solution and how did we plan on getting to that position.
Well we all (our team consisted of six evangelists) used a Visual Studio Online account to share the team room for our Student Continuous Integration project. We used VSO to manage the tasks and backlog for the project and also integrated it into our chosen method for team communication and integration with Git, VSO and other development tools. This was Slack, I had never heard of Slack and the phrase “Slack is the new black” was soon the team tagline. In short it is an instant messaging tool (which we already have plenty of, right) the big selling point (it is free) is the quantity and quality of the integration options available for your other tools.

We allocated tasks and did some whiteboarding of the architecture and since I am a OneNote fan I created a shared OneNote so that the team had somewhere to dump screenshots and other great resources, links etc. to be able to write this up and prove our DevOps chops to the facilitators at the end!

We then placed all our code and scripts and other interesting things in a public Git Hub repository.

This post was designed to be process based rather than technical but it will help to explain that we did have a few difficulties due to the limitations placed on the free Student azure account.

A student is given access to the new Azure portal and can only use the Azure Resource Manager features. This means that a number of our chosen methods and solutions simply would not work.

Martin chose to use a whole bunch of open tools to develop, deploy and amend the Website part of our solution.

These included

Gulp described as the streaming build system to automate your workflow

Grunt described as a JavaScript task runner – in one word automation

Yeoman described as the web’s scaffolding tool for modern web apps.

Slack  an Instant Messenger that integrate with everything
There are probably many more but I didn’t pick up on them due to my tasks. I was asked to deliver the PowerShell scripts to enable the automation of the solution to function. This was a pleasing task without a pleasing outcome. I failed in the time allocated to deliver anything that would work within our restricted environment.

At the end of day one we all set off across Cambridge on foot (the quickest way around town I reckon) to find the allocated restaurant. This was achieved much like the days tasks almost on time. After a great meal, great company and a comfortable night at rest we carried on with our DevOps hacking the next morning.

Our facilitators Damien and Alex who had travelled from deepest Europe to help us were seemingly impressed with our efforts and that we had come back to have another go! We even started half an hour early.

Sadly, for our team our restricted environment meant that we were not able to win the competition or to reproduce a solution we were proud of. What we did do was fully examine the tools available and the processes we used to manage a short fast project.

Just before lunchtime we had an opportunity to present our solution and listen to the others as well. My initial thoughts were that the four teams had delivered some really great results, all of which are still works in progress but will assist us and other customers to consider the DevOps culture in their future deployments.

I ought to mention that my colleague Andrew Fryer was part of the winning team and has or will write up their ‘victorious’ project in detail.

Also expect to see many more DevOps focussed posts here and elsewhere.

Watch this space.

UK IT Camps are back.

After our Summer break the UK IT Pro team are back with a bang.

Next week we are running our first two IT Camps

3rd November Manchester City Centre

4th November Leeds City Centre

These two camps focus on Microsoft Azure but we have a whole list of exciting things coming up between now and May next year.

So first, what is an IT Camp?

An IT Camp is a free hands-on training experience hosted by our IT Evangelists. These camps run between 0900 and 1700 and are at least 60% hands-on labs.

To take part you need to bring with you a device that can connect to the internet and run a modern browser. It does help if you have local admin privileges but that is not always necessary.

Now, what is the agenda for these camps?

The days starts with coffee and refreshments and launches fairly fast into the content, followed by a series of hands on labs to demonstrate and test the technologies being taught.

The aim is to give you a better idea of the capabilities of the specific technologies involved.

What is the content?

Between now and Christmas we are covering two main topics

Microsoft Azure and Windows 10.

Azure – What’s new in Cloud Infrastructure

What’s new in Cloud Infrastructure: Improving Datacenter Flexibility with Microsoft, Open Source and other technologies

Embrace infrastructure in the cloud while maximizing current resources and improving datacenter flexibility to deploy new technologies. Attend this free one-day training event to get the knowledge needed to integrate cloud solutions with existing on-premises datacenter without sacrificing security, control, reliability and scalability. Experience how advanced Azure infrastructure services enable you to provide reliable data access while maximizing productivity across platforms—Open Source (OSS), Microsoft and other.

Join Microsoft experts and learn to:

* Leverage a public cloud solution to increase reliability of disaster recovery using Azure Site Recovery

* Enhance virtualization performance in the cloud for different workloads

* Enable faster and easier deployment using Azure Resource Manager templates and GIT

* Design compute and storage infrastructure to improve performance and enhance security through Azure Networking infrastructure

* Boost secure data access with identity solutions via Azure Active Directory

* Minimize errors and save time with advanced automation using PowerShell and DSC extensions in your infrastructure

Join this IT Innovation Series event for first hand experiences with real-world scenarios around Microsoft Azure. Browse more events on additional topics.

Audience: IT operations/infrastructure professionals

Prerequisites:  Cloud management, virtualization, security, and storage experience.  An interest in understanding Azure Infrastructure solutions.

Windows 10 – in the Enterprise

What’s new in Windows 10 Enterprise: Increasing Security, Predictability, and Compatibility

Experience the most innovative and reliable Windows yet! Windows 10 brings increased stability and predictability to your organization, while minimizing risk. Attend this free one-day training event to explore new servicing, security, and management features that enable corporate data access across devices and platforms while allowing you to maintain control over those devices.

Join Microsoft experts and learn to: 

  • Help ensure application compatibility with new and legacy LOB apps with Microsoft Edge and IE 11.
  • Implement security and identity capabilities through Microsoft Azure Active Directory, Hello/Passport, Device Guard, Enterprise Data Protection, as well as Multi-Factor Authentication.
  • Get hands-on with Windows as a Service by managing Current Branch and Long Branch scenarios.
  • Create and configure deployment and management packages using Microsoft Mobile Device Management and Microsoft Intune.

Join this IT Innovation Series event for first-hand experiences with real-word scenarios around Windows 10.

Audience: IT operations/infrastructure professionals

Prerequisites Windows devices, apps and users security, management, and provisioning experience. An interest in the latest innovative capabilities in Windows 10, Azure AD, and Intune.

Register for Manchester here and Leeds here


Cortana in the UK – Get it here!

Having dutifully installed the upgrade to Windows 10, several of my friends and colleagues encountered a small but annoying issue.  Their very own personal assistant was absent without leave.

As shown below.


This short post is a step by step guide (with lots of pictures and not many words) to getting Cortana up and running, it is simple and is all about language. Cortana is available in quite a few languages and more are coming all the time. It’s no mean feat to create the environment for so many languages for the system to function effectively.

Ian Moulster, a colleague of mine on the Windows team at Microsoft UK created this video to show us what we need to do, watch it here. There are some detailed steps behind this video, which is why I published this step by step guide internally on our Microsoft Yammer group internally. It seems like a good idea to get it out into the public domain as well.

So here we go.

By default you may not have the English (United Kingdom) Language pack installed and some of the screenshots you see, may be different to mine. Persevere and if you need any extra advice just drop me a line on twitter @edbaker1965 or via email at

First open the new settings screen (there are a few ways to do this) I use the Windows Key and then Click the settings Icon as below.


From there select Time and Language followed by the Region and Language sub menu.

This screenshot is taken from the 64 BIT Enterprise ISO, although yours shouldn’t look too different if you used the English (united Kingdom) as the language and region.region

Click on English (United Kingdom) Language (language pack available) and get this (even if it doesn’t say language pack available, still click it)country












Click Options and you get this


Click Download


Also click download alongside all the possible choices offered. You get this


Ok so I have done all that Ed, so why does Cortana still say this


That’s because you are currently only half a job harry! Follow on further down the path to Cortana Cuteness.

This will take some time to install all these bits. As shown below



Click on settings under Speech


Under speech language select the drop down and choose English (United Kingdom)


Go back to the main Region and Language section (it may still be installing) I said it would take some time.


Choose additional date, time and regional settings, you get this (the old control panel dialog)


Select make this the primary language under Windows Display Language. Next you need to log off  (I would always reboot at this stage, but technically you just need to log off and back on).

Select Cortana and she MAY still not be there so choose settings in the Cortana menu.


Hooray we can turn her on now, so do so!

If you already have a Microsoft Account (Hotmail, live, outlook) or other such account already associated with your Enterprise account you are now finished and can start teaching Cortana how you speak.

Finally to use Cortana you must either have a Microsoft account associated to your domain account if this is an Enterprise domain joined work machine or you must log on with a Microsoft Account (, Hotmail etc.)

Although if you are down with the kids and already use an Azure AD account to log in then you will still need to associate a Microsoft Account to use Cortana.

So dive in and get her going – ask her some interesting questions.

My geek favourite is to check her Klingon is accurate!

How Windows 10 will change your workplace

Bold claim indeed.

But let us just examine what will be arriving in a couple of days, when Microsoft releases Windows 10 into the wild world outside the rarefied existence of the Windows Insider Programme.

Windows 10 has been in Preview with over 5 million insiders for many months and each ‘flight’ has introduced more and more of what will be available within the workplace very soon.

This is probably the most talked about, written about and widely commented on Operating System release ever and for that reason I don’t want to repeat things here.

So, I will limit myself to just one radical change that can and probably will change the face of your workplace. From the perspective of the IT Professional (that’s you!)

Users will have a great and far more productive experience and Windows 10 goes a very long way to Microsoft’s goal of more personal computing, especially as the operating system will be the same from tiny devices right up to the large conferencing system that is Surface Hub.

But the poor soul who will have to test, configure and deploy this magical new world for the users is the IT Professional, implementers that need to skill up quickly and develop skills to take advantage of all the new leaps in deployment, management and connectivity that Windows 10 brings.

So for my part I will pick the ability to utilise the Azure Active Directory features and an Azure AD account as your primary logon method and the new updating / deploying models, or Windows as a service.


Microsoft Azure is the all-encompassing name for Microsoft’s vast network of datacentres that provide public and private services to its customers. From Office 365, Intune, Azure IaaS, PaaS and many more including Azure RemoteApp which I have written extensively about and Machine Learning which is the remit of my colleague and fellow Technical Evangelist Andrew Fryer.

The backbone of this broad range of services is, of course Identity and ensuring security, authentication. Authorization and accounting. This is achieved in Azure by the use of Azure Active Directory. This article is not intended as a lesson in Azure AD since there are already many resources to assist you with that. (MSDN, TechNet, Azure help files)

Currently Azure AD can connect you to your Microsoft Azure services and other online services as well as provide directory synchronisation and full federation including Single Sign On (SSO) and password write back into your on-premises Active Directory Domain Services (AD DS) Domains.

This is quite literally huge. One of the great ways this now manifests itself is in the ability to use these identities as the sole method for connecting to a Windows 10 device.

From the Azure AD section of the management portal it is possible to track usage by user / device / application.


In addition the Azure AD Premium reports allow in depth reporting utilising Microsoft’s Machine Learning skills and many years of security experience to pre warn you of any potential security lapses or attempts to breach your carefully crafted user security.


To be honest there are so many ways that Windows 10 will change the workplace that I could not cover them all in one post here.

A short list would include Windows Hello, Device Guard, Microsoft Edge, Windows Update for Business, Cortana, the list is long and exciting.

So during the next few weeks I will be dropping the odd post as to what and why Windows 10 is an absolute must get for consumers, IT Pro’s and enterprise users.

This is without doubt the best windows ever and IT IS Free for the first year for Windows 7 and Windows 8.1 users!

Make sure you reserve your copy – Windows 10 arrives on Wednesday 29th July. Quite simply the best most secure windows ever!



Azure RemoteApp – the Final Part (3) ….. at last

And Finally…..

Not my final post but the final part in my Azure RemoteApp series. First, many apologies for the delay, the day job is getting hectic. I have been evangelizing, demonstrating and using Azure RemoteApp for almost a year now and like most Azure services this one has developed and blossomed into a seriously powerful Enterprise Mobility Tool.

The first two posts covered why and how you set up and use an Azure RemoteApp collection. This post will cover some recent enhancements and a description of what I think is its most powerful use case.

To use your organisational credentials to authenticate against your on-premises Active Directory and to then use resources inside your corporate network from a non-domain joined, possible non windows device with complete safety and security.

In a recent IT camp at our London Headquarters, Cardinal Place, Victoria, there was an evacuation alarm just as I was about to demonstrate these features. Luckily I had my handy Lumia 1520 phone with me and found a spot to carry on the demonstration outside, which then seamlessly continued on a different device when we regained our cosy auditorium.

So to recap, I introduced Azure RemoteApp in this post and developed the theme by uploading my own image with an installed application in this post.

Now it’s time to finish off. To help set the scene, below is a diagram of the infrastructure I am using. bigtextremoteappdrwaing

It is fairly self-explanatory but in this instance all these servers are IaaS Virtual Machines running in Microsoft Azure (Cunningly masquerading as my On-premises datacentre). As shown below.


So what is different about this RemoteApp collection?

Well in Post no 2 you learned how to create and upload a new RemoteApp image. The pre requisites are.

Windows Server 2012 R2 Image.

Remote Desktop Session Host Installed

Applications required installed and updated

Update the Operating System fully

VHD not VHDX (use dynamic)

Sysprep the image

Upload it.


The difference is in how you create the RemoteApp Collection. This time we are going to add this to the ADDS Domain (in this case and use the create with Vnet option.

There are Basic and standard plans available which simply dictate the size and power of the VM that is created. Pricing is available here.

Having created my collection, I have a number of tasks to complete. The collection I am using for this post is one which is joined to the domain (on premises AD DS) and runs a Line of Business Application and also accesses data stored in an on premises SQL Server (the Adventureworks sample database) by using Microsoft Excel.

Configuring this has become much simpler than it used to be, despite Azure RemoteApp being a new service, it has undergone frequent revisions and improvements. The latest and greatest of these are the release of the PowerShell CmdLets and the ability to use a virtual network in Azure that is already present and in use for other functions. Prior to this you had to create a special RemoteApp network and then join it to other networks. This is not now required.

As shown below


We are now left with a new but empty collection.


Clicking on the right arrow takes you to the quick start page. Where you should start the configuration of your new collection.













It really is as simple as 1,2,3,4 GO!

In my demonstration setup I use a Virtual Network I created in Azure and use for the virtual datacentre shown above.

I made sure the network was large enough for my local and remote users. You really don’t want to run out of IP addresses if you have a small network and lots of users, this is a possibility. You are also warned about this.


















To link the network you simply complete a short wizard.










The next step is to join a local domain (on premises AD DS) again this is a simple step of clicking the button and adding the necessary credentials. This user MUST have domain join rights (computer accounts), it is best practice to create a new user (service account) specifically for this rather than use a domain admin credentials.













The organization unit is an optional field allowing you to join the remote desktop session host servers to this OU.

The below is a screenshot of the Computers container in after a while using this collection.


My on-premises Servers are all present, as are the many instances of RDH Servers, it would be tidier to create an OU for these and state it in the wizard so all future RDH servers are created in that OU.

So we now have an empty collection attached to a network and set to join its instances to my domain. The next step is to link this collection to my uploaded image. For the sake of brevity (this is already a chunky post) I will refer you to post 2 in the series which shows the process for uploading your own image.

Just remember the re requisites for the image. Such as VHD rather than VHDX, dynamic rather than fixed size as well as the details for the operating system and applications above.

Having got this far you are now enabled to start publishing the applications you would like to deploy through this service, to your remote users.

There are two ways of publishing programs, by start menu or by path.






The first option lists out all the applications that are available on the start menu (installed) of the image you deployed. The second allows you to choose a program that you have placed in a particular location.



The image I uploaded has the followed published programs, a mixture of Path and Start Menu programs.


The final step is to choose the users you want to deploy this service to. Azure RemoteApp requires a default Azure Active Directory to be associated with the service, when using a hybrid collection.


To add users you simply enter them line by line users1

Or bulk add them using a .CSV file with a single column containing the UPN of the users you wish to add.







All the users added MUST be dir synced users from your on-premises Active directory because you have joined this collection to your domain so authentication will take place there. You can see below the failure report when I try to add an Azure AD user not originating or sync’d from the on-premises Domain.






Your collection is now complete. A few pointers here though about your environment back on-premises.

The users with Azure RemoteApp access must also have AD DS access to the machines and resources you want them to reach. Also things like port 1433 for SQL Server (thankyou @deepfat) must be open.

The dashboard, sessions and scale tabs are all self-explanatory and allow you to manage your collection from now on.

Users require access to the same client as before on any of the supported platforms. (almost all of them)

When you connect with your on-premises users, the log in screen will authenticate you against the on premises Domain Controller.

You then have access to all the published applications and since you are operating from a server joined to the domain and have domain credentials, you can access the internal resources, just as if you were within the corporate network.

The beauty of this is that you aren’t and you could be on any device and the data never lands on that device it is stored safely on-premises and in the application in Azure.

You can see below a screenshot of the end result.


For a full recorded demonstration of the user experience for this collection, I shall publish a short video demonstration in the next few days.

When Azure Remoteapp was first released in Preview last year I was enthusiastic, now it has become generally available at an affordable price in different flvours of size and licensing options and has been updated and refreshed I am firmly believe this is now a killer App and the take up from customers is proving this point.

Why not set up a 90 trial now. There are also useful resources in Microsoft Virtual Academy to help you. If you haven’t been there for a while, take a look now – it has a new look and feel to it.

Finally as you might imagine, I would like to end this series by introducing you to the new PowerShell CmdLets for Azure RemoteApp as described in this MSDN Blog post from March.

This makes the automation of these processes just a PowerShell session away. Happy Days.

A busy year ahead with nano server.


I have spent the last two weeks doing my normal day job(s) and also pretty much working double shifts by watching along with our /Build and Ignite conferences. The several Keynote sessions were as polished, impressive and exciting as we have come to expect. What has surprised me is just how many of our flagship products are all due for a full version release in 2016.

Windows 10 is obviously coming in the near future and is to be followed by Windows Server 2016, System Center 2016, SQL Server 2016, SharePoint 2016, Office 2016 and probably a good few more. This also does not include all the new Microsoft Azure updates and releases such as Azure Stack for your very own Azure Datacentre and the release of new services and features as well.

This realisation left me with more than a few questions, the foremost amongst these were how on earth am I going to find the time to learn all these? And where will I find out about them all?

The first is a difficult question and I am going to have to fit it in somewhere, the second is easier there are three prime areas at present where you can learn, for free, online about our products.


MSDN and

Microsoft Virtual Academy

Armed with these fantastic resources you should be in a fit state to be able to download the new versions, install them either in your on-premises lab or in Azure IaaS Virtual Machines and ‘play’ to your hearts content.

If I was to pick one announcement or product that I think will have the greatest impact on my working life it would be very hard. I would have a shortlist of

Nano server

Azure Stack

Windows 10.

There has already been a great deal written about Windows 10 and it will continue to be the focus of my attention for the summer. Azure Stack is not here yet and will come later on so there is plenty of time  for me to worry about that further down the tracks. So I am going to concentrate on ‘nano’ server.

There are several reasons for this.

  1. It is here now in Windows Server 2016 TP 2
  2. There are plenty of resources to learn how to use it
  3. I just know it is going to be the biggest thing (irony intended) to happen to Windows Server for a very long time.
  4. It is not a simple thing to create, install and manage so I am no doubt going to be called upon to know this very well indeed.

So, before I wander off into the sunset muttering about DISM, WIMS’s and all things nano, let’s have a look at what it is, what it does and why we need it.

Well Windows Server 2016 will have three deployment models rather than two.

  1. Full Fat GUI (which is actually a server with a client on top)
  2. Server core (no GUI but fatter than nano)
  3. nano (no gui, no method of logging in to a console and can ONLY be remote managed)

nano will eventually be the server of choice for all workloads, but it is being introduced for two specific scenarios

  1. Born-in-the-cloud applications – support for multiple programming languages and runtimes. (e.g. C#, Java, Node.js, Python, etc.) running in containers, virtual machines, or on physical servers.
  2. Microsoft Cloud Platform infrastructure – support for compute clusters running Hyper-V and storage clusters running Scale-out File Server.

In future there will be many more workloads which will run on nano. When you install the Windows Server 2016 Technical Preview 2 you will not see nano as an option.  To delpoy nano requires running DISM (part of the windows deployment toolkit) that will allow you to create a Windows Image File (WIM) containing just those roles and drivers that the hardware you have requires. This reduces the disk footprint of the binaries to around 410MB.

To configure the nano server you will need to learn PowerShell Desired State Configuration (DSC) and you can only run native 64 bit applications.  To manage the server you will need to use the Azure portal and the new web-based Remote Server Administration Tools (RSAT). There is NO remote desktop or remote console administration. You can of course use PowerShell remoting to manage the nano server directly.

These new RSAT tools are simply amazing and also give you a pretty fully featured PowerShell console window and a device manager and even a registry editor all in the web- based Azure portal.

These changes mean that you can install, configure and run nano in minutes rather than hours.

If you want to see a very quick explanation of the core features watch this short video from Ignite where Jeffrey Snover explains all.

So nano is small and fast what other benefits are there?

Nano Server will allow customers to install just the components they require and nothing more. The initial results are promising.  Based on the current builds, compared to Server, Nano Server has:

  • 93 percent lower VHD size
  • 92 percent fewer critical bulletins
  • 80 percent fewer reboots


for more from the nano team visit

Right – much to do and little time to do it.

Next up here will be my final part of the Azure RemoteApp series (finally I hear you say) followed by some cool PowerShell and nano stuff too.

Watch this space.



Microsoft Azure – Changing the face of certification. (Or why Second Shot is Essential)

Followers of this blog will know that I am pretty quick to take new certification exams produced by Microsoft. There are several reasons for this ranging from occupational hazard as an MCT to the fact that it is often, in my opinion the best time to take and pass these tests.

Nothing supports the latter assertion more than the email I received this morning from the Microsoft Azure Team with the title March Azure Newsletter. This contains upwards of 2600 words, of which well over half are dedicated to describing new or updated services.

email header

I have already posted about my early certification attempts at the three Microsoft Specialist certifications for Azure. Success in the Infrastructure Exam (70-533, second time and as it happens after a re-mark first time as well) as you might imagine, I was more than a little frustrated at the four weeks of cramming including the Christmas break I spent on the retake all to be one of the few who has the exam twice! I also mentioned I had taken the Beta test for 70-534 the Architecting solutions specialist exam. I was not expecting a pass and was not disappointed. Although to miss by so little (665/700) was also frustrating.

As a technical evangelist at Microsoft UK I don’t get to teach very much but do get to stand up in front of audiences of varying sizes and hopefully pass on the art of the possible for their businesses with Microsoft technologies, it truly is a fantastic job, so much so that it still doesn’t feel like going to work. I do miss the formal setting of the classroom and the buzz of helping people reach their certification goals.classroom

I was very lucky to be given the chance this month to teach the Microsoft Official Curriculum for Azure implementing infrastructure solutions (20533B) which maps directly onto the 70-533 certification exam. This was part of my personal development and to assist in maintaining my MCT status as well as preparing to help our internal Microsoft technical staff to certify in Azure.

Whenever I talk about Azure to customers, delegates or students I always mention just how rich the Azure, MSDN and TechNet documentation are and what a valuable resource for learning and troubleshooting they are. I ALWAYS add the caveat that before reading the pages you find please check the date of publication. If it is not within 3 months then check around for newer resources. The one constant in the world of the Microsoft Cloud is that 3 months is a very long time and you can guarantee that something will have changed.

This brings me to the exams. I took the Beta exam in November, I received the result in last February and I retook the exam whilst teaching this class in early March. In the normal scheme of things such as a Windows 8 or Server 2012 exam this would not be of any concern or consequence, however, in the world of the Microsoft Cloud, this is a big deal.

In fact taking an exam (which you have recently narrowly missed passing, whilst teaching the topic in front of your peers) can be quite stressful and that’s without the added concern of the moving goalposts of product versus syllabus. However, i was successful and I can report that the Ga version of the 70-534 exam is a very testing one but a fair test of the knowledge and skills required.

This brings us back to the March Azure Newsletter. In March the following was added or

I don’t have the space to discuss or describe any of these, but I do recommend you sign up for the Azure newsletter the easiest way is to sign up for a free trial here 30 days of usage up to £125 to spend.

Which raises the question, what to study?

Do I study what was current back when the exam was released?

Do I study all the new stuff each month?

Someone help please?

Well my advice is quite simple.

Currently there is a second shot promotion on every Microsoft certification exam. This allows you two attempts at every exam for the price of one voucher. Second-shot-bannner_1

I recommend doing some base study of the product,the exam criteria and practice building out solutions as well. Microsoft Virtual Academy is good to help with this through the various jump starts and courses available.  and then taking the exam, if you pass all well and good, if not then you at least have a guide as to what level of test this is and what is covered.

Then use your free second shot to retake the test having studied the areas you are weakest in. The Microsoft Learning team have spent a long time redesigning the score report and you now receive a great deal of useful information. For those who did not achieve a passing score, the most useful is the list of areas you were weakest in. An ideal list to go and study for next time.

The caveat here is that the exam can change at any time, as can the product. So be aware that the question can be altered, added to, removed or generally remodeled to fit the needs of the programme.

As an example, let’s take a look at the new Azure App Service

The interface you are used to will change, the name of the services will change even though the Web Apps still contain Websites.

new port oldport











Azure App Service is an enterprise-grade cloud-app service that can help you more quickly and easily create powerful enterprise web and mobile apps for multiple platforms and devices. App Service integrates the best of Azure Websites, Mobile Services, API Management, and Azure BizTalk Services, in addition to new capabilities in Logic and API apps to provide you with a unified set of capabilities for everything from the billing model to the development and management experience. App Service and the Web Apps capability are generally available today. The Mobile, Logic, and API Apps capabilities and the BizTalk Connectors are in public preview.

This is a real change in the services tested in all three Azure exams and a whole new way of delivering applications but it will take a while for the exam to catch up and the course ware also.

So in short – study early, study well and use your second shot.

My Favourite Band….

When I was growing up and enjoying being a spotty teenager I could never settle on one favourite band.I finally compromised with two from completely opposite ends of the spectrum.

Dr Feelgood and Blondie

dr feelgood a3 poster








Both these bands gave me several important benefits as a teenager.  First and most important they made each day a little bit easier to manage and left me feeling more productive and well balanced. Secondly they gave me kudos with my peer group for being so hip and trendy and up to date with modern music. Finally they did the job they were supposed to do, made damn good music and kept me entertained.

Let’s zoom forward around 33 years, to the modern day. I am now in a position where I have to choose my next favourite band. This choice is altogether more difficult.The market is full of Bands promising me the world. Full of hope and anticipation. I can be fitter, sleep better, keep in contact with the world. You name it these bands can do it.

For several years I have been on the cusp of something big in the personal fitness monitor space. I always held off, as a long distance runner I had always relied on my Garmin ForeRunner to map out distance, speed and heart rate for Marathons and training runs.

Now two unthinkable things have happened, I can no longer run long distances due to Doctors orders AND my trusty Garmin has given up the ghost and gone to that gadget drawer in the sky.

Yes, I am in the market for a more consumer based fitness tracker. But being an IT Pro, a gadget lover and a Technical Evangelist means that I will not put up with just any band.I made enquiries of my friends and colleagues who use all manner of bands.The one thing that kept coming back to me was the new Microsoft Band is just super cool, super- efficient and does everything it says on the tin.

The one big drawback is that it is not available in the UK yet and I have no real way of getting one in the short term. Unlike my MVP friends who all seem to have scored one at the MVP summit in November last year.One of my MVP friends came back with this quote.

 “There are some things in this world that you just simply don’t appreciate. Especially until you have had a chance to experience the value of that thing and then when you own that thing, you simply can’t imagine what life was like before. This is how I feel about my Microsoft Band”


Powerful claims indeed. This got me thinking a band that simply does fitness stuff is useless to me, so what can the Microsoft Band do for me a humble IT pro, how can it make my working day shorter, my leisure time longer and still not lose me my job.

So I am going to put aside all the things that other bands do. Sleep tracking, step measuring etc. I want to know how I can use the band I am obviously going to buy to manage my Datacenter?

To do this I need to use some form of monitoring and alerting software for Servers, clients and other general IT stuff. Luckily with System Center 2012 R2 and specifically System Center Operations Manager (lovingly known as SCOM) I can make sure that anything that is happening or is likely to happen that will cause me a headache at work can immediately be transmitted to me. I can filter out various day to day events that are not critical and only receive the ones that matter through the use of VIP lists of email addresses.

So there I am, lying on the beach, wearing my band, watching some Microsoft Virtual Academy training videos on my trusty Lumia 1520 4G smartphone. Whilst sipping a tasty beverage.

Suddenly my Microsoft Band vibrates to tell me I have a VIP email from my SCOM server, I glance at the screen and using Quick Read on the band, I can see that there is a potential issue with one of my Microsoft Azure Website Front-end VM’s, which will also impact on one of my backend on-premises SQL Server VM’s. These run my e-commerce system which pays for me to lie on the beach in the first place.


This calls for drastic action. I stop watching the training video and switch to the Remote Desktop app on Windows Phone 8.1. From there I can remediate both virtual machines within 10 minutes, revert to the training video and continue with my Vodka Martini, shaken not stirred.

Not forgetting to ask Cortana to remind me to check the machines again in the morning.

caloriewsNot exactly James Bond but you have to admit that the scenario above is a very tempting one indeed.

How long into the future must we wait for such a scenario to become reality?

Well the answer is only until I get my hands on a Microsoft Band.

All of this is available NOW.

I can now roll over take a look at my Health app on the Lumia (or on Android or iPhone) and make sure that the UV rays are not too high or that my shares are doing what I want them to, or even if I have burned enough calories this week to have another drink. All from the band or the dashboard built into my app.

Decision made, money saved all I need is to get hold of that Band.


Who ever said the life of an IT pro was boring?

I just hope the Microsoft Band keeps going as long as my previous favourites have. I saw both Dr. Feelgood and Blondie at the Glastonbury Festival last year.

Unlocking the value of System Center.

have just returned from another series of UK IT Camps, this time at our London Headquarters in Victoria.


They were well attended and it was a pleasure to meet so many @TechNetUK Twitter followers as well as some old friends, MVP’s and members of the UK Microsoft Technology Community Council.

So far this year we have taken our Enterprise Mobility and Extending the Datacenter (Azure) IT Camps to a good few locations including Cardiff, Manchester, Birmingham and London. We are always keen to travel to wherever there is a need so do get in touch and let us know where you want to see us, just drop us an email to Of course you can sign up for one of them here.

When I deliver the Extending your Datacenter with Microsoft Azure camp I am often asked about the Windows Azure Pack, the integration of System Center and the holistic approach to a Hybrid Cloud solution. Oh and since TechEd Europe I am also asked at every camp about the Cloud Platform System which has been hailed as an on-Premises Azure in a box. Having seen this impressive piece of kit in Barcelona, it would be nice to get my hands on this platform that was designed and built to host the Windows Azure Pack.

Since that is not at all likely in the medium to long term, I shall content myself with deepening my knowledge and understanding of how Windows Azure Pack builds on top of Microsoft System Center 2012 R2.

The story behind the camp we run is that an organization is struggling to contend with the scaling demands of modern business and data requirements. This struggle has led to over-provisioning in their data-center and the attendant under usage and cost impacts. Essentially a waste of time, money space, energy and human resources. The camp walks the delegate through a series of exercises in building out an infrastructure in Azure and then connecting it up to an on-premises network.

The natural extension to this is to use the full force of System Center to manage, control, monitor and automate the workloads that you run.

I have been ‘away’ from the whole suite of products for some time, other than System Center Configuration Manager which we use in the Enterprise Mobility scenario to work with Microsoft Intune to effectively manage mobile devices.

Obviously eyes are now being peeled and horizons scanned for any signs of vNext System Center and the future of the on premises half of the Cloud operating system suite. Until such time as any announcements are made we can use the current suite of products to do some pretty special things.

Within Microsoft UK we are blessed with a group of Microsoft Most Valuable Professionals (MVPs) who get their hands metaphorically dirty with our products every day and earn their daily bread by helping our customers squeeze every last ounce of performance from their investments. Luckily for me I get a chance to work with them and sometimes to learn a great deal from them.

Just around the corner we have one such opportunity. On Friday 20th March our UK MVPs are running an online event. I for one will be attending and learning how to tame the Windows Azure Pack and implement it in my own home lab environment.unlock

You can register here. It is a whole day of free training and guidance as to how the professionals plan, deploy and use all aspects of System Center 2012 r2 to make the difference for the customer.

So if you want to learn about the hybrid management capabilities of System Center and how they deliver on the Cloud Platform vision.

Take a look at the mind map of the event below created by System Center MVP Paul Keely


The integrated Microsoft System Center suite covers the full range of management needs, spanning the physical, virtual and application layer. System Center also allows you to manage heterogeneous infrastructure, providing a single view of the datacenter and the cloud, including management of Linux workloads and multi-hypervisor management.

So it matters not if you run Hyper-V, VMWare or a mixture within your estate, System Center can help you.

The agenda is below.


And don’t forget to get hold of the Windows Server 2012 r2 and System Center 2012 r2 products to evaluate them as well as sign up for our next It Camps.

I look forward to seeing you there. Feel free to drop me a email follow me on Twitter @edbaker1965 or comment below.

Docker Containers in Windows Server vNext


When i read this news in October 2014 I had to do a double-take, if you are anything like me (i.e. – nearly 50 and somewhat techy) then the word Docker conjures up more than one image. The predominant one in my mind is that of Doc Marten boots affectionately known in my peer group as Dockers.









Enough reminiscing,  the Docker I need to move on to is the containerization system that allows applications to be completely portable.


Here’s the description from the Docker website.

“Docker is an open platform for developers and sysadmins to build, ship, and run distributed applications. Consisting of Docker Engine, a portable, lightweight runtime and packaging tool, and Docker Hub, a cloud service for sharing applications and automating workflows, Docker enables apps to be quickly assembled from components and eliminates the friction between development, QA, and production environments. As a result, IT can ship faster and run the same app, unchanged, on laptops, data center VMs, and any cloud.”

The Founder and CTO of Docker Solomon Hykes does a pretty good job of explaining the concepts in this short video

Those of us who routinely use Virtual Machines to overcome compatibility and deployment difficulties would suggest that this is a good way of handling such things. So how does Docker differ to VM’s

Well, with a VM there is always a guest operating system running on top of the host operating system AND the hypervisor as shown in this image from the Docker website.


Once you deploy the Docker Engine this sits above the server host operating system providing all the necessary muscle to allow the individual Docker applications to maintain resource isolation and allocation much as VM’s do but without the huge overheads.

The closest thing in the Microsoft stack at present is App-V the Application virtualization system that comes as part of MDOP, the Microsoft Desktop Optimization Pack, available only to Windows Enterprise customers.



App-V is currently in version 5.0 and provides a way of streaming apps to a desktop when in online mode and also to run those apps when isolated from the network or in standalone mode. It is a different system and has provided many years of service. App-V applications however will only run on Client operating systems or on RDS servers.

So the Windows Server vNext edition will have support for Docker containerization. This shouldn’t come as much of a surprise to anyone who routinely uses Microsoft Azure. Since June 2014 Docker has been available in Linux VM’s on that platform. All part of Microsoft embracing open technologies.


So back in October 2014, Microsoft and Docker announced that

“Under the terms of the agreement announced today, the Docker Engine open source runtime for building, running and orchestrating containers will work with the next version of Windows Server. The Docker Engine for Windows Server will be developed as a Docker open source project, with Microsoft participating as an active community member. Docker Engine images for Windows Server will be available in the Docker Hub. The Docker Hub will also be integrated directly into Azure so that it is accessible through the Azure Management Portal and Azure Gallery. Microsoft also will be contributing to Docker’s open orchestration application programming interfaces (APIs).”

What makes Docker unique is that instead of maintaining configuration files (as is the case with tools such as Puppet andChef), developers can create an image of their system and share it directly with their team. Any changes to local environments produce a new image that can be re-shared. Importantly, these images should not be confused with heavyweight Virtual Machine images, which contain everything needed, including the application, any dependencies and the operating system. In contrast, Docker containers include the application and some libraries, but the OS and common dependencies remain shared assets. Consequently, Docker containers are extremely lightweight in comparison to Virtual Machine images.

This is all relatively old news, but with the push towards DevOps, this news brings the developers and sysadmins much closer together on a platform that they know and love. There are industry rumours that the next release of the Windows Server Technical Preview is scheduled soon and that it will contain this technology. I have no insight into this, but am looking forward to getting my hands on the next release, of course once I have digested the new Windows 10 Technical preview released this week.

You can get the Server Technical Preview here now and the Windows 10 technical Preview here. (soon)

You can find out much much more about all this from my fellow Technical Evangelist Susan Smith when she expands on Docker during day 2 of the upcoming TechDays Online extravaganza – specifically Wednesday 4th February at 1330Hrs.

In the meantime if you want to hear the latest news on Azure and Docker – flip forward to the 16 minute mark of this video where Rick Claus (@RicksterCDN) interviews members of the Azure product team on the Edge Show 132.

Busy and interesting times in the wonderful world of Windows. Watch this space for more.