What should I study? Which Exam should I take?

In the TechNet UK office we have an open email address for people to ask us all manner of questions and even sometimes complain about us or praise us… but mostly we like it when people tell us what they want from us, that way we know our audience is getting what they want. In this case information about Azure exams.

The email address is ukitpro@microsoft.com, please use it. Tell us what you want from us!

This post is in answer to a very interesting question and one I come across all the time in my role as both a Technical Evangelist and also as a Microsoft Certified Trainer (MCT)

The two questions I want to address today are:

What should I study?

Which exam should I take?

This is in response to one of our IT Pros out there who asked.

Hi, my name is xxxx and i want ‘CCNA’ OR ‘Microsoft Azure Specialist certifications’ which 1 is best for me please tel me and tell me online certification i want both of one, which is best online university?

There are a number of variables in this question which prevent me giving direct advice to the keen learner.

Everyone I talk to is in a different career stage and on a different career path.

So I will assume that the question is being posed from the perspective of a fairly new entrant to the industry and that the questioner is keen on Infrastructure, networking and cloud services.

So to the certifications.

CCNA or Cisco Certified Network Associate is not an easy ask and requires access to either a test lab of Ciso networking kit such as routers, switches etc or to simulation software (both is better). The certification is all about configuring switches and routers as well as networking fundamentals areound TCP IP v4 and v6 as well as the TCP multi layered model of networking, right down to what makes up a TCP Packet.

Cisco have a network academy and many universities offer this course, including the Open University.

The certification can be obtained in two ways. passing a single big exam or taking two smaller separate exams. Neither contains easier questions but people often run out of time in the larger exam. You do get the CCENT certification with the first exam.

The main difference between Microsoft tests and CISCO ones are that once a question has been answered in a CISCO exam you cannot go back.

This certification is widely valued and is very useful to get a job in a company supporting a physical network.

For study and training you would need to go to the CISCO website.

The second option asked about was an Azure Specialist Certification. there are three of these and if you pass all three then you gain the MCSD or Microsoft Certified Solutions Developer. (This is the shortest route to that certification which is the developer equivalent of the MCSE)


The three Azure specialist certifications are shown above. NOTE the MCSD, like the MCSE require re-certification every three years.

Which one of these you take very much depends upon your skil set and focus at work.

70-532 is the Developer exam

70-533 is the Infrastructure exam and

70-534 is the architects exam which actually contains bits of each.

From this page you can access the exam objective domains and details of training courses.

If you want less formal training, then you can try the Microsoft Virtual Academy or Channel 9, both of which have a mass of content on the topic. The links above wil take you to Azure specific topics on both platforms. I thoroughly recommend them to you.

Finally Microsoft run a learning community borntolearn.mslearn.net, here you can join study groups, hear about others stories and generally feel part of that IT Pro community and Dev ones for studying the Microsoft Certifications.

I moderate a couple of forums on here including the Office 365 study groups.

For any Microsoft Azure exam you will need access to a subscription. These are available in lots of ways.

You can sign up for a free trial, or use your MSDN subscription Azure credits, but make sure when signing up for a subscription that it is one that includes the credits.

Or you can come on one of our Windows or Hybrid Cloud InnovateIT events, register here and you will receive a 30 day free Azure pass.

The bottom line is that you must have a detailed knowledge of all the areas in the exam topic lists.

These are not easy exams and bear in mind that the content of the exam does not change as often as the Azure services do, so a certain knowledge of a few months past will help when sitting these.

As for which you take CISCO or Azure it depends what job you want or are doing,personally I would do the Azure ones and then the CISCO one, since the Azure networking knowledge is not directly linked to knowing your CISCo binary AND and Or ing. since Azure takes care of most of the networking for you.

You can sign up for an exam here.

You can download the latest technical certification roadmap here as well

Good luck

Windows Containers: What they are and how they work


I thought a brand New Year warranted a post on some brand new technology that I predict will have a huge impact on us this year.

The It Industry is a maze of Three Letter Acronyms (TLA’s) and Buzzwords they seem to proliferate at an ever-increasing rate. One of the latest ‘fads’, ‘trends’ or buzzwords that has gained a great deal of momentum over the last 12 months is the wonderful world of containers.

Hopefully, if you currently don’t know what they are or why we should be diving in feet first into this new(ish) technology then after reading this post you should have a clearer idea. Why not get testing the Microsoft implementation with Windows Server Technical preview 4 (TP4) (get it here) or on your Windows 10 client Hyper-V using PowerShell by followinf the steps Here.

There are also a bunch of free MVA courses here and to finally quench your thirst for knowledge, Channel 9 has a containers channel here.

So having prepped you for the journey into Containers why not take a quick look at an intro video on Channel 9.

Microsoft has been in partnership with Docker for some time,integrating their container technology into Microsoft Azure and Windows Server 2016. With TP4 Microsoft have also introduced both Windows Containers and Hyper-V Containers, both of which carry out the same function of providing a platform for lightweight portable application to be hosted, sclaed and reworked in seconds with differing levels of isolation.

At its most fundamental level container tehcnology is a way of packing many identical, similar or completely different applications in isolation on the same host (or ported to the cloud)

I know, we can already do that with Hyper-V and other hypervisors. BUT this technology only requires one base operating system to be in use onthe container host, all the containers use links to that operating system. A container host can be a physical host or a virtual machine. The container analogy is similar to the way a differencing disk contains only changes to the parent installation, the base OS image is never changed.

Microsoft Container technology now provides two base images nanoserver and Windows Server Core. There are also two types of container which provide the same functionality but different levels of container isolation as shown below.


A Hyper-V container will be run in a very lightweight virtual machine, ensuring that the isolation is total. This will allow Microsoft Container technology to support a full public multi-tenancy solution.

The clever part of the solution is that its quite easy to create, amend, copy and recreate container images and then deploy them as containers running modern appplications, web browsers etc. on your server.

Normally my posts include a whole bunch of how-to steps and advice but all of these are already listed in a step by step way on http://aka.ms/windowscontainers so I will limit myself to your first step.

From either a Windows 10 machine that runs Hyer-V client or from a Windows Server running Hyper-V fire up your best friend (PowerShell) in an administrative window. Then enter this one command.

 wget -uri https://aka.ms/tp4/Install-ContainerHost -OutFile C:\Install-ContainerHost.ps1

This creates a new script which will then set your system up as a container host, so run this comand straight afterwards.

New-ContainerHost.ps1 –VmName Tp4ContainerHost -WindowsImage ServerDatacenterCore –Hyperv

Once the script completes you will have a running container host ready to start experiementing.

Be warned htough, this script takes a long time to runas it downloads the base OS images.

Happy Containerising!

Devops Practices – Infrastructure as Code

This post is provided by PowerShell MVP Richard Siddaway

One of the biggest issues in IT is the apparent split between development teams and operations teams. I deliberately say apparent because as with many things in IT the hype doesn’t always match reality. DevOps is the new IT ‘wunderkind’ that will cure all your IT woes at the wave of a wand.  OK cynical view put aside for a while – the DevOps movement does bring benefits and in many organizations there is a dichotomy between development, operations and purchasing. The last one isn’t talked about much but how many times have operations between the last to find out that a new application has been purchased. Oh, and it has to be installed by the end of the week!

One of the biggest stumbling blocks in IT is the movement of applications into production. The development team have an environment where everything works and when the application is passed over the fence to operations everything falls apart. There a number of reasons for this including:

·         Development infrastructure doesn’t match production

·         Development machine configurations don’t match production

·         Load in production greater than development

·         Permissions different in development and production

·         Application configurations aren’t documented

Infrastructure as Code (IaC) is all about applying software development techniques, processes and tools to manage the deployment and configuration of your servers and applications.

Why do you need to do this? And what exactly is IaC?

Let’s start by viewing what it’s not. Since PowerShell’s introduction 9 years ago many people have written code to configure their servers. You might use Desired State Configuration:

Configuration ODataSetup


  param (





  Node $node


    WindowsFeature IISbasic


       Ensure = ‘Present’

       Name = ‘Web-Server’



    WindowsFeature HTTPtracing


       Ensure = ‘Present’

       Name = ‘Web-Http-Tracing’

       DependsOn = “[WindowsFeature]IISbasic”



    WindowsFeature BasicAuth


       Ensure = ‘Present’

       Name = ‘Web-Basic-Auth’

       DependsOn = “[WindowsFeature]HTTPtracing”



    WindowsFeature WinAuth


       Ensure = ‘Present’

       Name = ‘Web-Windows-Auth’

       DependsOn = “[WindowsFeature]BasicAuth”



    WindowsFeature ManCon


       Ensure = ‘Present’

       Name = ‘Web-Mgmt-Console’

       DependsOn = “[WindowsFeature]WinAuth”



    WindowsFeature Modata


       Ensure = ‘Present’

       Name = ‘ManagementOdata’

       DependsOn = “[WindowsFeature]ManCon”



    WindowsFeature Desk


       Ensure = ‘Present’

       Name = ‘Desktop-Experience’

       DependsOn = “[WindowsFeature]Modata”





       RebootNodeIfNeeded = ‘True’






This is a configuration I used to create a demo machine for a talk on Odata that I presented at the PowerShell Summit NA 2015.  Its purpose is to ensure that a number of Windows features are installed on the machine including the web server (IIS), Odata management extensions and the desktop experience. The configuration handles the reboot.

The configuration is run:

$node = “W12R2OD01”

$path = ‘C:\Scripts\Mof’


$params = @{

 Node = $node

 OutputPath = $path


ODataSetup @params 


This creates a MOF file which is pushed to the remote machine to install the configuration:

$cs = New-CimSession -ComputerName $node

Start-DscConfiguration -CimSession $cs -Path $path -Verbose -Wait -Force


Other configuration tools such as Puppet, Chef, Ansible or Salt may be used instead of, or in conjunction with, DSC.  You might just use a script to configure parts of your new server:

Get-NetAdapter -Name Ethernet | Rename-NetAdapter -NewName ‘LAN’


## set network adapter and IP address

$index = Get-NetAdapter -Name LAN | select -ExpandProperty ifIndex



$ipv4address = ‘’

New-NetIPAddress -InterfaceIndex $index -AddressFamily IPv4 -IPAddress $ipv4address -PrefixLength 24


Set-DnsClientServerAddress -InterfaceIndex $index -ServerAddresses “”

Set-DnsClient -InterfaceIndex $index -ConnectionSpecificSuffix “manticore.org”



##  join domain


$newname = ‘NewServer’

$cred = Get-Credential -Credential

Add-Computer -Credential $cred -DomainName manticore  -NewName $newname  -Force -Restart


Or you might use a combination of scripts and configuration.

So, you’ve automated your build process to a greater, or lesser, extent. You have a reproducible process but it’s still not IaC – you’re using software to manage your infrastructure but you need to take a few more steps into the development world.

So what do we need to do?

At the minimum you need to:

·         Apply source control to your infrastructure code

·         Put together a process that takes a new application build, creates the required infrastructure and deploys the application

·         Build tests into your infrastructure code – and keep those tests up to date as you make changes

·         Make sure that infrastructure used for development, test and production is created using the same code

Source control at its simplest is a matter of tracking the changes that are made to your code, who made them and most importantly gives you an easy way to roll back changes. Think of the scenario where you have a number of web farms each supporting an important web application. You need the server configuration in each web farm to be identical across the farm. Assume that your configuration code is changed by an overzealous junior administrator. No one will know until you need to deploy a new server into the farm to cope with the Christmas rush. Oops. The deployment works the difference in configuration brings the application to its knees.

Source control means that you can restrict who can change the code. It means you know that a particular version will work and that you know where it is. One piece of grit removed from the machine and ultimately less work firefighting.

Creating a build process that will create infrastructure and deploy the application is one endpoint of IaC. Your developers commit their changes, and the build process spins up a new server and deploys the application. This means that testing is always performed on a clean machine with a known configuration. It’s possible with a source control product such as Team Foundation Server (TFS) to create a number of builds – development, test and production for instance – and chose which one is targeted.

Don’t forget the change control! Deployments into development, and possibly test, should be standard changes – moving a new version into production and your organisation needs to make a call on how much risk it can accept. There are organisations that roll out several new versions of an application per day – this includes building new infrastructure for the application each time.

Testing is critical to any build process. You need to develop a set of tests that can be run, preferably, automatically, when your build process is changed. If you’re using PowerShell for your code you can use the Pester module (ships in Windows 10 and Server 2016 or can be downloaded from the PowerShell gallery). A simple example of creating tests with Pester:

function divisiontest {

    param (




    return $a / $b




Describe -Tags ‘BasicTests’ ‘divisiontest’{

    It ‘divides positive numbers’{

        divisiontest 6 3 | Should Be 2




    It ‘divides negative numbers’ {

        divisiontest -6 -3 | Should Be 2



    It ‘throws error for divide by zero’ {

        {divisiontest 6 0} | Should Throw




The function divides the first parameter by the second. The tests are wrapped in the Describe block and in this case test for correct division of positive and negative numbers. The third test is to determine if the function correctly throws an error when attempting to divide by zero. The results look like this:

Describing divisiontest

 [+] divides positive numbers 70ms

 [+] divides negative numbers 25ms

 [+] throws error for divide by zero 45ms


When testing you IaC routines you’ll need to look at testing things like installed windows features and roles, network configuration (IP addresses and subnet masks) or the existence of particular folders (use Test-Path).

The last point about making sure you use the same code to create the infrastructure for development, test and production is important as it ensures that you have consistency across the environments. This prevents any assumptions made in development or testing causing problems in production. Also, you know now know the configuration you need, its creation has been tested and you’ll have a smooth roll out into production (hopefully).

Put all of the above into play and you’ve taken the most important steps into the brave new world of DevOps – you’ve started.  Implementing IaC is a great first step but it will all fall apart on you if you don’t have good change control. I’ve seen many organisations that claim to be using change control but in reality they’ve created a bureaucratic overhead that doesn’t achieve its goals because:

·         It doesn’t have the authority to stop dangerous changes

·         Its viewed as an overhead rather than a protection

·         It purpose isn’t understood – its viewed as a rubber stamp for change

Make sure your change control process is robust and works. It’s quite telling that in the Phoenix Project (http://www.amazon.co.uk/Phoenix-Project-DevOps-Helping-Business-ebook/dp/B00AZRBLHO/ref=sr_1_1?s=books&ie=UTF8&qid=1450281900&sr=1-1&keywords=phoenix+project) the first thing that is implemented is a change control process that works. If you’ve not read the Phoenix Project its highly recommended as an introduction to DevOps. The book spends a bit too long on the scene setting and problem description as opposed to the solutions but it’s well worth a read.

Finally, should you use Infrastructure as Code for all applications? Ideally yes but (strange how there’s always a but) you will derive most benefit from using IaC with applications that change frequently, or are being developed and rolled out in a phased manner. The one-off deployment of a COTS application isn’t going to be the best place to start though ultimately your IaC processes should support that as well.



Demystifying DevOps Behaviours

In an effort to demystify the new and exciting world of DevOps and DevOps culture, I am hosting a series of guest posts on this blog over the next few months.

Firstly we need to have a look at just what those behaviours are. To do this, I have enlisted the help of a number of Microsoft Most Valuable Professionals (MVPs).

When I ask attendees at our InnovateIT UKITCamps what DevOps is, the answers I get are less than confident and often conflicting, as shown in the Elephant slide below.


The discussions often lead on to what behaviours and practices make up the modern culture of DevOps.

I have listed them below

•Infrastructure as Code (IaC)

•Continuous Integration

•Automated Testing

•Continuous Deployment

•Release Management

•App Performance Monitoring

•Load Testing & Auto-Scale

•Availability Monitoring

•Change/Configuration Management

•Feature Flags

•Automated Environment De-Provisioning

•Self Service Environments

•Automated Recovery (Rollback & Roll-Forward)

•Hypothesis Driven Development

•Testing in Production

•Fault Injection

•Usage Monitoring/User Telemetry

Rather than try and address these practices in a single post, I thought it would be really interesting to get the thoughts of some industry experts to help out. What better way of doing that than asking our MVPs to pick a practice and explain it from their own experience?

First up is Richard Siddaway a PowerShell MVP who blogs here and you can follow him on Twitter here

You can find his post  on Infrastructure as Code, here

The UK IT Camp Roadshow travels to Cardiff and Birmingham.

In December we travel to Cardiff and Birmingham to deliver the next phase in our IT Innovation series of IT Camps.

We have four UK IT Camp events in December.

1st December – Radisson Blu hotel Cardiff       –    Azure (Cloud Infrastructure)

2nd  December – Radisson Blu hotel Cardiff     –   Windows 10 Enterprise

9th December  – The Studio Birmingham              –   Windows 10 Enterprise

10th December  – The Studio Birmingham            –    Azure (Cloud Infrastructure)

(Register by clicking the links above)

These are full day events with at least 60-70% hands on labs – so bring at least one Device that you can use for comfortable lab use all day. (12-15inch screens are good)

If you have two, all the better as you can have the pdf instructions on a separate screen.

For the Windows Events we will cover

 Application compatibility,

Identity and Security,

Configuration Management,

Modern Provisioning Practices,

 There are a whole list of demos and labs to do. As well as some top rate instruction and discussion.

For the Azure events we will cover

Deploying Azure Infrastructure using ARM templates (GitHub, JSON, VS Code)

Designing Azure Compute and Storage for best performance

Designing Azure networking for advanced security

Designing Site Recovery and Migration (ASR)

Designing Identity Solutions (Azure AD)

 You definitely don’t want to miss this. We are covering many new topics, techniques and skills that will be invaluable in this modern cloudy devopsy world we are entering.

Register at the links above.

Come and see us you won’t regret it.

DevOps – a word or a Brave New World?

  1. DevOps – A Journey into Open Source Tools

Once a year the team of Evangelists I work with gather together and spend two days engaged in a variety of different pursuits. This year we made our way to the Microsoft Research Labs in Cambridge.
Whenever I make the journey to this lab I do so with trepidation. I work in a large team of very clever people there are many degrees, several Doctorates and number of patent holders amongst them, but when we enter the hallowed ground that is the MSR Lab, we all feel a little awed in the presence of a large number of people who on the surface appear normal, ordinary types but actually have brains the size of a small planet and what’s more use those brains to develop useful technology for us all. These technologies include

Bing Translator
Clutter (For Office 365 Outlook)
And many more.
So for our humble team to turn up and expect to produce something comparable in the inaugural UK DX DevOps Hackathon was always a stretch too far. One thing is for sure though the two days were going to be fun, noisy and full of problems just dying to be solved by the wonderful world of DevOps.

First – What is DevOps, why do we need it and how do we use it?

DevOps has a Wikipedia definition of

DevOps is a software development method that emphasizes communication, collaboration (information sharing and web service usage), integration, automation, and measurement of cooperation between software developers and other IT professionals. The method acknowledges the interdependence of software development, quality assurance (QA), and IT operations, and aims to help an organization rapidly produce software products and services and to improve operations performance.

Well that is a mouthful of long words.

I describe DevOps as common sense and the way that all organisations should always have been managing their business. Rather than spend many of my words (in a self imposed but small all the same word limit) explaining the theory, have a read of my colleague Susan Smith’s blog post on TechNet from January this year.


So what was the aim of the two-day event?

The department I work in is primarily staffed with Developer Evangelists with the aim of assisting Developers to use the Windows platform for their applications. There are a few of us that are IT Infrastructure evangelists and the exercise was devised to skill us up as a team to be able to help our customers to embrace the culture that is DevOps to allow for more agile and robust application releases and development.

Essentially we were to have an idea, create a solution and make sure the process used a number of the DevOps practices, with the aim of the solution being publically available in a Git Hub repository.git


As a team we can then visit customers and help them to embrace the culture and practices in their own organisations.

That was the theory.

The first hurdle was getting to the Lab in Cambridge. If you have ever driven in Cambridge you will understand just how dysfunctional the transport system c

an be if you are unlucky enough to travel on 4 wheels. I rode in on my trusty BMW motorcycle but others used, trains, park and ride and even taxis from our hotel on the outskirts of the town. Huge queues and lots of road works led to timings being rather flexible!

Not Cambridge - But it feels like this

Not Cambridge – But it feels like this











After an excellent briefing from our external and very enthusiastic facilitators we got to choose our preferred project and team. This was surprisingly painless and I didn’t suffer the ignominy of being the last to be chosen, but only because each team had to have a mix of Developer and IT Pro evangelists.

I chose to work on a project to help our resident education / gaming Evangelist Lee Stott to produce a solution to enable greater use of a student DreamSpark Azure subscription, this would allow them to step through a manual process for creation of a website and a MySQL (ClearDB managed) Database. Once this was done the lessons had been learned and the student can then take advantage of an automated system to deploy future such solutions and learn the DevOps practices whilst doing so.

(DreamSpark is Microsoft's free offering to students of all ages to help them
gain access to software services and help to learn to use and develop with the 
Microsoft platform tools).


Other projects included a mixture of IoT gadgets from our resident IoT gurus Paul Foster and Bianca Furtuna and a project to allow easy updating of an ecommerce website by Martin Kearn. The final project was a Microsoft Band integration solution cunningly entitled Band on the Run.

I was slightly nervous working in the company of such awesome developer characters as Martin Beeby , Amy Nicholson and Lee Stott and this was borne out as the first suggestion in our team huddle came from Martin and went something like this.

“Why don’t we use as many non-Microsoft technologies as are available in 
delivering this as we can?”


So there we have it, the rest of the two days became filled with acronyms and products I had never heard of before and had almost no likelihood of understanding even if I had heard of them.

So what was our proposed solution and how did we plan on getting to that position.
Well we all (our team consisted of six evangelists) used a Visual Studio Online account to share the team room for our Student Continuous Integration project. We used VSO to manage the tasks and backlog for the project and also integrated it into our chosen method for team communication and integration with Git, VSO and other development tools. This was Slack, I had never heard of Slack and the phrase “Slack is the new black” was soon the team tagline. In short it is an instant messaging tool (which we already have plenty of, right) the big selling point (it is free) is the quantity and quality of the integration options available for your other tools.

We allocated tasks and did some whiteboarding of the architecture and since I am a OneNote fan I created a shared OneNote so that the team had somewhere to dump screenshots and other great resources, links etc. to be able to write this up and prove our DevOps chops to the facilitators at the end!

We then placed all our code and scripts and other interesting things in a public Git Hub repository.

This post was designed to be process based rather than technical but it will help to explain that we did have a few difficulties due to the limitations placed on the free Student azure account.

A student is given access to the new Azure portal and can only use the Azure Resource Manager features. This means that a number of our chosen methods and solutions simply would not work.

Martin chose to use a whole bunch of open tools to develop, deploy and amend the Website part of our solution.

These included

Gulp described as the streaming build system to automate your workflow

Grunt described as a JavaScript task runner – in one word automation

Yeoman described as the web’s scaffolding tool for modern web apps.

Slack  an Instant Messenger that integrate with everything
There are probably many more but I didn’t pick up on them due to my tasks. I was asked to deliver the PowerShell scripts to enable the automation of the solution to function. This was a pleasing task without a pleasing outcome. I failed in the time allocated to deliver anything that would work within our restricted environment.

At the end of day one we all set off across Cambridge on foot (the quickest way around town I reckon) to find the allocated restaurant. This was achieved much like the days tasks almost on time. After a great meal, great company and a comfortable night at rest we carried on with our DevOps hacking the next morning.

Our facilitators Damien and Alex who had travelled from deepest Europe to help us were seemingly impressed with our efforts and that we had come back to have another go! We even started half an hour early.

Sadly, for our team our restricted environment meant that we were not able to win the competition or to reproduce a solution we were proud of. What we did do was fully examine the tools available and the processes we used to manage a short fast project.

Just before lunchtime we had an opportunity to present our solution and listen to the others as well. My initial thoughts were that the four teams had delivered some really great results, all of which are still works in progress but will assist us and other customers to consider the DevOps culture in their future deployments.

I ought to mention that my colleague Andrew Fryer was part of the winning team and has or will write up their ‘victorious’ project in detail.

Also expect to see many more DevOps focussed posts here and elsewhere.

Watch this space.

UK IT Camps are back.

After our Summer break the UK IT Pro team are back with a bang.

Next week we are running our first two IT Camps

3rd November Manchester City Centre

4th November Leeds City Centre

These two camps focus on Microsoft Azure but we have a whole list of exciting things coming up between now and May next year.

So first, what is an IT Camp?

An IT Camp is a free hands-on training experience hosted by our IT Evangelists. These camps run between 0900 and 1700 and are at least 60% hands-on labs.

To take part you need to bring with you a device that can connect to the internet and run a modern browser. It does help if you have local admin privileges but that is not always necessary.

Now, what is the agenda for these camps?

The days starts with coffee and refreshments and launches fairly fast into the content, followed by a series of hands on labs to demonstrate and test the technologies being taught.

The aim is to give you a better idea of the capabilities of the specific technologies involved.

What is the content?

Between now and Christmas we are covering two main topics

Microsoft Azure and Windows 10.

Azure – What’s new in Cloud Infrastructure

What’s new in Cloud Infrastructure: Improving Datacenter Flexibility with Microsoft, Open Source and other technologies

Embrace infrastructure in the cloud while maximizing current resources and improving datacenter flexibility to deploy new technologies. Attend this free one-day training event to get the knowledge needed to integrate cloud solutions with existing on-premises datacenter without sacrificing security, control, reliability and scalability. Experience how advanced Azure infrastructure services enable you to provide reliable data access while maximizing productivity across platforms—Open Source (OSS), Microsoft and other.

Join Microsoft experts and learn to:

* Leverage a public cloud solution to increase reliability of disaster recovery using Azure Site Recovery

* Enhance virtualization performance in the cloud for different workloads

* Enable faster and easier deployment using Azure Resource Manager templates and GIT

* Design compute and storage infrastructure to improve performance and enhance security through Azure Networking infrastructure

* Boost secure data access with identity solutions via Azure Active Directory

* Minimize errors and save time with advanced automation using PowerShell and DSC extensions in your infrastructure

Join this IT Innovation Series event for first hand experiences with real-world scenarios around Microsoft Azure. Browse more events on additional topics.

Audience: IT operations/infrastructure professionals

Prerequisites:  Cloud management, virtualization, security, and storage experience.  An interest in understanding Azure Infrastructure solutions.

Windows 10 – in the Enterprise

What’s new in Windows 10 Enterprise: Increasing Security, Predictability, and Compatibility

Experience the most innovative and reliable Windows yet! Windows 10 brings increased stability and predictability to your organization, while minimizing risk. Attend this free one-day training event to explore new servicing, security, and management features that enable corporate data access across devices and platforms while allowing you to maintain control over those devices.

Join Microsoft experts and learn to: 

  • Help ensure application compatibility with new and legacy LOB apps with Microsoft Edge and IE 11.
  • Implement security and identity capabilities through Microsoft Azure Active Directory, Hello/Passport, Device Guard, Enterprise Data Protection, as well as Multi-Factor Authentication.
  • Get hands-on with Windows as a Service by managing Current Branch and Long Branch scenarios.
  • Create and configure deployment and management packages using Microsoft Mobile Device Management and Microsoft Intune.

Join this IT Innovation Series event for first-hand experiences with real-word scenarios around Windows 10.

Audience: IT operations/infrastructure professionals

Prerequisites Windows devices, apps and users security, management, and provisioning experience. An interest in the latest innovative capabilities in Windows 10, Azure AD, and Intune.

Register for Manchester here and Leeds here


Cortana in the UK – Get it here!

Having dutifully installed the upgrade to Windows 10, several of my friends and colleagues encountered a small but annoying issue.  Their very own personal assistant was absent without leave.

As shown below.


This short post is a step by step guide (with lots of pictures and not many words) to getting Cortana up and running, it is simple and is all about language. Cortana is available in quite a few languages and more are coming all the time. It’s no mean feat to create the environment for so many languages for the system to function effectively.

Ian Moulster, a colleague of mine on the Windows team at Microsoft UK created this video to show us what we need to do, watch it here. There are some detailed steps behind this video, which is why I published this step by step guide internally on our Microsoft Yammer group internally. It seems like a good idea to get it out into the public domain as well.

So here we go.

By default you may not have the English (United Kingdom) Language pack installed and some of the screenshots you see, may be different to mine. Persevere and if you need any extra advice just drop me a line on twitter @edbaker1965 or via email at edbaker@microsoft.com.

First open the new settings screen (there are a few ways to do this) I use the Windows Key and then Click the settings Icon as below.


From there select Time and Language followed by the Region and Language sub menu.

This screenshot is taken from the 64 BIT Enterprise ISO, although yours shouldn’t look too different if you used the English (united Kingdom) as the language and region.region

Click on English (United Kingdom) Language (language pack available) and get this (even if it doesn’t say language pack available, still click it)country












Click Options and you get this


Click Download


Also click download alongside all the possible choices offered. You get this


Ok so I have done all that Ed, so why does Cortana still say this


That’s because you are currently only half a job harry! Follow on further down the path to Cortana Cuteness.

This will take some time to install all these bits. As shown below



Click on settings under Speech


Under speech language select the drop down and choose English (United Kingdom)


Go back to the main Region and Language section (it may still be installing) I said it would take some time.


Choose additional date, time and regional settings, you get this (the old control panel dialog)


Select make this the primary language under Windows Display Language. Next you need to log off  (I would always reboot at this stage, but technically you just need to log off and back on).

Select Cortana and she MAY still not be there so choose settings in the Cortana menu.


Hooray we can turn her on now, so do so!

If you already have a Microsoft Account (Hotmail, live, outlook) or other such account already associated with your Enterprise microsoft.com account you are now finished and can start teaching Cortana how you speak.

Finally to use Cortana you must either have a Microsoft account associated to your domain account if this is an Enterprise domain joined work machine or you must log on with a Microsoft Account (live.co.uk, Hotmail etc.)

Although if you are down with the kids and already use an Azure AD account to log in then you will still need to associate a Microsoft Account to use Cortana.

So dive in and get her going – ask her some interesting questions.

My geek favourite is to check her Klingon is accurate!

How Windows 10 will change your workplace

Bold claim indeed.

But let us just examine what will be arriving in a couple of days, when Microsoft releases Windows 10 into the wild world outside the rarefied existence of the Windows Insider Programme.

Windows 10 has been in Preview with over 5 million insiders for many months and each ‘flight’ has introduced more and more of what will be available within the workplace very soon.

This is probably the most talked about, written about and widely commented on Operating System release ever and for that reason I don’t want to repeat things here.

So, I will limit myself to just one radical change that can and probably will change the face of your workplace. From the perspective of the IT Professional (that’s you!)

Users will have a great and far more productive experience and Windows 10 goes a very long way to Microsoft’s goal of more personal computing, especially as the operating system will be the same from tiny devices right up to the large conferencing system that is Surface Hub.

But the poor soul who will have to test, configure and deploy this magical new world for the users is the IT Professional, implementers that need to skill up quickly and develop skills to take advantage of all the new leaps in deployment, management and connectivity that Windows 10 brings.

So for my part I will pick the ability to utilise the Azure Active Directory features and an Azure AD account as your primary logon method and the new updating / deploying models, or Windows as a service.


Microsoft Azure is the all-encompassing name for Microsoft’s vast network of datacentres that provide public and private services to its customers. From Office 365, Intune, Azure IaaS, PaaS and many more including Azure RemoteApp which I have written extensively about and Machine Learning which is the remit of my colleague and fellow Technical Evangelist Andrew Fryer.

The backbone of this broad range of services is, of course Identity and ensuring security, authentication. Authorization and accounting. This is achieved in Azure by the use of Azure Active Directory. This article is not intended as a lesson in Azure AD since there are already many resources to assist you with that. (MSDN, TechNet, Azure help files)

Currently Azure AD can connect you to your Microsoft Azure services and other online services as well as provide directory synchronisation and full federation including Single Sign On (SSO) and password write back into your on-premises Active Directory Domain Services (AD DS) Domains.

This is quite literally huge. One of the great ways this now manifests itself is in the ability to use these identities as the sole method for connecting to a Windows 10 device.

From the Azure AD section of the management portal it is possible to track usage by user / device / application.


In addition the Azure AD Premium reports allow in depth reporting utilising Microsoft’s Machine Learning skills and many years of security experience to pre warn you of any potential security lapses or attempts to breach your carefully crafted user security.


To be honest there are so many ways that Windows 10 will change the workplace that I could not cover them all in one post here.

A short list would include Windows Hello, Device Guard, Microsoft Edge, Windows Update for Business, Cortana, the list is long and exciting.

So during the next few weeks I will be dropping the odd post as to what and why Windows 10 is an absolute must get for consumers, IT Pro’s and enterprise users.

This is without doubt the best windows ever and IT IS Free for the first year for Windows 7 and Windows 8.1 users!

Make sure you reserve your copy – Windows 10 arrives on Wednesday 29th July. Quite simply the best most secure windows ever!



Azure RemoteApp – the Final Part (3) ….. at last

And Finally…..

Not my final post but the final part in my Azure RemoteApp series. First, many apologies for the delay, the day job is getting hectic. I have been evangelizing, demonstrating and using Azure RemoteApp for almost a year now and like most Azure services this one has developed and blossomed into a seriously powerful Enterprise Mobility Tool.

The first two posts covered why and how you set up and use an Azure RemoteApp collection. This post will cover some recent enhancements and a description of what I think is its most powerful use case.

To use your organisational credentials to authenticate against your on-premises Active Directory and to then use resources inside your corporate network from a non-domain joined, possible non windows device with complete safety and security.

In a recent IT camp at our London Headquarters, Cardinal Place, Victoria, there was an evacuation alarm just as I was about to demonstrate these features. Luckily I had my handy Lumia 1520 phone with me and found a spot to carry on the demonstration outside, which then seamlessly continued on a different device when we regained our cosy auditorium.

So to recap, I introduced Azure RemoteApp in this post and developed the theme by uploading my own image with an installed application in this post.

Now it’s time to finish off. To help set the scene, below is a diagram of the infrastructure I am using. bigtextremoteappdrwaing

It is fairly self-explanatory but in this instance all these servers are IaaS Virtual Machines running in Microsoft Azure (Cunningly masquerading as my On-premises datacentre). As shown below.


So what is different about this RemoteApp collection?

Well in Post no 2 you learned how to create and upload a new RemoteApp image. The pre requisites are.

Windows Server 2012 R2 Image.

Remote Desktop Session Host Installed

Applications required installed and updated

Update the Operating System fully

VHD not VHDX (use dynamic)

Sysprep the image

Upload it.


The difference is in how you create the RemoteApp Collection. This time we are going to add this to the ADDS Domain (in this case corp.ed-baker.co.uk) and use the create with Vnet option.

There are Basic and standard plans available which simply dictate the size and power of the VM that is created. Pricing is available here.

Having created my collection, I have a number of tasks to complete. The collection I am using for this post is one which is joined to the corp.ed-baker.co.uk domain (on premises AD DS) and runs a Line of Business Application and also accesses data stored in an on premises SQL Server (the Adventureworks sample database) by using Microsoft Excel.

Configuring this has become much simpler than it used to be, despite Azure RemoteApp being a new service, it has undergone frequent revisions and improvements. The latest and greatest of these are the release of the PowerShell CmdLets and the ability to use a virtual network in Azure that is already present and in use for other functions. Prior to this you had to create a special RemoteApp network and then join it to other networks. This is not now required.

As shown below


We are now left with a new but empty collection.


Clicking on the right arrow takes you to the quick start page. Where you should start the configuration of your new collection.













It really is as simple as 1,2,3,4 GO!

In my demonstration setup I use a Virtual Network I created in Azure and use for the virtual datacentre shown above.

I made sure the network was large enough for my local and remote users. You really don’t want to run out of IP addresses if you have a small network and lots of users, this is a possibility. You are also warned about this.


















To link the network you simply complete a short wizard.










The next step is to join a local domain (on premises AD DS) again this is a simple step of clicking the button and adding the necessary credentials. This user MUST have domain join rights (computer accounts), it is best practice to create a new user (service account) specifically for this rather than use a domain admin credentials.













The organization unit is an optional field allowing you to join the remote desktop session host servers to this OU.

The below is a screenshot of the Computers container in corp.ed-baker.co.uk after a while using this collection.


My on-premises Servers are all present, as are the many instances of RDH Servers, it would be tidier to create an OU for these and state it in the wizard so all future RDH servers are created in that OU.

So we now have an empty collection attached to a network and set to join its instances to my domain. The next step is to link this collection to my uploaded image. For the sake of brevity (this is already a chunky post) I will refer you to post 2 in the series which shows the process for uploading your own image.

Just remember the re requisites for the image. Such as VHD rather than VHDX, dynamic rather than fixed size as well as the details for the operating system and applications above.

Having got this far you are now enabled to start publishing the applications you would like to deploy through this service, to your remote users.

There are two ways of publishing programs, by start menu or by path.






The first option lists out all the applications that are available on the start menu (installed) of the image you deployed. The second allows you to choose a program that you have placed in a particular location.



The image I uploaded has the followed published programs, a mixture of Path and Start Menu programs.


The final step is to choose the users you want to deploy this service to. Azure RemoteApp requires a default Azure Active Directory to be associated with the service, when using a hybrid collection.


To add users you simply enter them line by line users1

Or bulk add them using a .CSV file with a single column containing the UPN of the users you wish to add.







All the users added MUST be dir synced users from your on-premises Active directory because you have joined this collection to your domain so authentication will take place there. You can see below the failure report when I try to add an Azure AD user not originating or sync’d from the on-premises Domain.






Your collection is now complete. A few pointers here though about your environment back on-premises.

The users with Azure RemoteApp access must also have AD DS access to the machines and resources you want them to reach. Also things like port 1433 for SQL Server (thankyou @deepfat) must be open.

The dashboard, sessions and scale tabs are all self-explanatory and allow you to manage your collection from now on.

Users require access to the same client as before on any of the supported platforms. (almost all of them)

When you connect with your on-premises users, the log in screen will authenticate you against the on premises Domain Controller.

You then have access to all the published applications and since you are operating from a server joined to the domain and have domain credentials, you can access the internal resources, just as if you were within the corporate network.

The beauty of this is that you aren’t and you could be on any device and the data never lands on that device it is stored safely on-premises and in the application in Azure.

You can see below a screenshot of the end result.


For a full recorded demonstration of the user experience for this collection, I shall publish a short video demonstration in the next few days.

When Azure Remoteapp was first released in Preview last year I was enthusiastic, now it has become generally available at an affordable price in different flvours of size and licensing options and has been updated and refreshed I am firmly believe this is now a killer App and the take up from customers is proving this point.

Why not set up a 90 trial now. There are also useful resources in Microsoft Virtual Academy to help you. If you haven’t been there for a while, take a look now – it has a new look and feel to it.

Finally as you might imagine, I would like to end this series by introducing you to the new PowerShell CmdLets for Azure RemoteApp as described in this MSDN Blog post from March.

This makes the automation of these processes just a PowerShell session away. Happy Days.