Azure certification: The view from the street

Stop Press – All change for Azure Certification

Azure Certification Badge

Azure Certification Badge

When the original Azure certification and training programme was released back in 2014, the cloud world was a very different place. Adoption was low, cloud skills were lower, and cloud job roles were as yet undefined. The landscape in 2018 is very different.

In 2014 the first Azure two courses and exams were released and focussed on developer and infrastructure skills. Early in 2015 Microsoft introduced the final part of its initial Azure trilogy, the architecture exam, and if you were lucky or clever enough to pass all three exams, then you achieved the heady heights of a Microsoft Certified Solutions Developer. Currently, pass two of the three, and you become a Microsoft Certified Solutions Associate. A single additional pass of an elective exam gains you the title of Microsoft Expert. All these exams sit within the Cloud Platform and Infrastructure track of the Microsoft Certified Solutions Expert. Finally, if any of these are the first time you’ve passed a Microsoft exam, you also add Microsoft Certified Professional to your list of certifications.

Confused? Stay with me.

This new job role–based programme should clear that up.

I have been a Microsoft Certified Trainer (MCT) for longer than I care to remember, my first Microsoft certification being way back in 1998 when Windows NT4 Server was all the rage. In all that time, the one constant has been the rate of change and complexity in the certification programmes on offer.

Well, the time has come for another change. Stick with me! In mid-June this year Microsoft Principal Psychometrician Liberty Munson announced some radical changes to the Azure certification and training programme.

As an MCT Regional Lead for the UK, I can provide feedback directly to the MCT programme manager and the Microsoft Learning leadership team. Almost without exception, the Regional Leads pleaded for meaningful change rather than change for change’s sake.

At this point I need to come clean. I write exam items for Microsoft certifications and I write courseware for Microsoft Official Courses, I have taken well over 150 technical certification exams, and I regularly teach a wide range of Microsoft technologies. I thought it might be helpful to hear an overview of the new Azure certifications from someone who writes the exams, teaches the courses, and takes the exams (at least, the ones he doesn’t write). I ought to add that I am not a Microsoft employee and am free to pass on constructive criticism in public and private.

If you read the blog post from Liberty referenced above, you will know that there are currently three exams in wide-scale beta-test phase. All three of these exams relate to the newly announced Azure Administrator certification. (Note no MCSA, MCSE, or other titles.) The release of these exams heralds a complete change in the way Microsoft is offering certification and is definitely not change for the sake of change.

So, what, why, and how?

For several years it has been obvious to me that the content of all the Microsoft Azure certification exams does not reflect the actual job that any individual will be asked to perform. Even the new Azure Data Management and Analytics Exams have some pretty broad objective domain lists. The head honchos at Microsoft Learning have been building this new programme for many months and released the first job role certification path at the recent Inspire conference.

So I introduce to you the Azure Administrator certification. There are two exams required to get to this level of certification, if you have not already certified in Azure Infrastructure (Exam 70-533). These exams are AZ-100 and AZ-101. On successful completion of both exams, you will become a certified Azure Administrator.

Azure 100 Certification

Azure 100 Certification

If you have already passed Exam 70-533, you can take a transition exam (AZ-102), but not for much longer—only for a three- or four-month period, after which you would need to start again with AZ-100 and AZ-101.

To get to this point, there have been many sessions to identify what exactly an Azure Administrator should do in their day-to-day job. The results are a much more refined and narrow exam objective domain.

I was involved in the authoring of Exam AZ-101, so I haven’t taken that one, but I did take AZ-100 and AZ-102 a couple of days after they were released. I won’t know my results for a few weeks but that wasn’t why I took them.

Prior to taking these new exams, I retook 70-533 and 70-535 to recertify. (After four years, the new Azure looks and works nothing like the old one.) Retaking these exams also helped me to identify the differences in the two new exams.

When I take a certification exam, I always take advantage of the online proctored version. This means I can sit at home and take the exam when and where I want (for the same price).

AZ-100: Microsoft Azure Infrastructure and Deployment focusses on the following major objectives:

  • Manage Azure subscriptions and resources (15-20%)
  • Implement and manage storage (20-25%)
  • Deploy and manage virtual machines (VMs) (20-25%)
  • Configure and manage virtual networks (20-25%)
  • Manage identities (15-20%)

Note how much slimmer this exam is. It drills down into specific, basic tasks necessary to deploy infrastructure in Azure. In addition, the deep dive on Azure AD and how to manage subscriptions and resources provides a rounded set of jobs that an administrator needs to know.

AZ-101: Microsoft Azure Integration and Security covers a number of different technologies and concepts:

  • Evaluate and perform server migration to Azure (15-20%)
  • Implement and manage application services (20-25%)
  • Implement advanced virtual networking (30-35%)
  • Secure identities (25-30%)

When added together, these sets of skills enable the successful candidate to declare himself or herself a certified Azure Administrator. The difficulty with all cloud-based technologies though is that two days after qualifying, things will have changed. It is a constant quest for current knowledge.

I cannot comment on Exam AZ-101, as I authored some of it, but I can comment on the other two and I found them to be a much more sensible, balanced, and focussed set of questions that have real-world application in a specific job role.

I am looking forward to teaching the AZ-100 syllabus when the courses are released in the autumn (which is “UK-speak” for fall). This change of focus from technology to job role means that it is much easier to decide which course to attend. I have often found the need to customise courses, since a lot of delegates used to say they would never need technology X but hoped to instead cover technology Y.

I look forward to hearing much more about the new certifications and processes at Ignite this year. If you are attending, come and see me at one of my sessions. This year I am speaking on Windows Server 2016 and how to stay relevant in a cloud-based world.

Intune goes to the Cloud

First I should apologise for being dead to the world in blog terms since January, but lots has happened and continues to happen.

First and most importantly your humble correspondent has been awarded the honour of being a Microsoft Most Valuable Professional (MVP) in the Enterprise Mobility arena.

Rather than fill the post with explanations. Read about the MVP Programme and my MVP Exploits at these links.

Suffice to say it is an honour and a privilege and one I will try to continue to deserve.

So on to what else has been happening.

Well in November2016 I left Microsoft as a full time employee (as reported here) and re-kick started my Old company Excalibur Services (UK) Ltd. We provide training and IT consultancy.

But what has been happening in the world of Microsoft and Enterprise Mobility.

Even more, but I will limit this post to the HUGE changes in Microsoft Intune. You will remember that my last few posts have been about how easy it was to buy, configure and deploy the EM+S suite of products to manage your BYOD,CYOD and other devices for your organization.

Well now Microsoft have gone one stage further. The Intune console was always a bit clunky and quirky having grown organically over the years. What’s more it required Silverlight to run and that technology has been deprecated (Microsoft speak for binned!)

So a new console was required. Luckily Microsoft already had the perfect answer. Microsoft Azure.

Intune uses Azure Active Directory as its Identity Management solution and integrates with other users such as Office 365, Dynamic 365 etc. For some time the Intune and Intune App Protection workflow has been available in the Azure portal but recently it has gone GA (Generally Available) and no longer a preview version.


Almost all Intune features can now be enabled, configured and monitored from the Azure Portal.


It is even possible to set the Mobile Device Management authority from Azure now (one of the last remaining options that required Silverlight)

Some features such as policies create din Silverlight will not be visible in Azure and vice versa.

Why not create yourself a trial tenant and dive in or schedule a 1:1 demo.

The next post will be a deep dive into some of the new features introduced and the ways of managing them.

Compliance – Enterprise Mobility and Security (EMS) – How to Secure your Devices in 15 minutes (Part 2)

EMS – the Easy Way (continued)

We left off at that part of our story with a fully fledged E3 – Office 365 tenant and an E5 Enterprise Mobility & Security subscription. We also have a custom domain connected to the tenant. Now to enforce device compliance.

The first step in this post is to show the necessary DNS entries to successfully connect all the Office 365 services to your custom domain. The next stage is to examine all the portals you will be using.

I will then walk through the steps needed to configure the Microsoft Intune console for Mobile Device Management (MDM) and finally enrol an iOS client and a Windows 10 client to the Intune tenant for MDM

DNS for Office 365 and Microsoft Intune

To successfully connect a custom domain to an Azure Active Directory and use the Office 365 applications, several Domain Naming System (DNS) records need to be created. This either requires Office 365 to manage the DNS zone for the domain or someone with the ability to update DNS themselves.

The table below shows the types of records that are required. You can see in more detail what they are here and how to set them, here

Record Application / Service Purpose
CNAME Office 365 Suite Identity Platform redirection
TXT Office 365 Suite Domain ownership verification
CNAME Autodiscover – Exchange Online Exchange online record to direct outlook clients
MX Exchange Online Email This record points all email toward the Exchange online service.
SPF (TXT) Exchange online Spam prevention tool
SRV Skype for Business Required by SfB IM
CNAME Skype for Business SfB client uses this to find the SfB online service

Other records are required for Single Sign On and other federation services (Hybrid Exchange online service)

A portal, a portal my kingdom for a portal.

The one thing that is NOT lacking in the Microsoft online services world is a portal. Well actually there is one portal that is lacking. There are so many portals in use in Microsoft Azure,  Microsoft Intune, Office 365 and Enterprise Mobility & Security that what is missing is a single portal providing access to all your relevant portals by function.

So, if you are listening Microsoft!

This short section is a list of portals, their function and a screenshot to help you identify where you are and what you can do.


The Office 365 Admin Center


All top level functions and access to all other Office 365 service portals.



The Microsoft Intune Portal
requires Silverlight


The setup, configuration and management portal for Microsoft Intune



The Microsoft Azure Portal ( For active directory management  (IN PREVIEW)

A preview portal for managing all aspect of Azure Active Directory



The Microsoft Azure Classic Portal For active directory management

Azure service manager portal for managing Azure Active Directory



Intune Portal in Azure (Preview)


Mobile Device management Preview in Azure



Intune Portal in Azure for MAM (


Mobile Application Management for iOS and Android in Azure Portal



Office 365 Product Admin Centers (Portals)

Each Office 365 service has its own Portal or Admin Center linked from the main Office 365 Admin Portal



The above is a small sample of the portals available and as with all cloud services, they will change on a regular basis. You have been warned.

Corporate Branding

Having created the tenant and the custom domain link the next stage is to add a corporate brand to the applications and sign in pages of your tenant. This is a simple process which can be carried out in a number of places. I used the Office 365 portal and uploaded a couple of graphics.

Microsoft Intune

The Enterprise Mobility and Security product suite is led by the Microsoft Intune service. This service uses an Azure Active Directory (AAD) to store the users and computers in your organization that you want to be managed.

MDM Authority

At present Intune can work in two distinct Mobile Device Management (MDM) modes. Hybrid or cloud only. The selection of this mode is made before you configure any other settings. The two modes are mutually exclusive and currently very hard to reverse if you change your mind.

The Hybrid solution uses Microsoft System Center Configuration Manager (SCCM) to manage all MDM activities. Whilst this necessarily cannot be as ‘up to date’ as a pure cloud service, the SCCM team have added a number of new features to allow fast service updates.

I do not use SCCM for my deployment and this series of posts will not consider it further.

To set the MDM authority, select the Admin menu of the Intune Portal. Then select Mobile Device Management. On this page you can configure the authority to either Intune or SCCM the screenshot below shows an tenant configured for Intune MDM.

mdm authority

This completes in seconds and your tenant is now ready to be configured with Compliance Policies, Configuration Policies, Device Groups and User Groups.

To make things simpler later on, before starting on the policies, you can connect your tenant to the Exchange online system. This is done by navigating from the Admin section of the Office 365 Portal, expand the Microsoft Exchange menu and Click on the Set up Exchange Connections, finally click on Set up Service to Service Connector. You require  an admin credential for an Office 365 account with an Exchange Online service to make this connection. (So Office 365 Pro Plus, Office 365 Business and all Office 365 Home subscriptions are not suitable for this)


Having chosen your Mobile Device management authority, you can now continue with configuring and deploying Microsoft Intune.

Intune Policies

Microsoft Intune has, at first glance a plethora of different types of policy to implement. To cut through the confusion, and to keep this post to the steps I actually took to implement my own deployment, I shall stick to a single compliance policy and two configuration policies, one for iOS and one for Windows 10.


The first policy to create is a compliance policy. I run a small business with a fewer than 30 devices. These devices are limited to Windows 10 desktops, laptops and phones and a few iOS devices iPhones and iPads. If your business is different you may choose a different plan.

Planning is essential to implement this solution effectively and to prevent duplicate policies or restrictions.

To create a compliance policy select the Policy icon on the left of the dashboard view .


Then click on the Compliance policies link


Finally click the Add… text at the top of the window


Which opens up the Create Policy screen. Here you can require certain properties to be present or absent on a device before it is in compliance. You can then set conditional access policies that only allow devices in compliance access to apps and data as well as monitor and remediate devices that may be out of compliance. Checkout the docs on Compliance policies here. Compliance policies are deployed to a user, so all the user’s devices must therefore comply with that policy to gain access to data and apps.

The Create policy screen contains settings such as password length and complexity, device encryption, Windows device health attestation, security settings, operating system version and whether or not the device is jailbroken.

As an example I have only one compliance policy and it is applied to all employees. It state that all devices must have a  xx character password of a complex nature (security prevents me revealing any more).

Windows Devices have to be healthy. to be healthy a Windows 10 device must have code integrity, BitLocker encryption and secure boot enabled. In addition the early launch malware driver must be loaded on desktop devices. Windows devices must also be at a minimum version of Windows 10.

iOS devices cannot be jailbroken.


Configuration policies can be set on devices and / or users. These policies are specific to device types. Having clicked the Configuration policies link and Add.. at the top of the screen you see the Create a New Policy window.


Within each device type listed are a number of policy templates. The iOS choices are below


I set up a General Configuration policy for both Windows and iOS.

My policy settings cover use of a password to unlock the device and a whole raft of device settings for screenshots, camera usage and applications. it is possible to tie down the device in a very restrictive manner if you so choose.

Whereas a compliance policy monitors the device to report to conditional access as to a devices compliance, a configuration policy will actually force a device to comply with the settings laid down within the policy.

When the device is enrolled into Intune the compliance is checked and the configuration is applied (if deployed). If a configuration policy conflicts with a compliance policy, the compliance policy will always win. The most restrictive setting in a compliance policy will also take precedence.

Device and User Groups

This brings me to groupings. You can create device groups and user groups. I have set one user group up since the company is small and does not require more. I have also created a Windows Desktop PC group and iOS Devices group and two Windows 10 groups, for mobile devices. One for phones and one for surfaces.

This covers all the devices I am likely to have for a couple of years.

Having created policies, you can then choose to deploy them to users (or users and devices for configuration policies).

All these steps are simple wizard driven actions.

Conditional Access

Two steps remain, first to enable conditional access policies for enrolled devices. I chose to enable only the Exchange Online policy, shown here.


I could also have selected Dynamics CRM, Exchange on-premises, SharePoint online and Skype for Business online.

All of these policies and step by step instructions can be found at the new home of Microsoft Documentation The specific Intune section is here.

Secondly I need to enable my client devices to register with Intune and there are a number of ways to do that. I will walk you through a Windows 10 desktop and an iOS iPad.

Client Connectivity

Windows Devices

There are a number of ways to enrol a Windows device. If using a Windows version prior to Windows 10 you will need to download the Intune client. I chose to use the feature of Azure AD premium that allows Auto enrolment of devices into Intune when you join them to an Azure AD.

You could also use the Company Portal App or force the join through Windows 10 Settings as shown below


iOS devices

To connect an iOS device to Intune you need to install the company portal app and enrol the device. To be able to do this you need to prepare Intune for iOS device enrolment. This takes the form of an Apple Push Notification (APN) certificate. The process is quick, simple and free but does require an Apple ID.

The steps are From the Admin menu select iOS and Mac OS X the select Upload an APN Certificate.

Here you can download the certificate request from Intune. You then import that request to the APN website and download the certificate that is produced. The final step is to upload that certificate back to Intune.

This is acknowledged with a Green tick and ready for enrolment on the iOS and Mac OS X page.


To enrol the iPad into Intune MDM, simply download the Company Portal app from the Apple Store and sign in with your Office 365 / Intune / Azure AD credentials. If those credentials relate to a user that is enabled for a compliance policy on a device that has a configuration policy, the enrolment will also configure the device, or at least enforce the user to configure the device .

A short while after, the user and their devices appear in the Intune management portal.


Here you can see one of the users already has 8 devices under management. You can ‘drill down’ into the user and see all their devices and the state of compliance and even obtain a full inventory from each device.


Here you can see the hardware details of a Windows 10 Desktop device under management.


and how it conforms (or not) to the policies


The depth and richness of data that Intune can provide is quite staggering.

There are many many more features which I will cover in later posts. One of the most exciting and useful is Mobile Application Management without enrolment which is carried out through the Azure Portal.

Watch this space for the next exciting instalment.

Enterprise Mobility and Security (EMS) –How to Secure your Devices in 15 minutes (Part 1)

EMS – the Easy Way

Since I deployed Microsoft’s Enterprise Mobility and Security products (EMS) (and blogged about it).I have been inundated with requests for a how-to guide (well a few people were curious anyway).

So here it is. This post is part of a series that will show and tell how to sign up for, enable and use Office 365 and Enterprise Mobility and Security E3 (I didn’t capture the steps when I did it, it was literally too fast (and I didn’t think about it)). So I will sign up for trials in both services and show those. The only steps not shown live will be the linking to my custom domain, which relies on DNS records and does not slow down progress in any other way.

Buy your Licences

The first step is to buy the products. Depending upon your needs , size and status the process and varies. I am assuming you are a small business like me and can sign up for 5 users at the relevant Microsoft Web sites.

I will do this through trial sign ups.

First, for later use make sure you have a real custom domain. Mine is (without this you will be tied to an ‘onmicrosoft’.com domain which doesn’t look good for your business). You can sign up for a domain almost anywhere ( is prominent in web searches for this and provides a cheap service other registrars are available such as

The next step is to sign up for a trial of Office365 E3 (The E3 subscription provides me with all the services I need, make sure you pick the right one for you – it can be confusing there are so many to choose from as the tables show. Enterprise Plans, Business Plans )

Sign up Here. This will provide a trial period of Office 365 E3 subscription.


Complete the data entry above and click the arrow to the right of Just one more step.


Again, complete the data entry required (TIP Keep the name of your company shown by the arrow very short this is your login ID until you add a custom domain and it can get quite long and tedious)

Once you have completed the prove you are not a pesky BOT verification by text



you will have a freshly minted Office 365 subscription and underpinning that is an Azure Active Directory where all your users, groups and device information is stored (More of that later).


Notice that although you can immediately download one of your 5 allocated copies of Office 2016 Pro Plus there are some services that are not yet complete. But you do have full access to the Admin portal.

Having created an Office 365 tenant we now need to add the Enterprise Mobility and Security E5 SKU to it. as a free trial. (The trial version available is E5 not E3, but this is good as you get access to all the goodies in the cupboard such as Cloud App Security).

To do that open a browser ‘InPrivate’ or ‘Incognito’ or their equivalent. Login to your office 365 tenant as an administrator (the email address you used to create the trial above)

Then add a new tab to the browser and navigate to and click try now. if all goes well and the office tenant was created properly and you are logged in correctly, you will see this message


Click the arrow to the right of Yes, add it to my account and the Enterprise Mobility and Security (EMS) Licences will be added to your tenant Azure Active Directory (AAD). You will see the following screen before the addition is completed.

EMS Sign upEMS Sign up

In the first one click Try Now and in the second Continue. (Note you have 100 EM&S licences and only 5 for Office 365)

After a few minutes open another browser window (NOT EDGE as edge cannot support Silverlight and the Intune Console currently runs on Silverlight) (UPDATE – there is also a Preview of Intune in the Azure Portal)

In that browser navigate to and log in as the Office 365 / EMS global administrator (the email address you used earlier).

You will be greeted with the following Intune EMS screen


This is the Microsoft Intune portal and where you carry out all setup and maintenance of your Mobile Device management (MDM) policies and configuration.

NOTE this How to DOES NOT INCLUDE HOW TO USE INTUNE WITH System Center Configuration Manager (SCCM) – A separate post will deal with that – NOTE that currently you can use EITHER Intune OR SCCM to do the management in a single tenant but NOT BOTH.

So, this will have taken you longer than 15 minutes. I have done this many many times and when I purchased the Licences it probably took me 2 or 3 minutes to get to this point. I then attached my tenant to a custom domain ( as that is my corporate identity.

I will show you the steps to do this but this too will take longer than 15 minutes as it needs the DNS entries to propagate and that is if you have control of the DNS zone for your domain.

Adding a Custom Domain

The easiest and best way to achieve this is through the Office 365 Portal, indeed the portal will guide you through all your setup steps. Navigate to

Click on Go to setup and follow the instructions for adding  a TXT record to your domains DNS zone. Once this is complete setup will give you all the other records required for Exchange Online, Lync Online and the enterprise registration records for Intune.

You can even get Office 365 to manage all your DNS records if you want to.

Part 1 of this marathon is now complete.

What have we achieved?

Well you have signed up to Intune and Office 365 trials and allocated a custom domain to your tenant.

In the next Post we will concentrate on deploying the Intune policies necessary to manage your devices and also installing the Company Portal application to allow enrollment of iOS devices.

We will also cover some of the other EMS products.

Enterprise Mobility and Security – The easy way as Excalibur rises from the ashes.

Microsoft Enterprise Mobility and Security – The Easy Way

On Friday (4/11/16) I ceased to be a full time employee working for Microsoft UK Ltd. In most circumstances I would have found this a rather disconcerting, even sad day. But NO!

One thing never changes at Microsoft and that is the speed of change! Rapid hardly describes it. The department I worked for changed their focus in July and I have been looking around for my next role either in or out of Microsoft.

The ideal opportunity arose, to deliver IT Camps and other training and event services to Microsoft as a vendor and to carry on my Microsoft Training and consultancy business which had to stop when I joined full time. And training is my real passion.

Having made the decision, resigned and come up with a mutually convenient date for this to happen, I had to restart my old business, Excalibur Services (UK) Ltd.

Excalibur Services Deploying Microsoft Enterprise Mobility

Excalibur Logo

The obvious path for me to go in the productivity and security world was Office 365 and the Microsoft Enterprise Mobility and Security portfolio.

I have been using, training and evangelising all of the products for a long, long time. I have been telling people just how easy it is to setup, deploy and use.

But the startling truth hit me this week when I was faced with buying it all and starting from scratch.

I do not have a big company (YET! Growth mind-set is key in all business) so I bought 5 seats of Office 365 E3 and 5 seats of the Enterprise Mobility and Security E3 suite as well.

I have also converted my training and testing lab into a small business data centre but that is for a different post.

Having handed back my excellent Microsoft Lumia 950 phone, I also bought a much smaller iPhone SE and a nice large 4G data plan. The next step was to set up the accounts and the policies to protect my mobile devices.

I allocated a day to do all this.

I hope my estimation skills are honed a little more when estimating work for clients. It took me less than 15 minutes to sign up for the Office 365 subscription and add Enterprise Mobility and Security licences. In addition in the same time I also set up the compliance and conditional access rules, joined my devices to Azure AD (Windows 10) and added my iOS devices to the party through the Intune Company portal.

The last job was to ensure that my conditional access and exchange activesync rules were working.

Enterprise Mobility Deployed

The Company Portal Showing devices deployed

15 minutes start to finish. (getting an Apple cert took the longest to allow iOS devices in).

A short exercise deserves a short post! My shortest ever even

The next post will show the steps I took and help you to join the @MSIntune party.

Sign up here.

Azure AD administration reaches the new Portal

What better birthday present could I have from Microsoft than to read Alex Simons blog post. Azure AD administration is in preview in the new portal. Hooray!

To explain why this is so important and such a bit thing I need to outline a little bit of Microsoft Azure history.

Azure V1.0

First there was Windows Azure (let’s call it V1)

This was based on Azure Service Management (ASM) and used the Azure portal as shown below. (This is now known as the Classic or Management Portal). The URL points to the legacy nature of this portal

azure ad administration

The services available in this portal grew over time from a few to lots! This portal was designed to be used and accessed by Subscription administrators and co-owners. Everyone who had access to the portal had full admin rights over the objects. Which in most day to day usage is a severe disadvantage. In lower level management it meant that bulk actions became tricky and starting and stopping multiple Virtual Machines (VM) was also a problem.

Azure V2.0

This led to the introduction of a brave new world of Microsoft Azure (or v2) which was based around Azure resource management (ARM). This is represented to the world through the New Portal (which has had names such as Preview Portal, Ibiza Portal and now just the Azure Portal). The URL is now as shown below.

azure ad administration

This new portal opened the possibilities of full Role Based Access (RBAC) and was aimed at a subset of users of various levels of rights and permissions. It also introduced resource groups which are buckets that can contain multiple objects of multiple classes, so VM’s, Networks, etc. This allowed administrators to create a Resource Group and provide a whole environment for a number of users or developers to make use of. All the resources in that group can be started , stopped and deleted as one.

azure ad administration



Each view of the portal is limited to one Azure Active Directory but the user can scroll through each Azure AD to which he has access. This new ARM model also allows for objects to be created using templates, in this instance Azure uses JSON templates 9java Script Object Notification). This allows for scripting and automating just about everything that Azure can do.


Due to the nature of the beast there is currently a need to use both Portals to have complete control of your Azure subscription, one of the most important features missing from the ARM portal was Azure Active Directory administration. As of 12/9/2016 this has now been put right. The preview is live.



Azure AD Team news

The blog post from the Azure AD team does an excellent job of outlining some of the features, so its not worth repeating here.

azure ad administration

The interface is logical and works well for me, its easy to see many more things at a glance that you could in the classic portal.

Here you can see the steps to configure resetting passwords.

azure ad administration

To me this is the coming of age of the new Portal and very soon I will have no need at all (other than top level subscription functions) to visit the site.

There is a quick video tutorial here

Your Azure AD Administration

My top tip is to add the Azure Active Directory administration tab to your permanent menu of items.

To do that from the main portal dashboard, click More Services and type Azure in the search box at the top – to the right of the Azure Active Directory PREVIEW is a star, click the star

azure ad administrationazure ad administration

The Azure Active Directory tab will then be pinned to the dashboard services list.

Over the next few months this will become the home of Azure AD administration, an excellent start with some seriously good features. I will be investigating further and reporting back!

Bare metal Nano

Last time I went through the quick and easy way of creating and connecting to a Nano server VM using a VHD.

This is great for all your virtual needs, but with the ability of a Nano server deployment image to act as a hyper v host, a failover cluster node and a scale out file server using Storage spaces it make sense to deploy Nano to a bare metal device. This enables far more of the device resources to be used for services and not for infrastructure and enabling services.

It is a very different ball game deploying Nano on a Bare metal device, this post is designed to show the process from concept to reality.

Deploying Nano to bare metal

There are three stages

1. Build a Nano Server image

2. Build a WinPE boot device

3. Deploy the Image to the Bare metal machine


Build an image

The first stage, as before is to create a Nano image file. In this case since we are deploying an image and not booting from a VHD, we need to create a.WIM file (Windows Image File) the process is identical to that in the previous post. Below is a PowerShell command line that will create a .WIM file and add to the image the ability to act as a Hyper V host, a Failover cluster node and also the storage features.

New-NanoServerImage -Edition DatacenterDeploymentType Host -MediaPath e:\ -BasePath c:\nanoserver\base -TargetPath C:\NanoServer\target\bootnano.wim -ComputerName baremetal -Compute -Storage -Defender -Clustering –OEMDrivers

The detailed breakdown is this

-Edition Datacenter  (Options are Standard or Datacenter)

-DeploymentType Host (Options are Host or Guest)

-OEMDrivers (loads bare metal drivers rather than guest VM drivers)


Build a WinPE bootable USB device

Having obtained your .WIM file it is time to build a WinPE boot device (USB) to do that you should download the Windows Assessment and Deployment Kit (ADK) and then run the Deployment and Imaging Tools Environment this is essentially a command line environment where the Windows Preinstallation Environment runs and enable the installation of WinPE to other devices.

Note: This must be run as an administrator (indicated in the window title bar)


two commands are required, first copy the WinPE environment to a staging folder on your hard drive.

copype amd64 c:\WinPE_amd64

then to install that on a USB device

MakeWinPEMedia /UFD c:\WinPE_amd64 E:

(where E: is the drive letter of the USB device and this device is then reformatted and made bootable)

The next step is to copy the newly created.WIM file to the root of this new USB device so it is available for deploying to the hard disk of your new bare metal device.


Deploy to Bare metal


The final stage is to deploy this image to the hard disk on the bare metal device. I chose to do this on a Gigabyte BRiX i7 device. This makes screenshots quite difficult but I did video the process which will be posted here when it has been tidied up..

The steps to do this are also listed below.

I plugged the USB device in and booted the BRiX, to boot form USB (on my device this involves pressing the DEL key until the BIOS screen loads and setting the boot device to USB)

When the system boots into WinPE from the USB device, the WPEInit command runs and sets the system up for action (it can take a couple of minutes to do this). Once the command prompt appears there are a series of commands you need to run.

The utility which carries out most of the work is Diskpart.exe (the command line version of the Disk Management program in Windows)

The short version is

1 clean off the disk (wipe it and remove all partitions)

2 make it a GPT rather than MBR disk

3 create three partitions

EFI (S: FAT32 System volume 100MB)

MSR (recovery partition)

Primary (N: NTFS WIndows volume)

The next stage is to run DISM (Disk Imaging and Servicing Module) this allows all sorts of magic with online and offline images (WIM files). The DISM command simply installs the image on the Primary partition

The final stage is to update the Boot database to point to our image and reboot using WPEUtil into a brand spanking new Nano Server.

The Long version is.

The commands for all these are shown below



Select disk 0


Convert GPT

Create partition efi size=100

Format quick FS=FAT32 label=”System”

Assign letter=”s”

Create partition msr size=128

Create partition primary

Format quick FS=NTFS label=”NanoServer”

Assign letter=”n”

List volume



Dism.exe /apply-image /imagefile:.\<yourimagefilename>.wim /index:1 /applydir:n:\ 


Bcdboot.exe n:\Windows /s s:


Wpeutil.exe reboot

This is obviously a manual process which could be automated using WDS and either a VHD or a WIM.

But for a one off setup it takes around 5 minutes. Not bad for a system capable of  Hyper V , Clustering and Storage services.

How to deploy Nano-server.


So, if you are an IT Pro you will have heard that Microsoft is releasing a new Server operating System in the Autumn of 2016 which contains something called nano-server. Cunningly branded as Microsoft Windows Server 2016. No waste in marketing creative brand design $’s there then, which is a very good thing. Why?

Well, it leaves the whole budget for creating what is, in my opinion the biggest thing to hit the Server operating system world in 20 years. To see what is new in Windows Server 2016 check out this link on TechNet and if you have more time available there are a series of really good Microsoft Virtual Academy resources here.

Currently the Technical Preview of the new Server OS is at TP release 5. This post will concentrate exclusively on this release.

Over the last few years the feedback about Operating systems in general and server operating systems specifically has been very direct and vocal.

Why do we need to install everything if I only want to run a file server?

Why do I have to reboot so often?

Why is the image so big that it becomes hard to store, move about and install?

The first attempt to resolve these issues was in Server Core released as an installation option in Windows Server 2008. A command line only version of the Server OS that can be managed remotely and to a limited degree from a direct console This ‘server core’ did away with a lot of extraneous ‘stuff’ and meant fewer updates, smaller images and smaller, quicker installations.

But this was not enough and so the Server Product team in Microsoft went back to the drawing board and produced a deployment option now known as Nano server. This cannot be installed from the DVD or ISO, it has to be installed using PowerShell and each individual image built up to only contain the roles and services that are required for that particular server.

If you fancy trying it now and don’t want to step through it in this post with me, then head off to this is the landing page for the Getting Started with Nano Server walkthroughs. You could also do worse than to head over to Channel 9 here for the Nano server channel. or here for the Windows Server channel

If however you would like to step through how to deploy this innovative new server that can sit in as much as 150MB or RAM and on a VHD of as little as 450MB in size, then read on.

Deploying a Nano Server

There are several ways to deploy a Nano Server. You can deploy a bare metal bootable image, a boot to VHD physical host and a VM image. All three require different tasks and commands. This post will concentrate on the VM method as this is the easiest for a new user to get up to speed with. Future posts will cover the other scenarios.

The first step is to download the Windows Server 2016 TP5 ISO. If you have an MSDN subscription you know where to get it from, if not you can sign up to evaluate the Preview here. Just sign in with a Microsoft Account and download the ISO.


You can also see that a pre created Nano VHD has been uploaded for you. I would recommend downloading both. This post will not use that VHD but will show you the steps to go through to create your own. It is, however, useful to have one sitting there ready to use.

The final way to evaluate a Nano server is by deploying one to Microsoft Azure, the Microsoft Public cloud infrastructure. You can sign up for a free trial here. But it is a much better idea to take advantage of the new IT Pro cloud essentials offer here, which gives a longer trial with more money to spend.

So to be able to follow these steps, the minimum you require is.

  1. ISO for Windows Server 2016 TP5
  2. A  PC or server with an operating system and a hypervisor.
    1. Windows 8 or 8.1 or 10 with Hyper V installed.
    2. Windows Server 2008,2012 or 2016 TP5 with the Hyper V role installed

The instructions I will use will show screenshots from Hyper V running on Windows 10. Hyper V does not come installed by default so you can follow  these instructions on Windows 10.

The first step is to mount the TP5 ISO, to do this copy the ISO to a folder and right click on the file. then click MOUNT (alternatively double clicking the file mounts and opens the ISO, in my case on a Drive Letter R:).

You will see a folder named NanoServer


Double click that folder to see the contents


The next step is to copy the NanoServerImageGenerator folder to your hard disk.

The result is shown below


Now for the good stuff. We need to run PowerShell to be able to create a VHD containing a NanoServer deployment image. Make sure you run PowerShell as an administrator (right click the icon and run as administrator.


I would always run the PowerShell ISE (Integrated Scripting Environment) as it provides a better environment to learn and understand the magic that is PowerShell.

If you do not use PowerShell often, you may well have to set the system to allow you to run local scripts that are not secure or signed.

Do that with this command.

Set-ExecutionPolicy RemoteSigned , type that in the white script area (if you only see blue then click the View Menu and make sure view script pane has a tick by it.


The click the Green Play symbol or press F5 to run the script pane (to run a single line, select that line and click the play selection icon, one to the right or press F8), accept any warnings or offers to save.

You will not get a result just a new prompt line (PowerShell does not give any feedback unless it is asked to or unless there are errors or warnings)

It is good practice to type all your commands in the white script pane on a separate line and run them line by line. This means you can save the commands as a script when you are done. Saving you retyping next time and even learning.

Now change directory (folder) to where you placed the ImageGenerator folder in my case that is the L:\ drive root folder. then type

import-module .\NanoServerImageGenerator –verbose

I have added the verbose switch so that you can see the commands (or cmdlets that are imported)


You are now able to use the CmdLets above to create your Nano Server VHD. There is an awful lot of work that goes in to create this VHD but you can do it with one very simple command.

Type (all on one line)

New-NanoServerImage -Edition Standard -DeploymentType Guest -MediaPath R:\ -BasePath .\Base -TargetPath .\NanoServerVM\NanoServerVM.vhd -ComputerName Nano1 – compute –storage –clustering

To break this command down would help, I am sure

New-NanoServerImage (This calls the CmdLet you imported)

-Edition Standard (This switch sets either standard or datacenter edition (new in TP5))

-DeploymentType Guest (This switch defines whether the VHD is for a physical host or a guest VM)

-MediaPath R:\ (This is where you mounted your Server TP5 ISO file)

-BasePath .\Base (This is where you are going to copy the installation files and packages, .\ signifies the current folder)

-TargetPath .\NanoServerVM\NanoServerVM.vhd (This is the full path including filename to your output VHD or VHDX)

-ComputerName Nano1 (This is the internal computer name)

There are many many more switches, these allow you to install roles and features into your image. This can be done at creation or by using PowerShell or DISM after the image has been built.

I want this VHD to be a file server, a failover cluster node and a hyper v host, so I need to add the following switches to the end of the command.

–Storage –Clustering –Compute

Then run that command by selecting it all and pressing F8. This will take some time and will take longer the first time you run it as the CmdLet copies the media files to your hard disk and creates a base Nano VHD as well. Not all these tasks are required for future image creations.

The script will run and will ask you to enter an Administrator password. I recommend P@ssword! so that you don’t forget it (as I have written it down) This will be the local administrator password. You can join a Nano to an Active Directory Domain (although Group Policy will not be applicable to Nano Servers).


When finished, you will have two new folders as shown


Vase contains all the NanoServer software, a .wim file and the packages to install inside and the NanoServerVM folder contains your new VHD.

This one is 674MB in size, not bad for a Hyper-V host, file server and failover cluster node.


Now that we have the VHD we need to use Hyper-V (or PowerShell to create a VM) – I will use PowerShell

I am going to assume that you already have a Virtual switch in your Hyper-V manager (if not use this link to do that now)

The PowerShell to create a VM for my circumstances is below, change it to suit your drive letters and paths.

New-VM -Name EdsNANO -SwitchName Internet -Path L:\NanoServerVM -VHDPath L:\NanoServerVM\NanoServerVM.vhd

Once this has completed with a result as shown


You can then head on over to Hyper V Manager and start the VM and connect to it.

The VM takes about 6 seconds to start and connect and will show this screen


Enter the Administrator username and password and you see the Nano Server Recovery console as below


The Nano Server is not designed to be administered locally but uses any r all of the traditional server management tools (and can also be managed from Azure – more of that in a different post)

From this point I suggest you explore the local configuration possibilities.

To connect to the Nano-Server remotely we need to get back to PowerShell, we can either use the new Windows 10 PowerShell direct feature or connect to the IP address of the server.

So from PowerShell use this command

Enter-PSSession -VMName EdsNANO

resulting in this output


You can see from the revised command prompt, that we are now working directly in the EdsNANO VM.

Check this by typing a number of commands to see what you have



get-process .

You can also look at the Hyper V manager in the Memory tab of the VM


a total of 220MB for a running server

And then inspect the virtual disk from the VM settings menu.


A full running server in a hard disk of 606MB

All pretty staggering.

Future posts will show what we can do with this great new technology, but it is only part of the full plan. Nano-Server will NOT run all workloads and anything that won’t run in Nano will run in Server Core.

Happy practicing

PowerShell:- My top 10 CmdLets to build a travelling lab.

So it doesn’t take much scanning of my Blog and other ramblings to realise I am a great fan of PowerShell. The ever demanding Harry (the TechNet UK Editor) suggested that I ought to write about my top 10 PowerShell tools that I use in creating my lab environments.


Now, the awesomely talented Andrew Fryer has already created a well read and long lasting Lab Ops series on his TechNet blog page. So there is no need for me to repeat any of that.

I use a mixture of platforms for my labs and demos.

My on premises ‘datacenter’ consists of;

HP Z600 Workstation twin Xeon 2.4GHz CPU’s (E5620) each with four cores and hyper threading enabled with 24GB RAM and 1TB of OS disk and 2 TB of data disk (with lots of iSCSI available from the various NAS devices around the place). This system runs Windows Server 2012 R2 in a Workgroup with Hyper-V

Plus I have two HP Gen 8 Micro Servers these have Pentium G2020T at 2.5 GHz twin core non hyper threaded with 16 GB of RAM and 2 x 500GB disks, these systems are Windows Server 2016 TP4 member servers running Hyper-V, IS and File and Storage services.

Cable is Cat 5E , switch is a web managed Gigabit (changed often so make is unimportant)

This allows me to run most scenarios to test and demonstrate Windows Server technologies.

I tend to manage the deployment of these physical hosts in a manual way to test the various setup options.

All of this is stored neatly in a secure underground bunker AKA – Ed’s garage – See pic



When I want to test WDS, MDT ADK etc. I use my Gigabyte BRIX systems. I have four of these to test the various storage and clustering options to destruction! (See below)


This is one machine short of ideal as I have to use the main DC as a cluster node as well. (Currently Storage spaces direct requires 4 nodes)

My Cloud data centre is, of course Microsoft Azure, sign up for a free trial here

But this post is about preparing and deploying a repeatable Lab setup on a single smallish Portable Workstation (Dell Precision M6700), it is not about automating the repeat (that will come later in a separate post or possible a number of them)

So – to the PowerShell  – NOTE – this post will not give you a block of code you can use but it will give you the ideas needed to start learning how to do this yourself.

The aim of this post is NOT to show a completely automated lab creation system but to detail the ten CmdLets I recommend you to use when building your own lab.

With the advent of Windows 10 (1511) and Server 2016 Technical Preview 4, Microsoft has introduced an additional virtual switch into the mix. The NAT (network address translation switch) this will allow you to segment your VM’s on any IP scheme you choose and not interfere with your physical host and still have external network / internet access.

1. New-VMSwitch is the CmdLet I use to create this new switch. There are many useful blog posts about how to set this up, this one is very easy to follow.

2. New-NetNat is the other CmdLet you need to finish the job, see the screenshot below


This pair of commands creates a new VMSwitch and a host based NetNat object to allow communication through. See the commands required above.

These are fairly innocuous commands but they then allow me to isolate my lab setup completely and place it on any IP scheme without worrying about external connectivity.

The Hyper-V Switch manager GUI hasn’t yet caught up with the game as you can see it is shown as an internal switch.


But if you head off to network connections you can see a new vEthernet switch has been added and it has the correct Ip settings.


2 down 8 to go.

Having set up the networking to allow my lab infrastructure to operate in isolation using Network Address Translation I now need to stick some Virtual Machines into the mix.

I need to keep my storage requirements quite low on a travelling lab. I also want to ensure I am using the most up to versions of software. (Windows Server 2016 TP4 currently) and Windows 10 Insider builds for clients.

I have 32GB of RAM to play with so should be able to host a number of chunky VM’s in that memory space using Dynamic memory, but the first consideration is my Virtual Disks.

I am choosing to forego performance in the name of space. I have decided to use differencing disks for my servers with all of the main server instances running on a full GUI version of Windows Server 2016 TP4. (I know Jeffrey Snover would not be at all impressed but when demonstrating it is often useful to show the native tools without needing to fire up a separate VM and the RSAT tools)

So I need to use some PowerShell to get hold of a VHD image and then make some changes to set it up as a parent disk and create some differencing disks.

My lab will need the following servers

  • Domain Controller (with all the main plumbing roles ADDS,ADCS,ADFS,DNS,DHCP)
  • File Server
  • Exchange Server
  • SharePoint Server
  • Web Application Proxy

This will then allow me to use this setup as a good ‘On-premises’ demonstration of integration with Azure Active Directory and Office 365 / Enterprise Mobility Suite.

So to the VHD image I need. The best way to do this is to download the super-cool Convert-WindowsImage script from the TechNet gallery. This allows you to create a fully sysprepped VHD or VHDX from either a WIM or an ISO.

The  Windows Server 2016 TP4 ISO can be obtained either from MSDN if you have a subscription, or from the evaluation centre here

3. .\Convert-WindowsImage.ps1 also has a UI you can call as shown below (but you cannot use all switches and parameters if you use the UI)


The full command you need to run is something like this

.\Convert-WindowsImage.ps1 -SourcePath $imagepath –VHDPath  $vhd -VHDFormat VHDX -Edition ServerDataCenter -VHDPartitionStyle GPT –Verbose

there are many more options and uses. I am cheating here as this is a script and not a single CmdLet, but I set the rules…so I suppose I can break them.

Having created the file and put it where you want it to be nice and safe from tampering, you need to make it read only.

4. Set-ItemProperty this is a simple file-system action to turn on the IsReadOnly attribute of your VHDX.

So a command such as

Set-ItemProperty –path $vhd -name IsReadOnly -value $true  would do the job.


Note the r in the attributes (Mode) column signifies read only.

We now have a base network and switch and a base VHDX to start our super portable lab.

So we need to create five differencing disks from the Parent disk we just made read only.

This is a simple New-VHD CmdLet

5. New-VHD has a whole load of parameters to assist you to define your disk.

Something like

New-VHD –Path $dcpath – ParentPath $vhd –differencing

and you would need five of these with the path variable changing for each VM

So we have a network and disks all we need to do now is to create the VM’s

This would would be a simple New-VM CmdLet

6. New-VM the standard VM creation CmdLet

New-VM  -Name $VMName –Generation 2 –VHDPath $dcpath –path $dclocation –switchname $natswitchname –BootDevice VHD

And again we would need five of these one for each VM.

Having now built the vanilla machines we need to add all the roles and features to the servers and then install the software we want on them.

Now if you want to discover all sorts of ways to do this automagically then head on over to @deepfat’s LabOps series of blog posts.

I will just outline the necessary CmdLets to set up your Domain Controller with all the roles and features.

For this reason if you are not going to be automating all this in one script (as I am not) then my No 7 pick is  the CmdLet Install-WindowsFeature

7. Install-WindowsFeature allows you to install windows roles and features, much like Server Manager does. The difference is that by default the CmdLet does not install any management tools so you will need to include the parameter –IncludeManagementTools to your commands. The really cool thing about this CmdLet is that you can either install the features in a running VM or or to an offline VHD.

The CmdLet requires administrative credentials.

To find out what is available you can run the Get-WindowsFeature CmdLet first and find out the available Roles and Features that you need.

I will be installing

            • Active Directory Domain Services
            • Active Directory Certificate Services
            • Active Directory Federation Services
            • Domain Naming Service
            • Dynamic Host Configuration Protocol

So the CmdLet will call a parameter containing the names of those features

$features = AD-Certificate, AD-Domain-Services, ADFS-Federation, DHCP, DNS

Install-WindowsFeature –name $features –vhd $vhdpath –Includeallsubfeature –IncludemanagementTools

If you have a thin image and the features are not available withinthe image you need to add a –source parameter and place a path where the .WIM file has been mounted this parameter is only used if the features cannot be found in the image itself.

With this achieved, you can safely start your DC VM either with Start-VM or through Hyper-V manager.

Once you have done that you can use PowerShell Direct on your Windows 10 or Windows Server Host Machine to Enter-PSSession using the –VMName parameter. This is a new Windows 10 and Server 2016 PowerShell feature and allows connecting to VM’s from the host without the need for a working network connection.


8. Enter-PSSession -VMName $VMName is my eighth choice, very useful at all times.


The arrow indicates I am in a session with my TP4HOST2 VM simply by connecting directly using the VMName.

In the final Stretch now – just two to go.

Simply Installing the Windows features above only installs the binaries necessary to be able to configure and manage the roles and features. It is still a requirement to actually get those roles and features into a state to provide a useful service to the network and users.

I only have 2 CmdLets left if I limit myself to 10, So I choose to use the AD DS CmdLet to install a new AD DS Forest. This will set up a Domain to be able to imitate any real world enterprise situation.

9. Install-ADDSForest

and here Windows Server allows you to build this command up with absolutely NO experience or knowledge of PowerShell.

Simply run the Wizard to Promote your Newly installed Server to a Domain Controller in a new Active Directory Forest. At the very last screen before you finish the wizard you are given the opportunity to save the PowerShell that was generated to actually preform the promotion.

The screenshots below show you how cool this is.



The script above was generated on a Windows Server 2016 TP4 machine so the Domain and Forest Modes are described as Threshold. It can simply be added to a script to make your server a fully functional AD DS Domain Controller.

My final CmdLet could be any one of a number of choices. From configuring Certificate Services to DHCP to ADFS. But since All your networks will need all machines to have IP addresses, I will opt for the DHCP CmdLet to authorise the newly installed DHCP server in your newly created Active Directory. AD DS will not allow a Domain Controller to be integrated and take advantage of all the great integration services unless it is authorised in the Directory.

The CmdLet is

10. Add-DhcpServerInDC there are many DHCP CmdLets in the Module and we will cover more of those in a different post as we still have to set up a scope, scope options, reservations and other cool stuff like filters and policies.


Indeed as you can see below there are 121 DHCP commands in the module for DHCPServer so the choice is very wide!


I am sure that any PowerShell guru reading this post will call it lightweight or half a job. If the aim was to provide a series of chunks of code to allow an automated repeatable lab to be built by simply passing a few parameters to a function or two, then i would agree.

My aim is to help a newcomer to PowerShell to understand that there is so much you can do and for them to go out and experiment with these 10 CmdLets. There are many online examples of the turnkey solutions I mentioned, but the new user is unlikely to learn unless they at least try and build their own Lab, piece by piece.

Enjoy trying. Below are a few resources to help you on your way.

Microsoft Virtual Academy – PowerShell Courses

Channel 9 – PowerShell Resources

Microsoft Learning – PowerShell Course

Microsoft Bashes Windows 10 at //Build

I never thought I would post something with that headline. Yesterday at Microsoft’s annual //Build conference for Developers in San Francisco we were treated to the usual high quality glitzy round of announcements and demonstrations.

From new Cortana power to some great innovative technology assisting people who cannot see to understand the environment around them. All this was very much focussed on developers and getting all manner of developers to write programmes (I still don’t like saying Apps) for the Windows ecosystem using the Universal Windows Platform. This enables us to use one application on ANY type of Windows 10 device, whether it be a raspberry Pi or an 80” Surface Hub, or even the purely magical HoloLens (which started shipping yesterday).

Having listened carefully to industry insiders over the last few weeks , the rumour and speculation about what was going to be announced included all sorts of wild things like PowerShell on Linux.

One thing Terry Myerson did announce is that there will be an Anniversary update for Windows 10 this summer and it will be free for all users (all 270 Million of them so far).

This update will include a whole host of new innovative and quite frankly excellent features, from more Cortana integration, to Biometric log in to web sites using Edge, imagine that NO MORE remembering all those Amazon, Netflix or other online site passwords and usernames. Simply swipe your finger or look at the screen and Windows Hello will log you in.

All of this is cool and exciting but the real biggie is that Microsoft did not announce PowerShell on Linux (yet, who knows if that is in the pipeline).

But they did announce Linux on Windows!

Well actually to be more accurate the Anniversary update will include a Linux subsystem that will allow Linux command line tools to run natively inside a Windows 10 PC. This sparked a number of Twitter spikes for and against.

I am confident that the new Microsoft led by Satya Nadella is about as open as its possible to be, both in what we do well and what goes wrong (See Tay our new AI bot for Twitter), but more importantly open in the sense that if you have something blocking you from moving to the Windows ecosystem as a Developer then it seems that we will move heaven and Earth to help you.

The Bash Shell will be fully functional on Windows 10 allowing all those developer who know and love that command language that has been on the scene since 1989. For those interested it stands for Bourne-Again SHell (Bourne was the shell it replaced).

This will allow many more developers and IT Pros to use commands they know from Linux tools directly on their Windows 10 PC. Read all about it here.

There is a 30 minute recording all about it on Channel 9 here and a great build interview with Channel 9 live here

Scott Hanselman also has a great blog post on the topic here

And if you want the Skinny direct from Ubuntu, Dustin Kirkland from Canonical writes about it here

For real Geeks this is based on Ubuntu 14.04 and should be available to insiders within a couple of weeks and will then be upgraded to Ubuntu 16.04

Now PowerShell fans may argue (and they do) that we don’t need Bash, we need a more developed PowerShell, I simply direct those to the mission of Microsoft under Satya.

“Empower every person and every organization on the planet to achieve more”

It Pros and developers absolutely know what they want, know when they want it and know how they want it implemented. If there is a blocker to doing business and they can carry on using what they are currently using without detriment to themselves or their employers business then that is exactly what they will do.

By integrating Bash into Windows and working with Canonical (who distribute Ubuntu) to make it work Microsoft have shown that whatever it takes to help people use the tools they want to use to deliver their hard work on the Windows platform will be considered and often implemented.

Another example of this is the use of GitHub the open source code repository to distribute PowerShell module updates.


For the first time yesterday I installed an Azure Module (1.3.0) from GitHub directly. Even the source code is there.

Now that is open for you. So if you are a Linux guru and want to start using Bash on Windows, watch this space for the Anniversary update, coming your way soon.


If you aren’t an insider already – sign up here

For more of the announcements at Day 1 of /Build check here.

Follow the live stream of Build keynotes here