How to deploy Nano server.

nano2

So, if you are an IT Pro you will have heard that Microsoft is releasing a new Server operating System in the Autumn of 2016. Cunningly branded as Microsoft Windows Server 2016. No waste in marketing creative brand design $’s there then, which is a very good thing. Why?

Well, it leaves the whole budget for creating what is, in my opinion the biggest thing to hit the Server operating system world in 20 years. To see what is new in Windows Server 2016 check out this link on TechNet and if you have more time available there are a series of really good Microsoft Virtual Academy resources here.

Currently the Technical Preview of the new Server OS is at TP release 5. This post will concentrate exclusively on this release.

Over the last few years the feedback about Operating systems in general and server operating systems specifically has been very direct and vocal.

Why do we need to install everything if I only want to run a file server?

Why do I have to reboot so often?

Why is the image so big that it becomes hard to store, move about and install?

The first attempt to resolve these issues was in Server Core released as an installation option in Windows Server 2008. A command line only version of the Server OS that can be managed remotely and to a limited degree from a direct console This ‘server core’ did away with a lot of extraneous ‘stuff’ and meant fewer updates, smaller images and smaller, quicker installations.

But this was not enough and so the Server Product team in Microsoft went back to the drawing board and produced a deployment option now known as Nano server. This cannot be installed from the DVD or ISO, it has to be installed using PowerShell and each individual image built up to only contain the roles and services that are required for that particular server.

If you fancy trying it now and don’t want to step through it in this post with me, then head off to http://aka.ms/nanoserver this is the landing page for the Getting Started with Nano Server walkthroughs. You could also do worse than to head over to Channel 9 here for the Nano server channel. or here for the Windows Server channel

If however you would like to step through how to deploy this innovative new server that can sit in as much as 150MB or RAM and on a VHD of as little as 450MB in size, then read on.

Deploying a Nano Server

There are several ways to deploy a Nano Server. You can deploy a bare metal bootable image, a boot to VHD physical host and a VM image. All three require different tasks and commands. This post will concentrate on the VM method as this is the easiest for a new user to get up to speed with. Future posts will cover the other scenarios.

The first step is to download the Windows Server 2016 TP5 ISO. If you have an MSDN subscription you know where to get it from, if not you can sign up to evaluate the Preview here. Just sign in with a Microsoft Account and download the ISO.

image

You can also see that a pre created Nano VHD has been uploaded for you. I would recommend downloading both. This post will not use that VHD but will show you the steps to go through to create your own. It is, however, useful to have one sitting there ready to use.

The final way to evaluate a Nano server is by deploying one to Microsoft Azure, the Microsoft Public cloud infrastructure. You can sign up for a free trial here. But it is a much better idea to take advantage of the new IT Pro cloud essentials offer here, which gives a longer trial with more money to spend.

So to be able to follow these steps, the minimum you require is.

  1. ISO for Windows Server 2016 TP5
  2. A  PC or server with an operating system and a hypervisor.
    1. Windows 8 or 8.1 or 10 with Hyper V installed.
    2. Windows Server 2008,2012 or 2016 TP5 with the Hyper V role installed

The instructions I will use will show screenshots from Hyper V running on Windows 10. Hyper V does not come installed by default so you can follow  these instructions on Windows 10.

The first step is to mount the TP5 ISO, to do this copy the ISO to a folder and right click on the file. then click MOUNT (alternatively double clicking the file mounts and opens the ISO, in my case on a Drive Letter R:).

You will see a folder named NanoServer

image

Double click that folder to see the contents

image

The next step is to copy the NanoServerImageGenerator folder to your hard disk.

The result is shown below

image

Now for the good stuff. We need to run PowerShell to be able to create a VHD containing a NanoServer deployment image. Make sure you run PowerShell as an administrator (right click the icon and run as administrator.

image

I would always run the PowerShell ISE (Integrated Scripting Environment) as it provides a better environment to learn and understand the magic that is PowerShell.

If you do not use PowerShell often, you may well have to set the system to allow you to run local scripts that are not secure or signed.

Do that with this command.

Set-ExecutionPolicy RemoteSigned , type that in the white script area (if you only see blue then click the View Menu and make sure view script pane has a tick by it.

image

The click the Green Play symbol or press F5 to run the script pane (to run a single line, select that line and click the play selection icon, one to the right or press F8), accept any warnings or offers to save.

You will not get a result just a new prompt line (PowerShell does not give any feedback unless it is asked to or unless there are errors or warnings)

It is good practice to type all your commands in the white script pane on a separate line and run them line by line. This means you can save the commands as a script when you are done. Saving you retyping next time and even learning.

Now change directory (folder) to where you placed the ImageGenerator folder in my case that is the L:\ drive root folder. then type

import-module .\NanoServerImageGenerator –verbose

I have added the verbose switch so that you can see the commands (or cmdlets that are imported)

image

You are now able to use the CmdLets above to create your Nano Server VHD. There is an awful lot of work that goes in to create this VHD but you can do it with one very simple command.

Type (all on one line)

New-NanoServerImage -Edition Standard -DeploymentType Guest -MediaPath R:\ -BasePath .\Base -TargetPath .\NanoServerVM\NanoServerVM.vhd -ComputerName Nano1 – compute –storage –clustering

To break this command down would help, I am sure

New-NanoServerImage (This calls the CmdLet you imported)

-Edition Standard (This switch sets either standard or datacenter edition (new in TP5))

-DeploymentType Guest (This switch defines whether the VHD is for a physical host or a guest VM)

-MediaPath R:\ (This is where you mounted your Server TP5 ISO file)

-BasePath .\Base (This is where you are going to copy the installation files and packages, .\ signifies the current folder)

-TargetPath .\NanoServerVM\NanoServerVM.vhd (This is the full path including filename to your output VHD or VHDX)

-ComputerName Nano1 (This is the internal computer name)

There are many many more switches, these allow you to install roles and features into your image. This can be done at creation or by using PowerShell or DISM after the image has been built.

I want this VHD to be a file server, a failover cluster node and a hyper v host, so I need to add the following switches to the end of the command.

–Storage –Clustering –Compute

Then run that command by selecting it all and pressing F8. This will take some time and will take longer the first time you run it as the CmdLet copies the media files to your hard disk and creates a base Nano VHD as well. Not all these tasks are required for future image creations.

The script will run and will ask you to enter an Administrator password. I recommend P@ssword! so that you don’t forget it (as I have written it down) This will be the local administrator password. You can join a Nano to an Active Directory Domain (although Group Policy will not be applicable to Nano Servers).

image

When finished, you will have two new folders as shown

image

Vase contains all the NanoServer software, a .wim file and the packages to install inside and the NanoServerVM folder contains your new VHD.

This one is 674MB in size, not bad for a Hyper-V host, file server and failover cluster node.

image

Now that we have the VHD we need to use Hyper-V (or PowerShell to create a VM) – I will use PowerShell

I am going to assume that you already have a Virtual switch in your Hyper-V manager (if not use this link to do that now)

The PowerShell to create a VM for my circumstances is below, change it to suit your drive letters and paths.

New-VM -Name EdsNANO -SwitchName Internet -Path L:\NanoServerVM -VHDPath L:\NanoServerVM\NanoServerVM.vhd

Once this has completed with a result as shown

image

You can then head on over to Hyper V Manager and start the VM and connect to it.

The VM takes about 6 seconds to start and connect and will show this screen

image

Enter the Administrator username and password and you see the Nano Server Recovery console as below

image

The Nano Server is not designed to be administered locally but uses any r all of the traditional server management tools (and can also be managed from Azure – more of that in a different post)

From this point I suggest you explore the local configuration possibilities.

To connect to the Nano Server remotely we need to get back to PowerShell, we can either use the new Windows 10 PowerShell direct feature or connect to the IP address of the server.

So from PowerShell use this command

Enter-PSSession -VMName EdsNANO

resulting in this output

image

You can see from the revised command prompt, that we are now working directly in the EdsNANO VM.

Check this by typing a number of commands to see what you have

IPCONFIG

hostname

get-process .

You can also look at the Hyper V manager in the Memory tab of the VM

image

a total of 220MB for a running server

And then inspect the virtual disk from the VM settings menu.

image

A full running server in a hard disk of 606MB

All pretty staggering.

Future posts will show what we can do with this great new technology, but it is only part of the full plan. Nano Server will NOT run all workloads and anything that won’t run in Nano will run in Server Core.

Happy practicing

PowerShell:- My top 10 CmdLets to build a travelling lab.

So it doesn’t take much scanning of my Blog and other ramblings to realise I am a great fan of PowerShell. The ever demanding Harry (the TechNet UK Editor) suggested that I ought to write about my top 10 PowerShell tools that I use in creating my lab environments.

powershell2

Now, the awesomely talented Andrew Fryer has already created a well read and long lasting Lab Ops series on his TechNet blog page. So there is no need for me to repeat any of that.

I use a mixture of platforms for my labs and demos.

My on premises ‘datacenter’ consists of;

HP Z600 Workstation twin Xeon 2.4GHz CPU’s (E5620) each with four cores and hyper threading enabled with 24GB RAM and 1TB of OS disk and 2 TB of data disk (with lots of iSCSI available from the various NAS devices around the place). This system runs Windows Server 2012 R2 in a Workgroup with Hyper-V

Plus I have two HP Gen 8 Micro Servers these have Pentium G2020T at 2.5 GHz twin core non hyper threaded with 16 GB of RAM and 2 x 500GB disks, these systems are Windows Server 2016 TP4 member servers running Hyper-V, IS and File and Storage services.

Cable is Cat 5E , switch is a web managed Gigabit (changed often so make is unimportant)

This allows me to run most scenarios to test and demonstrate Windows Server technologies.

I tend to manage the deployment of these physical hosts in a manual way to test the various setup options.

All of this is stored neatly in a secure underground bunker AKA – Ed’s garage – See pic

WP_20160322_15_47_13_Rich_LI[1]

 

When I want to test WDS, MDT ADK etc. I use my Gigabyte BRIX systems. I have four of these to test the various storage and clustering options to destruction! (See below)

WP_20160322_15_55_27_Rich_LI[1]

This is one machine short of ideal as I have to use the main DC as a cluster node as well. (Currently Storage spaces direct requires 4 nodes)

My Cloud data centre is, of course Microsoft Azure, sign up for a free trial here

But this post is about preparing and deploying a repeatable Lab setup on a single smallish Portable Workstation (Dell Precision M6700), it is not about automating the repeat (that will come later in a separate post or possible a number of them)

So – to the PowerShell  – NOTE – this post will not give you a block of code you can use but it will give you the ideas needed to start learning how to do this yourself.

The aim of this post is NOT to show a completely automated lab creation system but to detail the ten CmdLets I recommend you to use when building your own lab.

With the advent of Windows 10 (1511) and Server 2016 Technical Preview 4, Microsoft has introduced an additional virtual switch into the mix. The NAT (network address translation switch) this will allow you to segment your VM’s on any IP scheme you choose and not interfere with your physical host and still have external network / internet access.

1. New-VMSwitch is the CmdLet I use to create this new switch. There are many useful blog posts about how to set this up, this one is very easy to follow.

2. New-NetNat is the other CmdLet you need to finish the job, see the screenshot below

image

This pair of commands creates a new VMSwitch and a host based NetNat object to allow communication through. See the commands required above.

These are fairly innocuous commands but they then allow me to isolate my lab setup completely and place it on any IP scheme without worrying about external connectivity.

The Hyper-V Switch manager GUI hasn’t yet caught up with the game as you can see it is shown as an internal switch.

image

But if you head off to network connections you can see a new vEthernet switch has been added and it has the correct Ip settings.

image

2 down 8 to go.

Having set up the networking to allow my lab infrastructure to operate in isolation using Network Address Translation I now need to stick some Virtual Machines into the mix.

I need to keep my storage requirements quite low on a travelling lab. I also want to ensure I am using the most up to versions of software. (Windows Server 2016 TP4 currently) and Windows 10 Insider builds for clients.

I have 32GB of RAM to play with so should be able to host a number of chunky VM’s in that memory space using Dynamic memory, but the first consideration is my Virtual Disks.

I am choosing to forego performance in the name of space. I have decided to use differencing disks for my servers with all of the main server instances running on a full GUI version of Windows Server 2016 TP4. (I know Jeffrey Snover would not be at all impressed but when demonstrating it is often useful to show the native tools without needing to fire up a separate VM and the RSAT tools)

So I need to use some PowerShell to get hold of a VHD image and then make some changes to set it up as a parent disk and create some differencing disks.

My lab will need the following servers

  • Domain Controller (with all the main plumbing roles ADDS,ADCS,ADFS,DNS,DHCP)
  • File Server
  • Exchange Server
  • SharePoint Server
  • Web Application Proxy

This will then allow me to use this setup as a good ‘On-premises’ demonstration of integration with Azure Active Directory and Office 365 / Enterprise Mobility Suite.

So to the VHD image I need. The best way to do this is to download the super-cool Convert-WindowsImage script from the TechNet gallery. This allows you to create a fully sysprepped VHD or VHDX from either a WIM or an ISO.

The  Windows Server 2016 TP4 ISO can be obtained either from MSDN if you have a subscription, or from the evaluation centre here

3. .\Convert-WindowsImage.ps1 also has a UI you can call as shown below (but you cannot use all switches and parameters if you use the UI)

image

The full command you need to run is something like this

.\Convert-WindowsImage.ps1 -SourcePath $imagepath –VHDPath  $vhd -VHDFormat VHDX -Edition ServerDataCenter -VHDPartitionStyle GPT –Verbose

there are many more options and uses. I am cheating here as this is a script and not a single CmdLet, but I set the rules…so I suppose I can break them.

Having created the file and put it where you want it to be nice and safe from tampering, you need to make it read only.

4. Set-ItemProperty this is a simple file-system action to turn on the IsReadOnly attribute of your VHDX.

So a command such as

Set-ItemProperty –path $vhd -name IsReadOnly -value $true  would do the job.

image

Note the r in the attributes (Mode) column signifies read only.

We now have a base network and switch and a base VHDX to start our super portable lab.

So we need to create five differencing disks from the Parent disk we just made read only.

This is a simple New-VHD CmdLet

5. New-VHD has a whole load of parameters to assist you to define your disk.

Something like

New-VHD –Path $dcpath – ParentPath $vhd –differencing

and you would need five of these with the path variable changing for each VM

So we have a network and disks all we need to do now is to create the VM’s

This would would be a simple New-VM CmdLet

6. New-VM the standard VM creation CmdLet

New-VM  -Name $VMName –Generation 2 –VHDPath $dcpath –path $dclocation –switchname $natswitchname –BootDevice VHD

And again we would need five of these one for each VM.

Having now built the vanilla machines we need to add all the roles and features to the servers and then install the software we want on them.

Now if you want to discover all sorts of ways to do this automagically then head on over to @deepfat’s LabOps series of blog posts.

I will just outline the necessary CmdLets to set up your Domain Controller with all the roles and features.

For this reason if you are not going to be automating all this in one script (as I am not) then my No 7 pick is  the CmdLet Install-WindowsFeature

7. Install-WindowsFeature allows you to install windows roles and features, much like Server Manager does. The difference is that by default the CmdLet does not install any management tools so you will need to include the parameter –IncludeManagementTools to your commands. The really cool thing about this CmdLet is that you can either install the features in a running VM or or to an offline VHD.

The CmdLet requires administrative credentials.

To find out what is available you can run the Get-WindowsFeature CmdLet first and find out the available Roles and Features that you need.

I will be installing

            • Active Directory Domain Services
            • Active Directory Certificate Services
            • Active Directory Federation Services
            • Domain Naming Service
            • Dynamic Host Configuration Protocol

So the CmdLet will call a parameter containing the names of those features

$features = AD-Certificate, AD-Domain-Services, ADFS-Federation, DHCP, DNS

Install-WindowsFeature –name $features –vhd $vhdpath –Includeallsubfeature –IncludemanagementTools

If you have a thin image and the features are not available withinthe image you need to add a –source parameter and place a path where the .WIM file has been mounted this parameter is only used if the features cannot be found in the image itself.

With this achieved, you can safely start your DC VM either with Start-VM or through Hyper-V manager.

Once you have done that you can use PowerShell Direct on your Windows 10 or Windows Server Host Machine to Enter-PSSession using the –VMName parameter. This is a new Windows 10 and Server 2016 PowerShell feature and allows connecting to VM’s from the host without the need for a working network connection.

image

8. Enter-PSSession -VMName $VMName is my eighth choice, very useful at all times.

image

The arrow indicates I am in a session with my TP4HOST2 VM simply by connecting directly using the VMName.

In the final Stretch now – just two to go.

Simply Installing the Windows features above only installs the binaries necessary to be able to configure and manage the roles and features. It is still a requirement to actually get those roles and features into a state to provide a useful service to the network and users.

I only have 2 CmdLets left if I limit myself to 10, So I choose to use the AD DS CmdLet to install a new AD DS Forest. This will set up a Domain to be able to imitate any real world enterprise situation.

9. Install-ADDSForest

and here Windows Server allows you to build this command up with absolutely NO experience or knowledge of PowerShell.

Simply run the Wizard to Promote your Newly installed Server to a Domain Controller in a new Active Directory Forest. At the very last screen before you finish the wizard you are given the opportunity to save the PowerShell that was generated to actually preform the promotion.

The screenshots below show you how cool this is.

image

image

The script above was generated on a Windows Server 2016 TP4 machine so the Domain and Forest Modes are described as Threshold. It can simply be added to a script to make your server a fully functional AD DS Domain Controller.

My final CmdLet could be any one of a number of choices. From configuring Certificate Services to DHCP to ADFS. But since All your networks will need all machines to have IP addresses, I will opt for the DHCP CmdLet to authorise the newly installed DHCP server in your newly created Active Directory. AD DS will not allow a Domain Controller to be integrated and take advantage of all the great integration services unless it is authorised in the Directory.

The CmdLet is

10. Add-DhcpServerInDC there are many DHCP CmdLets in the Module and we will cover more of those in a different post as we still have to set up a scope, scope options, reservations and other cool stuff like filters and policies.

image

Indeed as you can see below there are 121 DHCP commands in the module for DHCPServer so the choice is very wide!

image

I am sure that any PowerShell guru reading this post will call it lightweight or half a job. If the aim was to provide a series of chunks of code to allow an automated repeatable lab to be built by simply passing a few parameters to a function or two, then i would agree.

My aim is to help a newcomer to PowerShell to understand that there is so much you can do and for them to go out and experiment with these 10 CmdLets. There are many online examples of the turnkey solutions I mentioned, but the new user is unlikely to learn unless they at least try and build their own Lab, piece by piece.

Enjoy trying. Below are a few resources to help you on your way.

Microsoft Virtual Academy – PowerShell Courses

Channel 9 – PowerShell Resources

Microsoft Learning – PowerShell Course

Microsoft Bashes Windows 10 at //Build

I never thought I would post something with that headline. Yesterday at Microsoft’s annual //Build conference for Developers in San Francisco we were treated to the usual high quality glitzy round of announcements and demonstrations.

From new Cortana power to some great innovative technology assisting people who cannot see to understand the environment around them. All this was very much focussed on developers and getting all manner of developers to write programmes (I still don’t like saying Apps) for the Windows ecosystem using the Universal Windows Platform. This enables us to use one application on ANY type of Windows 10 device, whether it be a raspberry Pi or an 80” Surface Hub, or even the purely magical HoloLens (which started shipping yesterday).

Having listened carefully to industry insiders over the last few weeks , the rumour and speculation about what was going to be announced included all sorts of wild things like PowerShell on Linux.

One thing Terry Myerson did announce is that there will be an Anniversary update for Windows 10 this summer and it will be free for all users (all 270 Million of them so far).

This update will include a whole host of new innovative and quite frankly excellent features, from more Cortana integration, to Biometric log in to web sites using Edge, imagine that NO MORE remembering all those Amazon, Netflix or other online site passwords and usernames. Simply swipe your finger or look at the screen and Windows Hello will log you in.

All of this is cool and exciting but the real biggie is that Microsoft did not announce PowerShell on Linux (yet, who knows if that is in the pipeline).

But they did announce Linux on Windows!

Well actually to be more accurate the Anniversary update will include a Linux subsystem that will allow Linux command line tools to run natively inside a Windows 10 PC. This sparked a number of Twitter spikes for and against.

I am confident that the new Microsoft led by Satya Nadella is about as open as its possible to be, both in what we do well and what goes wrong (See Tay our new AI bot for Twitter), but more importantly open in the sense that if you have something blocking you from moving to the Windows ecosystem as a Developer then it seems that we will move heaven and Earth to help you.

The Bash Shell will be fully functional on Windows 10 allowing all those developer who know and love that command language that has been on the scene since 1989. For those interested it stands for Bourne-Again SHell (Bourne was the shell it replaced).

This will allow many more developers and IT Pros to use commands they know from Linux tools directly on their Windows 10 PC. Read all about it here.

There is a 30 minute recording all about it on Channel 9 here and a great build interview with Channel 9 live here

Scott Hanselman also has a great blog post on the topic here

And if you want the Skinny direct from Ubuntu, Dustin Kirkland from Canonical writes about it here

For real Geeks this is based on Ubuntu 14.04 and should be available to insiders within a couple of weeks and will then be upgraded to Ubuntu 16.04

Now PowerShell fans may argue (and they do) that we don’t need Bash, we need a more developed PowerShell, I simply direct those to the mission of Microsoft under Satya.

“Empower every person and every organization on the planet to achieve more”

It Pros and developers absolutely know what they want, know when they want it and know how they want it implemented. If there is a blocker to doing business and they can carry on using what they are currently using without detriment to themselves or their employers business then that is exactly what they will do.

By integrating Bash into Windows and working with Canonical (who distribute Ubuntu) to make it work Microsoft have shown that whatever it takes to help people use the tools they want to use to deliver their hard work on the Windows platform will be considered and often implemented.

Another example of this is the use of GitHub the open source code repository to distribute PowerShell module updates.

image

For the first time yesterday I installed an Azure Module (1.3.0) from GitHub directly. Even the source code is there.

Now that is open for you. So if you are a Linux guru and want to start using Bash on Windows, watch this space for the Anniversary update, coming your way soon.

image

If you aren’t an insider already – sign up here

For more of the announcements at Day 1 of /Build check here.

Follow the live stream of Build keynotes here

Server Management Tools in the cloud

There are a number of things about my job that keep me awake at night or wake me up early and none of them are bad. It’s all about the excitement of what is coming our way in the management of our servers.

One of those excitement points as I call them was October last year when Jeffrey Snover demonstrated the ability to manage the awesome Nano server technology (another excitement point) from a cloud based console. No date was given and no expectation of the true goodness that was about to be delivered.

Well on February 9th the Server management Tools (SMT) went into public preview as an Azure service.

graphic

This new service allows you to use your Microsoft Azure Portal to create multiple instances of Server Management Tools, 1 instance for each server you would like to manage.

You need to have an SMT gateway which you can install on one of your Azure VM’s or in your datacentre.

Once that is done you are free to use this portal to carry out a wide range of management functions.

The service really comes into its own when you have a number of hosts running the new headless Nano server deployment. There is no way to manage these other than by remote PowerShell, you cannot RD into the console (there isn’t one). You can connect directly to make a couple of firewall changes and checkout the networking, but that is all.

The following screenshots taken from a Nano VM (which is also a container host) show the limitations of the direct management features

imageimageimageimage

Nano server is featured widely on Channel 9 and Microsoft Virtual Academy, why not brush up on your Nano server skills before reading on.

The process of setting up SMT is not a difficult one but you do need to have an Azure subscription and to make it super-easy, here is a link to a 30 day free trial with enough credit to get you going (£125)

Rather than take you on a complete walkthrough which is already available here, I have produced a @Ch9 video which lasts around 30 minutes and takes you through the whole process of setting up VM’s the SMT service, gateway and management options.

You can find that here. This is the first entry in the new @TechNetUK How-to series on Channel 9, please feel free to send in requests for How-to guides to ukitpro@microsoft.com

Having deployed the full set of SMT tools to your Nano server it is then possible to run a remote PowerShell session directly onto the server. In addition to that the PowerShell console has the full command window experience and a Script Editor to boot.

image

And, there’s more

You can view, edit and add to the registry of the machine

image

you can also carry out all the below management functions.

image

A whole array of powerful functions that are just not available to you on the Nano interface (as it is).

It is particularly useful to be able to join a domain, rename the machine and check on roles and services available or installed.

The one gotcha you will see in the @Ch9 video at 22 minutes 40 seconds is the failure to connect to the Nano server from SMT.

Regular PowerShell users will know instantly that it is because the machine you want to manage is neither in the same domain as the management server or is it in the TrustedHosts list.

A simple one liner run in an elevated PowerShell session on the management server will fix this.

 

 

Set-item WSMAN:\localhost\client\TrustedHosts *

Now for a production system please replace the * with the IP address of the server you want to manage. For a lab system adding the * wildcard to open up all options is acceptable.

image

Once done, this error will disappear.

The flexibility of this service to manage your cloud or on-premises servers is what I like most. I can now be anywhere and as long as I have web-access I can use the Azure Portal to manage my entire Nano server (and full Server 2016 TP4) infrastructure (albeit one by one).

I hope you enjoy the video and do keep checking this blog and the @Ch9 page for new content.

Just how mobile is the Windows 10 experience?

Windows 10  a Mobile experience?w10mobility

I have been cooped up at home for the last couple of weeks, unable to drive or travel very far after shoulder surgery. This lack of mobility got me thinking about my normal working life and just how mobile I am and how does Windows 10 live up to one of its main aims, to enable the windows experience to be more mobile.

I should add that even though I will be talking about Windows 10 on a phone in this post I am not using the word ‘mobile’ to equate to smartphone. It is the mobility of the experience that is key here and not working whilst moving, there are key differences.

mobility

The question I will try and answer is;

Is there anything I can do at the office on a Windows 10 desktop device, that I cannot do with Windows 10 due to my location or device on which I am trying to work?

I deliberately exclude the availability of internet as important in this question. Internet access is assumed.

So I thought the best way for me to answer this question is by looking at all the activities I engage in, at work, as an IT Pro Technical Evangelist.

So lets start with where I work and what I use to do that work.

The Office (Work Premises)

tvp

I have added the parenthesis as I am a home worker and I use my own machine at home as you will see later on.

So Microsoft have provided me with a device to use. I currently have,as my main Domain-joined device, a Surface Pro 3 (i7, 8GB RAM 256GB SSD). Yes I count myself very lucky as it is a super piece of kit. I do covet a Surface Pro 4 or Surface Book and maybe my nice business manager Phil will be kind to me later on this year. If not I will get myself a Surface Pro 4 keyboard as they seem to make the whole experience so much better. Still, I digress.

When I arrive at work in Building 2 floor 2 section H (the hot desk section indicated by the discrete red arrow above) I plug this machine into one of the lovely Dell monitors (U2515) that the aforementioned Phil bought for us which allows me some real space to work with.

From here I am able to print through the location aware printing we use (controlled with Smartcard access at the printer), I can access all the internal file shares which seem to be disappearing in favour of cloud storage such as OneDrive and OneDrive for Business.

I have wired and Wi-Fi access to the corporate network and all resources to which I  am authenticated and authorised to access.

I have access to Cortana and all of my store apps which I am signed into with my Microsoft Account for personal apps and the Business Store Apps when I add an organisational account to the store.

So – that is my aim. The same level of connectivity and functionality or productivity that I have from that desk.

The Home Office

WP_20160311_15_20_54_Pro_LI

At home due to age and infirmity I use an electrically adjustable standing desk and i have a Dell 1650 Workstation (i7, 16GB RAM 1TB SSD and other local storage from various NAs devices and Local disks) with a four monitor setup. This may sound like overkill but when i bought it and se it up i was a full time trainer and a lot of my work was done from home with online remote training. For this I needed to make sure that I had enough screen real estate and grunt to manage many VM’s and applications. Suffice it to say I do not need all this now.

I have an Enterprise version of Windows 10 running the February 1511 update. I sign in with my Microsoft account xxxxx@hotmail.com and I have used Work Access to enrol my PC into device management. Our IT department (MSIT – this is worth looking at we have a huge network) use Microsoft Intune  (part of the Enterprise Mobility Suite) for this.

This is a pre requisite to be allowed to get onto the MSIT Virtual Private Network (VPN). Since my machine is mine and is not domain joined, that is the only way I can access all the resources and applications I want in the style I want. Having done this, I also use my Microsoft employee badge which is a smart card as a source for Multi Factor Authentication (MFA). It gets better, each time I connect to the VPN, I now receive an MFA request on my Lumia 950 smartphone which is either a phone call or an app authentication request.

intune

The reason for this is that Microsoft have made a decision to ensure all external access to the corporate network relies on phone based MFA.

If I want access to certain internal websites or resources without connecting to the VPN I can use my smart card or a fairly new feature a virtual smart card. The Trusted Platform Module on my PC stores a certificate which is enabled by my physical smart card. I still have to enter a complex PIN but I don’t have to plug in my card every time.

So I can still print to my work printers as well as my home printers. I have full access to all the work resources and applications I have when in the office. It is my machine but I have policies enforced on it from Microsoft Intune,like a requirement for a work PIN to be created (part of the Microsoft Passport system). The Intune admin could also prevent me from copying any work related data into applications that are not controlled. A feature called conditional access.

I ought to mention that the Microsoft Intune and the Enterprise Mobility Suite are additional options that have additional licensing costs.

If I don’t want to spend any extra money I still have the option to use a VPN without the additional device management provided by Intune.

I could use my Surface Pro 3 at home as I have a dock and can use two monitors with that as well. Finally I could use Windows to Go. If you are an enterprise user with a copy of Windows 10 Enterprise then you can use a wizard to create a USB stick based version to carry around with you. This can then be plugged into any device that will run Windows 10 and that can boot from USB. The device owner is reassured that you cannot mess around with his data or hardware and you are safe in the knowledge that this BitLocker protected device is safe secure and you can remain fully functional on the road, at home or anywhere else that there is a device waiting for you.

At Customer Sites or events

Now I can dream that I will get to work in such idyllic locations, but it really doesnt matter where I am or what I want to do. Armed with the Surface Pro 3 or the Windows to Go stick I am pretty much bullet proof as far as devices go.

If,however, I am caught short and only have my phone with me, i have scored there as well. I have a Microsoft Lumia 950 as my issued work smartphone. You might have gathered by now that I am a bit of a gadget geek. This device is pure magic for me. I was lucky enough to given a prototype and a trial version so have used this for the best part of a year now.

As a phone it does what it should as a smartphone it equally holds up to most competition. It is fast, the camera is excellent, it runs Windows 10 so all my Universal Windows Platform apps (UWP) work seamlessly as well. It has more RAM and a faster processor (with six cores) than any PC I had had up until at least 2008 or so.

The big news now though is that this phone will turn itself into a PC just by adding a Bluetooth keyboard and mouse and either a large screen with Miracast or add the small dock and you can use wired keyboard and mouse and up to two additional screens.

I now use this as my presentation device at almost all events and meetings. It is also under Intune management and has access to the VPN, indeed it has app activated VPN as well so any application that requires VPN access automatically connects.

This phenomena is known as Continuum for the phone and is pretty much game changing.

The only downside is that until developers have written more UWP apps, the number of apps that will run in the large screen is small. I am glad to say that all my productivity apps as well as Microsoft Edge do run in a full screen.

So now from playing films (sorry movies) and videos whilst away from home to presenting awesome slides and demonstrations it can do it all. All the other brands, are now playing catch up with turning the phone into a single device to do everything on!

While travelling

I added this as the only really mobile part as in moving while using technology. I often work when I am travelling on a  train or plane. All the above applies although I don’t often use the phone for anything other than sharing my 4G EE connection with my Surface Pro 3 to get some reasonable connectivity.

Summary

I appreciate that I haven’t covered many of the Windows ecosystem devices in this post. Simply because I don’t use them day to day.

I haven’t seen a HoloLens up close. I don’t have access to a Surface Hub at home or in my working day very often. I do have an Xbox One but rarely use it for work, other than watching great Channel 9 and Microsoft Virtual Academy content. Indeed there is a load of very good content up on the Enterprise Mobility Suite why not take a look at some lab walkthrough videos I recorded showcasing the abilities of the Enterprise Mobility Suite?

Conclusion

I have all I need to work anywhere on any device (even non- Windows devices using Office 365 and the Enterprise Mobility Suite).

Windows 10 fits the bill for everything I need to do.

Now to find that nice man Phil and get hold of the Surface Book I truly deserve.

The Life of an IT Pro: Past , present and future

ppf

In my day to day role I am constantly reminded of the fast pace of our new mobile-first, cloud-first world. You only have to look at the Microsoft Azure cloud services or the Office 365 products and services to see just how rapidly things change.

This constant refreshing of our IT Pro world started me thinking about the old times, the old products and what life was like way back when. I also started to get a little uneasy about what is going to happen in the future to the traditional IT Pro.

So why don’t I take you on a journey down memory lane, through to the modern day and what might be around the corner for us all in the future.

THE PAST

My first IT Pro role was back in 1994 when I was serving as a Police Officer in West Midlands Police. There were the usual political wrangles of procuring kit, software and resources to provide a new network, isolated from the main corporate network on which we would carry out a project to provide Crime Mapping, Human Resources planning etc.

My role was that of server hugger, sorry, Server Administrator. We faced several big issues with hardware and software but eventually went with the following:

A Pentium processor, not much RAM, a tiny hard disk running Microsoft Windows NT3.1 and five client PC’s running Windows for Workgroups 3.11 over a thin net Coax network.

The majority of you reading this will not have seen ANY of these things let alone used them. The interesting thing is that despite the tiny RAM and tiny hard disk it was plenty for what we needed at that time.

The operating system came on 3.44 HD Floppy Disks, or as one young delegate at a recent event said to me, “ah yes isn’t that the save icon?!”

clip_image001

To install the OS you had to hand feed the machine a huge number of these floppy disks in order, and if one was wrong, corrupted or an error occurred you had to start again.

Eventually we had the system up and running and we managed to use the MapInfo product (now part of Pitney Bowes) and its Map Basic development language to produce a very interesting crime mapping product, the first of its kind in the UK and the forerunner to a lot of today’s more intelligent policing strategies based on data analysis.

I need to add here that I had nothing to do with the development other than some rather neat Excel Macros to feed in the data. That was all down to a PC colleague John Kelly and the boss Supt Gil Oxley. (Gil went on to run IT in West Midlands Police and to form his own company providing HR resourcing software for rostering shift patterns. I don’t know what John is up to now).

What problems arose then that we still see today?

DevOps

clip_image003

– The term DevOps didn’t exist but the common sense processes between Developers and Operations staff did. But how often was it used? Well, the team was three strong, Devs and Ops worked together and there was only one application so we didn’t have much of a problem. There was no cloud, no real internet and many fewer threats too. The one challenge I faced most often was nosey police officers taking off one of the Ethernet coax terminators and sending the whole network down.

Networking

clip_image004

– There was no TCP/IP stack for Windows so it ran on NetBeui, and because that dint follow the OSI 7 layer model it wasn’t allowed to run on our corp network so we had to build our own. It was slow and clunky – see above for people taking it down at will.

Hardware

clip_image006

– Installations took forever, CD Roms had just come in but our software was distributed on Floppy Disks. Hard Disks were slow and backup took forever.

Remote Access / Enterprise Mobility

clip_image007

– This was only possible by modem which made it very very slow, and was only available for administrators on the server, there was no terminal services or RDS back then. There was no concept of mobile computing, especially not in a Police environment.

Updates

clip_image008

– These were released as Service packs once in a blue moon and were essentially a whole load more floppy disks. If there were bugs and you had an account and a support contract you had access to download hotfixes by modem from the FTP site (slowly).

Users

clip_image010

– This was one of the hardest challenges and to some degree is still the same today. The difference here is that we were training from scratch with Windows. Almost all Police systems were not running on GUI Operating Systems. The average user was initially actively against the extra work needed to use the systems and didn’t like having to use the HR system for rostering either. The majority did not have PCs at home and almost none of them had any form of mobile communications, although some had new-fangled message pagers.

PRESENT

Even though I am a technical Evangelist and I don’t have a regular IT Pro day job, I do help out family members and friends with their business IT. So since this is current and real I will obfuscate identifiable parts of the present section.

The scenario is a small business with around 20 users. The users and management are not hugely IT friendly or literate (by today’s standards), but they all have smartphones and laptops / PCs at home.

The network in use has Cat 6 cabling and Gigabit managed switches for each segment. There are three servers; one running Windows Server 2012 R2 Essentials, one is the old Windows Server 2008 Small Business Server and the other is a Server 2008 R2 RDS Session host for remote desktop access. The new Essentials server uses Azure Backup and is integrated into the Office 365 business premium subscription. The network uses IPv4 and IPv6 and the phone system has just been replaced with a fully Ethernet VoIP system. The card payment system has also been integrated and no longer uses the telephone network but is connected through the Cable internet system running at 100Mbps.

There are 15 Windows PCs, 3 different WiFi networks and all users have full remote access both to the RDS Server and through Essentials to their desktop PC should they need it.

There is LOTS of RAM, large fast processors and masses of RAID 0 and RAID 5 hard disk arrays.

There is also a LTO 4 tape drive for backup locally which is removed from site daily.

The administration of this system is all completed out of work hours and mostly remotely.

DevOps

clip_image011

– All applications in this network are commercial, either off-the-shelf or LOB applications, and the providers require some form of remote access to perform updates. The OS is Windows and is managed by online updates. There is no concept of DevOps in this business as there are no in-house or remote developers.

Networking

clip_image013

– Fast, Gigabit Ethernet fully-switched network with a small business router connecting to the outside world through cable internet and VPN.

Hardware

clip_image015

– Installations are often carried out through remote access, take a few minutes and the BITS are always downloaded through a seriously fast internet connection. Media (if used) is either USB or DVD and the backups are rapid whether to tape or Azure.

Remote Access

clip_image016

– The business cannot survive in its current model without remote access. Several people work from home, the bookkeeper is often remote only. All users regularly connect to access data which has to be stored on site for regulatory reasons. Office 365 is the email solution.

Updates

clip_image018

– These are automated, online and fast. No longer managed by WSUS and will soon be using the Windows 10 model for client updates current branch. Servers are updated by Windows Update on a regular basis. No more FTP or disk updates. 3rd party suppliers often connect to update their applications directly.

Users

clip_image019

– From users resenting PCs, they now require faster, better PCs and applications. Downtime is not accepted and the users have a hundred ways to communicate the smallest upset to service availability. All users connect to email remotely either by PCs or smartphones. OneDrive keeps them connected to work documents as does Office 365.

FUTURE

This is the bit I get to fantasise about; such as how easy being an admin will be in the near future.

Obviously hardware will never be the limiting factor, and applications are continuously integrated and deployed so they are always up-to-date and functioning.

The local datacentre is running on Windows Server 2016 with Azure stack deployed. Developers are using a mixture of Visual Studio with Team Services to manage deployment.

So what will I do as an IT Pro 10 or 15 years from now?

DevOps

clip_image020

– This is the area I expect to change most in business IT systems in the future. I would imagine all IT Pros will need to understand the mind and processes of a Developer and ensure that software defined infrastructure, software defined networking and storage are all configured to provide what they need in Dev, Test and Production at all times.

Networking

clip_image022

– Obviously all networking will be at 50GBe from server to client and faster for the core services. The big change for the IT Pro (other than the cable chap and the rack installation guy) will be that everything is software defined. There will be one console to ensure that the whole stack from storage to compute, through networking is scalable, flexible and self-service within set parameters.

Hardware

clip_image024

– Windows Server 2016 advances mean that all storage is directly attached and running in Storage spaces direct with Failover clusters hosting the necessary Nano-based Virtual machines. All high value data is stored on shielded VM’s protected by Active Directory-based attestation protected by HSM and TPM.

Remote Access

clip_image026

– Enterprise Mobility is the new normal. All users have access to thousands of SaaS apps with SSO provided by Azure AD integration. Microsoft Intune and Azure AD join enable cradle to grave management of Windows and other devices.

Updates

clip_image028

– These will be delivered in secure packages, through online updates, perhaps staged in cached areas of your own network and not just for Windows but for all technology that requires updates to software and firmware.

Users

clip_image030

– If the More Personal Computing vision of Satya Nadella has been successful, the methods of interaction with your PC will not stop at Holograms. The guys and girls at MS Research will be seeking out new ways of interacting with and controlling our technology.

You know what that sounds a lot like NOW, just faster and everyone will have adopted it!

In all seriousness, the IT Pro needs to evolve with each generation of Developer, User and Operating System, to ensure that his or her skills are kept up to date and relevant.

Don’t get left behind – make sure you know what’s coming by checking out the Microsoft Virtual Academy and Channel 9.

After all, we live in a world where PowerShell now works with Linux and network speeds to end users have moved from 5Mbps to 10GBps.

In 1994 when I built my first network, neither PowerShell or Linux existed (Ed: Linux did, but Linux 1.0 had not actually been released at the time of the network install).

So what will happen in the future? One thing is for sure it, isn’t going to be the fastest or the fittest that survive. Rather, it will be the one who is most willing to adapt to change.

If you are an IT Pro in today’s world just ask yourself – am I skilled and qualified enough to move into the future? Why not start by coming along to one of our #InnovateIT events and see what’s changing in Enterprise Mobility, Cloud Infrastructure and Datacentre Infrastructure, not forgetting of course the Operations Management Suite. You can register for an upcoming session near you here.

#InnovateIT- UK IT Camps, new content, new events

The @TechNetUK #InnovateIT events are changing – Find out how and why.

1542_Social_Banners_twitter_v1.5

One of the most noticeable differences between the ‘Old World’ and the new Mobile first, cloud first world of Microsoft is that the content rarely stays the same for more than a few months.

This rapidly changing theme has an impact everywhere, the products, the training courses, exam content and now to our very own @TechNetUK #InnovateIT IT Camps.

For the last six months we have been running a series of events with hands-on labs and audience participation covering What’s new in Windows 10 Enterprise, What’s new in Windows Server 2016 and What’s new in Cloud Infrastructure based on Azure Resource Manager templates.

Well, out with the old and in with the new we have three new areas of content with right up to date material and labs to use. Please note you still need to bring your own device to use the labs.

This device only needs to have internet access and a browser. A keyboard is highly recommended as the iPad / Tablet without keyboard experience is hard to use.

A screen size in excess of 8” is also recommended , especially if your eyes are as old and tired as mine are.

The new areas we are covering between March and July are:

image

Empowering IT to protect and enable productivity

This one day event covers the following topics

1.Identity and Access Management

2.Increase productivity and reduce helpdesk costs with self-service and SSO experiences

3.Manage and control access to corporate resources

4.Mobile Device and Application Management

5.Information Protection

image

Increasing efficiency, and scalability with Windows Server 2016 and Microsoft Azure

This one day event covers the following topics

1.Datacenter Infrastructure

2.Azure Infrastructure

image

This one day event covers the following topics

1.Introducing the Microsoft Azure Stack

2.Visibility, Insights and Security Analytics with the Operations Management Suite

4.Cloud-Enabled Protection with the Operations Management Suite

5.Hybrid Automation with the Operations Management Suite

The days takes the form of a mixture of lab exercises, demonstrations, presentations and useful group chats about areas of technology that will be key to most if not all businesses over the next few months and years.

This obviously keeps us in the @TechNetUk office on our toes in a technical learning way, which is good. It also means we can come around the country and pass on the latest greatest advances in the Microsoft world to you all.

Register HERE to attend, the site is being updated to reflect these changes over the next few days so remember to come back if they are not yet available.

Come along and meet your friendly neighbourhood Technical Evangelists and Microsoft MVPs get learning the future of IT.

You can also keep up with the topics at the Channel9 portal and the Microsoft Virtual Academy

Finally you can also practice the skills we demonstrate in the #InnovateIT events by using the TechNet Virtual Labs

Microsoft Azure Stack: What is it?

Over recent weeks whilst running our InnovateIT IT Camp events around the country I have been asked one question more than most. Microsoft Azure Stack: What is it?

Well here goes for an Azure Stack 101 explanation.image

For several years now Microsoft has been growing its Public cloud offering Microsoft Azure. The services are hosted in datacenters around the world and provide access to scalable compute, storage and networking for customers to deploy their Virtual Machines, Web Sites and many other business services.

On February 3rd this year the Microsoft Azure Stack Technical Preview (TP1) was made available for download.

Microsoft Azure Stack takes the power of Azure and makes it available in your own Datacenter.

I recommend taking a look at the many resources available on MVA and Channel9 to get up to speed.

image

MVA Courses

 

image

Channel 9 resources

The rest of this post will be a short introduction to what is available in the recently released Technical Preview of the Microsoft Azure Stack and what you will need to try it out.

Be warned the single node Proof of Concept preview has some fairly hefty hardware requirements (it is meant to run in a Datacenter after all)

image

I am unable to show an install, configure and run post as I don’t have the hardware available to do this.

The Hardware above is in itself an issue that a lot of small businesses and individuals will face when trying to skill up with Azure Stack.

So what does TP1 look like?

Well, once you have installed the Azure stack BITS on your 12 Cores with 96GB RAM and 5 hard disks, you have a number of services available. As far as Compute is concerned we are limited to Windows Server 2012 R2 for now. The number and type of services are being added bit by bit.

image

Watch Jeffrey Snover demonstrate this here.

In addition to compute you also get the same Storage solutions as Azure provides, Blob storage, table storage etc. with the same features and properties.

The benefit in using Azure Stack is that you can now deploy your services anywhere, Microsoft Azure, Azure Stack in your datacentre or Azure Stack in your hosting partners Datacenter, or spread the services across all three.

Azure Stack allows the IT pro to provide a plan to developers that limits compute, storage and other resources to prevent sprawl in your Datacenter. This is similar to the current Windows Azure Pack offering. The difference is that Azure Pack relies on an underlying System Center deployment to provide tenancies. Azure stack uses the changes in Windows Server 2016 and the Azure Stack code to manage the virtualization and networking directly without the added overhead of System Center.

Azure Stack also uses Azure Resource Manager (ARM) templates, this allows your users to deploy resources within their own Azure Stack subscriptions which you provide to them as part of the plans you create.

There are a whole bunch of ARM templates available in GitHub which you can deploy as is or amend to fit your own infrastructure. You can deploy these to Azure or Azure Stack because the same code is running on both platforms.

Azure Stack will allow you to make your infrastructure available to the development community within your organisation without losing control of it. The ability to allow continuous deployment and integration using Visual Studio or PowerShell or GitHub makes this a perfect DevOps tool

The Azure Stack will evolve and develop as the months go by and I will post more when the next TP is available.

Time to get the Preview or at least watch Mark Russinovitch and Jeffrey Snover launch the Preview here

What should I study? Which Exam should I take?

In the TechNet UK office we have an open email address for people to ask us all manner of questions and even sometimes complain about us or praise us… but mostly we like it when people tell us what they want from us, that way we know our audience is getting what they want. In this case information about Azure exams.

The email address is ukitpro@microsoft.com, please use it. Tell us what you want from us!

This post is in answer to a very interesting question and one I come across all the time in my role as both a Technical Evangelist and also as a Microsoft Certified Trainer (MCT)

The two questions I want to address today are:

What should I study?

Which exam should I take?

This is in response to one of our IT Pros out there who asked.

Hi, my name is xxxx and i want ‘CCNA’ OR ‘Microsoft Azure Specialist certifications’ which 1 is best for me please tel me and tell me online certification i want both of one, which is best online university?

There are a number of variables in this question which prevent me giving direct advice to the keen learner.

Everyone I talk to is in a different career stage and on a different career path.

So I will assume that the question is being posed from the perspective of a fairly new entrant to the industry and that the questioner is keen on Infrastructure, networking and cloud services.

So to the certifications.

CCNA or Cisco Certified Network Associate is not an easy ask and requires access to either a test lab of Ciso networking kit such as routers, switches etc or to simulation software (both is better). The certification is all about configuring switches and routers as well as networking fundamentals areound TCP IP v4 and v6 as well as the TCP multi layered model of networking, right down to what makes up a TCP Packet.

Cisco have a network academy and many universities offer this course, including the Open University.

The certification can be obtained in two ways. passing a single big exam or taking two smaller separate exams. Neither contains easier questions but people often run out of time in the larger exam. You do get the CCENT certification with the first exam.

The main difference between Microsoft tests and CISCO ones are that once a question has been answered in a CISCO exam you cannot go back.

This certification is widely valued and is very useful to get a job in a company supporting a physical network.

For study and training you would need to go to the CISCO website.

The second option asked about was an Azure Specialist Certification. there are three of these and if you pass all three then you gain the MCSD or Microsoft Certified Solutions Developer. (This is the shortest route to that certification which is the developer equivalent of the MCSE)

image

The three Azure specialist certifications are shown above. NOTE the MCSD, like the MCSE require re-certification every three years.

Which one of these you take very much depends upon your skil set and focus at work.

70-532 is the Developer exam

70-533 is the Infrastructure exam and

70-534 is the architects exam which actually contains bits of each.

From this page you can access the exam objective domains and details of training courses.

If you want less formal training, then you can try the Microsoft Virtual Academy or Channel 9, both of which have a mass of content on the topic. The links above wil take you to Azure specific topics on both platforms. I thoroughly recommend them to you.

Finally Microsoft run a learning community borntolearn.mslearn.net, here you can join study groups, hear about others stories and generally feel part of that IT Pro community and Dev ones for studying the Microsoft Certifications.

I moderate a couple of forums on here including the Office 365 study groups.

For any Microsoft Azure exam you will need access to a subscription. These are available in lots of ways.

You can sign up for a free trial, or use your MSDN subscription Azure credits, but make sure when signing up for a subscription that it is one that includes the credits.

Or you can come on one of our Windows or Hybrid Cloud InnovateIT events, register here and you will receive a 30 day free Azure pass.

The bottom line is that you must have a detailed knowledge of all the areas in the exam topic lists.

These are not easy exams and bear in mind that the content of the exam does not change as often as the Azure services do, so a certain knowledge of a few months past will help when sitting these.

As for which you take CISCO or Azure it depends what job you want or are doing,personally I would do the Azure ones and then the CISCO one, since the Azure networking knowledge is not directly linked to knowing your CISCo binary AND and Or ing. since Azure takes care of most of the networking for you.

You can sign up for an exam here.

You can download the latest technical certification roadmap here as well

Good luck

Windows Containers: What they are and how they work

happy-new-year-622149_640

I thought a brand New Year warranted a post on some brand new technology that I predict will have a huge impact on us this year.

The It Industry is a maze of Three Letter Acronyms (TLA’s) and Buzzwords they seem to proliferate at an ever-increasing rate. One of the latest ‘fads’, ‘trends’ or buzzwords that has gained a great deal of momentum over the last 12 months is the wonderful world of containers.

Hopefully, if you currently don’t know what they are or why we should be diving in feet first into this new(ish) technology then after reading this post you should have a clearer idea. Why not get testing the Microsoft implementation with Windows Server Technical preview 4 (TP4) (get it here) or on your Windows 10 client Hyper-V using PowerShell by followinf the steps Here.

There are also a bunch of free MVA courses here and to finally quench your thirst for knowledge, Channel 9 has a containers channel here.

So having prepped you for the journey into Containers why not take a quick look at an intro video on Channel 9.

Microsoft has been in partnership with Docker for some time,integrating their container technology into Microsoft Azure and Windows Server 2016. With TP4 Microsoft have also introduced both Windows Containers and Hyper-V Containers, both of which carry out the same function of providing a platform for lightweight portable application to be hosted, sclaed and reworked in seconds with differing levels of isolation.

At its most fundamental level container tehcnology is a way of packing many identical, similar or completely different applications in isolation on the same host (or ported to the cloud)

I know, we can already do that with Hyper-V and other hypervisors. BUT this technology only requires one base operating system to be in use onthe container host, all the containers use links to that operating system. A container host can be a physical host or a virtual machine. The container analogy is similar to the way a differencing disk contains only changes to the parent installation, the base OS image is never changed.

Microsoft Container technology now provides two base images nanoserver and Windows Server Core. There are also two types of container which provide the same functionality but different levels of container isolation as shown below.

image

A Hyper-V container will be run in a very lightweight virtual machine, ensuring that the isolation is total. This will allow Microsoft Container technology to support a full public multi-tenancy solution.

The clever part of the solution is that its quite easy to create, amend, copy and recreate container images and then deploy them as containers running modern appplications, web browsers etc. on your server.

Normally my posts include a whole bunch of how-to steps and advice but all of these are already listed in a step by step way on http://aka.ms/windowscontainers so I will limit myself to your first step.

From either a Windows 10 machine that runs Hyer-V client or from a Windows Server running Hyper-V fire up your best friend (PowerShell) in an administrative window. Then enter this one command.

 wget -uri https://aka.ms/tp4/Install-ContainerHost -OutFile C:\Install-ContainerHost.ps1

This creates a new script which will then set your system up as a container host, so run this comand straight afterwards.

New-ContainerHost.ps1 –VmName Tp4ContainerHost -WindowsImage ServerDatacenterCore –Hyperv

Once the script completes you will have a running container host ready to start experiementing.

Be warned htough, this script takes a long time to runas it downloads the base OS images.

Happy Containerising!