Windows Server Technical Preview – My Favourite Features – Part 1

Microsoft released the first Technical Preview of Windows 10 to much acclaim back in October. There have been three releases so far and we currently sit on the ‘last release of the Calendar year’ – Build 9879.

The Technical Preview is intended primarily for the enterprise to evaluate the changes and inform the development of new and evolved features of the client operating system. This is a brave and intelligent step. Most followers of Windows in an enterprise will know that Microsoft traditionally release their Client and Server platforms in pairs. XP/ 2003, Vista/2008, Win 7/ 2008R2, Win 8/2012 and most recently Win8.1/2012R2.

The dramatic changes inside Microsoft have not led to a change in this pattern and there is a new server platform being developed alongside Windows 10, this Server is as yet un-named but is also in Technical Preview.

If you have an MSDN subscription you can find it there in both ISO and VHD formats (the new Hyper V server is there too). If you do not subscribe then you can find it here. The new Remote Server Administration Tools for Windows 10 Technical Preview have also been released to allow you to remotely manage your new server from your new client. The RSAT can be found here. They are available in 32bit and 654bit flavours.

For anyone interested in the Server Technical Preview, just about everything you could want to know can be accessed from this blog site. This is Jose Barreto’s blog, Jose is a member of the File Server team within Microsoft and has put together this invaluable survival guide. As you might imagine, it is Storage focussed but does cover most other areas too.

There is one final way you can have a look at and run the Server Technical Preview and that is as a Virtual machine in Microsoft Azure. If you do not have an Azure subscription, again this is part of your MSDN benefit. (MSDN is sounding more and more like good value). Otherwise you can sign up for a cost free trial, here

azpreview1

Windows Server 2012 was a huge leap in performance and function for the Windows Server family and despite the familiar look and feel to the new server and most of its tools, there have been significant new features and improvements to old one. BUT please remember when looking at and playing with this new server operating system.

THIS IS A TECHNICAL PREVIEW – do not use it in production, do not rely on it for any tasks you cannot afford to lose. Having said that I have found it stable and reliable (as with the Windows 10 client. The difference being I use the Windows 10 client on my main work machine and just about all other machines I use – a couple of exceptions) Whereas the server version is very definitely a test rig setup for me at present.

So, what is new and of those new things, what are my favourite features and why. This is the first post in a series examining major new functionality in the Technical Preview.

In Server 2012 one of the big five features for me was Hyper-V Replica. The first new feature of the Technical Preview I want to describe is called Storage Replica.

To quote the TechNet site, Storage Replica (SR) is a new feature that enables storage-agnostic, block-level, synchronous replication between servers for disaster recovery, as well as stretching of a failover cluster for high availability. Synchronous replication enables mirroring of data in physical sites with crash-consistent volumes ensuring zero data loss at the file system level. Asynchronous replication allows site extension beyond metropolitan ranges with the possibility of data loss.

Ok that sounds a.) A lot of technical stuff and b.) Pretty exciting and revolutionary for an out of the box no cost inclusion in a server operating system. So what exactly does it do and how does it do it.

Well, Server 2012 introduced the next version of SMB (SMB 3.0) this allowed a vast number of performance and reliability improvements with file servers and storage as well as normal communications using the SMB protocol.

In short the feature allows an All-Microsoft DR solution for both planned and unplanned outages of your mission-critical tasks. It also allows you to stretch your clusters to a Metropolitan scale.

What is it NOT?

  • Hyper-V Replica
  • DFSR
  • SQLAlwaysOn
  • Backup

Many people use DFSR as a Disaster Recovery Solution, it is not suited to this but can be used. Storage Replica is true DR replication in either synchronous or asynchronous fashion

Microsoft have implemented synchronous replication in a different fashion to most others providers, it does not rely on snapshot technology but continuously replicates instead. This does lead to a lower RPO (Recovery point objective – meaning less data could be lost) but it also means that SR relies on the applications to provide consistency guarantees rather than snapshots. SR does guarantee consistency in all of its replication modes.

There is a step-by-step guide available here, but I have included some other notes below for those who don’t want to read it all now (all 38 pages of it). (Images are taken from that guide and live screenshots too)

The Technical Preview does not currently allow cluster to cluster replication.

ss

 

 

 

 

sc1

 

 

 

 

 

Storage replica is capable of BOTH synchronous and asynchronous replication as shown below. And anyone who knows anything about replication knows that to do this there must be some significant hardware and networking requirements.

synch1
asynch1

So what are the pre-requisites to be able to use Storage Replica in a stretch cluster?

The diagram below represents such a stretch cluster.

sc2

 

 

 

 

 

There must be a Windows Active Directory (not necessary to host this on Technical preview)

Four servers running Technical Preview all must be able to run Hyper-V have a minimum of 4 cores and 8GB RAM. (Note Physical servers are needed for this scenario, you can use VM’s to test Server to Server but not a stretch cluster with Hyper-V).

There needs to be two sets of shared storage each one available to one pair of servers.

Each server MUST have at least one 10GB Ethernet connection.

Ports open for ICMP, SMB (445) and WS-Man (5985) in both directions between all 4 Servers

The test network MUST have at LEAST 8Gbps throughput and importantly round trip latency of less than or equal to 5ms. (This is done using 1472 byte ICMP packets for at least 5 minutes, you can measure that with the simple ping command below)

ping1

Finally Membership in the built-in Administrators group on all server nodes is required.

This is no small list of needs.

The step by step guide uses two ways of demonstrating the set up. And is a total of 38 pages long.

All scenarios are achievable using PowerShell 5.0 as available in the Technical Preview. Once the cluster is built it requires just a single command to build the stretch cluster.

pshell1

You could of course choose to do it in stages using the New-SRGroup and New-SRPartnership CmdLets.

If, like me you do not have the hardware resources lying around to build such a test rig you may want to try and test the server to server replica instead.

This requires,

Windows Server Active Directory domain (does not need to run Windows Server Technical Preview).

Two servers with Windows Server Technical Preview installed. Each server should be capable of running Hyper-V, have at least 4 cores, and have at least 4GB of RAM. (Physical or VM is ok for this scenario)

Two sets of storage. The storage should contain a mix of HDD and SSD media.

(Note USB and System Drives are not eligible for SR and no disk that contains a Windows page file can be used either)

At least one 10GbE connection on each file server.

The test network MUST have at LEAST 8Gbps throughput and importantly round trip latency of less than or equal to 5ms. (This is done using 1472 byte ICMP packets for at least 5 minutes, you can measure that with the simple ping command below)

ping1

Ports open for ICMP, SMB (445) and WS-Man (5985) in both directions between both Servers.

Membership in the built-in Administrators group on all server nodes.

NOTE – the PowerShell CmdLets for the Server to Server scenario work remotely and locally, but only for the creation of the Replica, not to remove or amend using Remove or Set CmdLets (make sure you run these CmdLets locally ON the server that you are targeting for the Group and Partnership tasks).

I do urge you to go off and read more about this solution and test it if you can but remember things are not yet fully baked and will change with each release AND do not use them in Production yet. Read the guide for known issues as well, there are a few.

Finally – why do I love this feature – NO one likes to think of a disaster but if you don’t plan for it, when it does happen it truly will be a disaster in every respect. This allows a much cheaper but effective way of maintaining a current accurate replica of data either on a separate server or on a separate site within a stretch cluster.

Still pricey on hardware and networking, BUT much cheaper than a full hot site DR centre with old style full synchronous replication.

Watch this space for more Server Technical Preview hot features.

Future Decoded – 12th November – Enterprise Mobility

 

fdpic

If you are at all interested in Enterprise Mobility then Wednesday 12th November at Excel in London is the place to be. This FREE event has lots going for it.

Enterprise mobility is not all that is on offer. The morning is dedicated to some big-hitting keynote speakers. Including Professor Brian Cox, Sir Nigel Shadbolt and Michael Taylor.

keynotes

The afternoon agenda is packed with excellent Enterprise mobility sessions, from customer case studies to demonstrations of the latest Windows 8.1 gadgets and deep diving sessions including demonstrations of the journey into mobile device management on an enterprise level.

No better time to get yourself up to speed on the Enterprise Mobility Suite from Microsoft, which includes Microsoft Azure Active Directory Premium, Windows Intune and Azure RMS. All of which work together with Windows Server to provide an end to end mobility and identity solution, irrespective of your end users device choice.

ems1

Some of Microsoft UK’s top technical specialists will be presenting and answering your EMS related questions.

jamie

 

Devices, devices everywhere

Jamie Burgess

Mobility Lead, Microsoft UK

dan

 

Hybrid Identity Management

Daniel Kenyon-Smith

Solution Architect Microsoft Consulting Services

elsey

 

Desktop and Application Virtualisation

Doug Elsley

Application and Desktop as a Service Lead, Microsoft UK.

The session is being kicked off with a case study from Andy Turner of Mitchells and Butlers a customer story well worth following.

You can register here, I do hope you have time to spare to attend this fantastic event. I will be there hosting the mobility track so do come and find me and introduce yourself.

Oh and did I mention it is completely FREE!

New release to manage? Top tips on communicating, educating and activating users.

angry-user1

As IT Professionals, it is all too easy to lose sight of the impact our daily job can have on the poor, unsuspecting species known as the ‘user’. It could be said that having spent the last 20 plus years entirely working in or around the IT industry, I am both long in the tooth and set in my ways. One is probably true, I sincerely hope that the latter is not. Those of us not willing to change will indeed wither and fade.

Twenty years ago the needs and wants of the user were a secondary consideration, what equipment we used, what the applications and devices were capable of and looked like and indeed the user experience was not foremost in the minds of the designer or implementer. This may not be true across the board but I certainly have been subjected to some seriously poor implementations in both the public and private sectors.

Today the needs and wants of the user seem to be a primary consideration which can only be a good thing. Not only should applications be designed to make the human computer interaction as easy and intuitive as possible but the actual deployment and updating of the solution should also take users into consideration.

This is not a new idea. The IT Infrastructure Library (ITIL) has a section covering Release Management

 

6.3.4 Release and deployment management

The purpose of the release and deployment management process is to plan, schedule and control the building, testing and deployment of releases, and to deliver new functionality

required by the business while protecting the integrity of existing services.

 Definition: release

One or more changes to an IT service that are built, tested and deployed together. A single release may include changes to hardware, software, documentation, processes and other components.

 Effective release and deployment delivers significant business value by delivering changes at optimized speed, risk and cost, and offering a consistent, appropriate and auditable implementation of usable and useful services.

 Release and deployment management covers the whole build, test and implementation of new or changed services, from planning through to early life support.

 As a headline set of principles, this is laudable, however, the communication and training process is critical to the acceptance of a system by its users. No matter how good the product, if it does win the ‘hearts and minds’ it is unlikely to succeed.

For that reason the IT Pro has to not only be marketing savvy but actually should be a proficient marketer too.

It may not be possible to outline to all users the detailed roadmap and dates of new products or even of significant updates to current products for a number of reasons. These could range from political, financial and even business critical reasons.

It should though, be possible to provide a comprehensive outline of what to expect as a user. Once the roadmap is revealed, then the communication of any significant outages, product limitations and even of required training or activation processes becomes critical to the success of a release.

It is useless to have a sophisticated IT service if the users are not able to make the most of it due to lack of communication or training in addition to all the other ITIL processes shown below.

relman2

To produce a fully rounded release requires all of the processes. To market it well and to deliver it successfully relies upon Communication and training.

So if you are an IT Pro considering deploying something new to users, have you thought about;

 

  • A SharePoint site developed purely for this release for all users to be able to access documentation, FAQ’s and even e-learning training courses.
  •  Working with the Developers to make sure that it is the users that matter not just the potential ‘throw it over the wall’ type deployment – from Dev to Ops to user. Visual Studio Online and Visual Studio work very well together to provide a rounded development solution.
  • Providing an email alias for users to submit questions and heaven forbid – bugs to.
  • Provide regular feedback to the users either through the SharePoint site or a web site (Microsoft Azure is good for hosting this, of course).

With the speed of releases increasing for all types of deployment from minor updates to completely new solutions, the user could easily get forgotten, this is not a good idea.

 

Remember the old adage ‘hell hath no fury like a user scorned’, or something like that.

Azure RemoteApp Part 2 – More RemoteAppyness from Azure

cloudos1

 

 

Having published a post on Azure RemoteApp recently, I have been absolutely inundated with requests (well three people have asked me) for information on how to use Azure RemoteApp with an application that is not published by default and that I DON’T want to run on my own premises.

 

So here is part 2 of 3 in the Azure RemoteApp story of goodness. Publishing my own apps to Azure RemoteApp.

As a keen photographer I rely heavily on Adobe Lightroom, so i decided I would deploy a 30 day trial of this to Azure RemoteApp

To be honest I had thought this post would be difficult to describe and deliver but at least 60% of the tasks are identical to those in the first post. The ONLY difference is that you must create and upload your own image in the form of a Virtual Hard Disk (VHD) to Azure Blob storage (no I’m not being insulting to Azure storage – BLOB is a Binary Large Object).

The tricky part is that there are a bunch of pre-requisites for this image, its internal setup and the applications on it. So some patience is required.

To follow these steps, you must already have an Azure subscription (a trial will do) and you must have an Azure RemoteApp preview service enabled. Both of these steps are described and links included in my previous post, here.

Step 1 – Create your Image.

There are a number of ways to do this, of course PowerShell is the best way if you want to repeat the process and I will publish a handy script to do the basics, another time. For now just use either Hyper V or Disk Management to create a dynamically expanding, VHD with a Master Boot Record type (MBR). All three of those factors are mandatory.

Azure is not yet able to handle the newer VHDX format so make sure it’s a VHD. A dynamically expanding disk is much more efficient for the uploading and blob storage process. The Azure system only handles the MBR format rather than GUID type.

Once you have this created, then you can go ahead and create a VM based on that disk. The next pre requisite is that you MUST install Windows Server 2012 R2 on the disk. The machine can have several volumes BUT only one instance of Windows Server can be installed. This caters for the scenario where the Application you wish to use as a RemoteApp cannot co-exist on a system partition.

Having found a Windows Server 2012 R2 ISO or DVD and installed it to the VM (you will need to enter a valid licence key, which will be stripped out later on), there are several more tasks to complete before you can upload your VHD as a ‘gold’ image to the Azure storage platform in your subscription.

First you should use either Server Manager or PowerShell to add the Desktop Experience Feature which is hidden under the User Interfaces and Infrastructure section.

experience

 

 

 

 

 

 

This will almost certainly require a reboot.

Having rebooted the VM, again use Server Manager or PowerShell to add the remote Desktop Services Role and during the setup stage of the wizard for this, add only the RDS Session Host option. The normal process for installing RDS is to choose the RDS Installation option as most of the work for setting up your systems and servers is carried out for you.

This will not work for a number of reasons, not least as it requires a domain structure. So for this install you will need to choose Role-based or feature-based installation.

rds install

There are now a few settings to make to ensure that the Azure upload works ok.

First disable the Encrypting File System (EFS), this requires the following command entered in an elevated command prompt.

Fsutil behavior set disableencryption 1

If you know what you are doing and are used to hacking around in your registry, you can add the following DWORD here.

HKLM\System\CurrentControlSet\Control\FileSystem\NtfsDisableEncryption = 1

(usual caveats  around backing up the registry and not doing this unless you know what you are doing)

So far I have assumed that you are creating this image on-premises which in this ‘Cloud-First, Mobile-First’ world is an error on my part. There is an additional step to take if you are creating your image within an Azure VM.

There is an XML file stored in \Windows\Panther\unattend.xml

This needs to be renamed or deleted. If this step is not carried out, the upload script (which also assumes an on-premises image source) will fail.

The final three steps to make are

to fully update the operating system using Windows Update

to install your applications which you intend to be used as RemoteApps (my demonstration uses Adobe Lightroom v3 9a digital image importing, cataloguing and editing application).

The last step is to generalize your system so that when Azure starts the image during the process of uploading and provisioning, the system does not find keys, users or passwords that would block the process.

The command required for generalization is

  C:\Windows\System32\sysprep\sysprep.exe /generalize /oobe /shutdown

This will place the VHD in the same state as immediately after installation before entering the product key and administrator password (oobe). Do remember that even though Sysprep (one of my favourite tools) has a switch designed to use with VM’s, you should not use the /mode:VM switch as this would cause Azure to reject the image.

Even though this is a very simple set of instructions make sure that all of them are completed AND in the order stated. If you do not, expect a lot of red in your PowerShell results! (Hint – RED is bad) (I saw a lot of red in creating this post)

Step 2 – upload your image.

Provided that you have completed step 1 correctly this is the easiest but longest part of the process.

First in Azure go to the RemoteApp section select template Images and upload, answer the wizard options of name and Location (BIG NOTE – remember the location and make sure it is the same as the location when you create your remote app service, if not, you will not see this image)

upload1

Having created the template image name the wizard will automatically start downloading the

Upload-AzureRemoteApptemplateImage.ps1 script (remember where you save this)

The wizard will also contain a link to download the Azure PowerShell module and the details of the command to run, in this instance the command is below

commanupload

Here is the command in a  more readable form, simply copy and paste this into a notepad document so you can reuse it (if your first few tries do fail)

.\Upload-AzureRemoteAppTemplateImage.ps1 -SAS “?sv=2012-02-12&sr=b&si=428348c2-81f9-4ce6-9021-635289465915&sig=DMAia%2F0qa%2B4pXoDad5mRyJEjx%2BTyWVNEJW1Ah%2BDXjnY%3D” -URI https://cdvwu110638459rdcm.blob.core.windows.net/goldimages/428348c2-81f9-4ce6-9021-635289465915.vhd

Ensure that your VHD image is stored on directly attached storage as the wizard does not pick up network shares.

Having downloaded the Azure PowerShell, go ahead and run this from the start screen as an administrator.

azpshell

 

 

 

 

 

 

 

 

 

 

Change drives to the drive and directory where you downloaded the script and paste the command and run it.  The script asks for a location to the VHD, select the correct image and off you go.

upload3

My upload took about 1.75 hours and was around 10GB in size.

uploadfinal

From this point ALL the steps are identical to those in my first post. If the applications you want to use are not on the start menu, they will not show up in the publish applications window and you will need to add it manually by path. (Hence I suggested you write down the path).

path

Once the app is published, you can simply look for more app invitations if the Microsoft RemoteApp is already installed and configured to a different Azure RemoteApp service. The new apps will then show up and you can run them at your leisure.

updating apps

Note here that the first time you run an app within a session from EACH Azure RemoteApp service, it will take longer than usual as it has to set up the connection to the new session host. Also note that despite there being different session hosts, there is only one connection to Azure RemoteApp – all connections go through this one connection, as shown by the Screenshots of the task manager (processes). See also that where I have many RemoteApps open, almost no processing power is used and not that much RAM either. I can also run duplicate copies of the same Application on the same machine (1 RemoteApp and 1 local) also shown in the Task manager screenshot.

taskman

In preview I am limited in the number of concurrent sessions I can run to ten.

nosessions

 

 

 

 

 

The sessions listing in the Azure console allows you to keep track of this. I can even log off remote sessions, disconnect them (for later re connection) or send a message to the console of the user.

disconnect

 

 

 

Of course simply because it is easier for me to do quickly i have shown all this on a Windows 8.1 client. It is available on any platform where a RemoteApp client is available (OSx, iOS, Android, Windows)

Again I have gone well over my self-imposed 1000 word limit so will keep the hybrid implementation of Azure for another day. Happy Apping.

 

 

MCSE Update – Beware the path you choose to tread.

mcsepng

 

 

 

 

 

Despite my crossing the professional tracks from Full time MCT to Microsoft Full Time Employee as an evangelist, I am still keen to engage with anyone on a certification journey or those still teaching and training our hard-working IT pro community. As such I feel the need to react to the announcement that there is now a choice in the path you take to gain your MCSE certification for Communications,
Messaging or SharePoint.

So first up, what was the position?

Well, when Microsoft released Windows Server 2012, they also reinvented the MCSA and MCSE certifications which I covered here and here, so no need to repeat that. The key point for this post is that the ONLY route to any of the new MCSE badges was the MCSA Server 2012. (3 exams 410,411 and 412 each getting progressively more challenging).

* The caveat to this (there is always one of those) is that if you held certain prior certifications you could take the upgrade exam (417) which I found harder than all the other three put together.

* The other caveat is that if you possessed the MCSA Server 2008 then you could use that to gain your MCSE for a specified time.

All that being equal, essentially the only route in now is to get your MCSA Server first, then take the two examinations for your chosen specialism.

As shown in the graphic below.

mcse

 

 

 

 

 

 

As a key, Communications relates to the Lync product, Messaging to Exchange and SharePoint is self-explanatory. But as I explain in the previous posts, the two MCSE level exams are not limited to single products like the MCSA levels, they are much more detailed complex and wide ranging in style and content.

So what has changed?

Well, with the release of the Office 365 MCSA, which I have posted about quite a number of times here. There is now the option to study for an MCSE without having any on-premises server certifications.

It is now an available route to MCSE by gaining the MCSA Office 365

offline-hero-office-365

 

Those who have read my trials and tribulations in achieving the aforementioned MCSA will know that it certainly is not an easy option. But there are more detailed questions to ask.

Is it right to be an MCSE, the premium level of certification now that Master and Architect have been benched, without any tested knowledge of the key skills such as DNS / DHCP / VPN / TCP/IP / Active Directory / ADFS etc.

My immediate and resounding answer to that is NO! Certainly not.

The clever thing here is that by granting an MCSA after only two exams instead of three, it looks like a quick and easy way to sneak under the skills barrier.

Far from it.

I have no doubt that those who cheat will still cheat and to them it makes no difference as they had none of those skills before, but to those who choose the Cloud route rather than on-premises, they will still need the whole list of skills I mentioned above to be successful in the Office 365 exams.

Both exams test a great deal about Active Directory / ADFS / PowerShell / Authentication in addition to the key skills for the administration of the individual online versions of Exchange, Lync and SharePoint. The final element is of course the other portals that are required to administer Office 365 and setup the subscriptions.

Two exams but having taken all the exams in both routes, the Server 2012 on-premises route was by far the less challenging for me.

This may have been because I already had the skills as a basis and that those who come in the future and know no on-premises products will not, in which case either route will be seriously beyond their reach without a great deal of study.

I think this addition is a good step and opens up a number of possibilities for future IT Pros to concentrate on the cloud first route into certification. Whatever the route, they will possess one of the most respected IT certifications around which will be of great value for the three years until re-certification is required!

 

Just how important is ‘The Cloud’ in a world of connected devices?

The importance of ‘The Cloud’ in a world of connected devices.

empower

It doesn’t really matter if you are a consumer or an enterprise user. The number of devices you are likely to be using has increased from 0 or 1 to around 4 or 5 already. This is likely to increase as we consider the rapid expansion of the Internet of Things (IoT).

Already there are internet connected cars such as those built by Qoros in China who have developed an innovative solution using QorosQloud. There are many features of the system, but the most attractive to me is the ability to create a navigation plan based on your calendar and its meetings.

qoros

Whilst sat in the office at a desktop PC you can set a meeting and its location. If your car is parked a short distance away, when you pick up your smartphone the walking navigation route to your car (not your final destination) appears and activates, when you arrive at your car and start it, the journey continues automatically. When you park some distance from the destination of the meeting, in a car park identified by Qoros, the smartphone guides you the final steps by foot. Impressive, I think so. This system exists now and is entirely hosted by the Microsoft Azure Cloud! Read the press release here.

qorosqloud

This is just a taster of what a world of connected devices can do. The ability of an enterprise to maintain and support the required infrastructure to manage such a solution on a wide scale is beyond the means of all but the largest most cash rich organisations. Indeed the cost of such hardware and logistics required to manage such datacentres efficiently, professionally and with SLA’s in acceptable levels of reliability is currently within the reach of only two or three such organisations on the planet.

What impact does the cloud have on the average mobile consumer and worker?

So let’s assume you aren’t in China, don’t own a Qoros car and just sit on our little island commuting from Birmingham to Reading on an infrequent basis, splitting your working life four ways.

Home Office,

Workplace,

Mobile and

Customer or event sites.

This not only describes my working pattern but those of a good many people (if you replace the commute start and finish locations with others!)

Could I do my job, and live my fairly normal home life without the cloud?  Imagine the unthinkable, all non on-premises resources become unavailable for a day.

What could I still do?

Let’s take a look at what services I use on a day to day basis.

available

So my normal day would be completely limited to telephone calls and using my Laptop / PC / iPad or Surface to do local things, saving them to local storage or removable disks for transport to work.. If I had an exchange account that was not on Office 365 I would still have email.* Some enterprises may have some of their services on-premises but for my working life this is not the case.

What, then could I use to keep connected to my digital life?

In my working life, I could use all the old traditional tools like Server 2012 R2 / SharePoint / Exchange all hosted on-premises. I would face all the challenges that brings for scalability, elasticity, sharing between organisations using ADFS and other federated identity solutions. Without thesse i would be connected internally but isolated from my wider connected environment of customers, partners and vendors.

So what about the consumer. Essentially without the cloud we are pretty much disconnected. The vast majority of internet services are now either run on SaaS, PaaS or IaaS cloud solutions. The days of normal businesses being able to host their own public-facing services are gone.

I include in this all other cloud providers that are not on the scale of the big two or three.

If you think the Cloud is just arriving, then nothing could be further from the truth, we are already completely and utterly dependent upon cloud services of all types.

The future is only going to get cloudier. The management of mobile devices using technologies such as Windows Intune, the provision of ground-breaking services such as Azure Machine Learning and Azure RemoteApp are only proving just how valuable and important the cloud is to a world of connected devices today.

cloudos1

 

Already OneDrive is integrated into Office 2013 and Office 365, Google integrate cloud storage into Android handsets, Apple integrate iCloud into iPhone and iPad – there is no escaping the cloud whether or not we send our enterprise applications, infrastructure and data into a public facing cloud, we are all already enmeshed!

 

The internet age brought us information at our fingertips. The cloud era brings us the ability to manipulate, consume and use that information in ways and with speed and in volumes like we would never have believed. Microsoft Azure Storage now stores Trillions of objects.

azure2

 

The future world with the explosion of the IoT where everyday objects are internet enabled and produce data but have little or no storage on board will require another rapid increase int he resources available in the cloud to store and analyse that data into usable information.

I was determined not to focus on the Social media aspects during this post but cannot finish without a brief mention, the rapid expansion of the different platforms and their monetisation has simply added to the drive towards cloud hosting of data and services. It is too late to go back, the duty now is to continue to keep the cloud a safe and secure place for us all to rely upon in our working and personal digital lives.

Azure RemoteApp – Why it is such a killer solution?

So imagine you are the CTO / CIO of a business (any size, it doesn’t matter from 10 people to 10,000) your clever team of developers have just implemented a top-notch line of business application (LOB) using the latest tools, on the Windows platform. The application becomes the de-facto tool for all your employees and the business cannot run without it. You decide the time is right to sit back and relax a little as the world is a good and happy place.

The following week the executive board decide that your previously rigid policy of no access to the network unless by a corporately owned, domain-joined device is out of the window and employees may either bring their own device to work or select from a range of devices that are not limited to the Windows platform such as Mac OSx, iOS and Android. The relaxing is over and you need to come up with a solution fast to prevent the revolt in you development team. They know and love their programming in .NET etc. but have never created apps for the other platforms.

What do you do?

The solution is to dive into Microsoft Azure RemoteApp. The short answer is to publish your LOB application either entirely in the Cloud or as part of a Hybrid solution utilising Windows Server 2012 R2 remote Desktop Services (RemoteApp) (which has been around since Windows Server 2008 R2).

Currently in preview as shown below

preview

Azure RemoteApp allows the user to publish a list of applications to users or groups of users that will not run natively on their systems. The only requirement for their platform is the existence of the Microsoft Remote Desktop Client which is available for Windows, Mac OSx, iOS and Android.

The trial solution includes a whole host of built in applications as well as Office 2013. Before I walk you through the simple setup and deployment, let’s finish off the tech specs and business benefits.

Importantly in the mobile devices era, the concern over data security is eliminated since the application runs on the Azure servers (or your on premises ones) so the loss of the device does not expose and data to exposure or loss.

Scaling up and down is rapid, cost-effective and easy to achieve in the Azure platform. This can cater for rapid expansion or decline in service requirements. Seasonal workers can be accommodated without the capital costs and time involved in procuring and deploying servers. Equally important the scale down doesn’t leave expensive assets switched off being unproductive.

Employees can use either their corporate credentials or a Microsoft Account, whichever you choose.

Sound good, well if you want to follow this walk through simply sign up to an Azure trial (if you aren’t already an Azure customer) by clicking here. Then once you are in use any of the helpful tutorials or product documentation the get up to speed with the portal and navigation. Finally to prepare you for the session, sign-up to the Azure RemoteApp preview here.

Once this has been accepted and you are comfortable with the portal, read-on. (The rest of the post assumes a basic knowledge of Azure services and principles)

Sign in to your portal at https://manage.windowsazure.com where a list of your previously created items will be displayed.

azurehome

Scroll down the list of services until you find the RemoteApp section, select this tab.

remoteapptabSelect  + New at the bottom left of the screen and then App Services followed by Remoteapp (you may have to scroll the list down a little, the number of Azure services is growing rapidly). Two choices are available Quick create or Create with VPN. The latter is for Hybrid deployments, for now we will choose Quick Create as below

 

 

 

 

 

quick create

Enter a name for your service and the location in which you want this hosted. The number of regions is also growing. I always select North Europe or West Europe which relate to our data centres in Dublin and Amsterdam respectively (note Dublin is North Europe!). Currently the only template image available is Office 2013 Pro Plus running on a Windows Server 2012 R2 platform. (Although you will never see or know about the servers, Microsoft patch, update and maintain them in the background as part of the SLA)

Once your service has been created it will appear under RemoteApp services. Select the service by clicking on it and the service setup page will be reached.

The next stage is to configure which applications and which users may take advantage of this service.

dashboard

Select the ‘publish remote programs’ option and add / remove from the list of available applications shown below.

select programs

Having chosen the required built in apps, you can now go ahead and click on ‘configure user access’

Simply enter the email addresses of your users. You can even add groups to segment which users have access – this group would come from the built in Azure Default Active Directory.

users

The remainder of this tab hosts a published apps list and a session list showing the status of all connected users.

Clicking on the dashboard will show your current and available session usage and other image and template information.

You are now all set. All that is left is to download the client application from https://www.remoteapp.windowsazure.com/ which will default to the windows application page but a link will take you to the all clients download page. These are shown below and include Windows (x86,x64 and 8.1RT), Android, iOS (iPad and iPhone) and Mac (OSx)

dlp

 

 

 

 

 

 

 

 

 

 

 

 

Having done that it is simply a matter of running the app, logging in with the registered email address and choosing the Windows application to run.

Let’s have a look at the client experience on a Windows 8.1 device (simply as I am being lazy with the time required to screenshot on an iPad and transfer the files – on a deadline!)

The first thing to remember is that like on-premises RemoteApp, the first time you connect to an application, the Remote Desktop Protocol session is initialised and authenticated. So the app takes a short while to open. All other apps hosted on the same host then open in that session and are much quicker. (Remember, if you close all the RemoteApps the session will be closed so future starts will be slower again).

firsttime

Here it’s worth pointing out that the RemoteApp client can connect simultaneously to more than one RemoteApp service so an end user can have access using the same email address to many different RemoteApps from many different providers.

 

 

So run the Application, in this case from the start screen.

startapp

 

 

 

 

 

 

 

 

 

 

 

 

Log in with your registered email address and you find an empty window.

signin

 

 

 

 

 

 

 

empty

 

Click on App invitations and you will see any provider invitations to connect to a RemoteApp service. Choose the one you need and the window is then populated with all the published apps. This can be refreshed mid-session to see if any apps have been unpublished.

invites

populated

From there run the applications as you want, whilst remembering that your PC, iPad, Mac or Android device is not doing the processing the remote machine is doing that all you are doing is running a remote connection to it and presenting that on your screen. So data remains on that server. You can connect to your OneDrive which exposes the cloud shared files (currently each user has 1TB of storage). See the screenshot of the Task Manager Processes tab on my home PC when running four Azure RemoteApp programs. A total of 170MB of ram and 0.1% of processor.

taskm

I have shown below some screenshots showing PowerShell and CMD windows open having run a whoami and an ipconfig to show the host is the same for each app and the ip scheme and networking information is specific.

techdata

I have also shown a shot of the open dialog in RemoteApp Excel and the temporary storage setup for the system with a handily placed warning README.DOC not to store files there.

thispc

 

textwarning

The files you do not want on OneDrive can be saved locally. Try creating a file closing the session and accessing it again they are all persistent. So your data is stored safely in Azure.

taskbar

Here we see a RemoteApp Windows PowerShell session (I would run that wouldn’t I?) You can see from the task bar that the icon has a RemoteApp symbol on in (in green)

I have gone well over my normal word count and haven’t even touched on the uploading of your own images to host in the cloud or on creating VPN’s to run RemoteApp programs on-premises through Azure.

These will have to be for another day, but both involve a lot of PowerShell, always PowerShell!

MVA Heroes on tour

The TechNet UK IT pro team have been running a very successful competition over the summer. The aim is to promote the Microsoft Virtual Academy and drive IT Pros to its content in a structured way, with a fun approach to the competition. The MVA Heroes.

It is a universal truth that IT Pros love free stuff or Swag as it is known. They love prizes like laptops and phones even more!

The competition is over but the collection of the MVA Super hero collection goes on. We have sent many hundreds out to the eager beaver students whom complete courses that qualify them for one of the six figurine stress toys.

heros

Well last month our marketing manager Dan Pilling attended our annual marketing extravaganza known as MGX – it was a special occasion for Dan as he won a very special award and was inducted into the Platinum club. Dan decided to post a picture of one of the figurines on his plane seat and with celebrities he bumped into at the conference.

See here, so that got me thinking. I spent last week in Seattle on an internal Technical conference so I took my hard earned Super Heroes with me and decided I would try and do a little better.

The results are below
WP_20140725_003

 

 

 

 

 

 

 

 

My plane seat with two heroes to keep me safe

WP_20140728_004

 

 

 

 

Larry Kaye Development Certification Product manager from Microsoft Learning

WP_20140801_006Four Heroes take a trip to Seattle

 

 

 

 

Finally my favourite Hero meets a very famous person.

The first comments that identify the hero and the Tech Celebrity win a bit of swag from my collection.

WP_20140731_001

 

 

 

 

 

 

 

 

More importantly why not take your hero or your MVA Hero T Shirt when you travel, send me pictures and who knows you may pick up some special Tech Swag… if its far enough away that is. Seattle was 4781 miles away, can you do better?

Office 365 MCSA – Post Exam feedback!

If you have been following my journey to the Office 365 MCSA certification in previous posts, you will know that I was scheduled to take the exam on Wednesday  30th July whilst attending the Microsoft internal TechReady conference.

Not many people outside Microsoft will have heard about TechReady, that is because the event is for full time employees to learn the new products strategies and road maps for the following six months and for very good reasons this is not broadcast like the TechEd events.

The similarity between TechEd and TechReady includes the provision of Certification Prep sessions and a Certification hall for attendees to schedule and take MCP examinations whilst onsite.

I don’t have the stats for numbers taken and passed during the week but I can tell you that i was in the largest Prometric exam hall I have ever seen, there were over 100 testing stations and a whole bunch of proctors to keep an eye on us.

I had spent several evenings in Seattle with my nose buried in the books (or screens) to ensure that I was successful. The conference pretty much runs from 0700 to 1900 every day so the extra effort to study through jet-lag was no little work. (I don’t travel well and the 8 hour time difference was a not insignificant factor in my studies)

I did enjoy the surroundings of the Seattle Sheraton though and the weather was superb all week.study1

 

 

 

 

 

 

 

 

 

 

 

Having attended a prep session for the 70-346 and 70-347 exams on the Tuesday afternoon I decided to reschedule the exam to that evening. So in I went and sat the test.

I will leave the result until the end of the post but I will say that without the level of study I would have done much much worse. Obviously the NDA you agree to when taking exams prevents me from disclosing the details of the exam, but I can say a few things which will help those of you preparing for the experience of 70-347. Enabling Office 365 Services.

The exam I sat was a standard one with a wide range of the item types I have listed before. These included PowerShell build questions, drag and drop, hot spot and other types.

The overriding point I would like to make is that the exam was already significantly out of date in a few ways. Anyone that is a Microsoft Azure or Office 365 user will know that these cloud based Iaas, Paas and Saas products change almost weekly.  (Why not sign up for free trials now and check them out?)

This means that screenshots in your exam may not look like the current product, I had several instances where this was the case. The strategy for exams based on these products is under review (I met and spoke to the LeX team whilst at the conference).

I am prevented from giving too much detail and giving the number of questions is fairly pointless as these change often and when new questions are under test, they are added in too.

I can and will say that the exam is a fair test of the whole product, covering Sharepoint Online, Lync Online and Exchange Online as well as the Office 365 portal and admin consoles and the PowerShell required to run these products from the commandline. It was not easy and it caused me no end of headaches in the review of the questions.

Top exam taking tip from me is that your first answer is normally your best one and changing your answers just gets you in a mess.

I am pleased to say that I was successful with no wasted effort, I scored exactly the mark required to pass, no more, no less. As you can see from the exam sheet below.

o365pass

If you follow the Microsoft Pychometrician Liberty Munson on Born2Learn, you will know that a score of 700 is not 70% and does not reflect a result worse than a score of 800 or 900. Personally I have never understood this and look forward to someone explaining it to me in words I understand. I have shown this score to point out that even though I studied really really hard and know the product very well indeed, and I love PowerShell, had I got one more question wrong, I would have failed this test.

So study hard, study the correct things and Good Luck.

The bottom line, however, is that I now have the MCSA Office 365 to add to my transcript. This will help me in my career as a Technical Evangelist, will flag me up on LinkedIn to people looking for speakers and allow me to teach the two courses related to the certification.

mcsacerto365

How to keep company data secure, in a ‘mobile first, cloud first’ environment

Acloud-firstmobile-_Page

 

 

 

 

 

 

 

 

 

 

It’s a  Brave new world, a mobile first, cloud first world of technology. In that world there are many new ways of consuming data, on many new devices. Data security is of paramount importance to any user, IT Pro, Small Business or Enterprise, it is a universal requirement of making that data available.

The operating system, platform and delivery method of the data should not cause any increase in the risk to it. In short the corporate entity must have complete control over the access to, consumption of and removal of data in terms of users, devices and platforms at all time.

pcit2

 

 

 

 

 

 

 

That’s a fairly big ask. It is also one of the primary barriers to adoption of cloud technologies and people centric IT (PC_IT).

What is Microsoft’s approach to this rather thorny issue then?

As you might imagine, Microsoft has a number of methods of achieving this end game. The adoption of these depends whether you are either fully cloud, hybrid or fully on-premises for your infrastructure and data requirements. One thing is for sure, they certainly have it covered.

For an on premises scenario with a number of BYOD and corporate issued smartphones and tablets, the solution involved a number of products including Windows Server 2012 R2, Windows Intune and System Center Configuration Manager (2012 R2). The elements of the Server platform that assist with this solution are Active Directory Domain Services (AD DS), Dynamic Access Control, Active Directory Rights Management Services (AD RMS), Active Directory Federation Services (ADFS) and the all new Web Application Proxy.

For a Microsoft Azure based solution, the new Enterprise Mobility Suite (EMS) is designed to cater for most of the same functionality. The EMS consists of Azure Active Directory Premium (for Hybrid identity management and Multi Factor authentication as well as other added functions), Windows Intune and Azure Rights Management Services.

The Hybrid cloud customer would be able to take advantage of all these products to manage their data.

As an additional portion of goodness, Windows Server 2012 R2 also comes with Workplace join and Work Folders.

workfolders

 

 

 

 

 

 

 

 

 

If all of this isn’t enough security, Microsoft also has a scalable and robust Virtual Desktop Infrastructure solution for a whole number of different scenarios that can actually prevent the data leaving the corporate network at all, whilst still giving remote users the ability to enjoy a standard interface and experience. (These include Session Virtualisation, Desktop Virtualisation, both pooled and personal and with Microsoft App-v the ability to stream applications too.)

The final piece of the jigsaw is the new Azure RemoteApp which is currently in preview that now allows a cloud based solution for application virtualization. (RemoteApp is also available for your on premises Windows Servers too).

It is important to point out that the overall People-Centric IT vision is not restricted to data security and management but has a three-pronged approach to PCIT. That of Enabling end users, Unifying the environment and Protecting data. Take a look at the PCIT whitepaper here.

So assuming you have visited the links and read the whitepaper (which after all is why they are linked…), you now know the field but what about the practicalities and what scenarios are covered by this.

Gosh Ed, that’s 500 words where you have pretty much just listed solutions to data security in a whole bunch of scenarios. How do we use these and how do we choose what to use in what situation?

The rest of this post is dedicated to three examples of when to use these solutions. I will then go on to a more detailed technical explanation in a series of future posts dedicated to each solution.

So, Scenario 1.

An iPad user wants access to their corporate intranet and files and folders, some of which are business critical data files. What can we do to allow this access, but control the device and ensure the data is secured?

The iPad has the facility to download the profile settings and join a workplace environment without being domain joined formally. This would allow access to a company portal for access to websites and applications.

WJ_iOS_02

 

 

 

 

 

 

 

 

 

 

 

Using Windows Intune, an administrator can enforce polices for security and data wipe on the iOS device. Securing the data

fig-b-intune-mdm-create-policy

 

For access to secure data or applications incompatible with iOS, then a Virtual desktop could be used (Microsoft VDI) or a Microsoft Azure VM to keep the data off the device and allow access to the application on an incompatible operating system.

 

Scenario 2

An  Android SmartPhone user wants access to work email and files and folders for work use.

Windows Intune will secure the data and allow remote wipe of the device and or the data if required. Policies may be applied by the administrator to ensure that the device has a password and that encryption is also enforced.

Intune09

Scenario 3

A  Windows RT 8.1 Tablet user wants to use a non-domain joined Tablet for work access to email and applications as well as work folders for data.

The combination of Windows Server 2012 R2 and the EMS suite will allow the administrator to provide workplace join, work folders and software deployment as well as endpoint protection for the device. Additional polices may be applied with Windows Intune to enforce rules and security of the data and to remotely wipe the device / data if required.

In a ‘Mobile first, cloud first’ world of devices and data, security is always a concern but the solutions available from Microsoft allow complete control of data access, security, integrity and removal. Don’t forget of course EMS is powered by Microsoft Azure and you can control your Azure subscription with, yes you guessed it PowerShell!

Watch this space for the detailed technical solutions for the three scenarios above, with a special one for the Web Application Proxy all on it’s own. This ground breaking server role replaces the Active Directory Federation Services Proxy role and also does so much more!