New release to manage? Top tips on communicating, educating and activating users.

angry-user1

As IT Professionals, it is all too easy to lose sight of the impact our daily job can have on the poor, unsuspecting species known as the ‘user’. It could be said that having spent the last 20 plus years entirely working in or around the IT industry, I am both long in the tooth and set in my ways. One is probably true, I sincerely hope that the latter is not. Those of us not willing to change will indeed wither and fade.

Twenty years ago the needs and wants of the user were a secondary consideration, what equipment we used, what the applications and devices were capable of and looked like and indeed the user experience was not foremost in the minds of the designer or implementer. This may not be true across the board but I certainly have been subjected to some seriously poor implementations in both the public and private sectors.

Today the needs and wants of the user seem to be a primary consideration which can only be a good thing. Not only should applications be designed to make the human computer interaction as easy and intuitive as possible but the actual deployment and updating of the solution should also take users into consideration.

This is not a new idea. The IT Infrastructure Library (ITIL) has a section covering Release Management

 

6.3.4 Release and deployment management

The purpose of the release and deployment management process is to plan, schedule and control the building, testing and deployment of releases, and to deliver new functionality

required by the business while protecting the integrity of existing services.

 Definition: release

One or more changes to an IT service that are built, tested and deployed together. A single release may include changes to hardware, software, documentation, processes and other components.

 Effective release and deployment delivers significant business value by delivering changes at optimized speed, risk and cost, and offering a consistent, appropriate and auditable implementation of usable and useful services.

 Release and deployment management covers the whole build, test and implementation of new or changed services, from planning through to early life support.

 As a headline set of principles, this is laudable, however, the communication and training process is critical to the acceptance of a system by its users. No matter how good the product, if it does win the ‘hearts and minds’ it is unlikely to succeed.

For that reason the IT Pro has to not only be marketing savvy but actually should be a proficient marketer too.

It may not be possible to outline to all users the detailed roadmap and dates of new products or even of significant updates to current products for a number of reasons. These could range from political, financial and even business critical reasons.

It should though, be possible to provide a comprehensive outline of what to expect as a user. Once the roadmap is revealed, then the communication of any significant outages, product limitations and even of required training or activation processes becomes critical to the success of a release.

It is useless to have a sophisticated IT service if the users are not able to make the most of it due to lack of communication or training in addition to all the other ITIL processes shown below.

relman2

To produce a fully rounded release requires all of the processes. To market it well and to deliver it successfully relies upon Communication and training.

So if you are an IT Pro considering deploying something new to users, have you thought about;

 

  • A SharePoint site developed purely for this release for all users to be able to access documentation, FAQ’s and even e-learning training courses.
  •  Working with the Developers to make sure that it is the users that matter not just the potential ‘throw it over the wall’ type deployment – from Dev to Ops to user. Visual Studio Online and Visual Studio work very well together to provide a rounded development solution.
  • Providing an email alias for users to submit questions and heaven forbid – bugs to.
  • Provide regular feedback to the users either through the SharePoint site or a web site (Microsoft Azure is good for hosting this, of course).

With the speed of releases increasing for all types of deployment from minor updates to completely new solutions, the user could easily get forgotten, this is not a good idea.

 

Remember the old adage ‘hell hath no fury like a user scorned’, or something like that.

Azure RemoteApp Part 2 – More RemoteAppyness from Azure

cloudos1

 

 

Having published a post on Azure RemoteApp recently, I have been absolutely inundated with requests (well three people have asked me) for information on how to use Azure RemoteApp with an application that is not published by default and that I DON’T want to run on my own premises.

 

So here is part 2 of 3 in the Azure RemoteApp story of goodness. Publishing my own apps to Azure RemoteApp.

As a keen photographer I rely heavily on Adobe Lightroom, so i decided I would deploy a 30 day trial of this to Azure RemoteApp

To be honest I had thought this post would be difficult to describe and deliver but at least 60% of the tasks are identical to those in the first post. The ONLY difference is that you must create and upload your own image in the form of a Virtual Hard Disk (VHD) to Azure Blob storage (no I’m not being insulting to Azure storage – BLOB is a Binary Large Object).

The tricky part is that there are a bunch of pre-requisites for this image, its internal setup and the applications on it. So some patience is required.

To follow these steps, you must already have an Azure subscription (a trial will do) and you must have an Azure RemoteApp preview service enabled. Both of these steps are described and links included in my previous post, here.

Step 1 – Create your Image.

There are a number of ways to do this, of course PowerShell is the best way if you want to repeat the process and I will publish a handy script to do the basics, another time. For now just use either Hyper V or Disk Management to create a dynamically expanding, VHD with a Master Boot Record type (MBR). All three of those factors are mandatory.

Azure is not yet able to handle the newer VHDX format so make sure it’s a VHD. A dynamically expanding disk is much more efficient for the uploading and blob storage process. The Azure system only handles the MBR format rather than GUID type.

Once you have this created, then you can go ahead and create a VM based on that disk. The next pre requisite is that you MUST install Windows Server 2012 R2 on the disk. The machine can have several volumes BUT only one instance of Windows Server can be installed. This caters for the scenario where the Application you wish to use as a RemoteApp cannot co-exist on a system partition.

Having found a Windows Server 2012 R2 ISO or DVD and installed it to the VM (you will need to enter a valid licence key, which will be stripped out later on), there are several more tasks to complete before you can upload your VHD as a ‘gold’ image to the Azure storage platform in your subscription.

First you should use either Server Manager or PowerShell to add the Desktop Experience Feature which is hidden under the User Interfaces and Infrastructure section.

experience

 

 

 

 

 

 

This will almost certainly require a reboot.

Having rebooted the VM, again use Server Manager or PowerShell to add the remote Desktop Services Role and during the setup stage of the wizard for this, add only the RDS Session Host option. The normal process for installing RDS is to choose the RDS Installation option as most of the work for setting up your systems and servers is carried out for you.

This will not work for a number of reasons, not least as it requires a domain structure. So for this install you will need to choose Role-based or feature-based installation.

rds install

There are now a few settings to make to ensure that the Azure upload works ok.

First disable the Encrypting File System (EFS), this requires the following command entered in an elevated command prompt.

Fsutil behavior set disableencryption 1

If you know what you are doing and are used to hacking around in your registry, you can add the following DWORD here.

HKLM\System\CurrentControlSet\Control\FileSystem\NtfsDisableEncryption = 1

(usual caveats  around backing up the registry and not doing this unless you know what you are doing)

So far I have assumed that you are creating this image on-premises which in this ‘Cloud-First, Mobile-First’ world is an error on my part. There is an additional step to take if you are creating your image within an Azure VM.

There is an XML file stored in \Windows\Panther\unattend.xml

This needs to be renamed or deleted. If this step is not carried out, the upload script (which also assumes an on-premises image source) will fail.

The final three steps to make are

to fully update the operating system using Windows Update

to install your applications which you intend to be used as RemoteApps (my demonstration uses Adobe Lightroom v3 9a digital image importing, cataloguing and editing application).

The last step is to generalize your system so that when Azure starts the image during the process of uploading and provisioning, the system does not find keys, users or passwords that would block the process.

The command required for generalization is

  C:\Windows\System32\sysprep\sysprep.exe /generalize /oobe /shutdown

This will place the VHD in the same state as immediately after installation before entering the product key and administrator password (oobe). Do remember that even though Sysprep (one of my favourite tools) has a switch designed to use with VM’s, you should not use the /mode:VM switch as this would cause Azure to reject the image.

Even though this is a very simple set of instructions make sure that all of them are completed AND in the order stated. If you do not, expect a lot of red in your PowerShell results! (Hint – RED is bad) (I saw a lot of red in creating this post)

Step 2 – upload your image.

Provided that you have completed step 1 correctly this is the easiest but longest part of the process.

First in Azure go to the RemoteApp section select template Images and upload, answer the wizard options of name and Location (BIG NOTE – remember the location and make sure it is the same as the location when you create your remote app service, if not, you will not see this image)

upload1

Having created the template image name the wizard will automatically start downloading the

Upload-AzureRemoteApptemplateImage.ps1 script (remember where you save this)

The wizard will also contain a link to download the Azure PowerShell module and the details of the command to run, in this instance the command is below

commanupload

Here is the command in a  more readable form, simply copy and paste this into a notepad document so you can reuse it (if your first few tries do fail)

.\Upload-AzureRemoteAppTemplateImage.ps1 -SAS “?sv=2012-02-12&sr=b&si=428348c2-81f9-4ce6-9021-635289465915&sig=DMAia%2F0qa%2B4pXoDad5mRyJEjx%2BTyWVNEJW1Ah%2BDXjnY%3D” -URI https://cdvwu110638459rdcm.blob.core.windows.net/goldimages/428348c2-81f9-4ce6-9021-635289465915.vhd

Ensure that your VHD image is stored on directly attached storage as the wizard does not pick up network shares.

Having downloaded the Azure PowerShell, go ahead and run this from the start screen as an administrator.

azpshell

 

 

 

 

 

 

 

 

 

 

Change drives to the drive and directory where you downloaded the script and paste the command and run it.  The script asks for a location to the VHD, select the correct image and off you go.

upload3

My upload took about 1.75 hours and was around 10GB in size.

uploadfinal

From this point ALL the steps are identical to those in my first post. If the applications you want to use are not on the start menu, they will not show up in the publish applications window and you will need to add it manually by path. (Hence I suggested you write down the path).

path

Once the app is published, you can simply look for more app invitations if the Microsoft RemoteApp is already installed and configured to a different Azure RemoteApp service. The new apps will then show up and you can run them at your leisure.

updating apps

Note here that the first time you run an app within a session from EACH Azure RemoteApp service, it will take longer than usual as it has to set up the connection to the new session host. Also note that despite there being different session hosts, there is only one connection to Azure RemoteApp – all connections go through this one connection, as shown by the Screenshots of the task manager (processes). See also that where I have many RemoteApps open, almost no processing power is used and not that much RAM either. I can also run duplicate copies of the same Application on the same machine (1 RemoteApp and 1 local) also shown in the Task manager screenshot.

taskman

In preview I am limited in the number of concurrent sessions I can run to ten.

nosessions

 

 

 

 

 

The sessions listing in the Azure console allows you to keep track of this. I can even log off remote sessions, disconnect them (for later re connection) or send a message to the console of the user.

disconnect

 

 

 

Of course simply because it is easier for me to do quickly i have shown all this on a Windows 8.1 client. It is available on any platform where a RemoteApp client is available (OSx, iOS, Android, Windows)

Again I have gone well over my self-imposed 1000 word limit so will keep the hybrid implementation of Azure for another day. Happy Apping.

 

 

MCSE Update – Beware the path you choose to tread.

mcsepng

 

 

 

 

 

Despite my crossing the professional tracks from Full time MCT to Microsoft Full Time Employee as an evangelist, I am still keen to engage with anyone on a certification journey or those still teaching and training our hard-working IT pro community. As such I feel the need to react to the announcement that there is now a choice in the path you take to gain your MCSE certification for Communications,
Messaging or SharePoint.

So first up, what was the position?

Well, when Microsoft released Windows Server 2012, they also reinvented the MCSA and MCSE certifications which I covered here and here, so no need to repeat that. The key point for this post is that the ONLY route to any of the new MCSE badges was the MCSA Server 2012. (3 exams 410,411 and 412 each getting progressively more challenging).

* The caveat to this (there is always one of those) is that if you held certain prior certifications you could take the upgrade exam (417) which I found harder than all the other three put together.

* The other caveat is that if you possessed the MCSA Server 2008 then you could use that to gain your MCSE for a specified time.

All that being equal, essentially the only route in now is to get your MCSA Server first, then take the two examinations for your chosen specialism.

As shown in the graphic below.

mcse

 

 

 

 

 

 

As a key, Communications relates to the Lync product, Messaging to Exchange and SharePoint is self-explanatory. But as I explain in the previous posts, the two MCSE level exams are not limited to single products like the MCSA levels, they are much more detailed complex and wide ranging in style and content.

So what has changed?

Well, with the release of the Office 365 MCSA, which I have posted about quite a number of times here. There is now the option to study for an MCSE without having any on-premises server certifications.

It is now an available route to MCSE by gaining the MCSA Office 365

offline-hero-office-365

 

Those who have read my trials and tribulations in achieving the aforementioned MCSA will know that it certainly is not an easy option. But there are more detailed questions to ask.

Is it right to be an MCSE, the premium level of certification now that Master and Architect have been benched, without any tested knowledge of the key skills such as DNS / DHCP / VPN / TCP/IP / Active Directory / ADFS etc.

My immediate and resounding answer to that is NO! Certainly not.

The clever thing here is that by granting an MCSA after only two exams instead of three, it looks like a quick and easy way to sneak under the skills barrier.

Far from it.

I have no doubt that those who cheat will still cheat and to them it makes no difference as they had none of those skills before, but to those who choose the Cloud route rather than on-premises, they will still need the whole list of skills I mentioned above to be successful in the Office 365 exams.

Both exams test a great deal about Active Directory / ADFS / PowerShell / Authentication in addition to the key skills for the administration of the individual online versions of Exchange, Lync and SharePoint. The final element is of course the other portals that are required to administer Office 365 and setup the subscriptions.

Two exams but having taken all the exams in both routes, the Server 2012 on-premises route was by far the less challenging for me.

This may have been because I already had the skills as a basis and that those who come in the future and know no on-premises products will not, in which case either route will be seriously beyond their reach without a great deal of study.

I think this addition is a good step and opens up a number of possibilities for future IT Pros to concentrate on the cloud first route into certification. Whatever the route, they will possess one of the most respected IT certifications around which will be of great value for the three years until re-certification is required!

 

Just how important is ‘The Cloud’ in a world of connected devices?

The importance of ‘The Cloud’ in a world of connected devices.

empower

It doesn’t really matter if you are a consumer or an enterprise user. The number of devices you are likely to be using has increased from 0 or 1 to around 4 or 5 already. This is likely to increase as we consider the rapid expansion of the Internet of Things (IoT).

Already there are internet connected cars such as those built by Qoros in China who have developed an innovative solution using QorosQloud. There are many features of the system, but the most attractive to me is the ability to create a navigation plan based on your calendar and its meetings.

qoros

Whilst sat in the office at a desktop PC you can set a meeting and its location. If your car is parked a short distance away, when you pick up your smartphone the walking navigation route to your car (not your final destination) appears and activates, when you arrive at your car and start it, the journey continues automatically. When you park some distance from the destination of the meeting, in a car park identified by Qoros, the smartphone guides you the final steps by foot. Impressive, I think so. This system exists now and is entirely hosted by the Microsoft Azure Cloud! Read the press release here.

qorosqloud

This is just a taster of what a world of connected devices can do. The ability of an enterprise to maintain and support the required infrastructure to manage such a solution on a wide scale is beyond the means of all but the largest most cash rich organisations. Indeed the cost of such hardware and logistics required to manage such datacentres efficiently, professionally and with SLA’s in acceptable levels of reliability is currently within the reach of only two or three such organisations on the planet.

What impact does the cloud have on the average mobile consumer and worker?

So let’s assume you aren’t in China, don’t own a Qoros car and just sit on our little island commuting from Birmingham to Reading on an infrequent basis, splitting your working life four ways.

Home Office,

Workplace,

Mobile and

Customer or event sites.

This not only describes my working pattern but those of a good many people (if you replace the commute start and finish locations with others!)

Could I do my job, and live my fairly normal home life without the cloud?  Imagine the unthinkable, all non on-premises resources become unavailable for a day.

What could I still do?

Let’s take a look at what services I use on a day to day basis.

available

So my normal day would be completely limited to telephone calls and using my Laptop / PC / iPad or Surface to do local things, saving them to local storage or removable disks for transport to work.. If I had an exchange account that was not on Office 365 I would still have email.* Some enterprises may have some of their services on-premises but for my working life this is not the case.

What, then could I use to keep connected to my digital life?

In my working life, I could use all the old traditional tools like Server 2012 R2 / SharePoint / Exchange all hosted on-premises. I would face all the challenges that brings for scalability, elasticity, sharing between organisations using ADFS and other federated identity solutions. Without thesse i would be connected internally but isolated from my wider connected environment of customers, partners and vendors.

So what about the consumer. Essentially without the cloud we are pretty much disconnected. The vast majority of internet services are now either run on SaaS, PaaS or IaaS cloud solutions. The days of normal businesses being able to host their own public-facing services are gone.

I include in this all other cloud providers that are not on the scale of the big two or three.

If you think the Cloud is just arriving, then nothing could be further from the truth, we are already completely and utterly dependent upon cloud services of all types.

The future is only going to get cloudier. The management of mobile devices using technologies such as Windows Intune, the provision of ground-breaking services such as Azure Machine Learning and Azure RemoteApp are only proving just how valuable and important the cloud is to a world of connected devices today.

cloudos1

 

Already OneDrive is integrated into Office 2013 and Office 365, Google integrate cloud storage into Android handsets, Apple integrate iCloud into iPhone and iPad – there is no escaping the cloud whether or not we send our enterprise applications, infrastructure and data into a public facing cloud, we are all already enmeshed!

 

The internet age brought us information at our fingertips. The cloud era brings us the ability to manipulate, consume and use that information in ways and with speed and in volumes like we would never have believed. Microsoft Azure Storage now stores Trillions of objects.

azure2

 

The future world with the explosion of the IoT where everyday objects are internet enabled and produce data but have little or no storage on board will require another rapid increase int he resources available in the cloud to store and analyse that data into usable information.

I was determined not to focus on the Social media aspects during this post but cannot finish without a brief mention, the rapid expansion of the different platforms and their monetisation has simply added to the drive towards cloud hosting of data and services. It is too late to go back, the duty now is to continue to keep the cloud a safe and secure place for us all to rely upon in our working and personal digital lives.

Azure RemoteApp – Why it is such a killer solution?

So imagine you are the CTO / CIO of a business (any size, it doesn’t matter from 10 people to 10,000) your clever team of developers have just implemented a top-notch line of business application (LOB) using the latest tools, on the Windows platform. The application becomes the de-facto tool for all your employees and the business cannot run without it. You decide the time is right to sit back and relax a little as the world is a good and happy place.

The following week the executive board decide that your previously rigid policy of no access to the network unless by a corporately owned, domain-joined device is out of the window and employees may either bring their own device to work or select from a range of devices that are not limited to the Windows platform such as Mac OSx, iOS and Android. The relaxing is over and you need to come up with a solution fast to prevent the revolt in you development team. They know and love their programming in .NET etc. but have never created apps for the other platforms.

What do you do?

The solution is to dive into Microsoft Azure RemoteApp. The short answer is to publish your LOB application either entirely in the Cloud or as part of a Hybrid solution utilising Windows Server 2012 R2 remote Desktop Services (RemoteApp) (which has been around since Windows Server 2008 R2).

Currently in preview as shown below

preview

Azure RemoteApp allows the user to publish a list of applications to users or groups of users that will not run natively on their systems. The only requirement for their platform is the existence of the Microsoft Remote Desktop Client which is available for Windows, Mac OSx, iOS and Android.

The trial solution includes a whole host of built in applications as well as Office 2013. Before I walk you through the simple setup and deployment, let’s finish off the tech specs and business benefits.

Importantly in the mobile devices era, the concern over data security is eliminated since the application runs on the Azure servers (or your on premises ones) so the loss of the device does not expose and data to exposure or loss.

Scaling up and down is rapid, cost-effective and easy to achieve in the Azure platform. This can cater for rapid expansion or decline in service requirements. Seasonal workers can be accommodated without the capital costs and time involved in procuring and deploying servers. Equally important the scale down doesn’t leave expensive assets switched off being unproductive.

Employees can use either their corporate credentials or a Microsoft Account, whichever you choose.

Sound good, well if you want to follow this walk through simply sign up to an Azure trial (if you aren’t already an Azure customer) by clicking here. Then once you are in use any of the helpful tutorials or product documentation the get up to speed with the portal and navigation. Finally to prepare you for the session, sign-up to the Azure RemoteApp preview here.

Once this has been accepted and you are comfortable with the portal, read-on. (The rest of the post assumes a basic knowledge of Azure services and principles)

Sign in to your portal at https://manage.windowsazure.com where a list of your previously created items will be displayed.

azurehome

Scroll down the list of services until you find the RemoteApp section, select this tab.

remoteapptabSelect  + New at the bottom left of the screen and then App Services followed by Remoteapp (you may have to scroll the list down a little, the number of Azure services is growing rapidly). Two choices are available Quick create or Create with VPN. The latter is for Hybrid deployments, for now we will choose Quick Create as below

 

 

 

 

 

quick create

Enter a name for your service and the location in which you want this hosted. The number of regions is also growing. I always select North Europe or West Europe which relate to our data centres in Dublin and Amsterdam respectively (note Dublin is North Europe!). Currently the only template image available is Office 2013 Pro Plus running on a Windows Server 2012 R2 platform. (Although you will never see or know about the servers, Microsoft patch, update and maintain them in the background as part of the SLA)

Once your service has been created it will appear under RemoteApp services. Select the service by clicking on it and the service setup page will be reached.

The next stage is to configure which applications and which users may take advantage of this service.

dashboard

Select the ‘publish remote programs’ option and add / remove from the list of available applications shown below.

select programs

Having chosen the required built in apps, you can now go ahead and click on ‘configure user access’

Simply enter the email addresses of your users. You can even add groups to segment which users have access – this group would come from the built in Azure Default Active Directory.

users

The remainder of this tab hosts a published apps list and a session list showing the status of all connected users.

Clicking on the dashboard will show your current and available session usage and other image and template information.

You are now all set. All that is left is to download the client application from https://www.remoteapp.windowsazure.com/ which will default to the windows application page but a link will take you to the all clients download page. These are shown below and include Windows (x86,x64 and 8.1RT), Android, iOS (iPad and iPhone) and Mac (OSx)

dlp

 

 

 

 

 

 

 

 

 

 

 

 

Having done that it is simply a matter of running the app, logging in with the registered email address and choosing the Windows application to run.

Let’s have a look at the client experience on a Windows 8.1 device (simply as I am being lazy with the time required to screenshot on an iPad and transfer the files – on a deadline!)

The first thing to remember is that like on-premises RemoteApp, the first time you connect to an application, the Remote Desktop Protocol session is initialised and authenticated. So the app takes a short while to open. All other apps hosted on the same host then open in that session and are much quicker. (Remember, if you close all the RemoteApps the session will be closed so future starts will be slower again).

firsttime

Here it’s worth pointing out that the RemoteApp client can connect simultaneously to more than one RemoteApp service so an end user can have access using the same email address to many different RemoteApps from many different providers.

 

 

So run the Application, in this case from the start screen.

startapp

 

 

 

 

 

 

 

 

 

 

 

 

Log in with your registered email address and you find an empty window.

signin

 

 

 

 

 

 

 

empty

 

Click on App invitations and you will see any provider invitations to connect to a RemoteApp service. Choose the one you need and the window is then populated with all the published apps. This can be refreshed mid-session to see if any apps have been unpublished.

invites

populated

From there run the applications as you want, whilst remembering that your PC, iPad, Mac or Android device is not doing the processing the remote machine is doing that all you are doing is running a remote connection to it and presenting that on your screen. So data remains on that server. You can connect to your OneDrive which exposes the cloud shared files (currently each user has 1TB of storage). See the screenshot of the Task Manager Processes tab on my home PC when running four Azure RemoteApp programs. A total of 170MB of ram and 0.1% of processor.

taskm

I have shown below some screenshots showing PowerShell and CMD windows open having run a whoami and an ipconfig to show the host is the same for each app and the ip scheme and networking information is specific.

techdata

I have also shown a shot of the open dialog in RemoteApp Excel and the temporary storage setup for the system with a handily placed warning README.DOC not to store files there.

thispc

 

textwarning

The files you do not want on OneDrive can be saved locally. Try creating a file closing the session and accessing it again they are all persistent. So your data is stored safely in Azure.

taskbar

Here we see a RemoteApp Windows PowerShell session (I would run that wouldn’t I?) You can see from the task bar that the icon has a RemoteApp symbol on in (in green)

I have gone well over my normal word count and haven’t even touched on the uploading of your own images to host in the cloud or on creating VPN’s to run RemoteApp programs on-premises through Azure.

These will have to be for another day, but both involve a lot of PowerShell, always PowerShell!

MVA Heroes on tour

The TechNet UK IT pro team have been running a very successful competition over the summer. The aim is to promote the Microsoft Virtual Academy and drive IT Pros to its content in a structured way, with a fun approach to the competition. The MVA Heroes.

It is a universal truth that IT Pros love free stuff or Swag as it is known. They love prizes like laptops and phones even more!

The competition is over but the collection of the MVA Super hero collection goes on. We have sent many hundreds out to the eager beaver students whom complete courses that qualify them for one of the six figurine stress toys.

heros

Well last month our marketing manager Dan Pilling attended our annual marketing extravaganza known as MGX – it was a special occasion for Dan as he won a very special award and was inducted into the Platinum club. Dan decided to post a picture of one of the figurines on his plane seat and with celebrities he bumped into at the conference.

See here, so that got me thinking. I spent last week in Seattle on an internal Technical conference so I took my hard earned Super Heroes with me and decided I would try and do a little better.

The results are below
WP_20140725_003

 

 

 

 

 

 

 

 

My plane seat with two heroes to keep me safe

WP_20140728_004

 

 

 

 

Larry Kaye Development Certification Product manager from Microsoft Learning

WP_20140801_006Four Heroes take a trip to Seattle

 

 

 

 

Finally my favourite Hero meets a very famous person.

The first comments that identify the hero and the Tech Celebrity win a bit of swag from my collection.

WP_20140731_001

 

 

 

 

 

 

 

 

More importantly why not take your hero or your MVA Hero T Shirt when you travel, send me pictures and who knows you may pick up some special Tech Swag… if its far enough away that is. Seattle was 4781 miles away, can you do better?

Office 365 MCSA – Post Exam feedback!

If you have been following my journey to the Office 365 MCSA certification in previous posts, you will know that I was scheduled to take the exam on Wednesday  30th July whilst attending the Microsoft internal TechReady conference.

Not many people outside Microsoft will have heard about TechReady, that is because the event is for full time employees to learn the new products strategies and road maps for the following six months and for very good reasons this is not broadcast like the TechEd events.

The similarity between TechEd and TechReady includes the provision of Certification Prep sessions and a Certification hall for attendees to schedule and take MCP examinations whilst onsite.

I don’t have the stats for numbers taken and passed during the week but I can tell you that i was in the largest Prometric exam hall I have ever seen, there were over 100 testing stations and a whole bunch of proctors to keep an eye on us.

I had spent several evenings in Seattle with my nose buried in the books (or screens) to ensure that I was successful. The conference pretty much runs from 0700 to 1900 every day so the extra effort to study through jet-lag was no little work. (I don’t travel well and the 8 hour time difference was a not insignificant factor in my studies)

I did enjoy the surroundings of the Seattle Sheraton though and the weather was superb all week.study1

 

 

 

 

 

 

 

 

 

 

 

Having attended a prep session for the 70-346 and 70-347 exams on the Tuesday afternoon I decided to reschedule the exam to that evening. So in I went and sat the test.

I will leave the result until the end of the post but I will say that without the level of study I would have done much much worse. Obviously the NDA you agree to when taking exams prevents me from disclosing the details of the exam, but I can say a few things which will help those of you preparing for the experience of 70-347. Enabling Office 365 Services.

The exam I sat was a standard one with a wide range of the item types I have listed before. These included PowerShell build questions, drag and drop, hot spot and other types.

The overriding point I would like to make is that the exam was already significantly out of date in a few ways. Anyone that is a Microsoft Azure or Office 365 user will know that these cloud based Iaas, Paas and Saas products change almost weekly.  (Why not sign up for free trials now and check them out?)

This means that screenshots in your exam may not look like the current product, I had several instances where this was the case. The strategy for exams based on these products is under review (I met and spoke to the LeX team whilst at the conference).

I am prevented from giving too much detail and giving the number of questions is fairly pointless as these change often and when new questions are under test, they are added in too.

I can and will say that the exam is a fair test of the whole product, covering Sharepoint Online, Lync Online and Exchange Online as well as the Office 365 portal and admin consoles and the PowerShell required to run these products from the commandline. It was not easy and it caused me no end of headaches in the review of the questions.

Top exam taking tip from me is that your first answer is normally your best one and changing your answers just gets you in a mess.

I am pleased to say that I was successful with no wasted effort, I scored exactly the mark required to pass, no more, no less. As you can see from the exam sheet below.

o365pass

If you follow the Microsoft Pychometrician Liberty Munson on Born2Learn, you will know that a score of 700 is not 70% and does not reflect a result worse than a score of 800 or 900. Personally I have never understood this and look forward to someone explaining it to me in words I understand. I have shown this score to point out that even though I studied really really hard and know the product very well indeed, and I love PowerShell, had I got one more question wrong, I would have failed this test.

So study hard, study the correct things and Good Luck.

The bottom line, however, is that I now have the MCSA Office 365 to add to my transcript. This will help me in my career as a Technical Evangelist, will flag me up on LinkedIn to people looking for speakers and allow me to teach the two courses related to the certification.

mcsacerto365

How to keep company data secure, in a ‘mobile first, cloud first’ environment

Acloud-firstmobile-_Page

 

 

 

 

 

 

 

 

 

 

It’s a  Brave new world, a mobile first, cloud first world of technology. In that world there are many new ways of consuming data, on many new devices. Data security is of paramount importance to any user, IT Pro, Small Business or Enterprise, it is a universal requirement of making that data available.

The operating system, platform and delivery method of the data should not cause any increase in the risk to it. In short the corporate entity must have complete control over the access to, consumption of and removal of data in terms of users, devices and platforms at all time.

pcit2

 

 

 

 

 

 

 

That’s a fairly big ask. It is also one of the primary barriers to adoption of cloud technologies and people centric IT (PC_IT).

What is Microsoft’s approach to this rather thorny issue then?

As you might imagine, Microsoft has a number of methods of achieving this end game. The adoption of these depends whether you are either fully cloud, hybrid or fully on-premises for your infrastructure and data requirements. One thing is for sure, they certainly have it covered.

For an on premises scenario with a number of BYOD and corporate issued smartphones and tablets, the solution involved a number of products including Windows Server 2012 R2, Windows Intune and System Center Configuration Manager (2012 R2). The elements of the Server platform that assist with this solution are Active Directory Domain Services (AD DS), Dynamic Access Control, Active Directory Rights Management Services (AD RMS), Active Directory Federation Services (ADFS) and the all new Web Application Proxy.

For a Microsoft Azure based solution, the new Enterprise Mobility Suite (EMS) is designed to cater for most of the same functionality. The EMS consists of Azure Active Directory Premium (for Hybrid identity management and Multi Factor authentication as well as other added functions), Windows Intune and Azure Rights Management Services.

The Hybrid cloud customer would be able to take advantage of all these products to manage their data.

As an additional portion of goodness, Windows Server 2012 R2 also comes with Workplace join and Work Folders.

workfolders

 

 

 

 

 

 

 

 

 

If all of this isn’t enough security, Microsoft also has a scalable and robust Virtual Desktop Infrastructure solution for a whole number of different scenarios that can actually prevent the data leaving the corporate network at all, whilst still giving remote users the ability to enjoy a standard interface and experience. (These include Session Virtualisation, Desktop Virtualisation, both pooled and personal and with Microsoft App-v the ability to stream applications too.)

The final piece of the jigsaw is the new Azure RemoteApp which is currently in preview that now allows a cloud based solution for application virtualization. (RemoteApp is also available for your on premises Windows Servers too).

It is important to point out that the overall People-Centric IT vision is not restricted to data security and management but has a three-pronged approach to PCIT. That of Enabling end users, Unifying the environment and Protecting data. Take a look at the PCIT whitepaper here.

So assuming you have visited the links and read the whitepaper (which after all is why they are linked…), you now know the field but what about the practicalities and what scenarios are covered by this.

Gosh Ed, that’s 500 words where you have pretty much just listed solutions to data security in a whole bunch of scenarios. How do we use these and how do we choose what to use in what situation?

The rest of this post is dedicated to three examples of when to use these solutions. I will then go on to a more detailed technical explanation in a series of future posts dedicated to each solution.

So, Scenario 1.

An iPad user wants access to their corporate intranet and files and folders, some of which are business critical data files. What can we do to allow this access, but control the device and ensure the data is secured?

The iPad has the facility to download the profile settings and join a workplace environment without being domain joined formally. This would allow access to a company portal for access to websites and applications.

WJ_iOS_02

 

 

 

 

 

 

 

 

 

 

 

Using Windows Intune, an administrator can enforce polices for security and data wipe on the iOS device. Securing the data

fig-b-intune-mdm-create-policy

 

For access to secure data or applications incompatible with iOS, then a Virtual desktop could be used (Microsoft VDI) or a Microsoft Azure VM to keep the data off the device and allow access to the application on an incompatible operating system.

 

Scenario 2

An  Android SmartPhone user wants access to work email and files and folders for work use.

Windows Intune will secure the data and allow remote wipe of the device and or the data if required. Policies may be applied by the administrator to ensure that the device has a password and that encryption is also enforced.

Intune09

Scenario 3

A  Windows RT 8.1 Tablet user wants to use a non-domain joined Tablet for work access to email and applications as well as work folders for data.

The combination of Windows Server 2012 R2 and the EMS suite will allow the administrator to provide workplace join, work folders and software deployment as well as endpoint protection for the device. Additional polices may be applied with Windows Intune to enforce rules and security of the data and to remotely wipe the device / data if required.

In a ‘Mobile first, cloud first’ world of devices and data, security is always a concern but the solutions available from Microsoft allow complete control of data access, security, integrity and removal. Don’t forget of course EMS is powered by Microsoft Azure and you can control your Azure subscription with, yes you guessed it PowerShell!

Watch this space for the detailed technical solutions for the three scenarios above, with a special one for the Web Application Proxy all on it’s own. This ground breaking server role replaces the Active Directory Federation Services Proxy role and also does so much more!

Office 365 MCSA – 70-347 Study Guide

The Study of this exam content is causing me no end of difficulty. Because of this, I have adopted my usual policy of forcing the issue and have booked my exam for Wednesday 30th July at 1130 AM in Seattle. I will be attending the internal TechReady conference and will be sitting in on as many Office 365 sessions as I can.

But why am I having such difficulty with the study when I have used the component parts of the product suite for many years?

So why then does the conglomeration of these products cause me such a headache?

Well with the constant upgrading of the Microsoft Learning certification exams, the types of questions one is likely to be asked are no longer limited to multiple choice and drag and drop type solutions. Checkout the Active-Screen ,Hot-Area and Build Lists mini videos to see what you will face. The Microsoft learning Experiences page here is a great place to start if you are unfamiliar with taking Microsoft certification exams. (Note here that from September 4th this year, there will be two choices for booking and taking your exam. You can sit your exam either at a Prometric centre or coming back after a seven year break at a Pearson Vue centre. Old hands definitely have a favourite interface and experience. It is equally important to note that the actual exam experience is exactly the same wherever and however you take it.)

This change in the method of asking questions can mean that you are presented with a screen taken from several layers down into the administration of a product that you are not familiar with. Such as the screen below.

activescreen

 

any idea which product and where that comes from?

Anyone familiar with these exams can tell you that guessing, really is a last resort. So to be proficient in all the above products takes time, effort and no little patience. The benefit of a cloud solution such as Office 365 is that  there is very little on-premises configuration and installation to do.

I have been madly using my free trial of Office 365 to navigate the products and set up SharePoint Online sites and all the other great features that Office 365 provides.

To do this I have been using a OneNote notebook where I can drag and drop all the cool TechNet links and articles and also run through the MVA course too.

If you haven’t already signed up tot he Microsoft Virtual Academy, you really should its quite simply awesome!

The Office 365 syllabus is not limited to the course linked above, a search for Office 365 brings  up four pages of course content. So pick the correct ones read the synopsis and section headers to make sure you are not wasting valuable study time.

No matter how many courses you take and jump starts you watch, there is absolutely no substitute for using the product – hands on time and testing / breaking / fixing the software.

As a trainer I had many certifications but if I hadn’t used the product in a live environment I was always very reluctant to stand up and teach it, after all IT is not really a hands-off job.

So to end this short study update here is a list of a few key resources I have found and am using for my 70-347 study.

Office 365 Identity and Authentication Poster

MOC 20346A (B version Released July 2014)

MOC 10968B

TechNet Office 365 for IT pro page (great jump off point)

Office 365 Fast Track site (exam covers planning and deployment)

PowerShell Cmdlets for Licensing users (It would be a Blogg( Ed) post without PowerShell now would it)

Also go to the Windows Store and download

Posterpedia and the Microsoft Training and Certification Guide

Several excellent and relevant resources in there as well as a roadmap for your certification needs.

Must dash – studying to do!

Office 365 MCSA – Halfway House

There are two certification examinations that make up the MCSA Office 365. The background, requirements and details are listed here and are partially shown in the graphic below.

o365mcsa

In short, to certify you need to pass 70-346 and 70-347 (the 70 simply identifies the retail examination, there are other codes for Academic and Academy exams- the content is identical, as is the passing score).

One of the problems for Microsoft in producing an Office 365 qualification is that the product encompasses so much and is continually changing and updating (so the MCITP in Office 365 or the Office 365 for Small Business qualifications are not that old but the content is not relevant (in my opinion, for today)

Why am I telling you all this, well I wrote a blog post last month about Self-Study, relying on my past life as an MCT (Microsoft Certified Trainer). As a trainer I do advocate all manner of training methods, not least MOC (Microsoft Official Curriculum) and MOAC (Microsoft Official Academic Curriculum), indeed I have written them before now for Windows Server 2012 and for Windows 8. Classroom based training is one of the methods that works very well for thousands of delegates and students every year.

There are those, however, for whom the course is out of financial range or the time required away from work is too great. If that is the case for you, the reader, then this series of blog posts is definitely for you.

To recap, we have so far dealt with the theory of making time, finding resources and actually studying for technical exams. I also decided that since I hadn’t taken any Microsoft exams since March and that I had been exceptionally sub-optimal in the BETA versions of the Office 365 tests, that I would ‘put my money where my mouth is’ (For the reader, Sub-optimal relates to a score below 700 which is the score required to pass the exam. I don’t consider it a fail, especially when teaching young people and apprentices, ‘failure’ is such a hard concept to grasp since the modern schools system doesn’t really have competition or failure in its curriculum – wrongly in my opinion but that is an entirely different subject for a post all of its own).

So I set myself the target of passing the Office 365 MCSA before July (this year) and yes I do like a challenge. Unfortunately I was unable to book both exams within the time limit (I was not about to forego my holiday to the Glastonbury Festival or give my tickets to our editor Steven Mullaghan, much to his disappointment). The second exam will be ‘in July’ sometime.

I wasn’t left with much study time,my role as a Technical Evangelist keeps me on the move, on my toes and rather buy, to say the least. I posted my decision to retake within June on 2nd June and since then I have been on MVP Roadshows presenting on People-Centric IT, System Center 2012 R2 IT Camps, Planning for FY 15 (which starts next week), racing round Donington park grand prix circuit with the Microsoft Motorbike Club (proof below) and manning the Cloud World Forum stand at London Olympia.

bike1 (1 of 1)

In between these great events I have been carrying on with normal family life and preparing for a big year ahead in my role as a Freemason. So I haven’t had all that much time to devote to the exam (can you hear the beginnings of an excuse for being sub optimal?)

 

The purpose of my little story above is to explain that the time available significantly alters the methods I use to study. Given time, I may explore every avenue of the product and read books, use it in anger, go to the Microsoft Virtual Academy (MVA) and checkout the Jump Starts and other courses. I may even use my status as an MCT to access the MOC courseware library and download the Virtual Machines and the trainer materials and run the course for myself.

Sadly not enough time for those methods this time. Although I did use the MVA course designed to support this 346 test and the 347 one. An excellent resource of free technical training from highly skilled and technical Microsoft staff and partners. Whilst discussing the MVA, why not sign up for the UK initiative, MVA Hero a way of choosing your path for study and having a bit of fun at the same time. If this doesn’t appeal to you don’t worry the MVA search engine will find the course you want.

programmeherosSo with such limited time, having booked the first exam (I booked on 2nd June – so I had to take it! A good trick to stop you backing out – I don’t have £100 to waste on missed or sub optimal exams, for those in the know, MCT’s receive a 50% discount on vouchers for exams – another good reason to become one and Microsoft employees receive free vouchers, another great reason for applying to work here with such great people and resources, although you do have to report your results and the perceived peer pressure to be ‘optimal’ is huge, for me at least.)

I was left with about 6 days to go and in the middle was a really big weekend event that would need preparation and no chance to study for at least 3 of those days.

I took the last minute cramming approach and spent 14 hours on Sunday going through the MVA course and scouring TechNet articles for methods to remove licences from Office 365 using PowerShell, to how to deploy a redundant AD FS infrastructure. Trust me there were so many pointers in the course to areas to really work hard on that I was not short of ideas.

I had booked the exam for 0900 about 60 miles from home (not many seats available in Prometric test centres in Birmingham so its really great news that Pearson Vue have also recently been awarded the contract to provide tests from September 2014).

I set my alarm for 0400 and woke up at 0355 – I set to a last minute or last 3 hours of revision of the key topics. I followed the advice from my previous post. My technique is to copy the areas to be studied into a OneNote notebook and to create links to all areas for TechNet, MVA other blogs and pdf’s.

Remember the people that write the exams have to get the content and ideas from somewhere and when you have taken some exams you will quickly find which Microsoft approved resources are useful and which are not.

Final piece of advice – don’t get bogged down in too much trivia. Do run through wizards live to see what you can and cannot do at each stage to achieve something. As an example (not from my exam – as that would breach the NDA).

If you are looking at Exchange online, and want to work out how to track or manage malware detections in your email. There are several ways to do it but not all of them would answer the question, so read the question carefully.

In this example the exchange online protection section has a malware filter where settings and rules are created, it also has a quarantine section where the relevant message would be listed. But if you wanted to track malware received or sent over a period ranging between 7 days and 60 days then you would not use the Exchange management portal you would use the Office 365 admin portal and choose reports.  See below.

mal1

 

These are taken from the Office 365 portal and clearly show what is asked for but the question may ask specifically for Exchange Online – which may confuse you.

mal2

Clicking on the malware detections in received email would show the second screen where you can easily answer the question. None of this is available in Exchange Online.

I stress that this is not a question I have had or have seen. It is representative of the tricks and traps that such a complex product or suite of products can lead you in to.

The question itself (i.e. In your Exchange Online deployment you want to track malware in received email over the last 60 days and identify the recipient of the greatest quantity of malware) is fairly simple)

If you had not drilled down through all the available menus and sections you would NEVER come across this section, buried three levels down.

Oh and for thsoe of you who rightly noticed, I haven’t mentioned PowerShell much, the MSONLINE module is HUGE and the MSOL cmdlets appear very regularly in the study and test. (but you expected that didn’t you!)

So you have all been very patient. I took the exam yesterday. See below

pass346

All Microsoft exams require a passing score of 700, a maximum score is 1000. The theory is that once you reach 700, there is absolutely no difference between a score of 700 and 900 because the questions are different and the exam has a set number of types of question and in each area they all get marked and scored differently.

I have read the theory seen the video where Liberty Munson Microsoft’s  PRINCIPAL PSYCHOMETRICIAN, LeX Products explains this and I confess I absolutely do not understand this.

Suffice to say I was happy with the result and will now hope to give myself more time and try to produce a couple of posts mid study for the 70-347 exam.

The exam this week was all about setting up, and getting working with Office 365, security connections etc. The next exam is all about actually working with the products that make up Office 365 (Exchange Online, Sharepoint Online and Lync Online as well as OneDrive).

This is very definitely a greater challenge for me. Watch this space, i cannot commit to a date as I have an even busier July that I did June (I am off to Seattle for the internal version of TechEd called TechReady) and I also have to get ready for my next trip on the race track!

Don’t forget to ping me any questions you may have @edbaker1965 or leave a comment here.