UK IT Camps – Enterprise Mobility – Sign up now.

Life during my first year at Microsoft has been almost exactly as I imagined it might be, but as I always do at Christmas I reflected on the year past and particularly this year on my first 9 months as a Microsoft Employee. Whilst doing so I remembered exactly what prompted me to sign up to this fun ride.

camp

Back in 2012 I attended a number of Tech Days which later became re-branded and reworked into IT Camps. The format of these really did excite me enough to take a full day out of my busy self-employed contractor type schedule. A rare thing since a day ‘off’ is actually money lost and often since my engagements were a week long, it was often a week of income lost.

So what is so good about an IT Camp that it’s worth the time trouble and sometimes lost income to come and join in the fun?

Firstly an IT Camp is a FREE day of detailed technical education with a very heavy hands-on approach. There are IT camps for all sorts of subjects but the one I lead is the Enterprise Mobility Camp. This covers all aspects of Microsoft’s people-centric IT solutions, from advances in the Windows Client operating system to deep-dives into Hybrid Identity solutions using Active Directory in both Azure and on-premises with Windows Server 2012 R2.

The unique aspect of our UK IT Camps is that we always allow the participants to build their own agenda. This adds a much greater level of relevance to the audience and requires a much deeper level of preparation by the Technical Evangelists leading the camp.

And I am pleased to say it is even more fun to deliver the camp than to present one.

The IT Camp content is created by the Technical Evangelists working for Microsoft at our HQ in Redmond (such as @simonster). For the Enterprise mobility camp there are over 500 PowerPoint slides in the material to cater for just about any part of the people-centric story.

Call me old-fashioned but I do love the superb PowerPoint application, BUT even I wouldn’t sit through a day of 500 plus slides. No thanks.

In a typical day for an IT Camp we may use 20 of those we prefer to use live demonstrations and explanations to get out point across, oh and of course PowerShell too.

So what can you expect if you sign up to one of our Enterprise Mobility Camps between now and June 2015.

As I already said, we don’t fix an agenda in advance there are only three givens in the day.

It starts – you get lunch – it finishes. The bits in between will be filled with a mixture of discussion, presentations, demonstrations and lots and lots of hands-on lab work by you the important person in the room.

For this reason it is essential that you bring along a device that can use a browser and connect to the internet (we provide the internet connection) and any lab environments you may need. For those who have been before, if you don’t like working in pairs, then either bring two devices or a large screen (as the manuals are online as well and it can be tiring switching between them all day).

As an example, during the camps this year so far, we have covered Microsoft Intune in cloud and hybrid modes when connected to Microsoft System Center Configuration Manager for Mobile Device Management. We have demonstrated Azure RemoteApp in both Cloud and Hybrid configurations. Both Workplace Join and Work Folders have also figured heavily as has Azure Active Directory Premium.  Office 365 integration and Single Sign on / multifactor authentication were also prominent. The identity piece of the puzzle is a critical one to understand so many questions have been posed and answered.

The new 2015 Camps have also been updated to include Windows 10 content and most of our demo machines also run Windows 10.

We also get to demonstrate most of these through the use of iOS devices, Windows Phones projecting their screens through Miracast and even the odd Android device too.

A large number of IT Pros prefer to learn by a mixture of listening watching and hands-on – the UK IT Camp experience provides all of these in abundance.

Why not visit http://aka.ms/ukitcamp and register to come along and find out what @Deepfat and I (@edbaker1965) get up to and how you can begin to understand how the people-centric IT vision can be applied within your own work environment?

There are other added bonuses too. Since everyone present if focusing on the same topics, you get to meet fellow IT Pros interested in similar areas, who may well turn out to be great future contacts.

Finally we also hand out odds and ends as prizes for asking that difficult (or seemingly easy question) that others are just too shy to ask.

 

What every IT Pro wants for Christmas.

This post will resonate with some if not every IT Pro. I am going to limit myself to ONE gift. (So for that reason alone it is not a very realistic scenario, but bear with me).

christmas-gifts

I am not presumptuous enough to think I know What every IT pro might want, but that was the title given to me!

IT Pros are, as a rule, rather fond of free stuff (known in the industry as SWAG) they are also fond of Gadgets (known in the industry as Gadgets) but above all IT Pros are fond of peace and quiet, being left alone by management and users alike to get on with more important stuff.

At this time of year IT Pros are all preparing for the inevitable ‘good idea’ dreamed up by project teams and management such that “users will be off over Christmas so lets plan a major release of either hardware or software or more likely a complete refresh of both the hardware and software deployed to the whole enterprise” (or at least your part of it). After all the chaps in the IT department don’t need time off at Christmas, do they?

The one thing I CAN guarantee is that the above scenario is what all IT Pros do NOT want for Christmas.

I have three suggestions for every IT Pros dream Christmas present.

First there is a cheap and cheerful piece of SWAG. When @Deepfat and I go out and talk at events, conferences or our very own IT Camps, we try to make sure we have enough swag with us to reward the most difficult questions for us to answer or for that comment that dissolves the room into laughter. Or sometimes for a delegate who can get a word in edge ways. This year we have been handing out the Microsoft Spider (as shown below).

WP_20141212_001

With the proliferation of gadgets that rely on some form of electrical power, the shouts of ‘has any got a charger for….. ‘ can often be heard around the Microsoft UK offices. For this reason the spider is a must have. Small almost unbreakable and provides power for;

Micro USB

Thunderbolt

iPhone /iPad

and connects to either a portable power pack or a standard USB 2 or USB 3 power source.

It has saved me from many embarrassing situations. In the never go anywhere without one stakes it is my number 1. If you want one – come and see us talk, ask a question or make us laugh!

Secondly, the gadget of my choice comes from a list of Surface Pro 3, Go Pro and Linx Tablets.

The Surface Pro 3 has changed the way I work. it goes absolutely everywhere with me, I have yet to find someone who doesn’t like the way the pen works, the way OneNote in A4 portrait mode is so easy to use without a keyboard attached.

sp3v1

The GoPro+ really is the epitome of a Boy’s Toy but has so many uses it has to be on my short list.

I am a dedicated motorcyclist and long ago resigned myself to the fact that the only way to survive on the road on two wheels is to make the assumption that everyone else using the road is actually trying to push you off the road.  It is also really easy to blame the nasty biker. To prevent this I now have my GoPro+ firmly fixed to the front of my bike wherever I go and record all journeys to show I rode safely and responsibly. In a tech capacity @Deepfat and I regularly use the time lapse facility to record our larger events and set up break down of the seemingly endless quantity of kit we carry everywhere.

gp1

The GoPro Hero is a 4K device taking Video, stills, time lapse and burst mode shots. All in a package the same size as a matchbox. I thoroughly recommend it.

The final gadget is actually a cheat really for two reasons, firstly I dont own one and secondly it is a package offer which contains two distinct pieces of equipment. Black Friday saw all sorts of good deals and one of those has hung around a little longer. I am sure there are other good deals from other suppliers but for me the Tescomobile.com offer for a Lumia 530 and a 7″ Linx tablet with windows 8.1 is close to unbeatable. The cost is as low as £12.50 per month for the first year (reducing in future years) and includes both devices running on Windows 8.1 and a 12 month subscription to Office 365 Personal. What more could you want?

530

linx7

The final category was Peace and Quiet. Freedom from Management and users. Well for some a Christmas at home with family and friends is the furthest from peace and quiet they can get. For me the chance to kick back, relax (maybe study for an exam I am taking on 30th December, yes really) watch the great content on the new NOW TV Windows 8.1 App (and X Box App too) or discover the amazing quality of my new XBox One console with the Kinect 2.0 Sensor, I am sure I will eventually get used to talking to a computer and issuing commands.

Of course I am also helping a friend install a Windows Server 2012 R2 Essentials and migrating a Small Business Server 2008 as well as setting up the Office 365 mailboxes for the users. A Busman’s holiday maybe but it also helps to keep connected to the real world and not just talk IT but to practice it as well.

Because this is my post I am going to cheat again (there’s a theme developing here) I would also add extending my SONOS collection to more rooms so I can play lovely Christmas music throughout the house. If you dont have SONOS – get it quickly there is a 12 month unlimited music offer on with Deezer at present

I will also make sure there is plenty of time for my children to spoil me in the way they should (I refer them to the SONOS statement above) and for lots of quality time with my long-suffering wife Sue.

Whatever you choose from this list or any other, I wish you a happy, healthy and fun-filled Christmas holiday. Of course looking forward to many more Tech adventures next year too.

Roll up, roll up get your FREE Microsoft Azure BETA Exam code here.

Within the last month, I have taken all three Microsoft Azure Certification Exams, two of them have been BETA tests. Read on and you can get you own code to take a FREE Azure BETA Exam (only 500 codes available – first come – first served)

azure exam

So, I post quite often about certification and testing but not that often about BETA exams. The Non-Disclosure Agreement we all sign when we take an exam mean that I cannot say too much about the content, other than you should study the Microsoft Learning pages for the detailed exam specifications.

70-532 – Implementing Microsoft Azure Solutions

70-533 – Developing Microsoft Azure Solutions

70-534  – Architecting Microsoft Azure Solutions

The exam I want to talk about is Exam 534 Architecting Microsoft Azure Solutions. This is exam is not as deep-dive technical as the other two which if I am honest are pretty hard. The first because I am not a developer and it was wall to wall coding (not expecting a pass here!) The Implementing Infrastructure one was almost wall to wall PowerShell but was also a disappointing experience, not because the test was poor but because I was poorly prepared and did not pass (this was not a BETA test). I am retaking that exam on 30th December so watch this space for all sorts of whooping and hollering if I pass.

There are several distinctions between a BETA test and a live one. First the questions although checked and prepared can have some small typo or content errors, that is why we are testing them in a live situation to see how they fly. Secondly the exam has a much longer time allowance, so that you can make detailed comments on those questions that do have typos or content errors. Thirdly when you complete the exam there is no result passed down to you and your printout just says ‘Thanks for coming – the result will be with you in a few weeks’.

I take a large number of exams and often these are the BETA ones so I am used to waiting for results, but I also often get a feel for those where I have been ‘sub-optimal’ and I am expecting to retake 70-534 when it is released.

The key content difference for me with the Architects exam is that it is very similar to the old, old MCSE Design exams which regularly tripped up even technical product gurus. This exam covers Microsoft Azure end to end and requires a broad knowledge of all the services and how they hang together, but also how you implement them for a customer. For that reason this is aimed more at implementation consultants, partners and contractors who may be delivering Microsoft Azure services to their customers.

The 70-532 and 70-533 are, in my opinion aimed more at those customer implementing solutions directly.

Now for those who have read the post or even just scrolled down, here are the beta instructions. Good Luck.

Your Invitation to The Microsoft Azure Architecture Beta Exam

Microsoft recently released a new Azure architecture certification exam in beta: Exam 534 Architecting Microsoft Azure Solutions. This is an invitation to take the beta version of the exam at no cost for the first 500 respondents. Passing this beta exam earns you a Microsoft Specialist certification.

The beta exam is available  at Pearson VUE testing centers or through the online proctoring option.  To take advantage of this exclusive invitation:

Here’s the fine print:

  1. As this is a beta exam, your will not receive a score report immediately upon completion of the exam.  Your results will be delayed for a period of several months until the exam goes live.
  2. The code will only be available to the first 500 people who register for this beta exam.
  3. This beta exam is not available in India, Pakistan, China, or Turkey.
  4. Please review Microsoft’s beta exam policies in full.

Good luck on your next exam, whether you’re taking this beta or have another one scheduled.

The above is due to be posted on the Microsoft Learning Born to Learn Site on Monday, so hurry up if you want to register the exam before they are all gone. That’s a £100 Christmas present right there!

Early Christmas present for Azure RemoteApp fans.

ara1

Yesterday Microsoft announced that Azure RemoteApp would be generally available from Thursday 11th December.

Anyone who reads this blog will know that I rate this service very highly indeed. The ability to run Windows applications from any platform without having to run the infrastructure on your own premises is of huge benefit. But when you add to that the ability to upload your own images containing line of Business applications and authenticate against your on-premises Active Directory or run the application in the cloud and store the data in your own data centre, it just about ticks ALL the boxes for me.

The blog post here announces both the date and the pricing details for the service. I have already posted about the technical aspects and the benefits to be gained. Now that the pricing has been released I am even more excited at the prospect of this becoming a major Azure service for all sizes of customer.

The Azure RemoteApp service will be available on standard pay as you go terms from 11th December and also as part of Volume Licensing from February 1st 2015.

The pricing page can be found here.

So from next Thursday you can reliably, in a scalable manner run any application that will run on Windows Server 2012 R2 from iOS, Android, Windows RT, Mac OSx. You can run them from Azure Active Directory or link to your own on-premises Active-Directory.

Secure, Scalable, Solutions.

what an early Christmas Present that is!

What are the best IT Pro Tools for automation – and why?

Each month here in the @TechnetUk #ITPro office there is a mad scramble (bunfight / race / polite debate) to bid for the best blog topics for the month. Each quarter has a theme, this quarter it is ‘the right tools for the job’. I was lucky enough to see the email early and our editor Charlotte accepted my bids for my post last week on Windows Server Technical Preview and for this one on automation tools.

As you might imagine this was a pretty popular one to bid for especially as all IT Pros will always try and find the quickest most efficient way to carry out their allotted tasks while it is yet day, all so that they can carry on with important things like Wolfenstein (the original of course) and even a quick toe-dipping into the world of Xbox One.

Anyone who has ever read any of my posts would be forgiven for thinking automation, well that’s going to be another exposition on the glories of PowerShell and why we need to learn it or learn golf… and in part you may be correct but I decided to read the title in full and this gave me the opportunity to go further than a single tool. PowerShell could be described as the framework for a whole bunch of excellent tools (or modules) but in this post I will be treating it as one tool and will be including it as it is simply the number one productivity and automation tool available in the world of Microsoft Server operating systems and business platforms such as Exchange, SharePoint, Office 365 etc.

What else is available to an IT Pro as an automation tool, I had to think long and hard about this as I haven’t really used much else on a day to day basis for quite some time, to automate routine tasks.

What does the landscape look like in the world of automation?

What does an IT Pro want to automate?

Well the roles of an IT Pro, even though they are changing and being a little blurred by the new DevOps school of thinking, are many and varied, from a network specialist who really doesn’t want to do all the IP Planning, management and administration manually (or by spreadsheet) to the deployment specialist who absolutely doesn’t want to wander round a building and install images, agents and other software on client and or server machines when everyone else has gone home.

But let us start with the traditional view of an IT pro – the server administrator and yes PowerShell. I am not going to offer up the whole of PowerShell as that is something I do on a regular basis. I am going to talk about DSC, or more formally titled PowerShell Desired State Configuration.

As I usually do I am going to quote the TechNet description of the feature and then dive a little deeper into it.

“DSC is a new management platform in Windows PowerShell that enables deploying and managing configuration data for software services and managing the environment in which these services run.

DSC provides a set of Windows PowerShell language extensions, new Windows PowerShell CmdLets, and resources that you can use to declaratively specify how you want your software environment to be configured. It also provides a means to maintain and manage existing configurations.”

 Now that sounds all very well but it doesn’t tell me exactly what the feature does in layman’s terms, nor does it describe how to do it or really sell to me the impact that can have on my infrastructure and thereby the amount of time I will have released to carry out other important tasks.

So PowerShell DSC gives us the ability to define exactly what we want our server to look like in terms of roles and features installed, configuration and even right down to detailed single registry settings or environment variables.

The seasoned IT Pro may well say at this point, ‘So What?’ I can do that with Group Policy if I am in a Domain environment (and most It Pros work in such an environment). The answer would be yes, of course you can. But Group Policy has default refresh rate of 90 minutes with a randomized offset of 0 to 30 minutes to prevent all machines hogging the network at the same time. The seasoned IT pro will also tell you that up to 120 minutes is a very long time indeed in the world of Server configuration.

DSC uses a set of built in resources (which are growing all the time) to control a range of features, functions and roles in an entirely automated manner. DSC also allows the IT Pro to create custom resources.  At this point I should add that as with all things PowerShell, the community tends to share and a large number of custom resources are already available for free.

The descriptions of the original built in resources can be found here.

In a default install, the built in resources are those as shown below.

dsc1

 

 

 

 

 

 

 

I should also add here that this is not basic level scripting or PowerShell and this post is not aimed at teaching you the skills required to script or to understand complex PowerShell commands. I will list out several script blocks to show what is involved. TechNet again provides a great tutorial on custom DSC resources here. In that tutorial the reader is shown how to create a custom resource that will either create, configure or delete a website on a particular server.

All this can be run on a schedule to ensure that the Desired State is maintained across your entire server estate. It can also be pushed or pulled whichever you prefer. There is also a great deal more DSC goodness coming with PowerShell 5.0 in Windows Server vNext.

There are also some good TechNet Virtual Labs for DSC and other PowerShell features. Check them out here. (33 of them covering PowerShell, DSC, Azure PowerShell, Automation etc.)

dsc2

 

 

I shall save some deeper DSC diving for other posts as this was NOT meant to be a PowerShell love in.

So what other tools can I use to automate IT pro tasks.

I have already alluded to the IP planning and deployment / management tasks that need automating and easing. Well I have posted many times about the super effective IP Address Management feature in Windows Server 2012 and 2012 R2. Suffice it to say that if you read this blog regularly you are already sufficiently acquainted with its principles to realize its value. Of course Windows PowerShell 4.0 also added these PowerShell CmdLets to enable automating your IPAM deployment and management.

ipam2

TechNet Virtual Labs also do a rather good job of highlighting this feature in this LAB. I ought also to mention here that the Microsoft Virtual Academy has a number of courses covering IPAM and PowerShell for Active Directory that includes DSC.

The Final set of automation tools (I wanted to pick Azure Automation using PowerShell but I promised I wasn’t going to use all PowerShell today) that I am going to select today are those that enable the deployment of Operating System images. I was spoilt for choice, since I could have picked System Center 2012 R2, Hyper-V, or many other useful tools.

I have chosen some tools that are cost free once you have licensed a server operating system (Server 2012 or 2012 R2).

The mix of tools are Windows Deployment Services (WDS) and the Microsoft Deployment Toolkit (MDT) 2013. But before I discuss those, I would like to mention the Microsoft Assessment and Planning (MAP) Toolkit.

“The Microsoft Assessment and Planning (MAP) Toolkit is an agentless inventory, assessment, and reporting tool that can securely assess IT environments for various platform migrations—including Windows 8.1, Windows 7, Office 2013, Office 2010, Office 365, Windows Server 2012 and Windows 2012 R2, SQL Server 2014, Hyper-V, Microsoft Private Cloud Fast Track, and Windows Azure.”

This is a must have tool for anyone planning to do anything to their network or clients / servers. Another free tool. Available here. All the above mentioned tools are part of the Microsoft Solution Accelerator Programme which seems to expand every time I look at this page. The MDT Team blog also has masses of useful information.

So why have I chosen this set of tools? WDS allows me to deploy operating systems across the network to all my clients in a Light Touch manner (LTI) this means that I would have to have some interaction with the client. Currently the preferred Zero Touch solution uses System Center 2012 R2, but this can be a costly option.

To assist you in using this free service Microsoft have provided the MDT and also the Windows Assessment and Deployment Kit (ADK). This kit is a hefty installation and provides a raft of useful tools. See the screenshot below, if you select all as I have here, the result is over 6GB of installation.

adk1

There are a number of TechNet Virtual Labs for the MDT, although most are focused on the integration with System Center Configuration Manager, for larger enterprises. There is one for creating images using the MDT though.

mdt2

In short the tools allow you to create images or capture them from reference PC’s then store them until required for deployment to new / refreshed PC’s in your network. Why am I considering this automation?  Well the use of an image in the new(ish) Windows Image Format (WIM) allows you to update, service and add / remove features, drivers and programs from the image at any time. It can also be used to deploy VHD and VHDX files to allow client PC’s to boot from VHD too. All this would take a long time configuring at each machine that you want to deploy.

As with most tools that save you time in the long run the deployment and configuration of this suite of tools is not a small task and it will involve a degree of learning the principles and processes, which can be confusing, there are capture images, install images boot images, reference images as well as thin thick and hybrid types of images. Enough images for you?

Oh and I am sure it won’t surprise you to find out that MDT uses PowerShell to carry out all its tasks, as I have said ‘ad nauseum’ PowerShell is the future.

I don’t have enough space this time to do a run through of MDT / ADK for developing and deploying images with WDS, but they are freely available on the internet and I will do a YouTube one when I get time. It may flow better that way.

But all new tools take time, whether they be PowerShell, Azure Automation or any other new feature. That is why learning and certification is still such a good thing to be involved with. All of the products and features I have talked about today appear in Microsoft Official Curriculum Courses and in Microsoft Certification Exams too.

With the landscape changing so often, it is wise to invest in your career by learning and certifying so that your employer or your prospective employer can have some benchmark to judge you by.

Use the MVA and the other training avenues wisely. For all things training and Certification you can use the many resources available to you at

Microsoft learning website

lex1

Born to learn website

b2l1

Microsoft Virtual Academy

va1

Watch this space for more on PowerShell DSC, Windows Server Technical Preview top five features and more.

Windows Server Technical Preview – My Favourite Features – Part 1

Microsoft released the first Technical Preview of Windows 10 to much acclaim back in October. There have been three releases so far and we currently sit on the ‘last release of the Calendar year’ – Build 9879.

The Technical Preview is intended primarily for the enterprise to evaluate the changes and inform the development of new and evolved features of the client operating system. This is a brave and intelligent step. Most followers of Windows in an enterprise will know that Microsoft traditionally release their Client and Server platforms in pairs. XP/ 2003, Vista/2008, Win 7/ 2008R2, Win 8/2012 and most recently Win8.1/2012R2.

The dramatic changes inside Microsoft have not led to a change in this pattern and there is a new server platform being developed alongside Windows 10, this Server is as yet un-named but is also in Technical Preview.

If you have an MSDN subscription you can find it there in both ISO and VHD formats (the new Hyper V server is there too). If you do not subscribe then you can find it here. The new Remote Server Administration Tools for Windows 10 Technical Preview have also been released to allow you to remotely manage your new server from your new client. The RSAT can be found here. They are available in 32bit and 64bit flavours.

For anyone interested in the Server Technical Preview, just about everything you could want to know can be accessed from this blog site. This is Jose Barreto’s blog, Jose is a member of the File Server team within Microsoft and has put together this invaluable survival guide. As you might imagine, it is Storage focussed but does cover most other areas too.

There is one final way you can have a look at and run the Server Technical Preview and that is as a Virtual machine in Microsoft Azure. If you do not have an Azure subscription, again this is part of your MSDN benefit. (MSDN is sounding more and more like good value). Otherwise you can sign up for a cost free trial, here

azpreview1

Windows Server 2012 was a huge leap in performance and function for the Windows Server family and despite the familiar look and feel to the new server and most of its tools, there have been significant new features and improvements to old one. BUT please remember when looking at and playing with this new server operating system.

THIS IS A TECHNICAL PREVIEW – do not use it in production, do not rely on it for any tasks you cannot afford to lose. Having said that I have found it stable and reliable (as with the Windows 10 client. The difference being I use the Windows 10 client on my main work machine and just about all other machines I use – a couple of exceptions) Whereas the server version is very definitely a test rig setup for me at present.

So, what is new and of those new things, what are my favourite features and why. This is the first post in a series examining major new functionality in the Technical Preview.

In Server 2012 one of the big five features for me was Hyper-V Replica. The first new feature of the Technical Preview I want to describe is called Storage Replica.

To quote the TechNet site, Storage Replica (SR) is a new feature that enables storage-agnostic, block-level, synchronous replication between servers for disaster recovery, as well as stretching of a failover cluster for high availability. Synchronous replication enables mirroring of data in physical sites with crash-consistent volumes ensuring zero data loss at the file system level. Asynchronous replication allows site extension beyond metropolitan ranges with the possibility of data loss.

Ok that sounds a.) A lot of technical stuff and b.) Pretty exciting and revolutionary for an out of the box no cost inclusion in a server operating system. So what exactly does it do and how does it do it.

Well, Server 2012 introduced the next version of SMB (SMB 3.0) this allowed a vast number of performance and reliability improvements with file servers and storage as well as normal communications using the SMB protocol.

In short the feature allows an All-Microsoft DR solution for both planned and unplanned outages of your mission-critical tasks. It also allows you to stretch your clusters to a Metropolitan scale.

What is it NOT?

  • Hyper-V Replica
  • DFSR
  • SQLAlwaysOn
  • Backup

Many people use DFSR as a Disaster Recovery Solution, it is not suited to this but can be used. Storage Replica is true DR replication in either synchronous or asynchronous fashion

Microsoft have implemented synchronous replication in a different fashion to most others providers, it does not rely on snapshot technology but continuously replicates instead. This does lead to a lower RPO (Recovery point objective – meaning less data could be lost) but it also means that SR relies on the applications to provide consistency guarantees rather than snapshots. SR does guarantee consistency in all of its replication modes.

There is a step-by-step guide available here, but I have included some other notes below for those who don’t want to read it all now (all 38 pages of it). (Images are taken from that guide and live screenshots too)

The Technical Preview does not currently allow cluster to cluster replication.

ss

 

 

 

 

sc1

 

 

 

 

 

Storage replica is capable of BOTH synchronous and asynchronous replication as shown below. And anyone who knows anything about replication knows that to do this there must be some significant hardware and networking requirements.

synch1
asynch1

So what are the pre-requisites to be able to use Storage Replica in a stretch cluster?

The diagram below represents such a stretch cluster.

sc2

 

 

 

 

 

There must be a Windows Active Directory (not necessary to host this on Technical preview)

Four servers running Technical Preview all must be able to run Hyper-V have a minimum of 4 cores and 8GB RAM. (Note Physical servers are needed for this scenario, you can use VM’s to test Server to Server but not a stretch cluster with Hyper-V).

There needs to be two sets of shared storage each one available to one pair of servers.

Each server MUST have at least one 10GB Ethernet connection.

Ports open for ICMP, SMB (445) and WS-Man (5985) in both directions between all 4 Servers

The test network MUST have at LEAST 8Gbps throughput and importantly round trip latency of less than or equal to 5ms. (This is done using 1472 byte ICMP packets for at least 5 minutes, you can measure that with the simple ping command below)

ping1

Finally Membership in the built-in Administrators group on all server nodes is required.

This is no small list of needs.

The step by step guide uses two ways of demonstrating the set up. And is a total of 38 pages long.

All scenarios are achievable using PowerShell 5.0 as available in the Technical Preview. Once the cluster is built it requires just a single command to build the stretch cluster.

pshell1

You could of course choose to do it in stages using the New-SRGroup and New-SRPartnership CmdLets.

If, like me you do not have the hardware resources lying around to build such a test rig you may want to try and test the server to server replica instead.

This requires,

Windows Server Active Directory domain (does not need to run Windows Server Technical Preview).

Two servers with Windows Server Technical Preview installed. Each server should be capable of running Hyper-V, have at least 4 cores, and have at least 4GB of RAM. (Physical or VM is ok for this scenario)

Two sets of storage. The storage should contain a mix of HDD and SSD media.

(Note USB and System Drives are not eligible for SR and no disk that contains a Windows page file can be used either)

At least one 10GbE connection on each file server.

The test network MUST have at LEAST 8Gbps throughput and importantly round trip latency of less than or equal to 5ms. (This is done using 1472 byte ICMP packets for at least 5 minutes, you can measure that with the simple ping command below)

ping1

Ports open for ICMP, SMB (445) and WS-Man (5985) in both directions between both Servers.

Membership in the built-in Administrators group on all server nodes.

NOTE – the PowerShell CmdLets for the Server to Server scenario work remotely and locally, but only for the creation of the Replica, not to remove or amend using Remove or Set CmdLets (make sure you run these CmdLets locally ON the server that you are targeting for the Group and Partnership tasks).

I do urge you to go off and read more about this solution and test it if you can but remember things are not yet fully baked and will change with each release AND do not use them in Production yet. Read the guide for known issues as well, there are a few.

Finally – why do I love this feature – NO one likes to think of a disaster but if you don’t plan for it, when it does happen it truly will be a disaster in every respect. This allows a much cheaper but effective way of maintaining a current accurate replica of data either on a separate server or on a separate site within a stretch cluster.

Still pricey on hardware and networking, BUT much cheaper than a full hot site DR centre with old style full synchronous replication.

Watch this space for more Server Technical Preview hot features.

Future Decoded – 12th November – Enterprise Mobility

 

fdpic

If you are at all interested in Enterprise Mobility then Wednesday 12th November at Excel in London is the place to be. This FREE event has lots going for it.

Enterprise mobility is not all that is on offer. The morning is dedicated to some big-hitting keynote speakers. Including Professor Brian Cox, Sir Nigel Shadbolt and Michael Taylor.

keynotes

The afternoon agenda is packed with excellent Enterprise mobility sessions, from customer case studies to demonstrations of the latest Windows 8.1 gadgets and deep diving sessions including demonstrations of the journey into mobile device management on an enterprise level.

No better time to get yourself up to speed on the Enterprise Mobility Suite from Microsoft, which includes Microsoft Azure Active Directory Premium, Windows Intune and Azure RMS. All of which work together with Windows Server to provide an end to end mobility and identity solution, irrespective of your end users device choice.

ems1

Some of Microsoft UK’s top technical specialists will be presenting and answering your EMS related questions.

jamie

 

Devices, devices everywhere

Jamie Burgess

Mobility Lead, Microsoft UK

dan

 

Hybrid Identity Management

Daniel Kenyon-Smith

Solution Architect Microsoft Consulting Services

elsey

 

Desktop and Application Virtualisation

Doug Elsley

Application and Desktop as a Service Lead, Microsoft UK.

The session is being kicked off with a case study from Andy Turner of Mitchells and Butlers a customer story well worth following.

You can register here, I do hope you have time to spare to attend this fantastic event. I will be there hosting the mobility track so do come and find me and introduce yourself.

Oh and did I mention it is completely FREE!

New release to manage? Top tips on communicating, educating and activating users.

angry-user1

As IT Professionals, it is all too easy to lose sight of the impact our daily job can have on the poor, unsuspecting species known as the ‘user’. It could be said that having spent the last 20 plus years entirely working in or around the IT industry, I am both long in the tooth and set in my ways. One is probably true, I sincerely hope that the latter is not. Those of us not willing to change will indeed wither and fade.

Twenty years ago the needs and wants of the user were a secondary consideration, what equipment we used, what the applications and devices were capable of and looked like and indeed the user experience was not foremost in the minds of the designer or implementer. This may not be true across the board but I certainly have been subjected to some seriously poor implementations in both the public and private sectors.

Today the needs and wants of the user seem to be a primary consideration which can only be a good thing. Not only should applications be designed to make the human computer interaction as easy and intuitive as possible but the actual deployment and updating of the solution should also take users into consideration.

This is not a new idea. The IT Infrastructure Library (ITIL) has a section covering Release Management

 

6.3.4 Release and deployment management

The purpose of the release and deployment management process is to plan, schedule and control the building, testing and deployment of releases, and to deliver new functionality

required by the business while protecting the integrity of existing services.

 Definition: release

One or more changes to an IT service that are built, tested and deployed together. A single release may include changes to hardware, software, documentation, processes and other components.

 Effective release and deployment delivers significant business value by delivering changes at optimized speed, risk and cost, and offering a consistent, appropriate and auditable implementation of usable and useful services.

 Release and deployment management covers the whole build, test and implementation of new or changed services, from planning through to early life support.

 As a headline set of principles, this is laudable, however, the communication and training process is critical to the acceptance of a system by its users. No matter how good the product, if it does win the ‘hearts and minds’ it is unlikely to succeed.

For that reason the IT Pro has to not only be marketing savvy but actually should be a proficient marketer too.

It may not be possible to outline to all users the detailed roadmap and dates of new products or even of significant updates to current products for a number of reasons. These could range from political, financial and even business critical reasons.

It should though, be possible to provide a comprehensive outline of what to expect as a user. Once the roadmap is revealed, then the communication of any significant outages, product limitations and even of required training or activation processes becomes critical to the success of a release.

It is useless to have a sophisticated IT service if the users are not able to make the most of it due to lack of communication or training in addition to all the other ITIL processes shown below.

relman2

To produce a fully rounded release requires all of the processes. To market it well and to deliver it successfully relies upon Communication and training.

So if you are an IT Pro considering deploying something new to users, have you thought about;

 

  • A SharePoint site developed purely for this release for all users to be able to access documentation, FAQ’s and even e-learning training courses.
  •  Working with the Developers to make sure that it is the users that matter not just the potential ‘throw it over the wall’ type deployment – from Dev to Ops to user. Visual Studio Online and Visual Studio work very well together to provide a rounded development solution.
  • Providing an email alias for users to submit questions and heaven forbid – bugs to.
  • Provide regular feedback to the users either through the SharePoint site or a web site (Microsoft Azure is good for hosting this, of course).

With the speed of releases increasing for all types of deployment from minor updates to completely new solutions, the user could easily get forgotten, this is not a good idea.

 

Remember the old adage ‘hell hath no fury like a user scorned’, or something like that.

Azure RemoteApp Part 2 – More RemoteAppyness from Azure

cloudos1

 

 

Having published a post on Azure RemoteApp recently, I have been absolutely inundated with requests (well three people have asked me) for information on how to use Azure RemoteApp with an application that is not published by default and that I DON’T want to run on my own premises.

 

So here is part 2 of 3 in the Azure RemoteApp story of goodness. Publishing my own apps to Azure RemoteApp.

As a keen photographer I rely heavily on Adobe Lightroom, so i decided I would deploy a 30 day trial of this to Azure RemoteApp

To be honest I had thought this post would be difficult to describe and deliver but at least 60% of the tasks are identical to those in the first post. The ONLY difference is that you must create and upload your own image in the form of a Virtual Hard Disk (VHD) to Azure Blob storage (no I’m not being insulting to Azure storage – BLOB is a Binary Large Object).

The tricky part is that there are a bunch of pre-requisites for this image, its internal setup and the applications on it. So some patience is required.

To follow these steps, you must already have an Azure subscription (a trial will do) and you must have an Azure RemoteApp preview service enabled. Both of these steps are described and links included in my previous post, here.

Step 1 – Create your Image.

There are a number of ways to do this, of course PowerShell is the best way if you want to repeat the process and I will publish a handy script to do the basics, another time. For now just use either Hyper V or Disk Management to create a dynamically expanding, VHD with a Master Boot Record type (MBR). All three of those factors are mandatory.

Azure is not yet able to handle the newer VHDX format so make sure it’s a VHD. A dynamically expanding disk is much more efficient for the uploading and blob storage process. The Azure system only handles the MBR format rather than GUID type.

Once you have this created, then you can go ahead and create a VM based on that disk. The next pre requisite is that you MUST install Windows Server 2012 R2 on the disk. The machine can have several volumes BUT only one instance of Windows Server can be installed. This caters for the scenario where the Application you wish to use as a RemoteApp cannot co-exist on a system partition.

Having found a Windows Server 2012 R2 ISO or DVD and installed it to the VM (you will need to enter a valid licence key, which will be stripped out later on), there are several more tasks to complete before you can upload your VHD as a ‘gold’ image to the Azure storage platform in your subscription.

First you should use either Server Manager or PowerShell to add the Desktop Experience Feature which is hidden under the User Interfaces and Infrastructure section.

experience

 

 

 

 

 

 

This will almost certainly require a reboot.

Having rebooted the VM, again use Server Manager or PowerShell to add the remote Desktop Services Role and during the setup stage of the wizard for this, add only the RDS Session Host option. The normal process for installing RDS is to choose the RDS Installation option as most of the work for setting up your systems and servers is carried out for you.

This will not work for a number of reasons, not least as it requires a domain structure. So for this install you will need to choose Role-based or feature-based installation.

rds install

There are now a few settings to make to ensure that the Azure upload works ok.

First disable the Encrypting File System (EFS), this requires the following command entered in an elevated command prompt.

Fsutil behavior set disableencryption 1

If you know what you are doing and are used to hacking around in your registry, you can add the following DWORD here.

HKLM\System\CurrentControlSet\Control\FileSystem\NtfsDisableEncryption = 1

(usual caveats  around backing up the registry and not doing this unless you know what you are doing)

So far I have assumed that you are creating this image on-premises which in this ‘Cloud-First, Mobile-First’ world is an error on my part. There is an additional step to take if you are creating your image within an Azure VM.

There is an XML file stored in \Windows\Panther\unattend.xml

This needs to be renamed or deleted. If this step is not carried out, the upload script (which also assumes an on-premises image source) will fail.

The final three steps to make are

to fully update the operating system using Windows Update

to install your applications which you intend to be used as RemoteApps (my demonstration uses Adobe Lightroom v3 9a digital image importing, cataloguing and editing application).

The last step is to generalize your system so that when Azure starts the image during the process of uploading and provisioning, the system does not find keys, users or passwords that would block the process.

The command required for generalization is

  C:\Windows\System32\sysprep\sysprep.exe /generalize /oobe /shutdown

This will place the VHD in the same state as immediately after installation before entering the product key and administrator password (oobe). Do remember that even though Sysprep (one of my favourite tools) has a switch designed to use with VM’s, you should not use the /mode:VM switch as this would cause Azure to reject the image.

Even though this is a very simple set of instructions make sure that all of them are completed AND in the order stated. If you do not, expect a lot of red in your PowerShell results! (Hint – RED is bad) (I saw a lot of red in creating this post)

Step 2 – upload your image.

Provided that you have completed step 1 correctly this is the easiest but longest part of the process.

First in Azure go to the RemoteApp section select template Images and upload, answer the wizard options of name and Location (BIG NOTE – remember the location and make sure it is the same as the location when you create your remote app service, if not, you will not see this image)

upload1

Having created the template image name the wizard will automatically start downloading the

Upload-AzureRemoteApptemplateImage.ps1 script (remember where you save this)

The wizard will also contain a link to download the Azure PowerShell module and the details of the command to run, in this instance the command is below

commanupload

Here is the command in a  more readable form, simply copy and paste this into a notepad document so you can reuse it (if your first few tries do fail)

.\Upload-AzureRemoteAppTemplateImage.ps1 -SAS “?sv=2012-02-12&sr=b&si=428348c2-81f9-4ce6-9021-635289465915&sig=DMAia%2F0qa%2B4pXoDad5mRyJEjx%2BTyWVNEJW1Ah%2BDXjnY%3D” -URI https://cdvwu110638459rdcm.blob.core.windows.net/goldimages/428348c2-81f9-4ce6-9021-635289465915.vhd

Ensure that your VHD image is stored on directly attached storage as the wizard does not pick up network shares.

Having downloaded the Azure PowerShell, go ahead and run this from the start screen as an administrator.

azpshell

 

 

 

 

 

 

 

 

 

 

Change drives to the drive and directory where you downloaded the script and paste the command and run it.  The script asks for a location to the VHD, select the correct image and off you go.

upload3

My upload took about 1.75 hours and was around 10GB in size.

uploadfinal

From this point ALL the steps are identical to those in my first post. If the applications you want to use are not on the start menu, they will not show up in the publish applications window and you will need to add it manually by path. (Hence I suggested you write down the path).

path

Once the app is published, you can simply look for more app invitations if the Microsoft RemoteApp is already installed and configured to a different Azure RemoteApp service. The new apps will then show up and you can run them at your leisure.

updating apps

Note here that the first time you run an app within a session from EACH Azure RemoteApp service, it will take longer than usual as it has to set up the connection to the new session host. Also note that despite there being different session hosts, there is only one connection to Azure RemoteApp – all connections go through this one connection, as shown by the Screenshots of the task manager (processes). See also that where I have many RemoteApps open, almost no processing power is used and not that much RAM either. I can also run duplicate copies of the same Application on the same machine (1 RemoteApp and 1 local) also shown in the Task manager screenshot.

taskman

In preview I am limited in the number of concurrent sessions I can run to ten.

nosessions

 

 

 

 

 

The sessions listing in the Azure console allows you to keep track of this. I can even log off remote sessions, disconnect them (for later re connection) or send a message to the console of the user.

disconnect

 

 

 

Of course simply because it is easier for me to do quickly i have shown all this on a Windows 8.1 client. It is available on any platform where a RemoteApp client is available (OSx, iOS, Android, Windows)

Again I have gone well over my self-imposed 1000 word limit so will keep the hybrid implementation of Azure for another day. Happy Apping.

 

 

MCSE Update – Beware the path you choose to tread.

mcsepng

 

 

 

 

 

Despite my crossing the professional tracks from Full time MCT to Microsoft Full Time Employee as an evangelist, I am still keen to engage with anyone on a certification journey or those still teaching and training our hard-working IT pro community. As such I feel the need to react to the announcement that there is now a choice in the path you take to gain your MCSE certification for Communications,
Messaging or SharePoint.

So first up, what was the position?

Well, when Microsoft released Windows Server 2012, they also reinvented the MCSA and MCSE certifications which I covered here and here, so no need to repeat that. The key point for this post is that the ONLY route to any of the new MCSE badges was the MCSA Server 2012. (3 exams 410,411 and 412 each getting progressively more challenging).

* The caveat to this (there is always one of those) is that if you held certain prior certifications you could take the upgrade exam (417) which I found harder than all the other three put together.

* The other caveat is that if you possessed the MCSA Server 2008 then you could use that to gain your MCSE for a specified time.

All that being equal, essentially the only route in now is to get your MCSA Server first, then take the two examinations for your chosen specialism.

As shown in the graphic below.

mcse

 

 

 

 

 

 

As a key, Communications relates to the Lync product, Messaging to Exchange and SharePoint is self-explanatory. But as I explain in the previous posts, the two MCSE level exams are not limited to single products like the MCSA levels, they are much more detailed complex and wide ranging in style and content.

So what has changed?

Well, with the release of the Office 365 MCSA, which I have posted about quite a number of times here. There is now the option to study for an MCSE without having any on-premises server certifications.

It is now an available route to MCSE by gaining the MCSA Office 365

offline-hero-office-365

 

Those who have read my trials and tribulations in achieving the aforementioned MCSA will know that it certainly is not an easy option. But there are more detailed questions to ask.

Is it right to be an MCSE, the premium level of certification now that Master and Architect have been benched, without any tested knowledge of the key skills such as DNS / DHCP / VPN / TCP/IP / Active Directory / ADFS etc.

My immediate and resounding answer to that is NO! Certainly not.

The clever thing here is that by granting an MCSA after only two exams instead of three, it looks like a quick and easy way to sneak under the skills barrier.

Far from it.

I have no doubt that those who cheat will still cheat and to them it makes no difference as they had none of those skills before, but to those who choose the Cloud route rather than on-premises, they will still need the whole list of skills I mentioned above to be successful in the Office 365 exams.

Both exams test a great deal about Active Directory / ADFS / PowerShell / Authentication in addition to the key skills for the administration of the individual online versions of Exchange, Lync and SharePoint. The final element is of course the other portals that are required to administer Office 365 and setup the subscriptions.

Two exams but having taken all the exams in both routes, the Server 2012 on-premises route was by far the less challenging for me.

This may have been because I already had the skills as a basis and that those who come in the future and know no on-premises products will not, in which case either route will be seriously beyond their reach without a great deal of study.

I think this addition is a good step and opens up a number of possibilities for future IT Pros to concentrate on the cloud first route into certification. Whatever the route, they will possess one of the most respected IT certifications around which will be of great value for the three years until re-certification is required!