Tuesday, February 27, 2018

Why following best practices in Azure is a good idea

Over my time I have seen so many Azure solutions built in ways that are contrary to agreed best practices. Why does this happen? Typically, it is because people bring old concepts and methodologies to new environments like Azure. Yes, many of the fundaments are the same. Things like TCP/IP, networking and the like are the same as on premises but others are very, very different.

One of the key differences when it comes to storage with Azure Virtual Machines (VMs) is the disk topology. When you spin up an Azure VM you typically get two drives, C: and D:. C: is the boot partition and holds the operating system while D: is a temporary or caching disk that gets recreated upon every reboot.


Above you can see an example of a topology from an Azure machine. You will see that D: has the label ‘Temporary Storage’.


A closer looks at D: reveals the contents shown above.


If you look at the contents of the warning file you see the above. Note the first line which says (in capitals):


Why am I emphasising this? I can’t tell you the number of people I have seen bring previous practices to Azure and put production data (such as Active Directory Databases) onto this temporary drive because ‘this is the way they have always done it”. That unfortunately, is only going to end in tears.

Best practice when it comes to Azure is to always add data disks to Azure and start the labelling from F:. Yes, there is an additional cost for adding data disks but that cost is small compared to the flexibility you gain.

Case in point. I have a nested virtualisation server running in Azure that hosts a number of machines for testing. This machine has two data disks striped together for storage and performance optimisation. Using striping is another change from the ‘de-facto’ that I’ll look at in an upcoming article.

Unfortunately, when I put on some recent Windows updates the machine decided it no longer wanted to boot. I tried all the troubleshooting tips to get the system to boot but to no avail.


I therefore went in to the disk configuration of the failed machine and ‘detached’ the existing data disks, which as you can see, you can do from the Azure portal, although there are also PowerShell commands to accomplish this.

With the data disks ‘freed’ from the original failed machine, I proceeded to create a new virtual machine to mirror the original failed host. After doing this I went to the disks area of the new machine and selected the option to Add data disk. However, instead of specifying to create new clean disk, I elected to use existing disks and select the ones that I had detached from the failed original.

When I now looked at the new machine, with the existing disk attached, I found that the striping environment was already in place and needed no further configuration. All I needed to do was to restore my virtual machines that were on the data disks using the Hyper V manager. All really simple.

If I had installed everything on the C: drive then I would have lost the lot and would have needed to rebuild every virtual machine in that Hyper V environment from scratch. That would have cost me a lot of time, where in fact the total recovery time here was only a matter of minutes. That’s a BIG difference!

The moral of this tale is that a new environment like Azure does operate in a different manner from previous technologies. It is generally not appropriate to always bring old practices to a new environment without taking time to understand the ‘best practices’ for a new environment. Doing things the same old way just because this is the ‘way it’s always done’ can lead to a lot of pain and heartache. On the contrary, when you take the time to understand any new environment and follow best practices for that environment, things tend to be much easier as the above hopefully illustrates. This applies as much to Azure as it does Office 365. New technologies need new approaches and new best practices.

In summary, please oh please DON’T put your production data on C: or D: with Azure virtual machines.

Friday, February 23, 2018

February Azure Webinar Resources

Here are the slides from the February Azure webinar where we took a look at Azure networking.


The recording is also available at:


which CIAOPS patrons get free access to as part of their subscription.

This webinar set more of the ground work for upcoming monthly webinars that will go deeper into Azure features and abilities.

So make sure you sign up for next month’s webinar.

February Office 365 Webinar Resources

Good to see such large numbers for this month’s webinar. Obviously, a topic of great interest to many.

Slide from this month’s webinar are at:


f you are not a CIAOPS patron you want to view or download a full copy of the video from the session you can do so here:


We looked at PowerApps in this session

Watch out for next month’s webinar.

Thursday, February 22, 2018

Need to Know Podcast–Episode 174

In the absence of Marc Kean who is busy at Microsoft, let me introduce my new co-host Brenton Johnson from Uptake Digital. Brenton comes from a 'born in the cloud' IT business that looks after customer's digital needs and implements cloud solutions for them. In this episode we meet Brenton and find out about his background, we also cover some new and updates from the world of Azure and Office 365. Have a listen and let us know what you think of the changes. We are still finding our feet in absence of Marc.

Take a listen and let us know what you think -feedback@needtoknow.cloud

You can listen directly to this episode at


Subscribe via iTunes at:


The podcast is also available on Stitcher at:


Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.




January Update for Microsoft Teams

New features in Planner

New apps in Microsoft Teams

Outlook Groups app is retiring

Use SharePoint web parts to showcase data from inside and outside Office 365

Azure revenues

Hybrid Cloud printing

Azure Cloud Shell

Protect machines using managed disks and ASR

Wednesday, February 21, 2018

OneDrive Office sync conflicts

I recently wrote an article about

Offline file conflicts with SharePoint Online

that ran through the process of what happens when users go offline when working on shared files.

After doing some more poking around in the latest OneDrive for Business sync client I found this under the Office tab in Settings:


You can find more information on the first option here:

Use Office 2016 to sync Office files that I open

which notes:

If you turn off this setting, Office will no longer be able to automatically merge changes from different versions of documents. You'll also be prompted to upload a new copy of a file before you can share it directly from an Office desktop app.

You can also elect how to handle Sync conflicts, which by default is set to Let me choose to merge changes or keep both copies.

The defaults options are going to suit most people but you can go in a customise these if you wish to improve how conflicts are handled in your environment.

Saturday, February 17, 2018

My podcasts 2018


Apart from my Kindle and Audible consumption I spend a lot of time listening to podcasts. Whether travelling in the car, on the train, out walking, taking a flight, wherever. I’m not usually far from a having a podcast in my ear.

So here’s my current top listening list:

1. Windows Weekly

The latest Microsoft news with some fun and entertainment along the way. Paul Thurrott’s musing make this podcast alone something worth listening to.

2. The Tim Ferriss Show

Some really great advice, business insights and strategy. Also lots of life lessons that I have found work really well for me. A weekly must listen for me.

3. Microsoft Cloud Show

Beginning to lose it’s interest for me. Becoming too Dev heavy and repeating stuff that I know about. Also, becoming a bit too much of the ‘space nerds’ podcast.

4. Hardcore History

Not a regular event but when these episodes drop I’m all ears. They are are deep dive into history told by a master narrator. If you love history, you’ll love these episodes.

5. Jocko Podcast

Probably too hard core for most. For me it is a great mix of military history and business mindset training. If you have a ‘fanatical’ tendency then give this one a listen.

6. Unbeatable Mind Podcast

Still some worthwhile content but becoming less so for me. Maybe time to put this one on the back burner for a while.

7. Let’s Talk Crypto podcast

An Australian show that walks you through the basics. Needs some deeper content to keep me listening long term but for now a good summary and getting started point for crypto, especially if you want an Australian bent.

8. Microsoft Cloud IT Pro podcast

A lot of news around the MS Cloud but also a lot of snide comments about unfavoured MS services which can be a tad grating given they do have value to many. Short and sweet but perhaps too short? Again, another one up for review in 2018.

9. The Kevin Rose Show

A bit like the Tim Ferriss podcast. Plenty of interesting and different stuff that always makes you think. Somewhat irregular episodes but I am still enjoying what I’m hearing.

I listen to all episodes at at least 2X speed to allow me to crank through most of these episodes in a week.

There are also a few new podcast I’ve recently picked up on that I am still evaluating as to whether they’ll remain favourites. I currently download them all but do I listen regularly? Probably not. if I have missed a few episodes then, after a while, I’ll probably remove them from my play list. Finding informative and enjoyable content is proving harder for me of late.

Since 2010 I have published my own podcast:

Need to Know podcast

which covers the Microsoft Cloud (typically Office 365 and Azure) as well as business topics. I encourage you to have a listen and me know what you think.

So what do you listen to and recommend?

Thursday, February 15, 2018

Azure Shell comes directly to browsers

One of the really cool things Azure has introduced recently has been it’s Cloud Shell.


This is the ability to run a PowerShell command line window directly in a browser or on a mobile device. You did this by selecting the shell icon in the portal as shown above. When you did so you got a command line in the lower half of your browser. Really handy.

Now all of this is happening in a browser and until now you needed to access this by logging into the Azure portal. Well, if you now navigate directly to:


You’ll be able to login to the Azure Cloud Shell directly.


When you go there you’ll need to select a subscription to use (in case your have a few).


Once you have selected this an Azure Cloud Shell will spin up right in your browser as shown above. It may take a few minutes to do this and actually get to the command prompt, so be patient.


Once there, you can execute PowerShell commands against the tenant.

How cool is that?

Wednesday, February 14, 2018

Email Message Header Analyzer for Office 365

Much of the diagnostic detail relating to emails is buried in locations that you can’t see. If you have the need to examine email messages for troubleshooting or security this can be a challenge.


A great tool you can add to your arsenal is the free Message Header Analyzer which you can find here:


Once installed you will find an additional button in your OWA:


That when selected will give you a range of options you can use to dive deep into the technical information surrounding the email in question.


I especially like the ability to dig into the SPF and DKIM style details.


If you need in to do any troubleshooting or email analysis on a regular basis I’d highly recommend you add this to your inbox.

Tuesday, February 13, 2018

Double check those links

Unfortunately, as services like Office 365 become more prevalent so too do the attacks against them. These attacks are going to target people who are the least IT savvy.


The above is the first example of an email I received this morning. Being close to Valentine’s Day it would be easy for an ordinary user to click on the link provided inside to download the PDF of their order.


However, if you mouse over that link, you see that it actually re-directs you to a malicious web site, but of course a user isn’t going to know that.


I gotta say that the malicious web site really does look an Office 365 login page doesn’t it? The only obvious give away is the URL at the top of the page.


Upon closer inspection you see that it is in fact not going to the Office 365 login URL which is:


You’ll also note that the email address is already in the dialog box so all a user would need to do is press enter as they normally would.


At the next page they are prompted for their email address. again, very, very authentic looking Office 365 login page.

Typically, the user would enter their password and hit enter. At this point their login details have been sent to the bad guys and the user is redirected to correct Office 365 login page. The user of course, thinks they entered something wrong and go through the process again. However, their account has now been compromised, pretty much without them realising.


Here is the next phishing email that I received moments after getting the first. This one appears to be directly from Microsoft request an update to the security of the Office 365 account.

This prays on the underlying fear most users have of technology in order to get them to click the link.


If they do so, they are again taken to another ‘official’ looking Office 365 login page as you see above.


Again, this one has a non Office 365 login URL as shown above. Like the previous case, this site has it’s own certificate (HTTPS) making it appear even more legitimate.

So if you come across these sites, first course of action is to report them to Microsoft.

Submit spam, non-spam and phishing scam messages to Microsoft for Analysis

Because these types of attacks are new into the wild they are typically not picked up by reputation based systems. Eventually they picked up, like in the browser here:


but until they are, there really isn’t much that can be done.

I’ve said this before, security is tough:

The bad guys keep winning

and technology can’t be used to solve every issue. We need to couple that with education to help people ask the right question before potentially doing the wrong thing.

if something in your inbox doesn’t seem right, chances are it isn’t. So treat it with caution.

Friday, February 9, 2018

CIAOPS Need to Know Azure Webinar–February 2018


The February session will build on the knowledge we have covered so far and dive into Azure networking. There’ll also be news, updates and well as open Q & A so I’d love to see you attend.

You can register for free at:

February Azure Webinar Registrations

The details are:

CIAOPS Need to Know Azure Webinar – February 2018
Thursday 22nd of February 2017
2pm – 3pm Sydney Time

All sessions are recorded and posted to the CIAOPS Academy.

There of course will also be open Q and A so make sure you bring your questions for me and I’ll do my best to answer them.

The CIAOPS Need to Know Webinars are free to attend but if you want to receive the recording of the session you need to sign up as a CIAOPS patron which you can do here:


or purchase them individually at:


Also feel free at any stage to email me directly via director@ciaops.com with your webinar topic suggestions.

I’d also appreciate you sharing information about this webinar with anyone you feel may benefit from the session.

CIAOPS Need to Know Office 365 Webinar–February


In the February webinar we’ll take a closer look at using PowerApps as a way to capture information and create forms inside SharePoint. There will be the usual news, updates and Q & A on Office 365.

You can register for free at:

February Webinar Registrations

The details are:

CIAOPS Need to Know Webinar – February 2018
Thursday 22nd of February 2018
11am – 12am Sydney Time

All sessions are recorded and posted to the CIAOPS Academy.

There of course will also be open Q and A so make sure you bring your questions for me and I’ll do my best to answer them.

The CIAOPS Need to Know Webinars are free to attend but if you want to receive the recording of the session you need to sign up as a CIAOPS patron which you can do here:


or purchase them individually at:


Also feel free at any stage to email me directly via director@ciaops.com with your webinar topic suggestions.

I’d also appreciate you sharing information about this webinar with anyone you feel may benefit from the session.

Thursday, February 8, 2018

Offline file conflicts with SharePoint Online

It has been over three years since I wrote an article about file conflicts in Office 365 -

Resolving OneDrive for Business file conflicts

and as you can appreciate a lot has changed since then. Probably the biggest change is that we now have File on Demand and the ability to sync SharePoint Document Libraries. However, there will always remain challenges around shared files going offline when multiple people continue to work on them.

I will preface all this by saying that it is best practice to ‘Check Out’ any files you wish to use prior to you going offline. Doing so will ensure you have exclusive write access to that file while you are offline and until you check that file back in.

Of course, not everyone is going to follow best practice and we are going to end up with the following scenario.


Let’s say that Lewis Collins (user 1) creates a new Excel spreadsheet called conflicts.xlsx in a SharePoint Document Library as shown above.


If Lewis opens that file using Excel Online and makes a change by adding the entry ‘Online 2’, as shown above, it is automatically saved back to the SharePoint Online Document Library.


A second user (Robert Crane – user 2) used OneDrive Files on Demand to sync a copy of that same file to their desktop as shown above.


This second user (user 2) now opens the file using Excel on desktop and makes changes to the file by adding the entry ‘Offline 3’ as shown.

You can see that because the user is still connected to the Internet any changes are automatically synced back to the SharePoint Online Document Library.

So, while everyone is online all changes are updated into the one location.


We can also look at the version history of the file and see all previous versions thanks to automatic version history in SharePoint Document Libraries. We can roll back or view any of these if we wish.

At this point, user 2 (Robert Crane), goes offline and is no longer connected to the Internet.


Now because user 2 didn’t check the file out prior to going offline, user 1 can continue to edit the file. They do so adding the entry ‘Online 4’ to the file, which is then immediately saved back to the SharePoint Document Library.


While offline, user 2 adds a new entry to their offline version of the same file. Here they create an entry ‘Offline 4’ as shown above.

Thus, we now have a situation where the file in SharePoint Online is different from the file on the users desktop. This will clearly create a conflict when user 2 return online.


User 2 comes back online and at the next sync is informed of a conflict as noted in their file manager as shown above.


When user 2 attempts to open the file in conflict they are presented with the warning banner at the top as shown. They are given the option to either Save a Copy or Discard Changes.

If they select Discard Changes, any updates they have made to the file while they have been offline will be overwritten with what is currently in SharePoint Online. Once they select this, any updates they have made to the file while they are offline will be lost and the copy they have on their desktop will be the same as what is currently in SharePoint Online. In short, their local copy is overwritten with that from SharePoint Online. They can’t recover their original file after this happen because the file they changed was only saved to their desktop.

If they select Save a Copy, the file they have changed will be uploaded to SharePoint Online replacing the current version in SharePoint Online.


The OneDrive sync client will then kick in and copy the file from user 2’s desktop to SharePoint Online Document Library replacing the version that others have been working on and potentially removing changes they have made.


When the sync is complete, user 2 should see the same situation on their desktop, as shown above, prior to going offline.

Now, the file that was changed by user 2 while they were offline has become the primary file in SharePoint Online and on desktops. However, any changes that user 1 made while user 2 was offline are no longer in the most current version of the file.

Before we tackle that situation let’s look another experience for user 2 as they come back online with a different version of the file.


When user 2 comes back online with a different version of a file they will also see the system tray icon for their sync client display a warning as shown above.


If they select this the sync client will open and display a conflict message as shown above.


Clicking that message will show them greater detail on the conflict as shown above.


If they click to resolve the issue they will be presented with the above dialog providing two options.

The option Open in Office to merge changes will simply open Excel and take the user through the experience detailed above, i.e. save a copy or discard changes.

The second option Keep both files will rename the changed version on the desktop to conflicts-<PC name>.xlsx. Thus, the original file they were working on offline will be renamed and the newer version that is in SharePoint Online will be downloaded to the original name on their desktop. The idea is basically to create a second copy of the file, rather than overwriting the original. Users would then need to open both files and manually merge any changes back to a single file. The end result here is two files with different names, each holding the unique changes made by each user.


Let’s return to the situation where user 2, who was offline, comes back online, opens the file in conflict and selects to save their copy back into SharePoint Online by using the Save a Copy button.

This means that any changes user 1 made to the file while user was offline are ‘lost’ because user 2 has overwritten the file with their version.


However, don’t forget that SharePoint Online Document Libraries include automatic versioning. This means that when user 2 uploaded their file, the file user 1 had been working on isn’t deleted, it is simply saved as a previous version. So, both files are still in SharePoint Online in full fidelity. One is current and one is the previous version.


You have the ability to compare previous versions or restore previous versions if you wish.


My experience is that Excel is a fairly complex program and in most cases you’ll have to manually merge any changes between the two documents. However, as you can see above, with Word the application can generally merge changes automatically for you using the revisions ability built into the program.

As I said at the beginning of this article, best practice is to check document out prior to going offline to avoid conflicts. If that doesn’t transpire, then you probably need to manually merge changes using versions in SharePoint Online. However, as you can hopefully see SharePoint Online will retain both versions of the file if you do go offline. I would suggest however, you have a play with exactly how this works in your environment prior to requiring it. SharePoint is magic but it doesn’t read minds, yet!

Wednesday, February 7, 2018

Learning Azure while mining cryptocurrency

One of the things that I advocate when it comes to learning new technologies is to find a use for it that interests you. Typically, that means find a problem you need to solve as I have said here:

Scratch your own itch

I used this approach to learn about Azure many years ago as I detailed here:

I finally get Azure

I continue to try all sorts of things in Office 365 and Azure but I thought I’d share this experience of using Azure to mine cryptocurrencies.

Warning, warning, spoiler alert – it isn’t profitable from what I can see to use Azure to do cryptocurrency mining. In 24 hours I managed to mine $8 and it cost me $50 in Azure credits. Not a good ROI, however what I learning during that same period was huge.

My aim was to determine how well Azure IaaS faired when it came to mining and what was the optimal family of VMs to use. I settled on using Minergate as the software to do the actual mining. Yes, there are better options when it comes to mining software but Minergate is free, is a simple install and can be set up in a few minutes. Minergate allows you to mine multiple coins, but for this experiment I stuck to just trying to mine Monero.


I then proceeded to run up various Azure VMs, install the Minergate software and complete a benchmark. I then set the machine to mining and looked at the Hashes/sec as a second data point.

You can see the results from the table above. The winner was the NC12 VM, even though it was the most expensive to run per minute.

So why do I have two entries for NC12 machines in the table above and why are the results so different? Interestingly, when you run an N series VM in Azure it doesn’t include the drivers for the GPUs. Thus, without installing the drivers you get a plain old CPU server. You’ll find the GPU drivers here:

Set up GPU drivers for N-series VMs running Windows Server

As you can see from the above table, with the GPU drivers loaded the benchmark jumps 3x fold!

Obviously, the more CPUs and GPUs you throw at crypto mining the better results you are going to get and that’s why I reckon the DS5_V2 promo machine is also a good option. The downside here is that the promo pricing won’t last forever in this machine. If the pricing goes up, then it will become less economic to mine.

All in all an interesting experiment and learning experience for me. I will continue to fiddle with crypto mining on Azure down the track and try stuff like using Linux instead of Windows as the OS and maybe look at some clustering options. However, my personal take away is that crypto mining on Azure isn’t economically viable and given that Azure rolls up costs like electricity into a single per hour cost, I don’t see how it can work economically for an individual if they use their own on premises hardware. I’m sure some people do make money mining crypto at home but, at this point, I can’t see how it can truly be profitable.


Another Azure activity I saw in action was the Security Center which flagged Minergate as malware on my VMs. I’ll now sit down and start playing with this more.

Azure, always interesting but for crypto mining not really profitable (yet!).

Thursday, February 1, 2018

Enable activity auditing in Office 365


Here’s something I suggest you ensure is enabled in all Office 365 tenants.

Visit the Office 365 Security and Compliance center as an administrator. From the menu on left, select the Search & investigation heading. From the items that appear select Audit log search.

If your audit logging hasn’t been enable you see a hyperlink on the right that says Start recording user and admin activity. If that link is visible, then select it as shown above.


You will then receive the above confirmation. Select Turn on.


You’ll be taken back to the Audit log search page where you’ll see a message telling you that logging is being enabled.


When that process is complete return to the Audit log search and select the Activities drop down.


You’ll now be able to audit a huge range of activities and produce a report, like this -


Here, I’ve run a report to display any files that have been accessed. From the results I can see the user, IP address and the file that was accessed.


You can now also set up an alert on any of these activities.

To do this, select the Alerts option on the left in the Security & Compliance center. From the items that appear select Manage alerts.


On the right select the + New alert policy button.


Set the Alert Type to Custom.


Select the Send this alert when… option and again choose the activity for the alert. The available options should be pretty much the same as you saw before with the audit logs.


Then choose which users you wish the alert to apply to as well as an email address to send the alert to.

As with all alert settings ensure that you don’t make these too general because you’ll end up getting too many alerts and end up spamming yourself.

The important thing here is that auditing is no enabled by default. The best practice recommendation is therefore to go and turn it on so you can audit activity in your tenant.