Azure AD Domain Services Cloud only user passwords

I have been creating a Windows Virtual Desktop (WVD) environment for internal testing. I’ll be sharing the process and tricks soon but this issue was one that I really didn’t know about for Azure AD Domain Services until someone pointed it out to me.  I am eternally grateful to gerry_1974 on the Microsoft Tech Community for this information that lead to the resolution. I thought I’d also share it here so others can avoid the oversight I made and prevent getting as frustrated as I did.

I recently wrote about setting up Azure AD Domain services for a cloud only environment

Moving to the Cloud – Part 3

The reason I needed to do this was to support my planned “cloud only” WVD test environment. Azure AD Domain Services is basically designed to create an ‘old style’ domain that WVD host machines connect to. That will change down the track, but for now WVD needs a traditional AD. Since I did not have an existing on premises domain, I planned to use Azure AD Domain Services.

After getting things working eventually (more about that soon), I was able to successfully login to my WVD environment with a user who didn’t have Multi Factor Authentication (MFA) enabled. I then tried a user with MFA and received:


The remote computer that you are trying to you are trying to connect to requires Network Level Authentication (NLA), but your Windows Domain controller cannot be contacted to perform NLA. if you are an administrator on the remote computer, you can disable NLA by using the options on the Remote tab of the System Properties dialogue box.

I put the issue down to being about MFA but as it turned out, I was so wrong!

When you have cloud only users with Azure AD Domain Services, no password hashes in a format that’s suitable for NT LAN Manager (NTLM) are automatically generated! To force this generation for cloud only users, it is required that the cloud only user change their password per:

Enable user accounts for Azure DS

which says:

The steps to generate and store these password hashes are different for cloud-only user accounts created in Azure AD versus user accounts that are synchronized from your on-premises directory using Azure AD Connect. A cloud-only user account is an account that was created in your Azure AD directory using either the Azure portal or Azure AD PowerShell cmdlets. These user accounts aren’t synchronized from an on-premises directory.

and most importantly:

For cloud-only user accounts, users must change their passwords before they can use Azure AD DS. This password change process causes the password hashes for Kerberos and NTLM authentication to be generated and stored in Azure AD.

After having this brought to my attention, I understand why this is but would also say this could be a very painful process if you have a lot of users that are wanting access to something like WVD.

Thus, another little configuration tip to remember if you are setting up a cloud only environment that utilises Azure AD Domain Services. Before users can potentially use services that are dependent on Azure AD Domain Services (like Windows Virtual Desktop) they need to change their password so the NTLM password hash can be generated for use by Azure AD Domain Services.

Ignite 2019 sessions on YouTube

Not everyone, including me, is able to get to Microsoft Ignite for various reasons. Microsoft, to their credit, live streams and records the sessions. Eventually, these sessions make their way onto YouTube which is my preferred viewing platform. However, what is missing is a catalogue of the links to each session.


As in previous years:

Ignite 2017 sessions on YouTube

Ignite 2018 sessions on YouTube

I have started building this index and making it available on my GitHub:

Ignite session 2019 on YouTube

Please note, all the session are not there as yet. I add them as I discover them along the way through the year.

Of course, if you have a link to a session that I don’t have up there yet, please send it along so I can add it and we can all benefit.

Thanks again to Microsoft for doing this and uploading the sessions to YouTube. They are a great source of learning and allows people like me would couldn’t get to Ignite the ability to work through the content.

Moving to the Cloud–Part 3

This is part 3 of a multi part examination of moving to the Microsoft cloud. If you missed the first episode, you’ll find it here:

Moving to the Cloud  – Part 1

which covered off setting up a site to site VPN to Azure and

Moving to the Cloud – Part 2

which looked at creating traditional ‘dive mapped’ storage as PaaS.

It is now time to consider identity. We need to know where a user’s identity will live in this new environment because there are a few options. Traditionally, a user’s identity has lived on premises in a local domain controller (DC) inside an Active Directory (AD). With the advent of the cloud we now have Azure Active Directory (AAD) as an option as well. It is important here to remember that Azure Active Directory (AAD) is NOT identical to on premises Active Directory (AD) per:


What this means is that native Azure AD (AAD) can’t do some things that on premises Active Directory (AD) can do. Much of that is legacy services like Group Policy and machine joins, etc. You’ll see that Windows 10 machines can be joined to Azure AD (AAD) directly but legacy systems, like Windows 7, 8 and Windows Servers can’t be directly joined to AAD. That’s right. As we stand today, even the latest Windows Server cannot be directly joined to AAD like it can be joined to an AD on premises.

Thus, if you have legacy services and devices as well as Windows Servers you want to remain as part of your environment, you are going to need to select an identity model here that supports traditional domain joins. I will also point out that, as of today (changing in the future), if you want to implement Windows Virtual Desktop (WVD), you will also need a traditional AD to join those machines to. However, if you have no devices that require legacy services, for example if your environment is totally Windows 10 pro based with no servers (on prem or in Azure IaaS), then all you will need is Azure AD.

Thus, not every one can jump directly to AAD immediately. Most will have to transition through some form of hybrid arrangement that supports both AAD and AD in the interim. However, most transitions are ultimately aimed at eliminating on premises infrastructure to limit costs such as patching and updating things like physical servers. This will be what we are aiming for in this scenario.

In a migration from a traditional on premises environment with a domain controller (DC) and AD we now have a number of options when it comes to identity in the cloud.

1. You can maintain the on premises domain controller and AD, while using Azure AD Connect to synchronise (i.e. copy) the user’s identity to the AAD. It is important to note here that the identity in Azure is a COPY and the primary identity remains on premises in the local AD. This is still the case if you implement things like password write back that are part of Azure AD P1 and Microsoft 365 Business. Having the user’s primary identity still on premises means this is where you need to go to make changes and updates.

2. You can swing the domain controller from on premises to Azure IaaS. This basically means setting up a new VM in the Azure VNET that has been created already, joining it to the existing on premises domain across the VPN, then using DCPromo to make it a domain controller. To make it the ‘primary’ domain controller, you swing across the domain infrastructure roles via the following in PowerShell:

Move-ADDirectoryServerOperationMasterRole -Identity “Target-DC” -OperationMasterRole SchemaMaster,RIDMaster,InfrastructureMaster,DomainNamingMaster,PDCEmulator

and then DCPromo the original on premises domain controller out and then remove it altogether. This way you now have your Domain Controller and AD on the VM in Azure IaaS working with machines in the Azure VNET and on premises thanks to the site to site VPN established earlier (told you it would be handy!). In essence, this is like picking up the domain controller hardware and moving it to a new location. Nothing else changes. The workstations remain on the same domain, group policy is unaffected, etc, etc. The downside is that you still need to continue to patch and update the new domain controller VM in Azure but the maintenance and flexibility is superior now it is in Azure IaaS.

3. You replace the on premises domain with Azure AD Domain Services. Think of this like a cloud domain controller as a service. It is a Domain Controller as PaaS. This means that when you use Azure AD Domain Services Microsoft will spin up two load balanced domain controller VMs and connect this directly to AAD so the users there now appear in the PaaS domain controllers. Using Azure AD Domain Services removes the burden of you having to patch, update, scale, etc domain controllers for your environment. It also gives you a traditional AD environment you can now connect things like servers to. However, there are some trade offs. When you use Azure AD Domain Services you must start a new domain. This means you can’t swing an existing domain across onto it, like you can in step 2 above. This means detaching and reattaching all your legacy devices, like servers, from the original to new domain. You also get limited functionality with traditional AD services like Group Policy. You should see Azure AD Domain Services as a transitionary step, not an end point.

With all that in mind, you need to make a decision on what works best for your environment, now and in the future. Considering that most environments I see want to eliminate the on premises domain controller hardware as soon as possible and not replicate this going forward. That desire therefore means a migration to PaaS using Azure AD Domain Services.

The first step in this process then is going to be to ensure that all your users are in Azure AD. The assumption here is that you have already set up your Microsoft 365 environment and the users are configured in Azure AD. If you retaining an on premises domain controller you’ll need to have set up Azure AD Connect to copy the user identities to Azure AD. Azure AD is where Azure AD Domain Services will draw it’s identities when it is installed, so the users need to be there first. Once the users appear in Azure AD, next step will be to set up Azure AD Domain Services. You can kind of think of a traditional on premises domain controller as somewhat being equivalent to Azure AD combined with Azure AD Domain Services.

Setting up Azure AD Domain Services is done via the Azure portal.


Login as a global administrator and locate Azure AD Domain Services and select that.


You’ll most likely find that no services are as yet configured. Select the Add option from the menu across the top as shown above.


You then need to complete the details. Here we face an interesting question, what should we call this new ‘traditional’ managed domain we are about to create with Azure AD Domain Services? Should it be the same as what is being used in Azure AD already?



How you configure this is totally up to you. There is guidance, as shown above, which can be found at:

Active Directory: Best Practices for Internal Domain and Network Names

In this case I have decided to go for a sub-domain, as recommended, and prefix the new Azure AD Domain Services with the letter ‘ds’ i.e.


With all the options completed, select Next – Networking to continue.


Unfortunately, you can’t configure Azure AD Domain Services on the same subnet that has service endpoints as you can see above. You’ll see this if you configured your Azure storage to use private endpoints as we have, which has been previously recommended.

If so, then you can select the Manage link below this box and simply add a new subnet to your Azure VNET and then use that to connect Azure AD Domain Services to.


Before you continue to Administration, ensure that you are adding Azure AD Domain Services to your existing Azure VNET as the default is to create a new VNET, which is NOT what you want here. You want to connect it to an existing VNET you have established previously.

When you have selected your existing Azure VNET and a suitable subnet, select the Next – Administration button to continue.


Here you’ll need to decide which users will be administrators for the domain.  So from the documentation:

What can an AAD DC Admin do?

Only Microsoft has domain admin and enterprise rights on the managed domain. AAD DC Admins can do the following:

  • Admins can use remote desktop to connect remotely to domain-joined machines

  • Admins can join computers to the domain

  • Admins are in the administration group on each domain-joined machine

Considerations for the AAD DC Administrators group

  • Pick group members for the AAD DC Administrators group that have these needs:

    • Users that need special administrative permissions and are joined to the domain

    • Users that need to join computers to the domain
  • Do not change the name of the AAD DC Administrators group. This will cause all AAD DC Admins to lose their privileges.

The default will be your global administrators and members of a special group called AAD DC Administrators, that will be created. So, you can simple add any Azure AD user to this group and they will have admin privileges in the  Azure AD Domain Services environment going forward.

You can of course configure these permissions any way you wish but generally the defaults are fine so select the Next – Synchronization button to continue.


The final question is whether you wish to have all or a subset of your Azure AD users synchronised into the Azure AD Domain Service environment. In most cases, you’ll want all users, so ensure that option is select and press the Review + create button to continue.


You should now see all your settings and importantly, note the box at the bottom about consenting to store NTLM and Kerberos authentication in Azure AD. This is because these older protocols have potential security concerns and having them stored in a place other than a  domain controller is something you need to be aware of. Generally, there won’t be any issues, but make sure you are aware of what that last box means for your security posture.

Press the Create button when complete.


You’ll then receive the above warning about what configurations options can’t be changed after the fact. Once you have reviewed this and you wish to proceed, select the OK button.

Your deployment into Azure will then commence. This process should generally take around 1 hour (60 minutes).


You should see the above message when complete and if you select Go to resource you’ll see:


You’ll note that it still says Deploying here, so you’ll need to wait a little longer until that process is complete.


In about another 15 minutes you should see that the domain is fully deployed as shown above. Here you will note that two domain controllers have automatically been allocated. In this case they are and on the subnet into which Azure AD Domain Services was deployed. You can select from a number menu options on the left but the service is pretty basic. Most times you’ll only need to look at the Activity log here from now on.

Can you actually manage the domain controllers like you can on premises? Yes, somewhat. To do that you’ll need to download and install the:

Remote Server Administration Tools for Windows 10

on a Windows 10 workstation that can access these domain controllers.


You can then use that to view your domain in the ‘traditional way’ as shown above.

Thus, with Azure AD Domain Services deployed, you have a ‘traditional’ domain but without infrastructure and with your Azure AD users in there as well.

The summary of the options around identity here are thus:

1. Primary = local AD, Secondary = none (which can be linked to Azure via a VPN)

2. Primary = Azure AD, Secondary = none (no on premises infrastructure like servers to worry about)

3. Primary = local AD, Secondary = Azure AD (thanks to Azure AD Connect, but need a VPN again to connect to Azure IaaS)

4. Primary = Azure AD, Secondary = Azure AD Domain Services (which can be linked backed to on premises via a VPN)

In this case, we’ll be going with Option 4. You can see however that a VPN is going to be required for options 1, 3 and 4. That’s why one of the first steps in this series was to set one up.

With all that now configured, let’s now look at the costs involved. The costs here will vary on what identity solution you select. If you stay with an on premises domain controller only, you will need to have site to site VPN to resources in Azure. The costing for this has been covered previously:

Moving to the Cloud  – Part 1

and equates to around AU$36 per month with less than 5GB of traffic inbound to Azure. Azure AD Connect software you use to synchronise user identities to Azure AD is free.

If you move the domain controller to a virtual machine in Azure, there will be the cost of that virtual machine (compute + disk storage). The cost will therefore vary greatly on what VM type you select. I’ll be covering more about VM options in this migration in an upcoming article, but for now let’s keep it simple and say we use a A2v2 Standard VM (4GB RAM, 20GB HDD) for a single role as just a domain controller. The cost for that is around AU$76 per month. If you also still have on premises infrastructure, like Windows Servers, that need access to the domain, then you’ll also need a site to site VPN to communicate with the domain controller VM in Azure IaaS. Thus, to move the domain controller to Azure IaaS and still allow access to on premises infrastructure the cost would be around AU$112 (Azure VM + VPN). Of course, if you can migrate all your on premises server infrastructure to Azure IaaS, you probably wouldn’t need the VPN but there would then be the costs of the additional infrastructure in Azure. Balanced against this cost in Azure IaaS is the saving in local hardware, power, etc.

Again, let’s keep it simple for now and say we want to maintain on premise infrastructure but have a dedicate domain controller in the Azure IaaS so the one on premises can be de-commissioned. That means the costs would be AU$112 per month for a domain controller in Azure IaaS and a VPN back to on premises.

Finally, the last identity option is if we wanted to use the Azure PaaS service, Azure AD Domain Services, which means no infrastructure at all but also means we need to start with a new ‘clean’ domain separate from the existing on premises one. The costs of this Azure PaaS service can be found at:

Azure Active Directory Domain Services pricing

which reveals:


For smaller directories (<25,000 objects) the cost is going to be AU$150 per month flat. Remember, here when equating costs, there are no VMs to backup or operating systems to patch because it is PaaS. This is a domain controller as a service and Microsoft will take care of all the infrastructure “stuff” for you as part of that service. Of course, if you need on premises infrastructure to access Azure AD Domain Services, you’ll again need a site to site VPN to get there. If all your infrastructure is cloud based, then no site to site VPN is required. However, in this scenario, we still want access to on premises infrastructure so the costs would be AU$186 per month (Azure AD Domain Services + VPN).

In summary then, the configuration options/costs will be:

Option 1. Retain on premises AD = AU$36

Option 2. Move domain controller to Azure IaaS = AU$112 (estimated typical cost)

Option 3. Migrate domain controller to Azure PaaS = AU$186 per month

Going forward we’ll be selecting Option 3, because we are aiming to minimise the amount of infrastructure to be maintained and we want to move to PaaS as soon as possible. That means the total cost of the migration so far is:

1. Site to Site VPN = AU$36

2. Storage = AU$107

3. Identity (PaaS) = AU$150

Total maximum infrastructure cost to date = AU$293 per month

This means we have:

1. Eliminated the old on premises domain controller (hardware, patching, backup, power, etc costs)

2. Can connect to on premises infrastructure to Azure AD (via Azure AD Domain Services and the VPN)

3. Have mapped tiered storage locations for things like archiving, profiles, etc that are PaaS

4. We can now build out a Windows Virtual Desktop environment

The next item that we’ll focus on is setting up a Windows Virtual Desktop environment as we now have all the components in place to achieve that.

Moving to the Cloud–Part 2

This is part of a multi part examination of the options of moving to the Microsoft cloud. If you missed the first episode, you’ll find it here:

Moving to the Cloud  – Part 1

which covered off setting up a site to site VPN to Azure.

The next piece of the puzzle that we’ll add here is storage.

Storage in the Microsoft cloud comes in many forms, SharePoint, Teams, OneDrive for Business and Azure. We’ll get to stuff in Microsoft 365 like SharePoint, Teams and OneDrive later, but to start off with we want to take advantage of the site to site VPN that was set up in Part 1.

In Azure there are three different access tiers of storage; hot, cool and archive. They all vary by access speed and cost. The slower the access, the cheaper it is. Hot is the fastest access, followed by cool, then archive. You can read more about this here:

Azure Blob storage: hot, cool, and archive access tiers

The other variable here with Azure storage is the performance tier; standard or premium. You can read more here:

Introduction to Azure storage

Basically, standard performance tier uses HDD while Premium uses SSD. Apart from performance, the major difference is how the storage cost is actually calculated. With the standard tier, you are only billed for the space you consume BUT you are also billed for access (read, write, delete) operations. With premium, you are billed for the total capacity of the storage you allocate immediately BUT, you are not billed for any access operations.

So the key metrics you need to keep in mind when you are designing a storage solution in Azure is firstly the access tier (hot, cool or archive) the performance tier (standard or premium) and the capacity you desire for each. You may find some combinations are unavailable, so check out the document linked above for more details on what is available with all these options.

The easiest approach to Azure storage is to create an Azure SMB Share and map these directly on a workstation which I have previously detailed here:

Creating an Azure SMB Share

as well as an overview on pricing:

Clarification on Azure SMB file share transactions

Azure SMB files currently only supports hot and cool tiers. You can use archive storage but only via blob access, not SMB files. So what good are all of these you may ask? Well, if you read my article:

Data discovery done right

You’ll find that I recommend dividing up your data into items to be deleted, archived and to be migrated.

So we need to ask ourselves the question, what data makes sense where?

Let’s start with Azure archive storage. What makes sense in here, given that Azure archive storage is aimed at replacement of traditional long term storage (think tape drives)? Into this, you want to put data that you aren’t going to access very often, and that doesn’t make sense going into Teams, SharePoint and OneDrive. What sort of data doesn’t make sense going into SharePoint? Data that can’t be indexed such as large image files without text, Outlook PST backups, custom file types SharePoint indexing doesn’t support (think some types of CAD files and other third party file types). In my case, Azure archive storage is a great repository for those PST backups I’ve accumulated over the years.

Here is the guidance from Microsoft:

  • Hot – Optimized for storing data that is accessed frequently.

  • Cool – Optimized for storing data that is infrequently accessed and stored for at least 30 days.

  • Archive – Optimized for storing data that is rarely accessed and stored for at least 180 days with flexible latency requirements (on the order of hours).

We now repeat this again but with the cool tier storage, remember that this tier now directly supports Azure SMB files. So, what makes sense here? There is obviously no hard and fast rule but again, what doesn’t make sense going into SharePoint? Stuff that can’t be indexed, is typically large, is not accessed that often but more often than archive storage AND you also want to be accessible via a mapped drive letter. In my case, that data that springs to mind are my desktop utility apps (like robocopy), ISO images (of old versions of SharePoint server I keep in case I need to do a migration) and copies of my podcast recordings in MP3 format.

We repeat this again for the hot tier which is fastest and most expensive storage option. Initially here I’m going to place the user profile data when I get around to configuring Windows Virtual Desktop (WVD) in this environment. That needs to be quick, however most other current data files I have will go into Microsoft 365. Being the most expensive tier of storage, I want to keep this as small as possible and only REALLY put data on here that makes sense.

You don’t have to use all three tiers as I do. You can always add more storage later if you need to, but I’d recommend you work out what capacity you want for each tier and then implement it. For me, I’m going for 100GB Archive, 100GB cool and 50GB hot as a starting point. Your capacities will obviously vary depending on how much data you plan to put in each location. That why you need to have some idea idea where all your data is going to go BEFORE you set all this stuff up. Some will go to Azure, some will go to Microsoft 365, some will deleted and so on.

As for performance tiers, I’m going to stick with standard across all storage accounts for now to keep costs down and only pay for the capacity I actually use.

Let’s now look at some costs by using the Azure pricing calculator:


I’ll firstly work out the price for each based on 1TB total storage for comparisons between the tiers and to SharePoint and OneDrive for Business.

All the storage calculations are in AU$, out of the Australian East data center, on the standard performance tier and locally redundant unless otherwise stated.

You can see that 1TB or archive storage is only AU$2.05, but it ain’t that simple.


There are other operations, as you can see above that need to be taken into account. I have adjusted these to what I believe makes sense for this example but as you can see, variations here can significantly alter the price (especially the read operations).

The estimated total for 1TB of archive storage on the standard performance tier = AU$27.05 per month.

Now, as a comparison, if I change the performance tier to Premium I get:


The price of the storage goes way up, while the price of operations goes way down. So, if you want to minimise costs and you have lots of operations on your storage, standard tier is your best option.

The estimated total for 1TB of archive storage on the premium performance tier = AU$224.22 per month.

Basically 10 x the cost above the standard tier.

In my case, I don’t need 1TB of storage, I only want 100GB of storage.


When I now do the estimation of 100GB of archive storage, the cost of just the storage falls by 10x (as expected) to AU$0.20, Don’t forget however about the storage operations which remain the same. So, my storage cost went down but my operation costs remained the same. Thus,

The estimated total for my 100GB of archive storage on the standard performance tier = AU$25.95 per month.

While premium is:


The estimated total for my 100GB of archive storage on the premium performance tier = AU$22.78 per month.

As outlined before, as a general rule of thumb with archive storage, premium performance tier is better value for low storage capacity and also low data operations. Once the capacity increases with premium performance, the price ramps ups.

So why would I recommend staying with the standard performance tier? Although, I ‘estimate’ that my archive will be small, I want the flexibility to grow the capacity if I need it. Remember, that we don’t set a storage capacity quota for block storage, it can just grow as needed and the bigger the storage capacity the more it will cost me if I go premium. Given that storage capacity here is more important than working with the data, I want the cheapest storage costs I can get as the data capacity increases. Thus, I’ll stick with the standard access tier. Also, remember that I’m estimating when my storage reaches 100GB here I’ll be billed AU$25.95 per month but until I reach that capacity and the less operations I do on files there, the cheaper this storage will be. I therefore expect my ‘real world’ costs to in fact be much less than this AU$25.95 figure over time.

Let’s now look at the next two storage locations, which will be Azure SMB file shares.

Unfortunately, the pricing calculator doesn’t allow us to easily calculate the price for an SMB Share on a cool access tier (Azure SMB files doesn’t currently support being on the archive tier). However, the pricing is only an estimate, so I know if I place it on a cool access tier it will be cheaper anyway, so I’m going to keep it simple.


Thus, for reference:

The estimated total for 1TB of SMB file storage on the standard performance tier = AU$106.58 per month.

remembering that for the standard tier we need to take into account the cost of operations as shown.

and for Premium:


The estimated total for 1TB of SMB file storage on the premium performance tier = AU$348.00 per month.

With premium storage, you don’t need to worry about operations, however don’t forget, if you go premium you’ll be paying for the total allocated capacity no matter how much you are actually using. Thus, I’ll again be sticking with standard storage.

So, for my 50GB Azure SMB files hot tier I calculate the following:


The estimated total for my 50GB of hot SMB file storage on the standard performance tier = AU$32.40 per month.

Now how can I get an idea of what the cool SMB file price will be? Although it is not this simple, I’m going to use a ratio from:

Azure Block blob pricing


So, by my super rough rule of thumb maths I get:

cool/hot = 0.02060/0.0275 = 0.75

Thus, cool storage is 75% the cost of hot storage say.

The estimated total for my 100GB of cool SMB file storage on the standard performance tier = AU$32.40 per month x 2 x 0.75 = AU$48.60 per month

The 2 x is because the hot price I have is only for 50GB and I want 100GB of cool storage.

In summary then, I will create 3 x storage repositories for my data:

– 100GB blob archive storage = AU$25.95 per month

– 100GB SMB file cool storage = AU$48.60 per month

– 50GB SMB file hot storage = AU$32.40 per month

250GB total storage estimated cost = AU$106.95 per month

Again remember, this is my estimated MAXIMUM cost, I expect it to be much lower until the data capacities actually reach these levels.

Now that I have the costs, how do I actually go about using these storage locations?

Because archive storage is blob storage I’ll need to access it via something like Azure Storage Explorer as I can’t easily use Windows Explorer. I’m not expecting all users to work with this data so Azure Storage Explorer will work fine to upload and manipulate data if needed by a select few.

As for the SMB file cool and hot storage I’m going to map these to two drives across my VPN as I have detailed previously:

Azure file storage private endpoints

This means they’ll just appear as drive letter on workstations and I can copy data up there from anything local, like a file server. The great thing is that these Azure SMB file shares are only available across the VPN and not directly from elsewhere as the article shows. That can be changed if desired, but for now that’s they way I’ll leave it. I can also potentially get to these locations via Azure Storage Explorer if I need to. The flexibility of the cloud.

So far we now have:

– Site to Site VPN to Azure (<5GB egress from/unlimited ingress to Azure)= $36.08 per month

– 100GB blob archive storage = AU$25.95 per month

– 100GB SMB file cool storage (mapped to Y: Drive) = AU$48.60 per month

– 50GB SMB file hot storage (mapped to Z: Drive) = AU$32.40 per month

Total maximum infrastructure cost to date = AU$143.03 per month

So we now have in place the ability to start shifting data that doesn’t make sense going into Microsoft 365 SharePoint, Teams and OneDrive for Business. Each of the three new storage locations has their advantages and disadvantages. That is why I created them all, to give me the maximum flexibility at the minimum cost

We continue to build from here in upcoming articles. Stay tuned.

Optimising Azure OMS data ingestion


Every month when I receive my Azure bill I take a careful look at it to see if there is anything I can optimise. This month I saw that the top cost was from my Log analytics workspace as you can see above. This however was no surprise because it basically represents that amount of data that had been ingested from my remote workstations into Azure Sentinel for analysis.


When I looked at Azure Sentinel I can see that I am bringing in more performance logs than security events per day. Now the question is, am I really getting value from having that much ingestion of performance logging? Probably not, so I want to go and turn it down a notch and not ingest quite so much and hopefully, save me a few dollars.


To do this, I’ll need to log into the Azure Portal and then go to Log Analytics workspaces.


I’ll then need to select Advanced settings from the menu on the left.


First thing I checked was in Data, Windows Event Logs is that I’m only capturing the errors in the Application and System logs for the devices, which I was.


Next I went to Windows Performance Counters and adjusted the sample time limit. I have increased it to every 10 minutes for now to see what difference that makes. I could also remove or add certain performance counters here if I wanted but I wanted to work with the current baseline.

With all that done, I’ll wait and see what the cost differences are in next month’s invoice and adjust again if necessary.

My software and services 2020


Here’s last year’s post for comparison:

My software and services – 2019

All my PC’s are running the latest version of Windows 10 (1909) without any issues and none during the upgrade process either. I do have Windows 10 and Office Insider builds happening on an original Surface PC as a testbed. All Windows 10 Pro machines are directly joined to Azure AD and managed via Intune. All machines run no third party AV as Windows Defender is a far better option in my experience. Thanks to Microsoft E5 on my production tenant, I am also using Microsoft Defender ATP at the back end for monitoring and investigation of endpoint threats.

The WD Sentinel DX4000 runs Windows Storage Server 2008 and replacement has been delayed due to the “pending” arrival of the NBN which hopefully will provide better bandwidth. In the mean time I have established a site to site VPN to Azure and have begun moving data into Azure storage. In the end this device will merely function as a backup device but for the time being I need to wait for better bandwidth. Hopefully this year I’m being told.

My two main tenants are an Office 365 E5 demo and Microsoft 365 production environments. The Windows 10 Pro machines are Azure AD joined to the Microsoft 365 production domain. The production Microsoft 365 tenant has Microsoft 365 Business for all users except myself. I have a Microsoft 365 E5 license on which I have configured all the services including integrated PSTN calling via Switch Connect.

I use most major browsers:

– Edge – mainly for logging into my production tenant

– Edge Insider – will soon become my major production browser and is used for production and business websites, like reading Microsoft docs.

– Chrome – I am minimising/eliminating my use of this on existing machines and not installing on any new machines. I want to move away from Chrome totally as soon as possible and get it off all my machines.

Brave – I have become increasingly concerned about the surreptitious tracking that many sites perform, especially when it comes to social media sites. I therefore now do all my ‘random browsing’, searching and viewing of social media sites. I became aware of the extent of tracking when I was adjusting the security settings in Edge Insider and found the following:


Made me realise that I probably need to take this ‘do not track’ stuff more seriously!

– Firefox – I occasionally use this for testing or isolation but less so now thanks to profiles in Edge Insider.

I have now cranked Edge Insider up to the maximum security level but wanted to isolate the most likely tracking culprits into another browser that was security focused. After some evaluation, I have chosen Brave to be this browser. This is now where I do all the stuff that is more likely to be tracked and now hopefully blocked or at least minimised. I have also set this browser up to use Duck Duck Go as the default search engine, otherwise I use Bing for my production browsers.

Services like SharePoint Online and OneDrive I use regularly both in the demo and production tenant. I have the OneDrive sync client installed, running and connected to various locations on my production and demo tenants. I can now sync across all my different tenants as well as my consumer OneDrive storage. We have come a long way with the sync client!

I used to have  Microsoft Teams which is now my main messaging application. All the CIAOPS Patron resources like the intranet, team, etc all reside in the Office 365 E5 demo tenant but I connect to it on my desktop normally via an Azure B2B guest account from my production tenant. Thus, I can admin the Patron resources in a browser if need be but I get the same experience on my desktop as any Patron would. Handy to know what works and doesn’t work with Microsoft Teams guest access. Thanks to Microsoft E5 and Switchconnect, I also have Teams connected as a phone.

I use Lastpass to keep my passwords and private information secure. It allows me to do things like generate and store unique passwords for each website that I sign up for. It is also available across all browsers on my machine (including Microsoft Edge). I also now also use Lastpass to store secure notes.

The extensions I run in all my browsers are:



I use the automation sites If This Then That and Zapier to automate many different tasks. A good example of one of these is automatically publishing to various social media sites. I am now using Microsoft Power Automate more and more for automation and I am still looking to dive deeper using things like Azure Functions in 2020. I have now replaced Socialoomph to post precisely scheduled social media posts with my own solution in  Power Automate.

For my Office 365 and Azure email newsletters I use Mailchimp.

My preferred public social networks for business, in order are:

1. Twitter

2. Linkedin

3. Facebook

The Apowersoft software allows me to display both iOS and Android devices on my Windows desktop which is really handy for demonstrations and presentations.

I also use Yammer extensively but for more specialised roles and thus don’t consider it really a ‘public’ social network, more a private one.

I consume a lot of content from YouTube both for business and personal interest. I also also use YouTube extensively for my publicly available training video training.

Microsoft Office desktop software is still part of my everyday workday via applications such as Outlook, Word, Excel, PowerPoint, etc. I use the desktop version of Outlook on my Surface Pro 6 which lives on my desk but I only use Outlook Web App on my travelling Surface Pro 4 device. I could happily not use Outlook on the desktop any more I believe but I still use so I understand the experience for most users. However, I do see the day when Outlook on the desktop begins to lose its appeal.

One of the things I have added to my desktop version of Outlook is a digital certificate that signs every email that I now send. This helps the receiver confirm that the message they have received is in fact from me and that it hasn’t been altered in any way. There are some issues when people attempt to reply to these emails from a mobile device but I believe a fix from Microsoft is not far away.

The key application from the suite for me is OneNote. OneNote is my go to Swiss Army knife for just about everything digital. I use it to capture all sort of data. I even use it as a diary as I have detailed previous here:

One of the ways I use OneNote

The reason OneNote is key is because:

1. Just about everything I put in there us searchable

2. It is freely available across all platforms.

3. All my information is synced and accessible on all devices.

4. It is available on the web or offline if needed.

There are now two version of OneNote, the Windows store OneNote and OneNote 2016. Microsoft have changed their stance on future upgrades to OneNote 2016 desktop which is great to hear and kudos to Microsoft for taking feedback on that score. I am a big user of OneNote on my iPad with the Apple pencil. This combination has allowed me to totally eliminate my paper notebooks for things such as journaling.

I use Pure Text to easily paste information, especially to and from OneNote as only text.

I am now a big Microsoft To-Do user. I use it to keep many tasks and items that I need to follow up. I love how it is available on all my devices and syncs across them all as well. I was becoming a bit worried when it had sat there with no updates for a long while, but that has changed now with heaps of updates being released. I’m keen to see where To-Do goes in 2020.

I use Windows terminal now for things like PowerShell execution and Microsoft Whiteboard for demonstrations and training.

Another key service I use everyday along with Office 365 and OneNote is Azure. Typically, I use it for running up virtual machines that I test various things with but I also use it to backup my local data as well as that of other members of my family using Azure Backup.

Azure desktop backup

I have also now implemented an Azure site to site VPN as well as Azure SMB File storage to start moving my data into. I use Azure Sentinel to monitor all my services and machines in one single console and tell me about any incidents. My plans for 2020 is to keep building out my Azure infrastructure to include Azure AD Domain Services, Windows Virtual Desktop and more. Stay tuned for updates on this throughout 2020.

There is just so much that can be done with Azure and I pretty much use it everyday.

For a subset of my local data that I wish to remain secure I use Truecrypt to create encrypted volumes. All my Windows 10 machines run with full disk encryption thanks to Bitlocker, but stuff like financial and customer data I keep inside Truecrypt volumes for that extra layer of security. I understand that Truecrypt is no longer maintained and may have some very minor security flaws, but for how and why I use it, it is more than adequate.

Production data is also protected using Windows Information Protection which provides yet a further level of protection and extends that to all devices including mobile devices like phones and tablets,

To capture my desktop for my online training academy or my YouTube channel I use Camtasia. I use SnagIt to capture screen shots and add highlights and emphasis to these. Snagit allows me to capture complete screens or specific areas quickly and easily.

I use Microsoft Teams to record my podcasts, which I then produce with Camtasia. These are uploaded to Podbean where they syndicated across various network.

To compose and publish blog articles I use Open Live Writer.

The majority of images I get, like the one at the top of this article, I get from Pexels. Pickit is also another great option.

For improved meeting management productivity I use Microsoft FindTime.

I use Visual Studio Code in which I do most of my PowerShell editing and publishing. The end result typically is my GitHub repository where you will find a range of scripts and other resources that I maintain regular. With Visual Studio Code I can edit publish and sync all my machines and my GitHub repository no matter where I am. Very handy.

Here are also a few of the other items I use regularly that are not for business:

Amazon Prime Video – only place to the latest The Grand Tour action. I also liked the Jack Ryan series and well as the Gymkana Files.

NetFlix – Just added this recently and have found many great documentaries.

XBox Live Gold – access to all the online Xbox goodness.

Duolingo – language learning, Japanese and Italian at the moment

Tinycards – language and facts learning via flashcards. Also handy for certification exams.

So there you have it, the major software and services that I use regularly. I continue to search out additional software that will improve my productivity. If you use something that you’ve found really handy, please let me know and I always keen to explore what works for others.

Moving to the Cloud–Part 1

This year I thought I’d try and embrace as much of the Microsoft Cloud technology that is available. However, I’d try and approach it through the lens of a SMB business moving to the cloud but also lay it out in a staged manner for easier comprehension. This post therefore represents the first in a series of posts that covers the methods and configuration you can take in moving your infrastructure to the cloud.

That said, there is no one single approach or method that will work for all. However, by running through the various options and also explaining what value these may have, hopefully people will get a better idea of all the options that are available. As I said, there isn’t necessarily any right or wrong here, just my thoughts on the approach that I take given typical scenarios I see.

The first thing you’ll need to go and do is get a Microsoft 365 tenant. I’ll cover off what I recommend specifically and why in later posts, but for now, you’ll need to have a tenant.

Next, you’ll need to add a paid Azure subscription to this same tenant. I have detailed about this approach here:

Deploy Office 365 and Azure together

In short, doing so will give you more options and capabilities, especially when it comes to infrastructure. The good news is that you’ll only pay for what you use, so as you build your solution out you can keep costs down.

With you Microsoft 365 and Azure subscriptions in place, I would suggest that the starting point should be a site to site VPN to Azure. This basically extends your on premises network to Azure.

In my situation, I have Ubiquiti equipment so I followed articles like:

Connecting Ubiquiti Unifi USG to Azure via VPN

The Azure Site to Site VPN documentation is here:

Create a Site-to- Site connection in the Azure portal

This article is also handy:

Step by step: Configuring a site to site VPN gateway between Azure and on premises

Given that there are already a lot of detailed documents out there on doing this I’m not going to cover this off here. However, you’ll basically need to:

1. Create a virtual network in Azure.

2. Create a virtual network gateway in Azure and connect to the virtual network you created above.

3. Create a connection from the virtual network gateway in Azure back to your on premises environment.

4. Configure the on premises equipment to connect to Azure.


When complete, you should have something that looks like the above. There isn’t a lot that you can do with this configuration just yet, but it is going to be the basis for what is used going forward. What it gives us in effect is a single network that spans both on premises and Azure.

Now, let’s consider the costs.

An Azure virtual network is free.

There are a number of different VPN options in Azure per:

VPN gateway pricing


In this case I’m going to select the Basic VPN, simply because it has enough bandwidth and tunnels, etc for my needs. However, the Basic VPN is typically only recommended for dev/test environments, but to keep costs down here I’ll use that going forward.


So, if I now use the Azure Pricing Calculator to get an estimate of the costs I get the above (in Australian dollars out of an Australian datacenter). Cost will vary depending on currency and location. You should also note that basically:

1. Data transfers into Azure are free.

2. You get the first 5 GB of data transfers out of Azure for free also.

So my expected initial VPN cost will be:

AU$36.08 per month

for up to 5GB of outbound (unlimited inbound) traffic.

What’s the comparison cost if we step up to the next level of VPN?


You see that the cost jumps to AU$190.44 per month.

How easy is it to change VPN gateways in azure if you wanted to? Deleting and re-creating is easy, the downside is simply the time taken. This is because the time required to spin up a VPN Gateway in Azure is between 30 – 45 minutes generally. When you do so, you may also get a different external IP address for the gateway, which would mean a change to the configuration of the on premises environment. However, all of this isn’t difficult to do if needed. So for now, I’m going to stay with the Basic gateway because it is all I need and I want to keep costs down.


When I look at my bill for the month, as it turns out, the cost of the Basic VPN Gateway for the month, shown above, is pretty much what the calculator determined. The variance is probably just a small amount of outbound data that I used. So, you can be pretty confident that the cost of the VPN with less than 5GB of outbound traffic will be a fixed cost per month. We’ll cover how to budget for outbound traffic in upcoming articles, so stay tuned. However, for now, I know I am going to have a fixed cost of AU$36.08 for just my Basic VPN gateway every month. Add that to the budget.

In summary, one of the first steps in migrating an on premises environment to the cloud is to establish a site to site VPN. You can do this easily with Azure and the expected costs for the most configuration is around AU$36 per month. The benefit of this is that you have now extended you on premises network to Azure and can start taking advantage of the services there.

Watch out for upcoming articles on the next stages of this process.

Azure file storage private endpoints

I’ve previously detailed how to create an Azure SMB File Share:

Creating an Azure SMB file share

as a way to create a ‘cloud USB’ drive that you can map to just about any desktop quickly and easily. All of this is accomplished securely but many remain hesitant to do this across the Internet directly. Luckily, there is now an option to map this SMB share to an IP address inside an Azure VNet to restrict access if desired.


Before you set this up you will need to have an existing Azure Vnet created as well as a paid Azure subscription. You can add a Private Endpoint to an existing Azure storage account or create one at the same time you create a new Azure Storage account. In this case, I’m going to an existing account.

In the Azure portal search for “private link”, which should then take you to the Private Link Center as shown above. Select the Add button on the right.


You’ll need to select a Resource Group as well as a Name as shown above.


You’ll then to select the Azure Storage account and the file option to connect to an existing SMB file share as shown above.


Next, you’ll need to connect to an existing Vnet and if you want to access the resource privately by a name, then you’ll need to integrate it with a private DNS zone, which will also be set up for you as part of this process.


You can then add tags. Note – when I created mine, if I assigned tags here I couldn’t create the Private Endpoint, which appears to be a bug. So, if for some reason you find the same issue, create the Private Endpoint without tags and then add them later.

With all that done, select the Create button to finish the configuration on the Review + Create page.


When the set up process is complete you’ll now see your endpoint as shown above with an allocated IP address on the Vnet you selected.


If you then look at your Vnet, as shown above, you will see that the Storage Account is seen as a connected device.


If you now visit the Storage Account and select Firewalls and virtual networks as shown above, you can configure what networks can access this new Private Endpoint.

Leaving the option set to All networks means that you can still map to that SMB share directly across the Internet, which you may want.


However, in the above case, I have selected to restrict the access to the Vnet only.


Doing so means that the ONLY way I can now access that SMB Share is via the selected Vnet. I can’t get to it using the Azure portal on my remote desktop machine as shown above.


If I wanted to access this from a remote location, outside the Vnet across the Internet, I could add those details below. However, I have chosen not to do this.

My Azure SMB File share now has a dedicated IP address that is restricted to access via an Azure Vnet, how do I work with this share directly on premises? Easy. I set up an Azure Site to Site VPN to that same Vnet and now I can access that Azure SMB File share from my local machines by mapping to something like the IP address.


Thus, the only way that Azure SMB file share can be access is across a Site to Site VPN, making even more secure.


Private Endpoints support connection to a number of PaaS Azure services as shown above. This is handy as it allows you to connected you Azure IaaS services (like VMs) directly to Azure PaaS (like storage) quickly and easily as shown. What’s the benefit? Remember, IaaS is typically billed on time used, while PaaS is billed on resource consumption. Thus, why should I pay for a VM to store my data and pay the time it runs (typically 24/7), plus disk storage where I could use Azure Storage and most be billed just for the data capacity?

PaaS is the future and has many benefits over IaaS. You should be looking to shift as much of you infrastructure to PaaS to take advantage of things like reduce maintenance, cost savings, etc. Private Endpoints is an easy way to start doing just that. For more information on Azure Private Endpoint visit:

What is Azure Private Endpoint?