Tuesday, January 31, 2017

Need to Know Podcast–Episode 130

Marc and I have some brief news and cloud updates for you and then we are straight into our guest for this episode. I speak with MVP Alan Burchill all about his upcoming Microsoft Ignite presentations:

Using Edge in the Enterprise

Microsoft Edge is one of the most secure and web standards compatible browsers on the market. See how the new management features in Windows 10 can help IT Professional to provide support for legacy web sites while still allowing users to access web sites with the latest web standards.

Don't forget to send us your feedback at feedback@needtoknow.cloud

You can listen to this episode directly at:

https://ciaops.podbean.com/e/episode-130-alan-burchill/ 

or on Soundcloud here:

Subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.

Resources

@alanburchill

@marckean

@directorcia

www.grouppolicy.biz

Azure ready

Office 365 German datacenters

Microsoft tech days online

Microsoft tech summit - Birmingham

Monday, January 30, 2017

Learning online advertising–Part 1

There are a lot of things in this world I have no real clue about. One of these is online advertising to grow your business via the likes of Google and Facebook ads. My plan therefore is to endeavour to make sense of these and share with you my journey. So, let the story begin.

There are lot of “so called” online advertising experts out there spruiking their wares. Others I have talked to failed to provide any really concrete or repeatable evidence of the successful way that they have used online advertising to boost their businesses. This to my mind is not a very satisfactory set of circumstances. This has therefore lead me to the reality that I need to dive deep and understand this for myself, in terms that make sense to me.

I think step one in this process is to actually define a tangible and profitable goal you are looking to achieve. Something that you can measure direct clicks from online advertising to profit, which seems to me few actually do definitively. You want to know that X amount of clicks will generate you, on average, Y amount of dollars. Otherwise you are simply wasting money in my books.

So, for the purposes of my adventures here and to make measurement easier, I am going to stick to one simple desired outcome which is:

I want to boost the number of paid subscriptions to my online Getting Started with SharePoint Online Course.  

That now gives me an endpoint to take people after they have clicked any online ad. See my ad. Like my ad. Click my ad. Pay for my product. Profit generated. Simple right?

Now the second thing I did was put a hard stop on the amount of money I was going to spend before making adjustments. The amount that I settled on was $100. Thus, each time a $100 threshold was crossed I would stop and review before continuing.

image

The above shows you my first attempt using just Facebook ads. For my $100 I received about 86,000 views and about 1,100 clicks. That is a conversion rate of about 1%, which is no unexpected from what I have determined. Now the big question, how many of these 1,100 or so clicks actually converted into dollars? Answer? Zero. Yup, zero. None of the 1,100 or so people who click actually signed up and pad for the course.

Disappointing based on pure numbers but let’s have a think about this. The Facebook online ads do appear to have been doing their work by bringing people to the online course. The problem appears to be actually converting them to buying customers. In that respect one would have to conclude that issues lie with the destination not the online ad. Something about the destination site is not resonating with people. In short, the destination is not making the value statement well enough.

Of course there could be other factors such as the online ads attracting the wrong demographic and so on. However, I need to pick something to focus on and adjust so, to me, the most obvious is that the destination is failing to convert.

Thus, the next step in this learning process is to revamp the destination and make it more appealing and more focused on conversion. I’ve got some ideas on some improvements that can be made and once these are done I’ll click off the next $100 spend with Facebook online ads and see what results that produces. I’ll also spend more time doing research about this whole online advertising process and report back my findings.

However, in summary I would suggest:

1. Start with a well defined goal you want your online advertising to achieve. Something that is measurable.

2. Aim to have a clear path from your online advertising to profit. Be very clear about how a click on an online ad is going to generate you profit.

3. Place a spending limit on your advertising at which point to can stop and review the success of your campaign.

4. Understand where issues lie. Getting people to click or converting them after they have clicked. The process is not just a single component, there are many moving parts here.

Finally, remember that no matter what anyone says, simply throwing up online advertising is no automatic guarantee of success. Although many claim to be doing it successfully most aren’t (from what I see) so always base decisions on hard evidence and past performance not the emotional promises of what the future ‘may” bring with some “special” method. Science is the foundation of any art. You need to get the basics right before you gain insight.

Need to Know Podcast–Episode 129

Marc and I give our usual Office 365 and Azure update. We then speak with John Liu about his upcoming presentation at Microsoft Ignite Australia:

Serverless in Office 365 - Build Services with Azure Functions

There was a time when it was the good old days - we ran SharePoint on-premises, and business always wanted more customizations, and so we built them and delivered great value. Now, it seems as everything is moving towards Office 365, and there are fewer and fewer customizations we can do. To the business, it seems more and more we have to say no - we can't do that. Code has to run somewhere. Suddenly, we are in the realm of hosting our own webjob, we may even have to find or buy a VM. We are left to wonder if we need to learn Docker too. Thank goodness - in 2016 we saw the rise of a new buzzword #Serverless - it promises a solution to many of these problems and lets us focus on what we do best again - creating solutions. Without having to worry so much about where or how to run it. This session is about introducing Azure Functions, and various amazing scenarios we can apply it to SharePoint and Office 365. Whether you are a new, veteran or even front-end developer, or if you are an IT Pro well versed in Powershell. This session is for you.

Don't forget to send us your feedback at feedback@needtoknow.cloud

You can listen to this episode directly at:

https://ciaops.podbean.com/e/episode-129-john-liu/ 

or on Souncloud here:

 

Subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.

Resources

@johnliu

@marckean

@directorcia

Azure news from Marc

Azure functions

Microsoft Cloud run rate

OneDrive and Officelens updates

Dynamics 365 roadmap

Tuesday, January 24, 2017

Need to Know Podcast–Episode 128

Marc and I are joined by a returning guest to talk all about her upcoming Microsoft Ignite Australia presentations. Sonia Cuff gives us the low down on what to expect with the following two sessions she is presenting:

Making SaaS part of your IT Strategy

With the business pushing for SaaS apps, why are we saying no? Can you balance an in-house infrastructure under strict controls & policies with a business reliance on an outsourced, uncontrolled solution? We'll look at how to enable the business while still protecting them and how to keep your sanity.

and

The CEO reviewed your project & you won;t believe what happened next

Your budget was agreed. A reasonable timeframe was achieved. The implementation went smoothly. So why is the business still unhappy? You'll learn why Digital Transformation is more than just technology deployment. We'll show you what successful Digital Transformation looks like to the CEO & how you can ensure your IT work really is enabling people to achieve more. Find out some of the practical tools that Microsoft provides that can help you navigate this conversation with your executive stakeholders.

There is also the latest Office 365 and Azure news and don't forget to send us your feedback at feedback@needtoknow.cloud

You can listen to this episode directly at:

https://ciaops.podbean.com/e/episode-128-sonia-cuff/ 

or on Soundcloud here:

 

Subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.

Resources

@cuff_s

@marckean

@directorcia

Marc's Azure news

Azure Active Directory meets Power BI

Office 365 planned service changes

Cloud platform roadmap

The Missing Chair: http://themissingchair.com.au/

Personal website: http://soniacuff.com

MS Ignite Session: Making SaaS part of your IT Strategy https://msftignite.com.au/sessions/session-details/2209/making-saas-part-of-your-it-strategy-

MS Ignite Session: The CEO reviewed your project & you won’t believe what happened next … https://msftignite.com.au/sessions/session-details/2320/the-ceo-reviewed-your-project-you-wont-believe-what-happened-next-

Microsoft FastTrack: https://fasttrack.microsoft.com/

Sharing files with external users using OneDrive for Business

Here’s a bright shiny new video detailing how to share files with external users from OneDrive for Business. You’ll see how to share with external users via email address and via a direct URL. Nice and easy.

If you wish to share with external users via email (i.e. they have to actually login to view the document), then they’ll need a free Microsoft account which they may already have or can easily set up using their own email address.

If you wish to share the document without the need for a login you can also do that easily via OneDrive for Business.

Sharing your files via OneDrive for Business means that you retain the source and control of the files. It also means there is a single point of truth when it comes to the document. That alone is worth using OneDrive for Business to share personal business documents.

Office 2013 via Office 365 is going away

A key date that is fast approaching is the removal of availability and support of Office 2013 from the Office 365 portal. As detailed here:

https://blogs.technet.microsoft.com/skywriter/2017/01/19/office-365-planned-service-changes-2017/

Office 2016 is the recommended version of Office 365 ProPlus and includes all the latest upgrades and new features. As we announced in September 2015, when we released Office 2016, beginning March 1, 2017, the Office 2013 version of Office 365 ProPlus will no longer be available for installation from the Office 365 portal. Beginning March 1, 2017, your users will no longer see Office 2013 as an option for download through the Office 365 portal, and admins will no longer have the option under Software download settings in the admin portal to choose to enable Office 2013. In addition, we will no longer provide feature updates for this version, nor provide support.

The requirement to upgrade an old version of Office on the desktop has been detailed previously and I detailed it here:

Questions about Office 2016 via Office 365

Probably the major point with Office 2016 is that it doesn’t support connection to Exchange 2007. This is typically going to affect those users still running SBS 2008, so you have been warned.

Part of the subscription features of Office 365 means that subscribers have access to the latest software. They should now ensure that they have upgraded any previous versions to the latest that Office 365 offers.

As mentioned in my article, users have 12 months from the release date of new software to upgrade to the latest version of the software. Failing to do so will result in their current version going into ‘reduced functionality mode’ where they can only carry out basic functions such as as read and open.

If a user has Office 2013 from Office 365 they will not be upgraded automatically, they will need to install the software manually. For answers to more questions about Office from Office 365 I urge your to read the above articles and make the change over as soon as possible.

Monday, January 23, 2017

The middle age spread

This is part nine of my presentation “Making money from the cloud”. You can find the full slides at:

https://doc.co/LyrxvF/qcihGm

and the previous parts are at:

We live in exponential times

Consider the following

Major Trends

Macro Trends

Software will eat the world

The phone is the desktop

Build a tailored service

Focus on adding value

image

The reality of many IT service businesses today is a model that looks like the above graphic I believe. To my mind, it illustrates that the majority of resources inside an IT services business are spent on managing and maintaining human capital. Now that human capital could be people management (i.e. employees) or it could be knowledge management (i.e. keeping up to date), but is most likely a combination of both. No matter what the components that constitute it, it is by far the largest drain on the business and is something that affects both IT resellers, both large and small.

In this old model, the human capital resource has to be the widest component to cater for all eventualities and is the base on which everything else sits. Most IT providers need people and knowledge to cover the huge variety of products and services they sell and the systems they utilise to support these. Some of these may only be required occasionally but there is too much risk involved in not having them covered. So the base of the structure traditionally needs to be the widest to support those layered on top of it.

This traditional model for revenue growth for IT providers has been to add more products and customers constantly. Adding more product generally also means introducing additional vendors. For example, ‘we hear there is good money in VoIP phone systems, let’s do that’ and off the business goes, charging down the path of adding more products that require additional resources for ill defined or unknown returns. Likewise, many IT providers have traditionally taken on any client they come across because their focus is on revenue rather than profit. If duly examined, many IT resellers would find that probably 20% of their customers are providing 80% or more of the profit in their business, yet the amount of resources dedicated to the most profitable customers is probably quite low. That is simply an indication that the IT reseller has lost business focus and is merely fighting fires. In short, they are letting the business control them.

Much of the diversity of products that resellers have to support comes from the variety of customers they also elect to support. Many customers has little in common with other customers, so each becomes a unique instance to accommodate. This requires unique knowledge and lots of time spent doing things that can’t be applied elsewhere or are worthwhile automating. The greater the variety of customers on board the exponentially worse this all becomes.

With a huge variety of both customers and products to support, you end up having far more resources than you need, ‘just in case’. This means an ever decreasing width as you move towards the top of the structure shown above, because the lower level must be larger than the upper one ‘just in case’. Unfortunately, at the top of this model sits the smallest component of all, profit. That has been eaten away by all the supporting structure underneath. Thus, the business now has the ‘middle age spread’ as I like to call it, far bigger in the bottom than the top. Which is not what you want it to be like if we are honest right?

You’ll also notice that I have included an unnamed mystery box floating over the whole structure. This is something that nearly every IT reseller I know of does not do or even take seriously, yet is one of the most factors in the success of a business. Any ideas on what it could be? Stay tuned.

The question is, what can be done to fix the situation? The next article sill start delving into the solutions in more detail.

Friday, January 20, 2017

January webinar resources

Welcome to 2017. The first webinar of the New Year is now done and dusted. You can see the slide above or download directly from:

January 2017 Need to Know Webinar

If you are not a CIAOPS patron you want to view or download a full copy of the video from the session you can do so here:

http://www.ciaopsacademy.com/p/january-2017-need-to-know-webinar/

you can also now get access to all webinars via:

http://ciaops-academy.teachable.com/courses/need-to-know-webinars

for a nominal fee.

Thanks to everyone who attended and I hope to see you again next month.

Need to Know podcast–Episode 127

In this episode we are joined by Alessandro Cardoso, Technology strategist at Microsoft to talk about his upcoming Microsoft Ignite Australia sessions:

Managing Red Hat on Azure with OMS [OPEN312]

With the capability to deploy a Red Hat supported Virtual Machine in Azure, you may be asking: "What else can I do with my Azure Red Hat VM?” We will introduce Microsoft Operations Management Suite (OMS), walking you through the incredible analytic power of the system for Linux and Windows Azure VMs. With Linux, OMS allows you to collect Syslog events, Performance data, and Nagios/Zabbix alerts

And

Deploying Linux on Microsoft Public and Private cloud [OPEN323]

Heterogeneous environments with Microsoft Windows Clients, Microsoft Windows Server, Linux, FreeBSD, and the cloud are the norm. Being able to run all of your virtualized workloads on a single hypervisor simplifies management and optimizes server capacity. Learn how to deploy Linux VM to Hyper-V or Windows Azure.

You can listen to this episode directly at:

http://ciaops.podbean.com/e/episode-127-alessandro-cardoso/

or on Soundcloud here:

or subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.

Resources

@cloudtidings

@marckean

@directorcia

Microsoft OMS

Azure news from Marc

New features in Microsoft Flow

Using Flow for event registration

Project Osaka

Monday, January 16, 2017

Issues with Azure File Backup on SBS

One of the initial steps that I have been advocating when it came to migrating SBS servers to Azure was the installation of the Azure backup agent (marsagentinstaller.exe) on the SBS box in order to backup files and folders. It was the first step before moving onto more complex operations. After further research, it turns out that doing this will break “other SBS” things.

The reason is that the Azure Backup agent needs at least PowerShell V3.0 per:

https://docs.microsoft.com/en-us/azure/backup/backup-client-automation

Now it turns out that installing PowerShell V3 or higher on an SBS breaks per this:

https://blogs.technet.microsoft.com/sbs/2012/12/15/windows-management-framework-3-0-applicability-on-windows-small-business-server-20082011-standard/

which concludes:

Our guidance at this time is that Windows Management Framework 3.0 should not be deployed on a server running Windows Small Business Server 2008 Standard or Windows Small Business Server 2011 Standard.

Windows Management Framework 3.0 contains PowerShell v3.0.

The bottom line is that you shouldn’t install the Azure files backup agent on an SBS box from what I can determine, because it doesn’t support the minimum required version of PowerShell.

However, the Azure files backup agent will actually install and run on an SBS server. However, it will also as part of that installation install PowerShell v 3.0 which can cause lots of other issues. Thus, even if it can be installed DON’T install it because the components will cause other issues on SBS.

Unfortunately, the Azure file backup agent can only backup files on the host that it is installed on. This means you can’t install it on a members server and backup files across the network that are on the SBS box. However, the way you can do this (in theory) is using Azure application backup, which I’ll now have to go out and check actually operates in an SBS environment.

Makes things tough when your production OS doesn’t support the latest software eh?

Need to Know podcast–Episode 126

For our continued focus on speakers at the upcoming Microsoft Ignite event on the Gold Coast we speak with Andrew McMurray from Microsoft about Azure Information Protection. Andrew’s presentation is:

Prevent unwanted and embarrassing leakage with Azure Information Protection

Microsoft Azure Information Protection helps you safeguard your data throughout the complete data lifecycle. Data is "born" protected and carries the protection wherever it travels. So you don't need to worry where it's stored or with whom it's shared - you can rest assured it's always protected. Join us to learn more about the technology and how it can solve your information protection challenges.

Marc and I also do our usual wrap up of the latest Microsoft cloud news.

You can listen to this episode directly at:

http://ciaops.podbean.com/e/episode-126-andrew-mcmurray/

or on Soundcloud here:

or subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.

Resources

andrew.mcmurray@microsoft.com

@marckean

@directorcia

AIP Slides: https://aka.ms/IPdeck
AIP video of slides: https://aka.ms/IPvideo
News: https://aka.ms/aipnews
Blogs: https://aka.ms/aipblogs
Security Overview: https://aka.ms/rmssec
Web: https://aka.ms/aip
Overview: https://aka.ms/aipoverview
Forum: https://www.yammer.com/AskIPteam
AAD Sync: https://aka.ms/aipaadsync

Azure news from Marc

Azure AV2 machines now available

Microsoft Staffhub is here

Study says Teams to pass Slack

Sunday, January 15, 2017

Using Microsoft Flow for event confirmations

One of the handy features that many third party webinar products provide is the ability for people to register on a web page and then receive confirmation of that registration via email. Unfortunately, if you are looking to run a public Skype for Business meeting this feature is currently not really available. However, Office 365 does provides some tools that allows you to build an even more powerful solution than the one provided by third parties

image

The tool to do this with is Microsoft Flow, which you can access via:

https://flow.microsoft.com

You can then login with your Office 365 credentials.

If you then select the My Flows option from the menu bar in the top left you should see a screen as shown above. Here select Create from blank.

image

Flow allows me to connect to external web services such as Typeform, which is what I have used to create the public registration page. Basically, the Typeform registration will ask for First Name, Last Name and Email address.

I can search for the service I wish to use in the box as shown above. In this case I enter typeform.

image

I’ll need to authorise and connect to that third party web service. In the the case of Typeform, I’ll need to locate and insert the API key from my Typeform account into my flow.

image

Once the service is connected to my flow I can select the registration form I’m going to use to capture my data. Here, I’m using an existing Typeform form called “Flow demo” which I can select from a drop down list of all the TYpeforms I have set up.

This flow will start when a new response is submitted to this form.

Now select the + New step button below.

image

From the items that appear, select Add an action.

image

From the search box that appears I enter “Office 365” and select the Send an email option as shown.

You may need to authorise the connection to email but once that is done you will see the fields To:, Subject and Body that you can now fill.

I can now insert dynamic content into these regions. Dynamic content effectively means fields from the connected services. which appear on the right that I can now select.

So, I click in the To: field in the flow and then select the appropriate question from the Typeform form on the right that will yield the register’s email address. Thus, I will be sending a reply email to the registration email address that was collected from Typeform.

image

I then complete the rest of the information I want to go out in the confirmation email as shown above.

image

Now, here’s where Flow is superior to other third party registration services. I have a custom list in my Team Site that I also want to populate with the registration details so I have a copy. This list is just the name and email address as you see above.

image

I go back to my flow and Add an action again.

image

Into the search box that appears I enter “SharePoint” and select the option Create item that appears.

image

I now enter the Team Site URL and the list within that Team Site I want to populate from Typeform.

You will then see the fields from that list appear (here Email and Name), which again I can now populate with dynamic content from Typeform.

image

I could continue on and add more steps if I wanted but I’ll now give this Flow a name and select the Create flow option in the top right to save the changes and activate the automation.

image

Now one of things that you may get is an error like the above. From what I understand, this is telling you something about the dynamic content, in my case Typeform, isn’t quite right. I’ll need to do more digging to understand why this happens but if it does you’ll need to debug your flow.

In may case, for some reason, Flow doesn’t like the Typeform First name or Last response, which is weird as the email response is fine. Something I need to investigate further. For the time being I simply deleted these Typeform fields from my flow.

image

If all is good you should receive a message like that shown above and you can select the Done link on the right.

image

The flow you just created should now appear in My Flows as shown above. You can view, edit, disable and track the flow from here if needed.

image

So if I now go and complete the Typeform registration, it should kick off my flow.

image

If I look at the status of my flow I indeed see it has executed successfully as shown above.

image

If I now check my Team Site list I can see that item has been added as shown above.

image

The person registering has also received an email (above),

image

and I also have a the sent item in my inbox as seen above.

So there you have it, a pretty quick way to create a registration confirmation process with the added benefit of saving registration information into a Team Site list.

There of course limits to what can be done with Flow at this stage but it is improving rapidly and I am keen to spend more time with the service to improve my knowledge because it provides a great opportunity to automate business processes. The ability for Flow to connect to third party applications like Typeform shown here is where the real power lies I believe.

I look forward to the continued improvement in Flow and suggest that if you have Office 365 you should start looking at it to help automate more of your business.

Thursday, January 12, 2017

Publish your Office 365 calendar publicly

image

There are plenty of times when it is handy being able to give people anonymous access to your calendar not matter where they are.

To enable this, login to your Office 365 web portal.

image

Navigate to your web calendar typically by selecting the calendar icon from the portal home page.

image

In the top right of the page select the Cog icon as shown.

image

This will slide a blade out from the right of the window. At the bottom of this locate and select Calendar as shown, this is under the Your app settings area.

image

Locate and select the option Publish Calendar which appears under the Shared Calendars option towards the bottom on the left.

image

Now determine your sharing options from the pull down menus. Since this is going to be available publically I’d be recommending you select Availability only from the Select permissions options.

Once you have made your changes select the Save icon at the top.

image

You’ll now get a HTML calendar URL as well as an ICS calendar URL that you can copy and paste, then send to any contacts.

image

If you navigate to the HTML link, you should see something like the above displayed. The entries there will depend on the permissions you selected previously.

Of course, when you update your private calendar the external links will also be updated, since they are basically a view onto this calendar.

Once you can configured this and copied the link it makes it really easy to provide people with a idea of what your calendar is like, now and in the future. Pretty cool eh?

Azure Av2 machines now available

image

https://azure.microsoft.com/en-us/pricing/details/cloud-services/?WT.mc_id=azurebg_email_Trans_33675_1284_Tier_2_Release_MOSP

The latest generation of A-series, Av2 Standard, has similar CPU performance and faster disk to standard A series. Suitable for development workloads, build servers, code repositories, low-traffic websites and web applications, Av2 Standard also works for micro services, early product experiments, and small databases.

Need to Know podcast–Episode 125

We are back for 2017! Marc and I do our usual news and cloud updates followed by a returning guest, MVP Troy Hunt. Troy chats to us about his upcoming Microsoft Ignite Australia presentation - Applied Azure: Building a Large Scale Real World Application on a Coffee Budget, which makes for real interesting listening.

You can listen to this episode directly at:

https://ciaops.podbean.com/e/episode-125-troy-hunt/

or on Soundcloud here:

 

or subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send us any feedback or suggestions you may have for the show.

Resources

@troyhunt

@haveibeenpwned

@marckean

@directorcia

Have I been Pwned

Azure updates from Marc

Updated SharePoint Team Sites move beyond first release

Microsoft Partner services being revamped

New unified DLP in Office 365

Microsoft Connect car platform

Replacement to Azure RemoteApp coming soon

Azure Backup protects against ransonware

Tuesday, January 10, 2017

Using Azure DNS with Office 365

One of the tasks that you need to perform when you adding a custom domain to Office 365 is to firstly verify that you actually own the domain name.

image

The Office 365 domain setup wizard, as shown above, will give you a TXT record you need to insert into your DNS zone so ownership can be verified by Office 365 before proceeding further.

Azure has the ability to host DNS records for you rather than using a hosting provider, so let’s see how you configure this.

image

Open your Azure Resource Manager Portal and select to add a DNS Zone from the market place.

image

The name of your new DNS zone has to match the domain you wish to host. Here azlab01.net. I have also elected to place this new DNS zone into a Resource Group for easy management.

image

After a few moments, the new DNS zone will be created and you can navigate to it in the Azure Resource Manager Portal to manage it.

Simple select the new DNS Zone to view its details.

image

You should see something similar to the above

You will notice two DNS records have already been created shown in the lower half of the screen.

image

In the top right of the blade you’ll find the name servers as shown above.

image

You’ll need to update the domain registration for that domain to point to these name servers instead of where they are currently pointing as shown above.

image

In top left of the blade select Record Set to create a new DNS record in this zone.

image

A new blade will appear, as shown. To verify our Office 365 domain we need to add a TXT record with the string provided as shown above.

When complete, save the new record.

image

If we now look at our DNS zone we see an additional TXT record as expected.

image

If we return to Office 365 and select Verify, our domain should successfully be verified thanks to Azure DNS. We can now proceed onto managing the individual domain records ourselves in Azure DNS. To do this select the option I’ll manage my own DNS records and select Next to continue.

image

As expected, and shown above, we get a long list of DNS records to add to our zone. Now here’s were the benefits of using Azure DNS shine through.

We can use PowerShell with Azure DNS to set all our records using a script. Thus, instead of adding them manually one by one via a browser, we simply run a script that does all the work for us.

get-azurermdnsrecordset –zonename <domain> –resourcegroup <resource group>

To view the existing Azure DNS zone information run the above command once you are connected to Azure.

image

As you can see from the above, one of the entries is the TXT record entered into the Azure DNS zone manually via the portal.

To add an MX record for instance to the zone, execute the following command:

New-azurermdnsrecordset -name "@" -recordtype MX –zonename <domain.com> –resourcegroupname <resource group> -ttl 3600 -DNSrecords (new-azurermdnsrecordconfig –exchange domain-com.mail.protection.outlook.com -preference 0)

That should produce the following record in your zone:

image

If you now execute the appropriate commands that add the remaining records to your zone, you can then return to Office 365 and complete the wizard.

image

If everything is in order you should now get confirmation that your domain has been successfully configured for Office 365 as shown above.

The huge benefit here that Azure DNS provides is the ability to totally script this. Most of the DNS records you need to add for Office 365 are identical or derived from the custom domain you wish to add. Thus, all you need to do is set some parameters at the top of your script and the remainder remains identical. Thus, you can use one PowerShell script to set the DNS Zone records for EVERY custom domain you wish to add to Office 365! How much time is that going to save you if you need to set up lots of custom domains?

Another benefit Azure provides is that ability to assign different rights to different users in you Azure portal. Maybe only a few users can update records, while other can only view them.

image

As you can see from the Azure pricing calculator above, Azure DNS is not a free service. There is a small fee based on the number of zones and DNS queries on those domains you have. In this case, for 1 zone with 1 million queries the cost is AU$1.15 per month, which is really pretty cheap.

I think Azure DNS has a lot of benefits for IT Professionals managing domains. They could aggregate them all under their own partner tenant and become like a hosting business. They could also host the zone records in the individual customer’s Azure tenant, which of course could use the same logins as Office 365 because Office 365 comes with a free Azure tenant. I also like the idea of bringing this sort of thing back to a single supplier rather than using multiple hosting providers.

However, I think the real killer benefit is simply the ability to script everything thanks to PowerShell. This alone is going to save me so much time when I set up test domains and labs. It also means I won’t make spelling mistakes when entering the records for Office 365. All I’ll need to do is change the variable at the top of my script to match the domain I want to work with and then the script is good to go. How easy is that?