Power Automate Azure Key Vault access inconsistencies

I’m in the process of building a Flow that connects to a Dataverse database inside a Microsoft Team. When you create this you get the ability to create Cloud Flows (aka Power Automate).

image

However, there is an issue when you try and use something like Azure Key Vault actions here.

image

In the above, you can see that I’m in my default Power Automate environment and the Get secret action of Azure Key Vault is accessible as expected and shows all the items I have inside the vault.

image

However, if I swap to another environment that was created as part of a Team (here an environment called Automation), you’ll see that I can add the Azure Key Vault action Get Secret but I no longer see the items inside that vault as I did before! I am using the same user in both cases.

It has clearly something to do with the connection,

image

which shows up as invalid as you can see above.

image

If I try and add a new connection, I see the above dialog but can’t make any changes or enter any information. Looks like I might need to investigate the Connect with a service principal option perhaps?

However, for now, there seems to be a limit when you use the Azure Key vault actions inside anything that is not the default environment for the Power Platform in your tenant. I will assume this is because these environments are limited to Microsoft Teams and have innate restrictions that I’ll need to find information on. If you know what this is, I’d love to hear from you.

Create a Dataverse database in Microsoft Teams

What I want to achieve in this process is to create a single Microsoft Dataverse database inside a Microsoft Teams and allow a basic Power Automate to add data to it.

image

Firstly, navigate to a Microsoft Team in the environment (here Automation), and select the + (plus) icon along the menu on the right as shown above.

image

In the list of options that appears, search for, and select Power Apps as shown above.

image

The first interesting thing, once you do that, is you typically can only select from pre-existing Power Apps that are listed in the dialog. However, there is an option create an app in Power Apps that you can select towards the bottom of the dialog as shown above.

image

You should then see the dialog display, like the one above, telling you to wait while things get set up.

image

If this process gets hung up after a minute or two, just refresh the page in your browser. You should now see something like that shown above with a list of the Microsoft Teams on the left. If you select the Microsoft Team you want to put the Dataverse database into (here Automation) you should see that nothing is built yet in the information area on the right.

image

Select the New button on the right and then App from the options that appear as shown above.

image

If you take a quick peek at the Power Platform admin center, in a new browser tab, and then Environments from the menu on the left or use the direct link:

https://admin.powerplatform.microsoft.com/environments

You’ll see that a new Power Platform environment has been created matching the name of the Microsoft Team (here Automation).

As the Microsoft documentation on Power Platform environments says:

https://docs.microsoft.com/en-us/power-platform/admin/environments-overview

A Power Platform environment is a space to store, manage, and share your organization’s business data, apps, chatbots, and flows. It also serves as a container to separate apps that might have different roles, security requirements, or target audiences.

In essence, think of an environment as a container to store things you create in the Power Platform. When you create a Power Platform App inside a Microsoft Team, it creates them in a unique container.

image

The idea is that you should be able to easily switch between environments. However, if you navigate to the Power Platform service directly at:

https://make.powerapps.com/

You are not able to see the environment just created in Microsoft Teams as shown above for some reason. It seems the only environments you can see here are those created directly in the Power Apps make portal.

image

You can drill into the new Teams environment you just created in the Power Platform admin center by selecting it from the list. Information about the environment will be displayed as shown above.

image

If you return to your app creation process inside the Microsoft Team, you’ll now need to give your app a name (here Capture)

image

Typically, you build a full app here but for now all we want to create is a single database, so select the Data icon on the left (cylinder) as shown and then select the Create new table button to the right of it.

image

You’ll then be asked to give the table a name (here Id). If you open the Advanced settings option at the bottom of the dialog, you’ll see that there are not many additional options to select from.

Select the Create button to continue.

image

You should now see the table displayed as shown above. You’ll also notice that there is already a column called Name created. This is a bit like when you create a new SharePoint list and get a single column created for you as well.

image

If you try and edit this initial column by selecting the header and then the Edit column option from the menu that appears above,

image

you’ll find there are not a lot of options available. This maybe limiting or just annoying as it is in SharePoint, but for now just leave that column in place. You’ll just need to remember to put some data in it as it is a required field.

image

You can then add any addition columns you require. Here I’ve added the columns Domain, Date and Value. These are the fields I want to populate with custom data.

image

If I return to the previous screen you should now see the Dataverse database listed as shown above.

image

Returning to the Build page in Power Apps in Microsoft Teams, and selecting the Microsoft Team (here Automation), you should now see some entries in the Items created for Automation list on the right. Here, you should also see the database just created as noted above.

image

If you select the database directly from this screen you can drill in and see the table and any entries as shown above. No data appears in the table yet as none has been added.

image

The way to get data into the database here will be via a very basic Power Automate Flow. It is a good practice to create this also inside the same Power Platform environment in which the Dataverse database was just created. Do this via the Cloud Flows option on the left as shown above.

image

To create a Flow, select Cloud Flows, then from the menu at the top on the right select the + New button. From the options that appear select Cloud Flows then type of Flow desired (here an Instant Flow).

image

The process for creating a Flow is the same as if you were creating a stand alone Flow via the Power Automate service. In this case, simply add the Dataverse Add a new row action as shown above. Configure this action to connect to the Dataverse database created earlier (Ids), then add some random text for the required default Name field (Hello), then data for Date, Domain and Value as shown above.

Save and Run the Flow.

image

If everything is correct, the Flow should run without errors as shown above.

image

If you then look at the details of the database you should see that it now has data inside it as shown above.

image

You could also create a Flow directly from the Power Automate service, but remember to switch to the new Microsoft Teams environment that was created by adding a Power Automate app to the Microsoft Team before creating the Flow.

image

The final interesting item here is to look at the capacity of the new database in the Power Platform admin center where you’ll find that, although you have a total size of 2GB, about 25% has already been consumed by the system.

For more information about the Dataverse for Teams consult the Microsoft documentation here:

About the Microsoft Dataverse for Teams environment

Set up PAYG for Power Platform

The Power Platform now has the ability to be Pay As You Go (PAYG) for licensing. This is a great option to get access to many advanced capabilities on demand. When you configure this option the billing is done via Azure rather than Microsoft 365. This means, prior to setting up PAYG for the power Platform, you’ll need to have an Azure subscription in place. As I have highlighted before:

Deploy Office 365 and Azure together

Once you have the Azure subscription in place, inside the same tenant where you want to enable PAYG for the Power Platform, you’ll need to have or create an Azure Resource group that will be associated with the PAYG option. You need to create this ahead of time. The following will show you how to create one if you need to:

Manage Azure resource groups by using the Azure portal

image

You’ll then need to visit the Power Platform admin center which is at:

https://admin.powerplatform.microsoft.com/

then select Billing policies. From the top menu now select New billing policy as shown above.

image

Give the policy a name, the name needs to be at least 10 alphanumeric characters. Select Next to continue.

image

Now select the Azure subscription you want the billing ties to. Then select the Resource Group that you want to use, the must already be in your Azure subscription as aI noted earlier. Finally select the region and press Next to continue.

image

Now add any environment to the subscription and press Next to continue.

image

Here select Create Billing Policy.

image

The policy should now be created and displayed as shown above.

You create additional billing policies if you wish by simply repeating the above process. Doing so would allow you to tie that policy to a different Azure subscription and/or Resource Group for billing and management if needed.

For more details on the Power Platform PAYG option see:

Set up pay-as-you-go

CIAOPS Need to Know Microsoft 365 Webinar – March

laptop-eyes-technology-computer

Join me for the free monthly CIAOPS Need to Know webinar. Along with all the Microsoft Cloud news we’ll be taking a look at Power BI.

Shortly after registering you should receive an automated email from Microsoft Teams confirming your registration, including all the event details as well as a calendar invite.

You can register for the regular monthly webinar here:

March Webinar Registrations

(If you are having issues with the above link copy and paste – https://bit.ly/n2k2203 – into your browser)

The details are:

CIAOPS Need to Know Webinar – March 2022
Friday 25th of March 2022
11.00am – 12.00am Sydney Time

All sessions are recorded and posted to the CIAOPS Academy.

The CIAOPS Need to Know Webinars are free to attend but if you want to receive the recording of the session you need to sign up as a CIAOPS patron which you can do here:

http://www.ciaopspatron.com

or purchase them individually at:

http://www.ciaopsacademy.com/

Also feel free at any stage to email me directly via director@ciaops.com with your webinar topic suggestions.

I’d also appreciate you sharing information about this webinar with anyone you feel may benefit from the session and I look forward to seeing you there.

Automated user tenant access control

image

If you ever used an on premises Active Directory (AD) you may be aware of the setting, shown above, that allows you to set users login times. This was typically done to prevent users logging in after hours, say from 9pm to 6am.

Unfortunately, with Azure AD there is no direct equivalent setting but we can create something similar quickly and easily using Power Automate.

image

The first step in this process is to create a new Azure AD security group that will contain the users who will be prevented from accessing the tenant. You can also create this security group in the Microsoft 365 portal but it is better to do it in Azure as you’ll need to get the ObjectID for this group as highlighted above. There is also no need to actually put any users into this group as they’ll be added dynamically by Power Automate.

image

To prevent access to the tenant you’ll need to create a Conditional Access policy as shown above. This will require you to have a license for Azure AD Premium P1 or P2 for each user. Microsoft 365 Business Premium already includes Azure AD P1, so if that is already in the environment you need nothing additional.

Give this new Conditional Access policy a name and set it to include just the Azure AD security group created previously. It is also best practice to exclude at least one administration account to prevent you from being ‘locked out’ of your tenant. Ensure that All cloud apps is selected and that Block access is configured, as shown above. Finally, turn this Conditional Access policy On.

With no members in this Azure AD Security group, no one will be restricted from accessing the tenant.

image

Next, create a List in SharePoint that contains a list of users you want to be blocked. You can achieve this by adding the Person field to the list in question. This will basically allow you to enter users who are in your tenant by doing a lookup from Azure AD.

image

Create a new Flow and use the Recurrence action to trigger it as shown above. Select an appropriate time once a day when users will be prevented from accessing the tenant, say 9pm.

image

Add the Get items action as shown above next. Configure this action to retrieve from the list of users from the SharePoint list just created.

image

Then use the Apply to each action to loop through all the users returned by this Get items as shown above.

image

Inside the Apply to each action use the Get user profile (V2) action as shown, with the users email address, which is effectively their Azure AD identity.

image

Also inside the Apply to each action add the Add user to group action as shown above. Populate this action with the ObjectID of the Azure AD security group obtained at the start and the Id from the Get user profile (V2) action.

image

Now test the Flow manually to ensure it works correctly.

image

Once the Flow completes, the users in the list in SharePoint should now appear in the Azure AD security group as shown above.

image

Now when a user on that list attempts to login, because they are part of a security group that is part of a blocking Conditional access policy, they will no longer have access to the tenant.

image

To allow access again for users simply create another scheduled Flow executing at say 6am that uses the Get group members action and then a Remove Member from group action inside an Apply to each action as shown above. In essence, this removes all users from the Azure AD security group that is part of the blocking Conditional Access policy, resulting in those users no longer being blocked from accessing the tenant.

image

Run this Flow manually and ensure it completes,

image

and you should find that Azure AD security group to be empty, as shown above.

image

The users in the list who were previously blocked,  should now be able to access the tenant as normal. Left to its own devices, users in the SharePoint list will have their access blocked from 9pm to 6am each day now.

This automation is very quick and easy to set up. It can solve the challenge of ‘forcing’ users to take a break from work after hours, rather than being ‘on’, aiding their mental health and making them more productive when they do work. It could be used to improve security but allowing account to only operate during ‘business hours’ and limiting attacks after hours, which is when many attacks happen.

This process could be extended and enhanced to provide more granular options to suit any need as well as alerting. However, hopefully, it demonstrates how easy it is to solve business challenges thanks to the Power Platform and the integration of Microsoft 365. Remember, the only extra you need is Azure AD Premium P1 to enforce Conditional Access, something already part of Microsoft 365 Business Premium.


Get a list of devices from Defender for Business into a SharePoint list

image

One of great things about an API is that it can be used in many places. I showed how to:

Offboard devices from Microsoft Defender for Business using an API with PowerShell

and I can do something similar with the Power Platform.

First step in that process is to get a list of Microsoft Defender for Endpoint devices and put them into a pre-existing list in SharePoint. For that I use the above Flow.

image

Once the Flow has been triggered I grab the Azure AD application credentials from the Azure Key Vault. I’ve covered off how to create an Azure AD application here:

https://blog.ciaops.com/2019/04/17/using-interactive-powershell-to-access-the-microsoft-graph/

and using a PowerShell script I wrote here:

https://blog.ciaops.com/2020/04/18/using-the-microsoft-graph-with-multiple-tenants/

Getting the Azure AD application credentials into an Azure Key Vault can be done manually or by using this scripted process I’ve covered previously:

Uploading Graph credentials to Azure Key Vault

Once they are in the Azure Key Vault they are easy to access securely using the Flow action Get secret as shown above.

image

The next step is to delete devices I already have in the list in SharePoint because I want only current devices to be brought in. To achieve this, I get all the items from my destination SharePoint list using the Get items action. Then, using the Apply to each action and the Delete item action inside that loop, existing entries will be removed so I have a clean list.

image

I’ll now use the HTTP action to execute an API call to the Defender environment as shown above. The API endpoint URI to get a list of devices in Defender for Endpoint is:

https://api.securitycenter.microsoft.com/api/machines

Access is granted via Active Directory Auth and the Authority is https://login.microsoftonline.com. You also need to use the credentials of the Azure AD application obtained previously from the Azure Key Vault, as shown above. Ensure that the Audience is https://api.securitycenter.microsoft.com/.

image

The output of this API request will be a JSON file so we now use the Parse JSON action to obtain the fields needed. To understand what the JSON looks like and insert a copy into this action look at the Microsoft documentation here:

List machines API

which provides a response sample that you can use.

image

The last action in the Flow is to take the parsed JSON output and enter those details into the pre-existing SharePoint list that you need to create to house this information.

image

I’ve kept the destination list simple, as you can see above. Basically, the final Apply to each action places each device and its information as a row into the destination SharePoint list.

image

If I now run this Flow, I see it runs successfully.

image

Looking at my SharePoint list I see I have a new list of items as expected.

image

If you weren’t aware, the ‘eyelashes’ on an entry in SharePoint indicate it is new.

Now I have copy of all the machines in my Defender for Endpoint in a SharePoint list. You will also see that my SharePoint device list contains an additional ‘Offboard” column that I am going to use when I implement another Flow to offboard devices from Defender for Endpoint, much like I did with PowerShell previously.

You can also easily extend the operation across multiple tenants if I want using Azure AD applications in each.

The great thing about using the Power Platform and APIs is that for many, it is much easier to get the result they want rather than having to write code like PowerShell. Also, the Power Platform environment has many capabilities, such as sending emails, adding extra metadata, etc. that are much easier to do than using PowerShell. Once the Defender for Endpoint device list is in SharePoint there is really no end to what could be done.

With that in mind, stay tuned for an upcoming post on how to use what’s been done here and another Flow to actually offboard devices from Defender for Endpoint.

Celebrating anniversaries with Power Automate

A very common requirement is to remind people about anniversaries. In a business this could take the form of birthdays or commencement dates. It could, however, just as easily be any sort of event that happens on a certain date.

Previously, I’ve shown how to:

Send recurring tweets using Microsoft Flow

However, in this case, instead of simply rotating through a list of posts we want to match today’s date to a date on a list and then broadcast the message that corresponds to that date entry.

image

The starting point for this process is to create a reference list containing the dates and details you wish to share. I recommend that easiest place to do this is in a SharePoint list, as shown above. Of course, this list can contain as much detail and additional columns as you wish, but for this, I’ll keep it simple and just have two fields. It is important that you have at least one column (here Dateoption) that refers to the current year in which that item will be displayed.

image

For simplicity, I have also configured my date column in the SharePoint list to exclude time and display in standard format as shown above. There is nothing stopping you using Date & Time if you wish, it just makes the filtering a little more complex later in this process.

You’ll then want to create a Power Automate Flow that looks like this:

image

It all starts with a Recurrence action that will trigger this process once a day like so:

image

If you select the Show advanced options in this action like so:

image

You can set an exact time when this Recurrence action will be triggered (say 10am). However, since this example is a daily anniversary, we only need to trigger it at any time during the day.

image

We now need our process to determine what the current date is and we can do this using the Current time action as shown above.

image

Next, add the Convert time zone action. There are two reasons for adding this action. Firstly, the Current time action returns today’s date in UTC which may cause issues if you are not in that time zone like me. Thus, I want the current time BUT I want it as a local value (i.e. to reflect the actual time in Sydney, Australia), thus the Source and Destination time zone field settings.

The second reason for the Convert time zone action is so the time value is in the right format for a comparison test later on in the process. Thus, the Format string field should be set to yyyy-MM-dd as shown.

Now, I need to add the Get items action to actually go and look at what is in my SharePoint list.

image

In this action I enter the Site Address and List name, however I also expand the advanced options to reveal the Filter Query field as shown.

image

The Filter Query field will limit the items returned by this action to only those that match the filter. Thus, I want the returned items to only be those that match today’s date, which I have correctly formatted and stored in the Converted time action result. Thus, I want to compare the date field from my SharePoint list (here Dateoption) to the Converted time result. It is important to note that I have enclosed Converted time result in single quotes (‘) to convert the value to a string for comparison. It is very important that you do that, otherwise you’ll get errors when your Flow runs.

image

With the values that the Get Items action returns you’ll need to perform a number of steps. For this you use the Apply to each action as shown above.

image

In the case of this example, I’m simply going to post the text from the Title field in the SharePoint list that matches today’s date into a chat message as shown above. Again, this action could be anything you want, in fact, I’ll talk about how I use this with Twitter later on. For now the expectation is that if there is a match in the SharePoint list for today’s date, then the text for that entry will appear in a Teams chat message.

image

We need to do one more thing before we are finished here. As this is an anniversary calendar, we want to increment the current item for today and have it reoccur on the same date next year. To do that we use the Add to time action as shown by adding 12 months to the result date we have determined as shown above.

image

Before the new date can be added back to the SharePoint list it needs to be formatted correctly. This is achieved using the Compose action as shown using the following expression:

formatDateTime(body(‘Add_to_Time’),’yyyy-MM-dd’)

You’ll notice that that date format yyyy-MM-dd is the same as the one we set in Convert time zone action earlier.

image

All that remains is to use the Update item action to update the item in the SharePoint list with the new date entry just composed. As shown above, the same SharePoint site and list is selected, along with the item ID and Title but the Dateoption field is set to contain our new formatted date output from the previous action.

You can now save your Flow and run a manual test.

image

If I look at the chat in my Team I see the expected message that matches the item Title field in the original SharePoint list.

image

Also looking at my original SharePoint list I see that the date of today’s item has been incremented twelve months as shown.

image

One of the ways that I use this process with Twitter is to regularly post anniversary dates around ANZAC participation from World War One, which are taken from my site ANZACS in France.

The idea is that that the Flow checks this list of dates and then tweets out the text in the Title field if there is a match. Then it increments the PostDate field twelve months ready for next year. You’ll also see that I have added another custom column that records the original date of action just so I can filter and sort easily. Feel free to follow @ANZACSIF to be reminded of these dates.

As I initially mentioned, I believe there are plenty of applications for this type of process in a business. The most common ones I would suggest are for staff birthday and anniversary reminders. The great thing is that with Power Automate it is easy to modify this process to suit whatever need to have. It also makes it easy to edit the events and more if you need to because all you need to do is modify the SharePoint list that this process uses.

The possibilities are endless thanks to the Power Platform.

Power Automate ODATA filter failure when field named ‘Date’

image

So, I was doing some testing with a new Flow in Power Automate. What I wanted it to do was, at a recurring time each day, look for today’s date in a list of SharePoint items and then display other values from any matching record in a Team’s chat. To prototype this out I created a very simple list with two columns, as shown above, Title and Date. Remember, the Title field is generally created for you by default when you create a basic SharePoint List.

image

In Power Automate I used a SharePoint Get Items action as shown above to get the information I wanted. To filter down to the data I used on ODATA query like:

Date eq ‘2021-12-31’

to test. Problem was, as shown above, I was getting no results that were feeding through to the next Apply to each action that followed directly after.

There were no errors indicated in my Flow. I tried a number of different format options and so on, trying to work out what the issue was.

image

The issue turned out to be the name of the field – Date – I had created in my SharePoint List! Once I created a new column called Dateoption with the same format, and entered the same data into it and removed the offending Date column, it successfully filtered data as expected and passed the result to the following Apply to each action as you can see above.

The moral of the story is that you should probably avoid naming your fields with any ‘reserved’ programming commands like ‘Date’ as I did. Make it something unique like ‘Datefield’ or whatever. Just don’t use a common term like ‘Date’ as I did or you might struggle to troubleshoot as I did here.

Hopefully, this will save you wasting the amount of time I did to solve this that you can better spend on creating your Flow!