Incentivising Microsoft Teams contributions with Power Automate

One of the most important parts of technology adoption is incentivising people. One simple way to do this is to acknowledge their contributions in Microsoft Teams. I’ve spoken about this before using something like:

Custom Praise badges in Microsoft Teams

The challenge there is, that this is a manual process. How can the same be achieved the same kind of thing using automation I recently wondered?

What if you could automatically determine the number of chat messages that someone posted into Microsoft Teams, and then acknowledge the highest contributors with a custom ‘thank you’ message? Here’s how I managed to do just that using the Microsoft Graph and Power Automate.

The core information needed about messages posted in Microsoft Teams will come the Microsoft Graph, specifically:

getTeamsUserActivityUserDetail

The first step in this process will be to create an Azure AD application in the tenant, which I have detailed previously:

https://blog.ciaops.com/2019/04/17/using-interactive-powershell-to-access-the-microsoft-graph/

the credentials for this Azure AD application will then need to be uploaded into Azure Key Vault as I have covered here:

Uploading Graph credentials to Azure Key Vault

You’ll also need to set the API permissions for this application to :

image

Reports.Read.All

 You’ll need to trigger your Flow the way that suits you. Typically you’d schedule it to run on the first of each month. You’ll then need to use the Get secret action from the Azure Key Vault as shown above. You should also note that this is a premium connectors, so you’ll need an appropriate Power Platform license to use this.

image

Next, will be the HTTP action that also a premium connector. The URI to use here will be:

https://graph.microsoft.com/beta/reports/getTeamsUserActivityUserDetail(period=’D30’)?$format=application/json

The D30 parameter determines how many previous days to consider. Valid options for the number of days is 7,30, 90, 180 according to the documentation.

The security parameters come from the Azure Key Vault Get secret actions.

image

The data returned from this HTTP action will be in JSON format, so that is fed into the Parse JSON action as shown above.

One tricky things about parsing json is to get the right schema. When I first attempted this I received this error:

“message”: “Invalid type. Expected String but got Null.”,

After some digging I found that sometimes the lastactivitydate parameter returned a string or NULL value. Thus, I need to modify the original JSON for that parameter to accommodate both potential results:

“lastActivityDate”: {
     “type”: [
         “string”,
         “null”
     ]
},

Here the whole working JSON schema I used:

{
     “type”: “object”,
     “properties”: {
         “value”: {
             “type”: “array”,
             “items”: {
                 “type”: “object”,
                 “properties”: {
                     “reportRefreshDate”: {
                         “type”: “string”
                     },
                     “userId”: {
                         “type”: “string”
                     },
                     “userPrincipalName”: {
                         “type”: “string”
                     },
                     “lastActivityDate”: {
                         “type”: [
                             “string”,
                             “null”
                         ]
                     },
                     “isDeleted”: {
                         “type”: “boolean”
                     },
                     “deletedDate”: {},
                     “assignedProducts”: {
                         “type”: “array”
                     },
                     “teamChatMessageCount”: {
                         “type”: “integer”
                     },
                     “privateChatMessageCount”: {
                         “type”: “integer”
                     },
                     “callCount”: {
                         “type”: “integer”
                     },
                     “meetingCount”: {
                         “type”: “integer”
                     },
                     “meetingsOrganizedCount”: {
                         “type”: “integer”
                     },
                     “meetingsAttendedCount”: {
                         “type”: “integer”
                     },
                     “adHocMeetingsOrganizedCount”: {
                         “type”: “integer”
                     },
                     “adHocMeetingsAttendedCount”: {
                         “type”: “integer”
                     },
                     “scheduledOneTimeMeetingsOrganizedCount”: {
                         “type”: “integer”
                     },
                     “scheduledOneTimeMeetingsAttendedCount”: {
                         “type”: “integer”
                     },
                     “scheduledRecurringMeetingsOrganizedCount”: {
                         “type”: “integer”
                     },
                     “scheduledRecurringMeetingsAttendedCount”: {
                         “type”: “integer”
                     },
                     “audioDuration”: {
                         “type”: “string”
                     },
                     “videoDuration”: {
                         “type”: “string”
                     },
                     “screenShareDuration”: {
                         “type”: “string”
                     },
                     “hasOtherAction”: {
                         “type”: “boolean”
                     },
                     “urgentMessages”: {
                         “type”: “integer”
                     },
                     “postMessages”: {
                         “type”: “integer”
                     },
                     “replyMessages”: {
                         “type”: “integer”
                     },
                     “isLicensed”: {
                         “type”: “boolean”
                     },
                     “reportPeriod”: {
                         “type”: “string”
                     }
                 },
                 “required”: [
                     “reportRefreshDate”,
                     “userId”,
                     “userPrincipalName”,
                     “lastActivityDate”,
                     “isDeleted”,
                     “deletedDate”,
                     “assignedProducts”,
                     “teamChatMessageCount”,
                     “privateChatMessageCount”,
                     “callCount”,
                     “meetingCount”,
                     “meetingsOrganizedCount”,
                     “meetingsAttendedCount”,
                     “adHocMeetingsOrganizedCount”,
                     “adHocMeetingsAttendedCount”,
                     “scheduledOneTimeMeetingsOrganizedCount”,
                     “scheduledOneTimeMeetingsAttendedCount”,
                     “scheduledRecurringMeetingsOrganizedCount”,
                     “scheduledRecurringMeetingsAttendedCount”,
                     “audioDuration”,
                     “videoDuration”,
                     “screenShareDuration”,
                     “hasOtherAction”,
                     “urgentMessages”,
                     “postMessages”,
                     “replyMessages”,
                     “isLicensed”,
                     “reportPeriod”
                 ]
             }
         }
     }
}

image

I filtered the results produced to firstly remove users with zero messages and then I removed specific users (in this case me) from the results using the two Filter action shown above.

image

One of the limitations I found during this development process was that Flows are not very good at sorting information. It would have been nice if I could have just sorted my results from largest to smallest. I solved this but firstly creating a temporary variable into which I would store the highest number of chat messages. I then looped through all the remaining results using the Apply to Each action shown above, and set this variable to the greatest value found throughout the results.

image

I then employed another Filter action again to remove any results that didn’t match this maximum value.

The next challenge came when considering external users, who have a UPN in the format of:

user_domain.com#EXT#tenant.onmicrosoft.com

Turns out the the Power Automate lookup actions don’t seem to work using these type of UPNs, they only work with an email address apparently!

image

To solve this, I now created two more variables to use when extracting the external users email address.

image

I need to extract and create a user email address before using it in the Search for users (V2) action. To do this I used two functions inside two Set variable actions. The first is:

slice(items(‘Apply_to_each_2’)?[‘userPrincipalName’],0,lastIndexOf(items(‘Apply_to_each_2’)?[‘userPrincipalName’],’#EXT’))

which removes anything from the #EXT and beyond. This typically leaves the format of:

user_domain.com

The second function is:

replace(variables(‘EmailAddress’),’_’,’@’)

that replaces the underscore character with an “at” symbol. The output after both of these functions complete is the users full email address which I can then feed into the Search for users (V2) action to locate my user(s).

image

Now that I have the user information I can use it with the Get an @mention token for a user action and then post a message with that @mention and anything else into the Team to publicly recognise the user(s) using the Post message in a chat or channel action.

image

The end result is a nice message, like shown above, that acknowledges the user(s) for their contribution in the Microsoft Team. Best part is that now it is automated it’ll do all the hard work for you going forward!

The moral of the story here is that Power Automate combined with the Microsoft Graph can achieve just about anything you want. Yes, it does a little bit of setting up with an Azure AD application, Keyvault and so on but they are typically only once off. There is also some advanced techniques when it comes to the JSON parameters but once you start working with, it becomes easier. In short, none of that should put you off from what is possible with the Microsoft Cloud environment and putting the technology to work for you to give you more freedom and improve your business processes.

Defender for Endpoint remediation levels

If you read the Microsoft documentation:

Automation levels in automated investigation and remediation capabilities

you find that there are 5 different levels of remediation automation you can set:

– No automated response

– Semi – require approval for all folders

– Semi – require approval for non-temp folders

– Semi – require approval for core folders

– Full – remediate threats automatically

which are all detailed here:

Levels of Automation

Note:

Full automation is recommended and is selected by default for tenants that were created on or after August 16, 2020 with Microsoft Defender for Endpoint, with no device groups defined yet.

Thus, Automation levels rely on Device Groups in Defender for Endpoint.

image

You see this when you create a Device Group as shown above.

image

With Defender for Endpoint P2 you find Device Groups via https://security.microsoft.com | Settings | Endpoints | Device groups as shown above.

image

However, with Defender for Business (above), you’ll see that there are no options currently for Device Groups. This basically means that the all remediation will be performed automatically.

I don’t that it is really a problem, but is another difference between Defender for Endpoint P2 and Defender for Business. I have not tested Defender for Endpoint P1 but I assume that it have the same lack of Device Groups as Defender for Business has, but I would to check to be 100% sure.

Adoption with fun

*** UPDATE April 2023 – Dilbert has moved behind a paywall and is no longer available. See my updated post Adoption with fun and astronomy for alternate engaging content.

The majority of IT products and services are not actually used by IT people (amazing eh?). They are in fact, used by ordinary people (aka Muggels) in businesses, trying to do their job. For these people, changing the way that they work is frustrating because they need to adopt new approaches and tools. Helping with this adoption is a key to the success of modern approaches to IT I believe.

A handy technique that I have found to work well is make using new systems fun. In the distant past, when I was implementing SharePoint on premises, I used to implement the Daily Dilbert web part to post a Dilbert cartoon onto the front page of the SharePoint Intranet each day. The idea was to help drive adoption by getting people to visit the company Intranet to read the Dilbert comic and then, hopefully, dive into the other content that was there.

Today, the technology has changed but the adoption challenge hasn’t. I thought that I’d therefore share with you a way to get a Dilbert comic into your Teams channel daily using Power Automate.

This is all made possible via APIs and a suitable one I found is:

https://dilbert-api.glitch.me/json

which will produce an output that looks like this:

{"title":"Simulation TestingElbonia University Partial Win","image":"https://assets.amuniversal.com/4f2025a02e0d013a8769005056a9545d.png"}

In here you’ll see an image link to the Dilbert Cartoon.

Step one is to create a new Flow that is triggered at a recurring time.

image

Next, you want to add the HTTP action. In here, use the GET method and the URI set to the above API link as shown above.

The HTTP action is actually a ‘premium’ connector and may not be available to you by default. Thus, you may need an upgraded Power Platform license to have this available. Remember however, you’ll only need that license for the user creating and running that Flow.

image

You’ll then need to the Parse JSON action as shown above. The content here will be the Body from the HTTP action above and simply copy and paste the output of the API above into the option Generate from sample.

image

Now add Post message in a chat or channel action.

image

Enter option to post into the Team and Channel of your choice as shown above.

image

For the Message field select the </> option from the menu bar across the top, as shown. This will allow you to use raw HTML code here.

Type the following:

<img src = ”

then select the option to insert dynamic content like so:

image

(the lightning bolt icon)

image

In this list that appears you should be able to select image as shown above.

image

add the following text after the dynamic field

” width=”738″ height=”229″>

so the completed Message field looks like:

image

It is important that the HTML formatting is correct, otherwise the image will not display.

image

If you now test your Flow you should see the cartoon appear in your Teams channel as shown above. If you have scheduled your Flow daily then you should see a new comic every day. Remember, there is only one cartoon every 24 hours! Rerunning the Flow before then will simply display the same strip.

When the daily comic is more than three frames then it is cut off by default like so:

image

However, clicking on the comic will enlarge it for full viewing. This limitation is due to the height and wide parameters the HTML code used inside the Flow. Most strips are only three frames, that is why I used those height and width defaults for most readability most of the time, but you can vary those parameter if you wish.

So, the idea is to make visiting a Team a more fun place to visit regularly, hopefully with people engaging about the content to help drive adoption.

This Flow/API method can be utilised with just about anything that supports an API. Another I have found (although somewhat more risqué) is a Chuck Norris API here:

https://api.chucknorris.io/

which can be moulded to give a similar result (be it text only).

The only limitation of all of this is the need for the premium Flow HTTP action, but as I said, it is well worth the investment and is only really necessary for the user creating the Flow. Having a premium license for Flow opens up so many more capabilities, so it is highly recommended if you want to get serious about automation inside your environment.

Happily, Daily Dilbert is back baby! And now in Microsoft Teams.

Power Platform Community November webinar–Sessions

As mentioned here:

Power Platform Community November webinar

We had some issues with the screen recordings. The presenters graciously agree to re-record each of their sessions and they are now available for viewing here:

Patron Power Platform Community November 2021 Webinar – Session 1 – YouTube

Patron Power Platform Community November 2021 Webinar – Session 2 – YouTube

Patron Power Platform Community November 2021 Webinar – Session 3 – YouTube



Connect to Microsoft 365 using PowerShell

Once you have set up your PowerShell environment the next thing is to use it to connect to Microsoft 365 services like Exchange Online and Teams.

I have created several free automation scripts at:

https://github.com/directorcia

to make that process easy.

In this video, I’ll walk you through the steps of using what I have created to make it simple to connect to any Microsoft 365 service using PowerShell quickly and easily.

Here is a direct link to the video:

https://www.youtube.com/watch?v=c1PwAbzM8RI

Setting up PowerShell for use with Microsoft 365

This video will show you the process of setting up PowerShell on a new clean Windows 10 environment to support working with Microsoft 365. Basically, you grab my free set up script here –

https://github.com/directorcia/Office365/blob/master/o365-setup.ps1

and paste that into an elevated PowerShell window and run it. The required PowerShell cloud modules will then be installed into your environment, making it ready to connect to Microsoft 365, Azure, Intune, etc.

Here is a direct link to the video:

https://www.youtube.com/watch?v=UUb_lYxvlOc

Azure Cloud Shell now available in Microsoft 365

image

If you take a close look at your Microsoft 365 admin center, as shown above, you might see a new icon in the top right. The Azure Cloud shell is now available right from here.

image

If you select that, you’ll then see a PowerShell style window appear at the bottom of page as shown above. Here you can run all your favourite scripts directly in a browser!

I’ve covered the Azure Cloud Shell in previous articles:

Azure Cloud Shell

Connecting to Exchange Online with Azure Cloud Shell

and you can read the Microsoft documentation here:

Overview of Azure Cloud Shell

The only limitation seems to me is that you need an active Azure subscription tied to your Microsoft 365 environment because Azure Cloud Shell does need some storage to operate. But who doesn’t have an Azure subscription in their tenant these days right?

Deploy Office 365 and Azure together

(Hint, this is another reason to ALWAYS sell an Azure subscription when you sell Microsoft 365 if you are a reseller).

Hopefully, Microsoft might allow some included storage in the future for those without an Azure subscription.

Having the ability to run PowerShell directly from the browser with Microsoft 365 is a super handy addition and hopefully the functionality will keep extending with this.

PowerShell with Azure Conditional Access

Recently, I did a video demonstrating how PowerShell can be used to automate Endpoint Management:

PowerShell with Endpoint Manager

I’ve now also created a video demonstrating how to automate Azure Conditional Access using PowerShell. As before, I am only making these scripts available via the CIAOPS Paton program.

In this video you’ll see me automatically backup up both Conditional Access locations and policies, then apply best practices locations and policies, finally restore the original policies, all using scripting.

Again, these scripts are not free and part of the CIAOPS Paton program. You’ll find my free stuff at https://www.github.com/directorcia.