Adoption with fun and astronomy

A while back I detailed how to schedule a Dilbert comic to appear daily in a Microsoft Teams channel:

Adoption with fun

Sadly, Dilbert has moved behind a paywall which means that process no longer works. As such, I have been searching for a suitable replacement and have settled on the Astronomy Picture of the Day from NASA.

The basic concept from the Dilbert process is the same. This process also requires a premium Power Automate connector, which you can easily configure using with either a Power Platform Premium license or using Power Platform PAYG configuration with Azure which I have shown previously.

image

The starting process is to create a new Scheduled Cloud Flow and select the time when you wish that Flow to execute.

image

You will then need to add a HTTP action as shown above. This is a premium connection mentioned previously. This HTTP action will need to use a GET method for the URI:

https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY

image

Open that URI in a new browser tab and you should see some JSON information returned as shown above. Copy all of this.

image

Next add the Parse JSON action to the Flow. Then select the Generate from sample button at the bottom of this action as shown above.

image

Paste the text obtained from browser window previously in here and select Done.

image

Ensure you have the Body option selected in the Content area as shown above.

image

Next, add the Post message in a chat or channel as shown above. Add the appropriate Team and Channel. Then in the Message area select the </> icon in the top right to enable HTML editing.

image

Complete the formatting any way you wish but this is what I used:

<br><h1>Space Image of the day</h1>
<p><b>@{body(‘Parse_JSON’)?[‘title’]}</b><br><br>
<img src = “@{body(‘Parse_JSON’)?[‘hdurl’]}”><br><br>
@{body(‘Parse_JSON’)?[‘explanation’]}</p>

Basically, I’m going to display a heading, then the title, high definition image and explanation (from the returned result).

The result when the Flow runs is:

image

and when the imaged is clicked on, you see something like:

image

Remember, the whole idea here is to encourage people to regularly visit the Team in questions and hopefully drive more engagement of the environment.

Connecting a Sparkfun ThingPlus ESP32-S2 WROOM to Azure IoT Central

One of the main aims I’ve had with all my IoT projects was eventually to integrate them into Azure. One way that I found was via:

Connecting to Azure IoT hub

The limitation there is that it really only gets the telemetry into Azure. From Azure IoT hub you need to send it off to another application to get any real value.

What I wanted to achieve was to send that data into Azure but have it display some sort of result, like a graph, without me having to do anything too much low level work.

The solution was to use the Azure IoT Central service. So the project plan was to use what I learned in building an

Adafruit Huzzah Temperature sensor

but instead of simply displaying the results on the serial console to have the results sent ot Azure and displayed in a graph.

The starting point was:

Quickstart: Connect an ESPRESSIF ESP32-Azure IoT Kit to IoT Central

problem was that the hardware device they use in this project is now obsolete it appears:

image

https://au.mouser.com/ProductDetail/Espressif-Systems/ESP32-Azure-IoT-Kit?qs=PqoDHHvF64%252BuVX1eLQkvaQ%3D%3D

Instead, I decided to use a:

SparkFun Thing Plus – ESP32-S2 WROOM

The hope being that it would be close enough to what the original document wanted.

Also for guidance and source files I used:

Connecting ESPRESSIF ESP32 to Azure IoT Central using the Azure SDK for C Arduino library

You should start here:

Getting started with the ESPRESSIF ESP32 and Azure IoT Central with Azure SDK for C Arduino library

which will take you through setting up a new IoT Central Application, which I won’t repeat here. The result of that will be 3 items that will need to be included in the code for the device:

  • ID scope
    Device ID
    Primary key

Next, you’ll need to download all the source files in the repo and include them in a new PlatformIO project. The files are:

  • AzureIoT.cpp
    AzureIoT.h
    Azure_IoT_Central_ESP32.ino
    Azure_IoT_PnP_Template.cpp
    Azure_IoT_PnP_Template.h
    iot_configs.h

I renamed Azure_IoT_Central_ESP32.ino to main.cpp in my project.

The next thing you’ll need to do is set your local wifi parameters in the file iot_configs.h. The settings should look like:

// Wifi

#define IOT_CONFIG_WIFI_SSID “<YOUR LOCAL SSID>”

#define IOT_CONFIG_WIFI_PASSWORD “<YOUR WIFI ACCESS POINT PASSWORD>”

Make sure you save any changes to the files you make.

In this same file also locate and the set the Azure IOT Central settings like:

// Azure IoT Central

#define DPS_ID_SCOPE “<ID SCOPE>”

#define IOT_CONFIG_DEVICE_ID “<DEVICE ID>”

// Use device key if not using certificates

#ifndef IOT_CONFIG_USE_X509_CERT

#define IOT_CONFIG_DEVICE_KEY “<PRIMARY KEY>”

which need to include the values obtained when configuring Azure IoT Central earlier.

If you now build your code and upload it to the device you should find that it will connect to your local wifi and start sending information to Azure IoT Central.

image

The device configured in Azure IoT Central should report as connected as shown above when you view this in the Azure IoT Central portal at:

https://apps.azureiotcentral.com/

image

If you then select the Raw Data menu item as shown above, you see the data from your device being received regularly into Azure.

image

If you look at the serial monitor connected to the device locally you should see something like the above indicating that data is being sent up to Azure.

This, therefore, now indicates that there is a correct connection to the Azure IoT Central portal. The problem is that the data being sent currently is actually just static dummy data that never changes. What I want to do is send actual data read from a temperature sensor connected to my device. So I need to find the source of the data in the code so I can replace that with the dynamic data from the tempreture sensor connected to my device I want.

Turns out the source of that dummy data is in the file Azure_IoT_PnP_Template.cpp around line 236:

image

What I now want to do is replace the static value of 21.0 for temperature and 88.0 for humidity with actual readings from the device.

To achieve that I’ll need the code from the previous project that read the temperature data which is here:

https://github.com/directorcia/Azure/blob/master/Iot/huzzah-tempsens.ino

I’m going to add that to a new file in my project called ciaath.cpp to keep my code separate from the templated Azure stuff. In there I’ll have 2 functions:

float ciaaht_getTemp() which returns temp.tempreture

float ciaaht_getHumidity() which returns humidity.relative_humidity

Remember, both temp and humidity are objects and all I want is the actual numeric value in there.

I’ll also create a ciaath.h file that looks like:

#ifndef CIAATH_H
#define CIAATH_H
void ciaaht_init();
float ciaaht_getTemp();
float ciaaht_getHumidity();
#endif

The idea is that this tells other pieces of code about these functions. You’ll also note I have a function ciaaht_init() to initialise the temperature sensor at start up.

Back in the Azure_IoT_PnP_Template.cpp file I need to include the line:

#include <ciaath.h>

to tell it about my functions in my ciaath.cpp file. I can now also change the lines that report the temperature and humidity from their original static value to the value read from the temperature senor connected to my device to be:

static float simulated_get_temperature() { return ciaaht_getTemp(); }
static float simulated_get_humidity() { return ciaaht_getHumidity(); }

which basically get the data from my device which will then be sent to Azure.

Back in main.cpp I need to add:

#include “ciaath.h”

to tell it about my custom functions. I also have to add around line 359:

ciaaht_init();

to initialise the temperature sensor on my device at startup.

Once this all compiles and uploads to the device I can again check Azure IoT Central portal and see in the Overview menu item

image

and I see my temperature and humidity are no longer a constant.

If I heat up the temperature senor connected to my device I see:

image

and if I leave it to return to normal I see:

image

I’ve put all the code up at:

https://github.com/directorcia/Azure/tree/master/Iot/ESP32-S2/IoT-Central

so you can have a look and use it if you need to.

I did need to get some help along the way, especially with the code and working out where the values uploaded to Azure came from initially as well as how to structure the .h files to make it cleaner. I’m no coder but hopefully my explanation here helps other non-coder, but let me know if I haven’t got it right as I really want to better understand all this.

I’m now super happy I have this working and I’m confident that I can use this as a base to start creating more powerful projects connected to Azure!

Need to Know podcast–Episode 300

In this episode I cover off why adding Azure to every environment makes sense. Even though the billing model is different that doesn’t there isn’t an opportunity to add value to an environment with what Azure can provide. There are also plenty of updates from the Microsoft Cloud with many exciting new things to try. Listen along and let me know if you have any feedback.

You can listen directly to this episode at:

https://ciaops.podbean.com/e/episode-300-why-you-should-add-azure/

Subscribe via iTunes at:

https://itunes.apple.com/au/podcast/ciaops-need-to-know-podcasts/id406891445?mt=2

The podcast is also available on Stitcher at:

http://www.stitcher.com/podcast/ciaops/need-to-know-podcast?refid=stpr

Don’t forget to give the show a rating as well as send me any feedback or suggestions you may have for the show

This episode was recorded using Microsoft Teams and produced with Camtasia 2022.

Brought to you by www.ciaopspatron.com

Resources

@directorcia

@directorcia@twit.social

Join my shared channel

CIAOPS merch store

Become a CIAOPS Patron

CIAOPS Blog

YouTube edition of this podcast

Introducing the new Microsoft Teams, now in preview

The new Teams

Welcome to the new era of Microsoft Teams

Windows 365 Frontline available in public preview

Adding your Microsoft Store for Business and Education apps to the Microsoft Store in Intune

What’s New at Microsoft Secure

Avatars for Microsoft Teams

Introducing Microsoft Security Copilot: Empowering defenders at the speed of AI

Explaining the Microsoft 365 Copilot System

Microsoft Incident Response Retainer is generally available

Microsoft awarded Best Advanced Protection for Corporate and Consumer Users by AV-TEST

New Microsoft Intune Devices experience

What’s new in Microsoft Intune – 2303 (March) edition

How to enable Microsoft Authenticator Lite for Outlook mobile (preview)

Getting Endpoint Privilege Management rule policies working

In a recent article:

Getting Endpoint Privilege Management working

I detailed how to get the basics of Endpoint Privilege Management working using settings policies.

The next step in the process is to get the rules policies working in conjunction with this. The scenario will be that we want to only allow a single application to be run with elevated privileges on a device. Here, that application will be Adobe Acrobat installer.

As before, we’ll need to go back into https://intune.microsoft.com under the Endpoint Security menu option as shown above.

image

We’ll firstly need to edit the original Settings policy from the previous article and change the Default elevation response to Deny all requests as shown above. This will block any request to elevate by default.

image

Next, we’ll need to create a new policy with the Profile set to Elevation rules policy as shown above.

image

As always, we need to give this new policy a name.

image

On the following screen select Edit instance on the right as shown above.

image

On the blade that appears from the right, you’ll need to give the Rule a name and then a description if you wish.

For the Elevate type I have selected User confirmed rather than automatic as well as requiring Validation to be a Business justification as shown.

Next is the actual file name for the Acrobat Reader installer which is acrordr2300120064_en_US.exe in the File name field.

Screenshot 2023-04-04 180747

To get the file hash I used the PowerShell command get-filehash as shown above.

Screenshot 2023-04-04 180929

The remaining details were obtained from the properties of the file, as shown above.

I then saved this Rule and completed the creation of the policy using the standard process, ensuring I applied it to teh appropriate group in my environment.

Once again, you need to wait until the policies have been pushed out to all devices.

Screenshot 2023-04-04 180525

With the policies deployed, if I now right mouse click on the Acrobat Reader installation file and select Run with elevated privileges I see,

Screenshot 2023-04-04 181930

that the configured app is identified in the dialog and I need to provide a business justification for the installation as was configured in the rules policy.

Screenshot 2023-04-04 182041

Screenshot 2023-04-04 182221

Once that has been completed the application installs as normal.

Screenshot 2023-04-04 182359

The Adobe Reader application runs on the device once the installation is completed as shown above.

Screenshot 2023-04-04 182512

If I try and install another application by using the run with elevated privileges option (here, on the file officesetup.exe), it is blocked as shown above because the default setting policy is deny all. To allow this, another rule for that specific file would need to be created in the policy.

This means that you can now create a default Privilege Management settings policy to deny all requests to elevate and then have specific rules to only allow pre-defined applications to be run as administrator on the device. Remember, all this can be done without needing to have a local administrator on the device.




Getting Endpoint Privilege Management working

If you are not aware yet, Endpoint Privilege Management is now available in public preview.

image

You can find it in https://intune.microsoft.com under the Endpoint Security menu option as shown above.

image

You’ll firstly need to use the Create Policy menu option, as shown to create a policy for your environment.

Select Windows 10 and later for the Platform (only option currently available).

Select Elevation settings policy for the Profile.

Select Create to continue.

image

As always, give the new policy a name and select Next to continue.

image

The most important thing here is to ensure that the option Endpoint Privilege Management is set to Enabled as shown above.

In this case, the Default elevation response is set to Require user confirmation.

Select Next to continue.

Continue through the rest of the policy as normal, ensuring you assign this policy to an appropriate group in your organisation.

image

You can then select on the new policy to view it and then select View report to see the results of how the policy has been applied in your environment.

It is important to ensure your workstations are at the appropriate update level. At the moment that is:

image

The policy will NOT work until you are at this level.

Screenshot 2023-04-04 153526

The above shows the client I used was Win 10 22H2 Build 19045.2788.

Screenshot 2023-04-04 153056
When the policy is applied successfully to the device you will find a new directory C:\Programs Files\Microsoft EPM agent is created as shown above.

Screenshot 2023-04-04 153137

If you look inside that directory you will see the above structure.

Screenshot 2023-04-04 153323

With these files now on the device, you can right mouse click on an executable and you should now see the option Run with elevated access as shown above.

Screenshot 2023-04-04 153409

When you select that option you will now be prompted, per the policy options, to enter a confirmation as shown above.

You can find documentation from Microsoft here:

Use Endpoint Privilege Management with Microsoft Intune

CIAOPS Need to Know Microsoft 365 Webinar – April

laptop-eyes-technology-computer

Join me for the free monthly CIAOPS Need to Know webinar. Along with all the Microsoft Cloud news we’ll be taking a look at Microsoft Defender for Cloud Apps.

Shortly after registering you should receive an automated email from Microsoft Teams confirming your registration, including all the event details as well as a calendar invite.

You can register for the regular monthly webinar here:

April Webinar Registrations

(If you are having issues with the above link copy and paste – https://bit.ly/n2k2304

The details are:

CIAOPS Need to Know Webinar – April 2023
Friday 28th of April 2023
11.00am – 12.00am Sydney Time

All sessions are recorded and posted to the CIAOPS Academy.

The CIAOPS Need to Know Webinars are free to attend but if you want to receive the recording of the session you need to sign up as a CIAOPS patron which you can do here:

http://www.ciaopspatron.com

or purchase them individually at:

http://www.ciaopsacademy.com/

Also feel free at any stage to email me directly via director@ciaops.com with your webinar topic suggestions.

I’d also appreciate you sharing information about this webinar with anyone you feel may benefit from the session and I look forward to seeing you there.