Fight!
A while ago, I have created a solution using System Center Orchestrator for a customer. Although it was OK, most of the solution was created using PowerShell scripts. I have leveraged Orchestrator basically to monitor a folder for certain types of files. Since this is all local at the customer infrastructure, the first impulse was to use Orchestrator, since it is hard to leverage a cloud solution for on-premises data, right? Wrong!
Challenge accepted! Let’s try to do the same using Azure Automation!
The first part of the challenge is to mimic the basic Orchestrator side of the solution:
My first thought was to use the file system watcher object with PowerShell. However, this object is only active whilst your PowerShell session is open, making this impossible through automation. So, taking the simples approach: polling every X minutes for files should do the trick, for now.
Now let’s try to use that in Azure Automation! Let’s start with a variable on which file path will be monitored:
Now for the Runbook:
The testing runbook looks like this:
For testing, let’s use the test pane:
Notice it might be queued for a while.
After a bit, if you look at the results:
These are the two files in the folder, as expected.
The last step of this part is to add a schedule. For that we’ll create a schedule asset and assign this runbook to run. Don’t forget to save and publish the runbook.
Update: So much to learn, so little time! I was just reviewing some articles on automation and there is a much better way to do this. Let’s try this again.
First we will create a Webhook for our dear published runbook:
And make sure you set the run settings for a hybrid worker:
Now you’ll need to create a scheduler job. In the new Azure Portal, go to:
Set the Job collection:
Configure the Action settings:
And the (much more granular) schedule:
Note that the Scheduler Service is not free. If you need granularity, keep that in mind. Review this to check the pricing.
And there is my first run:
Then the second:
On the next article, I will finalize the execution of my script through Azure Automation to disable and enable jobs.
I intended to write a second post, but when I started to do it, I noticed it would be much simpler to just add my second script (the one in the second PowerShell .NET box in SCO) as a function in my script than making it a second Azure Automation Runbook and having to handle remote authentication from the worker against Azure (Certificates, published settings file,etc).
So, here’s what I did. Here is the script as it looks like in PS ISE, with the main function (Process-File) collapsed:
As you can see, this does all Orchestrator was doing in the original workflow.
The main changes whilst using this through Azure Automation are in regards of the authentication (how to use the variables) and to keep in mind that it will run locally on a remote computer. That is actually since you probably just moved the script from you local ISE, so, shouldn’t be too different!
My original lines looked like this:
Of course, I was using an Orchestrator encrypted variable. But with Azure Automation, I can have an asset that is a credential:
Neat, eh? Note also that I’m using Invoke-Command to allow me to use these credentials, since I can’t specify the credentials the worker will run the script as and it would then run as the local system. I had to do that using Orchestrator anyways, since the native .NET PowerShell activity won’t allow credentials to be set.
So, let’s test this!
And done!
I have started the job at about 1:56 pm, so, I should have log file on the local folder of the worker machine:
That looks about right!
And the contents are consistent with the purpose of the script:
After enabling the scheduler, the Runbooks and generated output whenever a file was found in the folder (and generated logs):
So, although the script may be slightly different, this setup required no Orchestrator, no SQL and no setup, besides the hybrid worker. Let’s go Azure Automation!
Hope this helps!
Recent Comments