For our Powershell script to work in Azure, we're going to need to give it access to the iTextSharp.dll. We achieve this by creating and uploading a custom module into our Azure Runbook Assets library.
Create the Automation Account
Before we go anywhere, we need to setup an Azure Automation account to execute our runbooks. This is used to authenticate against our Azure Subscription in order to gain access to all our Azure resources (such as storage accounts etc).
In Azure, create yourself a new Automation account.
Fill out the details, and pick the same resource group we used before just to keep things organized.
Expand out your Automation Account --> Overview --> Assets --> Credentials and click Add Credential.
Fill out the details, and add the new credential.
Create and upload the iTextSharp module
We need to upload our iTextSharp library so we can access it within our Powershell code running in Azure. To do this, we create a new module by simply uploading our library.
Go to the location on your hard disk where you saved the itextsharp.dll, and put it in a zip file with the same name.
Back in your Azure Automation account, click Modules and Add a Module
Browse to your zip file, and give it a few minutes to upload to your module library.
Create the runbook
In your Azure Automation account, click Runbooks and Add a runbook
Create a new runbook, and set the Runbook Type to PowerShell.
Once the runbook has been created, it'll take you into the editor view. This is where we begin to create the wizardry.
When an Azure Runbook executes, think of it as standing up a little sandboxed Windows system with a Powershell console. When that console fires up, it's going to load all of our modules that we've imported, such as our iTextSharp library, and we'll even be able to access the local file system to a very limited degree.
First we're going to define all the references to our storage container that holds our mail attachments we saved from our Flow.
# Define storage values $storageAccountName = 'myBillStorage' $storageAccessKey = '123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ==' $ContainerName = 'attachments' $localFileDirectory = 'C:\'
Next we load in our Azure Automation Credential that we created earlier, and authenticate against our Azure subscription.
# Load the Azure Automation credential $AzureOrgIdCredential = Get-AutomationPSCredential -Name 'myCreds' $Null = Add-AzureAccount -Credential $AzureOrgIdCredential $Null = Select-AzureSubscription -SubscriptionName 'Pay-As-You-Go'
Now we create an Azure Blob Storage Context object. This contains all the information that relates to the details of our blob, such as where it's located, and how to access it.
# Configure the Azure Blob Storage Context $ctx = New-AzureStorageContext ` -StorageAccountName $storageAccountName ` -StorageAccountKey $storageAccountKey $BlobName = "myBill.pdf" $localFile = $localFileDirectory + $BlobName
We then take that storage context object, and the
Get-AzureStorageBlobContent cmdlet to download our PDF onto our sandboxed Runbook system.
# Download the PDF from Azure Blob Storage $null = Get-AzureStorageBlobContent ` -Destination $localFile ` -Container $ContainerName ` -Blob $BlobName ` -Context $ctx
Next we load our iTextSharp library. When a Runbook loads modules, they are located in
C:\modules\user\<modulename>\<module> on the sandboxed system. Using this location is how we load our custom module into the running Powershell context.
# Load the DLL Write-Output "Loading itextsharp.dll" Add-Type -Path "C:\modules\user\itextsharp\itextsharp.dll"
Now that the iTextSharp library is loaded, we can copy in our two functions we created on our local system earlier (
Get-BillValues) and they should work exactly as-is.
At this point we have:
- Downloaded the PDF attachment to the local system
- Imported our iTextSharp.dll into the running powershell runspace
- Defined and loaded our functions
If you think about it, this means that our environment where our runbook is executing now resembles our local system when we were developing the powershell code to extract our text. All that's left to do, is call our functions.
# Execute the Get-BillValues function $billDetails = Get-BillValues -Path $localFile $json = $billDetails | ConvertTo-json
And finally, do a little housekeeping
# Clean up # Remove the PDF from Azure Blob Storage $null = Remove-AzureStorageBlob -Container $ContainerName -Blob $BlobName -Context $ctx # Delete the local file we saved from Azure. Remove-Item $localFile
Save and Publish the Runbook.
Adding a webhook
The last thing we need to do is to add a webhook to the Runbook that we can use to trigger from our Flow.
Back in your Runbook properties, click Webhooks --> Add Webook
Fill out the details and copy the URL it generates. It's important to grab the URL now, because for security reasons, this is the only time you'll be able to view it without having to delete it and recreate it.
After our Create Blob action, insert another action of HTTP. Set the method to POST and the URI to be that of our webhook.
The flow should now look something like this:
There we have it. When an email arrives, it'll save the attachment to Azure Blob Storage, and then kick off our Azure Runbook that parses the text.
What we learned
- How to create an Azure Runbook
- How to upload a custom library to Azure as a module, and reference it within a Runbook
- How to trigger a runbook from Flow with a webhook
Now that we can parse the PDF attachment in Azure and extract our information from the bill, the next step is to return that information back to Microsoft Flow and create our calendar events.
- Going with the Flow: Part 1 - Introduction
- Going with the Flow: Part 2 - Creating the Flow
- Going with the Flow: Part 3 - Saving attachments to Azure Blob Storage
- Going with the Flow: Part 4 - Parsing attachments with Powershell
- Going with the Flow: Part 5 - Working with Azure Runbooks
- Going with the Flow: Part 6 - Completing the Flow
- Going with the Flow: Part 7 - Visualizing data in PowerBI