Transforming JSON Objects in Logic Apps

Introduction

Many integration scenarios require translating messages from one data model to another. This is described in the Message Translator Enterprise Integration Pattern. Some of these might be:

  • Translation between two different proprietary data models
  • Translation between a proprietary data model and an industry standard specification, and vice-versa
  • Translation between a proprietary data model and a Canonical Model, and vice-versa
  • Normalisation from different third-party formats to an internal Canonical Model
  • Content Filtering to remove unnecessary or sensitive information
  • Content Enrichment which can include different input messages from diverse sources
  • Applying an Envelope Wrapper to add metadata into the message

Logic Apps, as an Integration Platform as a Service (iPaaS), offers different capabilities that allow us to transform messages flowing through. The Enterprise Integration Transform Connector allows us to use XSLT-based graphically-designed maps to convert XML messages from one XML format to another. This connector can be used together with flat file decoder and encoder to transform from flat files to XML and vice-versa; and with the EDIFACT encoder and decoder and X12 encoder and decoder to translate from EDI formats to XML and vice-versa. Even though, flat files, EDI and XML are quite common in legacy integrations, JSON is now the de-facto standard for data interchange.

While there are clear transformation tools for these legacy formats, I have heard more than a couple of times the question of how to transform JSON objects to different data models within a Logic App. In this blog post, I’ll share some tips on how to leverage Logic Apps capabilities to implement JSON transformations in integration workflows.

Scenario

To show how to transform JSON objects, I’ll be working with an imagined scenario. The Keep Yourself Active Company is organising a corporate Step Challenge across their multiple locations. In addition to the step competition, the company wants to collect data about workout distances and energy burned to show them in the dashboards. Due to the large scale of the competition, they chose to use more than one fitness tracking app and they need to support different units of measure, e.g. calories and kilojoules for energy burned, and kilometres and miles for distances. The integration team has started architecting the solution and have designed a Canonical Model to represent each participant’s data. They have decided to implement the Normalisation pattern to standardise the diverse formats coming from the different apps into the Canonical Model. In addition, the team has implemented a routing and tracking framework that requires an Envelope Wrapper to include some metadata. Sample input messages, including one from one particular fitness tracking app, and a canonical sample message are shown below.

Input 1: Employee Details coming from the HR System

{
   "firstName": "Paco",
   "lastName": "de la Cruz", 
   "location": "Melbourne",
   "country": "Australia",
   "department": "Information Technologies",
   "email": "paco@keepyourselfactive.com.au"
}

Input 2: Tracked Activities from the Fitness Tracking App 1

{
   "user": "paco@keepyourselfactive.com.au",
   "workouts": [
      {
         "date": "2017-05-22",
         "type": "run",
         "distanceInMiles": 3.73,
         "time": "31:21",
         "energyInCalories": 533,
         "elevationInFeet": 119
      },
      {
         "date": "2017-05-24",
         "type": "run",
         "distanceInMiles": 3.74,
         "time": "32:05",
         "energyInCalories": 529,
         "elevationInFeet": 121
      },
      {
         "date": "2017-05-27",
         "type": "run",
         "distanceInMiles": 3.73,
         "time": "31:12",
         "energyInCalories": 534,
         "elevationInFeet": 118
      }
   ]
}
			

Input 3: Step Count from the Fitness Tracking App 1

{
   "user": "paco@keepyourselfactive.com.au",
   "steps": [
      {
         "date": "2017-05-22",
         "steps": 11813
      },
      {
         "date": "2017-05-23",
         "steps": 8340
      },
      {
         "date": "2017-05-24",
         "steps": 10980
      },
      {
         "date": "2017-05-25",
         "steps": 9753
      },
      {
         "date": "2017-05-26",
         "steps": 8798
      },
      {
         "date": "2017-05-27",
         "steps": 12531
      },
      {
         "date": "2017-05-28",
         "steps": 7689
      }
   ]
}

			

Expected output in the Canonical Model, including an Envelope with metadata.

{
   "metadata": {
      "messageId": "6468f980-a167-4307-888e-874a843aebe4", 
      "timestamp": "2017-05-29T01:00:00:000Z", 
      "entityType": "ActiveChallengeParticipantWeekRecords",
      "version": "2017-04-01"
   }
   "payload": {
      "participant": {
        "givenName": "Paco",
        "familyName": "de la Cruz", 
        "office": "Melbourne",
        "country": "Australia",
        "department": "Information Technologies",
        "email": "paco@keepyourselfactive.com.au"
      },
      "steps": [
         {
            "date": "2017-05-22",
            "steps": 11813
         },
         {
            "date": "2017-05-23",
            "steps": 8340
         },
         {
            "date": "2017-05-24",
            "steps": 10980
         },
         {
            "date": "2017-05-25",
            "steps": 9753
         },
         {
            "date": "2017-05-26",
            "steps": 8798
         },
         {
            "date": "2017-05-27",
            "steps": 12531
         },
         {
            "date": "2017-05-28",
            "steps": 7689
         }
      ],
      "workouts": [
         {
            "date": "2017-05-22",
            "type": "run",
            "distanceInKms": 6.00,
            "time": "31:21",
            "energyInKJ": 2230
         },
         {
            "date": "2017-05-24",
            "type": "run",
            "distanceInKms": 6.02,
            "time": "32:05",
            "energyInKJ": 2213
         },
         {
            "date": "2017-05-27",
            "type": "run",
            "distanceInKms": 6.00,
            "time": "31:12",
            "energyInKJ": 2234
         }
      ]
   }
}

As we can see, this scenario includes several Enterprise Messaging Patterns, including Message Translator, Enrichment (assuming the information is coming from different APIs), Normalisation, Canonical Model and Envelope Wrapper. Let’s have a look at how to implement this transformation in Logic Apps.

Preparing my scenario in a Logic App

For demonstration purposes, I’m implementing this scenario using a simple Logic App with Data Operations – Compose actions to create the input JSON messages, as shown below. However, in real-life scenarios, you would expect to receive these payloads from other APIs.

Once we have the JSON messages in the Logic App, I’m using the Data Operations – Parse Action to be able to use Dynamic Content tokens from these JSON objects later. I’ve used the same payload as a sample message to generate the JSON schema.

Mapping a flat JSON object

The easiest part of the mapping is to map a flat JSON object to another one. So let’s start with the participant object within the payload. To transform one JSON object in a particular data model to a different one, we can leverage the Data Operations – Compose action to create an object with the required data model. We can use the dynamic content tokens from the previous Data Operations – Parse actions.

And below is the code behind. There, you can see that we are constructing a JSON object with the required properties and we are using the properties of the parsed input payload to do so.

"Transform_Participant_by_Using_Compose": {
   "inputs": {
      "country": "@{body('Parse_Input_Employee_Details')?['country']}",
      "department": "@{body('Parse_Input_Employee_Details')?['department']}",
      "email": "@{body('Parse_Input_Employee_Details')?['email']}",
      "familyName": "@{body('Parse_Input_Employee_Details')?['lastName']}",
      "givenName": "@{body('Parse_Input_Employee_Details')?['firstName']}",
      "office": "@{body('Parse_Input_Employee_Details')?['location']}"
   },
   "runAfter": {
      "Parse_Input_Steps": [
         "Succeeded"
       ]
   },
   "type": "Compose"
}

			

That was too easy! This is how you map a JSON object with a flat structure in Logic Apps. Of course you can use any of the functions available in the Workflow Definition Language

Mapping repeating records or an array in a JSON object

It’s a common scenario that we have repeating records, or an array of objects in our messages and we need to translate them to a different data model.

To do so, we can use the For Each Loop available in Logic Apps, and use a Compose action inside. I’m showing how to transform the workouts array to the required data model using this approach. I’m utilising the @mul (multiply) function to calculate the distance in Kms and energy in KJ. As mentioned before, you can use any of the available functions in your mappings.

The code behind is shown as follows.

"For_each_Workout": {
    "actions": {
        "Transform_Workouts_by_Using_Compose": {
            "inputs": {
                "date": "@{item()?['date']}",
                "distanceInKms": "@mul(item()?['distanceInMiles'], 1.60934)",
                "energyInKJ": "@mul(item()?['energyInCalories'], 4.184)",
                "time": "@{item()?['time']}",
                "type": "@{item()?['type']}"
            },
            "runAfter": {},
            "type": "Compose"
        }
    },
    "foreach": "@body('Parse_Input_Workouts')?['workouts']",
    "runAfter": {
        "Transform_Participant": [
            "Succeeded"
        ]
    },
    "type": "Foreach"
}
			

Another way to map an array of objects is by using the Data Operations – Select action. The main advantage of this approach is that we don’t need to explicitly implement a For Each loop. Designer and code views of this action are shown below.


"Transform_Workouts_by_Using_Select": {
    "inputs": {
        "from": "@body('Parse_Input_Workouts')?['workouts']",
        "select": {
            "date": "@item()?['date']",
            "distanceInKms": "@mul(item()?['distanceInMiles'],1.60934)",
            "energyInKJ": "@mul(item()?['energyInCalories'],4.184)",
            "time": "@item()?['time']",
            "type": "@item()?['type']"
        }
    },
    "runAfter": {
        "For_each_Workout": [
            "Succeeded"
        ]
    },
    "type": "Select"
}
			

Including an array of objects in the Compose action

To create the JSON object in the Canonical Model, as detailed at the beginning of this post, we need to create the participant object and insert two arrays of objects (steps and workouts) while creating the JSON message. Again, this is possible by using the Data Operations – Compose action. I’m creating the Canonical Model, without the Envelope first for demonstration purposes. The Canonical Model includes, the transformed participant, the steps array and the translated workouts array. I’m composing again the participant, inserting the steps array as they are from the inputs, and inserting the workouts array as output from the Select action.

This is the code behind, As you can see, we can insert arrays of objects while using the Data Operations – Compose action.

"Transform_Payload_to_Canonical_Model": {
   "inputs": {
        "participant": {
            "country": "@{body('Parse_Input_Employee_Details')?['country']}",
            "department": "@{body('Parse_Input_Employee_Details')?['department']}",
            "email": "@{body('Parse_Input_Employee_Details')?['email']}",
            "familyName": "@{body('Parse_Input_Employee_Details')?['lastName']}",
            "givenName": "@{body('Parse_Input_Employee_Details')?['firstName']}",
            "office": "@{body('Parse_Input_Employee_Details')?['location']}"
        },
        "steps": "@body('Parse_Input_Steps')?['steps']",
        "workouts": "@outputs(Transform_Workouts_by_Using_Select)"
    },
    "runAfter": {
        "For_each_Workout": [
            "Succeeded"
            ]
        },
    "type": "Compose"
}

Adding an Envelope Wrapper to the JSON Payload

We can use a Compose Action also for adding the Envelope Wrapper as shown below. We just need to insert the payload within the composed JSON as follows.

And this is the code behind.

{
  "metadata": {
    "entityType": "ActiveChallengeParticipantWeekRecords",
    "messageId": "@{guid()}",
    "timestamp": "@{utcnow('o')}",
    "version": "2017-04-01"
  },
  "payload": "@outputs('Transform_Payload_to_Canonical_Model')"
}
			

Other mapping scenarios

There are other JSON mapping scenarios which can easily be solved using the same actions, such as:

  • Sending more than two payloads to an http triggered Azure function. You might have some scenarios in which you need to send more than one JSON payload to an http triggered Azure function. To do so, you can use a variation of the Wrapper pattern inserting the different payloads in a wrapping request body.
  • Transforming a payload which includes an array of objects or repeating records. In the scenario above, I showed how to include an array into a new message. However, you can apply the same principles to map a JSON object which includes an array of objects into a different data model.

More complex scenarios?

Chances are that you would have transformation requirements that go beyond the current capabilities in Logic Apps. For those cases, you have two main alternatives:

  • Create an Azure Function to handles the transformation using code or
  • Transform the JSON object into XML using the @xml() function; then implement the mapping using an XML map; and finally transform the XML back to JSON using the @json() function. I’m not a big fan of this approach, as transforming JSON into XML and back might have an impact on your data model due to the restrictions of each standard.

Conclusion

In this post, I’ve shown how to use Logic Apps capabilities to transform JSON objects to a different data model. We can leverage Data Operation actions to do so. The Parse action allow us to use the different properties of the JSON object as dynamic content tokens in subsequent actions. The Compose action allows us to create a new JSON message, and it can be used within a For Each loop to work with Arrays. And the Select action is quite handy to map arrays of objects into a different model. Furthermore, for more complex scenarios you can always make use of Azure Functions or, if you are familiar with BizTalk Maps, XML Transforms.

How are you implementing your JSON object transformations in Logic Apps? Feel free to share your ideas or post your questions below.

HTH and happy clouding!

Cross-posted on Mexia Blog

Implementing the Polling Consumer Pattern using Azure Logic Apps

Introduction

When implementing integration projects, it’s quite common that upstream systems don’t have the capabilities to push messages to downstream systems, or that due to different constraints or non-functional requirements, the receivers are required to pull for messages from those systems. Gregor Hohpe describes in his book “Enterprise Integration Patterns” the Polling Consumer Pattern, in which a receiver is in charge of polling for messages from a source system. In this pattern, the receiver usually polls for messages with an incremental approach, i.e. polling only for changes from the source system; as opposed to getting a full extract. In most of these scenarios, the provider system does not keep any state on behalf of the receiver; thus, it is up to the receiver to keep a state which allows it to get changes since the previous successful poll.

Azure Logic Apps provides many trigger connectors which already implement the Polling Consumer Pattern out-of-the-box. For example, the Salesforce adapter, can trigger a workflow when a record is created or modified; the SharePoint adapter can initiate a workflow when a file or an item is created or modified; and the Gmail adapter can start a workflow when an email arrives. All these triggers work on a recurrent basis, e.g. every 5 minutes. For all these triggers, the Logic App adapter has to keep a trigger state or polling watermark which allows the connector to get only changes since the last poll. For example, the Gmail connector has to know if there are new emails since the last time it executed a poll. This trigger state or polling watermark should work even if we temporarily disable the Logic App. Even though there are many trigger connectors that make our life very easy, there might be scenarios in which a polling trigger connector is not available for a particular API or system. For example, what if we need to poll for changes from an Oracle database, from a custom API, or from an Atom feed? In this blog, I will show how to implement a custom Polling Consumer Pattern using Azure Logic Apps for those scenarios in which a trigger connector is not yet available.

Scenario

To illustrate the implementation of the Polling Consuming pattern, I will implement a fictitious scenario in which ‘New Hire´ events in an HR System are used to trigger other processes, e.g. identity and workstation provisioning. You might imagine that the end-to-end scenario could be implemented using the Publish-Subscribe Pattern. That’s true, however, on this blog I will focus only on the Polling Consumer interface on the Publisher side.

The HR System of my scenario provides an Atom feed for polling for updates. This Atom feed is exposed as a RESTful API endpoint and requires two query parameters: ‘updated-min’ and ‘updated-max’. Both parameters are date-time in an ISO 8601 UTC format (e.g. yyyy-MM-ddTHH:mm:ss.fffZ). The lower bound (updated-min) is inclusive, whereas the upper bound (updated-max) is exclusive.

Even though my scenario is using a date-time polling watermark, the same principles can be used for other types of watermarks, such as Int64, base64 tokens, and hashes.

Coming back to my scenario, let’s imagine that my Logic App is to be run every 10 mins, and I start it on the 1st of May at 8 AM UTC time; I would expect to send from my Logic App to the HR System http requests like the following:

The first request would return New Hire events that occurred between 8:00 AM and just before 8:10 AM. The next one from 8:10 to just before 8:20, and so on.

Components

To implement this pattern, I will use:

  1. A Logic App, to implement a custom Polling Consumer Pattern workflow.
  2. An Azure Storage Table to persist the polling watermark.
  3. An Azure Function to extract the current polling watermark.
  4. An Azure Function to update the polling watermark after the poll.

Show me the code!

After describing the scenario, let’s start having fun with the implementation. To implement the pattern, I will follow the steps below:

  1. Create an Azure Resource Group
  2. Create an Azure Storage Account, create a Table, and populate the Table with my Polling Watermark
  3. Create an Azure Function App
  4. Develop and deploy the Azure Functions which will allow me to retrieve and update the Polling Watermark
  5. Develop the Azure Logic App

Each step is described in detail as follows,

1. Create an Azure Resource Group

You might already have an Azure Resource Group which contains other resources for your solution. If that’s the case you can skip this step. I will start creating a new Resource Group for all resources I will be using for this demo. I’ll name it ‘pacopollingconsumer-rgrp’


2. Create an Azure Storage Account, create a Table, and populate the Table with the Polling Watermark

Once I have my Azure Resource Group, I’m going to create an Azure Storage Account. I’ll use this Storage Account to create a Table to persist my polling watermark. I want to create a framework that can be used for more than one scenario; whether it’s polling changes from different entities from the same source system, or from more than one source system. So, I’ll prepare my table to handle more than one entity and more than one source system. I’ll name the table ‘pacopollingconsumerstrg’ and use the default settings.


Once I’ve created the Storage Account, I’ll create a table. I’ll use the Azure Storage Explorer for this. Once, I’ve downloaded it, I will open it and add my Azure Account by signing in to Azure.


After signing in, I select my subscription. Then, I should be able to see all my existing Storage Accounts


I create a new Table by right clicking on the Tables branch


I’ll name the new Table ‘PollingWatermark’


Once the Table has been created, I’ll add a new Entity


As mentioned above, I want to be able to use this table to handle more than one entity and more than one source system. I’ll use the Table PartitionKey to store the Source System, which for this demo I’ll use ‘HRSystem’, and the RowKey to store the Entity, which will be ‘EmployeeNewHire’. I will create a new column of time DateTime to store my Watermark, and I will set my initial value. Bear in mind that the Azure Storage Explore works with local time, however, the value will be stored in UTC.


Cool, now we have our Azure Table Storage ready 🙂

3. Create the Azure Function App

At the time of writing this post, there is no Logic App connector for Azure Table Storage. There is already a user voice for it here, and if you want it, I would suggest you to vote for it. (I already did 😉 ) In the absence of a Logic App connector for Azure Storage Table, we will be using an Azure Function App for it. I’ll create an Azure Function App called ‘pacopollingconsumer-func’ on my resource group, using the recently created storage account for its logs, and will use the consumption plan option, which is priced based on execution, as explained here.


Once I’ve created my Function App, I’ll download the publish profile which I’ll be using later to publish my functions.


4. Develop and Deploy the Azure Functions

Even though you can author your Azure Function from the portal, I really like the option of building and test my code locally with all the advantages of using Visual Studio. At the time of writing, Azure Function Tooling on Visual Studio supports creating C# Function as scripts (.csx files) on VS 2015 as described here, and creating class libraries on Visual Studio 2015 and 2017 as shown here. The use of compiled libraries brings some performance benefits, and will also make easier to transition to the planned Visual Studio 2017 tools for Azure Functions as discussed here. So, based on the recommendations, I’ve decided to use a class library project. I’ve already developed an Azure Functions class library project to implement this pattern, and made it available on GitHub, at https://github.com/pacodelacruz/PollingConsumer. Even though you might want to reuse what I have developed, I strongly suggest you to get familiar with developing Azure Function Apps using class library projects, as described here.

Some notes in regard to my ‘PacodelaCruz.PollingConsumer.FunctionApp’ project

  • In order to run Azure Functions locally, you have to install the Azure Functions CLI available here.
  • You will need to update the External program and Working Directory paths in the Project Properties / Web as described here.


  • You might need to get the already referenced NuGet packages:
    • Microsoft.WindowsAzure.ConfigurationManager
    • Microsoft.AspNet.WebApi
    • Microsoft.Azure.WebJobs.Host

NOTE: I have found that the Newtonsoft.Json.JsonSerializerSettings on the Newtonsoft.Json version 10.0.2 does not work properly with Azure Functions. So, I’m using the Newtonsoft.Json version 9.0.1. I recommend you not to update it for the time being.

  • You might want to update the host.json file according to your needs, instructions here.
  • You will need to update the appsettings.json file on your project to use your Azure Storage connection strings.

Now, let’s explore the different components of the project:

  • The PollingWatermarkEntity class is used to handle the entities on the Azure Table Storage called ‘PollingWatermark‘. If you are not familiar with working with Azure Storage Tables on C#, I would recommend you to have a read through the corresponding documentation.

  • The PollingWatermark class helps us to wrap the PollingWatermarkEntity and make it more user-friendly. By using the constructor, we are naming the PartitionKey as SourceSystem, and the RowKey as Entity. Additionally, we are returning another property called NextWatermark that is going to be used as the upper bound when querying the source system and when updating the Polling Watermark after we have successfully polled the source system.

Now, let’s have a look at the Functions code:

  • GetPollingWatermark function. This Http triggered function returns a JSON object containing a DateTime Polling Watermark which is stored on an Azure Storage Table based on the provided ‘sourceSystem’ and ‘entity’ query parameters in the GET request. The function bindings are defined in the corresponding function.json file.

  • UpdatePollingWatermark function. This Http triggered function updates a DateTime Polling Watermark stored on an Azure Storage Table based on the payload in the JSON format sent as the body on the PATCH request. The JSON payload is the one returned by the GetPollingWatermark function. It uses the ‘NextWatermark’ property as the new value. The function bindings are defined in the corresponding
    function.json file.

Once you have updated the appsettings.json file with your own connection strings, you can test your functions locally. You can use PostMan for this. On Visual Studio, hit F5, and wait until the Azure Functions CLI starts. You should see the URL to run the functions as shown below.


Once your project is running, you can then call the functions from PostMan by adding the corresponding query parameters. By calling the GetPollingWatermark function hosted locally, you should get the PollingWatermark, as previously set, as a JSON object.

[GET] http://localhost:7071/api/GetPollingWatermark?sourceSystem=HRSystem&entity=EmployeeNewHire


To call the UpdatePollingWatermark function you need to use the PATCH method, specify that the body is of type application/json, and add in the Request Body, as obtained in the previous call. After calling it, you should see that the PollingWatermark has been updated based on the sent value. So far so good 🙂

[PATCH] http://localhost:7071/api/UpdatePollingWatermark


After having successfully tested both functions, we are ready to publish our Function App project. To do so, we have to right click on the project, and then click Publish. This will allow us to import the Publish Profile that we downloaded previously in one of the first steps described above.


After having successfully published the Azure Function, now we need to configure the connection strings on the Azure Function app settings. You will need to add a new app setting called PollingWatermarkStorage and set the value to the connection string of the storage account containing the PollingWatermark table.


Now you should be able to test the functions hosted on Azure. You need to go to your Function App, navigate to the corresponding function, and get the function URL. As I set the authentication level to function, a function key will be contained in the URL.


Bear in mind that we need to add the corresponding query params or request body. Your URLs should be something like:

[GET] https://pacopollingconsumer-func.azurewebsites.net/api/GetPollingWatermark?code=%5BfunctionKey%5D&SourceSystem=HRSystem&Entity=EmployeeNewHire

[PATCH] https://pacopollingconsumer-func.azurewebsites.net/api/UpdatePollingWatermark?code=%5BfunctionKey%5D

5. Develop the Logic App implementing the Polling Consumer workflow.

Now that we have implemented the required Azure Function, we are ready to build our Logic App. Below you can see the implemented workflow. I added the following steps:

  • Recurrent trigger
  • Function action, to call the GetPollingWatermark function with the GET method and passing the query parameters.
  • Parse JSON to parse the response body.
  • Http Request action to call the HR System endpoint passing the updated-min and updated-max parameters using the PollingWatermark and NextPollingWatermark properties of the parsed JSON.
  • Call a nested Logic App. Because the Atom Feed is a batch of entries, on the nested Logic App, I will be implementing debatching using SplitOn. You can also use ForEach and send each entry to an Azure Service Bus queue or topic.
  • Function action to update the polling watermark for the next poll by calling the UpdatePollingWatermark function with the PATCH method and passing as request body the response obtained from the previous function call.


In case it’s of help, you can have a look at the code view of the Logic App. Just bear in mind that I removed some sensitive information.

Other potential scenarios

Now that you have seen how I have implemented the custom Polling Consumer Patterns on Azure Logic Apps supported by Azure Functions and Azure Storage Tables, you should be able to implement the same pattern with slight variations for your own scenarios. You might need to use a numeric or a hash polling watermark, or instead of passing it to an API, you might need to pass it to a Function, that on its turn, queries a database for which a connector with a trigger is not yet available.

Conclusion

The Polling Consumer pattern is quite common in integration projects. Logic Apps provides different trigger connectors which implement the Polling Consumer pattern out-of-the-box. However, there are scenarios in which we need to connect to a system for which there is no trigger connector available yet. In these scenarios, we need to implement a custom Polling Consumer pattern. Through this post, we have seen how to implement a custom Polling Consumer Pattern using Azure Logic Apps together with Azure Functions and Azure Storage. To do so, we created an Azure Resource Group, we created an Azure Function, we developed Azure Functions using a C# class library project, we tested the functions locally, we deployed them to Azure, and configured and tested them on Azure as well. Finally, we developed a Logic App to implement the Polling Consumer pattern.

As discussed above, you can implement the same pattern with slight variations to accommodate your own scenarios with different requirements; and I really hope this post will make your implementation easier, faster, and more fun. Do you have any other ideas or suggestions on how to implement the custom Polling Consumer Pattern? Feel free to share them below J

I hope this has been of help, and happy clouding!

Cross posted on Mexia’s Blog