Ten-point checklist when migrating from BizTalk to Azure iPaaS (Logic Apps)


Business are evolving increasingly fast, and IT can be an enabler or a deterrent of this evolution. IT changes should always bring business value and never compromise core business needs.

As part of becoming more agile, a common concern for many of our customers is the transition from on-premises integration platforms to a cloud or hybrid solution, in particular migrating their BizTalk Server environments to the Microsoft Azure Integration Platform-as-a-Service (iPaaS), which is based on Azure Logic Apps.

There are a number of reasons you might want to consider migrating your BizTalk solutions to the Microsoft Azure iPaaS, including:

  1. Enabling or supporting the digital transformation journey. Azure services, like Logic Apps, Azure Functions and API Management allow you to expose and consume modern APIs, which are key enablers of digital transformation.
  2. Reducing your OpEx. Significantly reduce your IT operation and licensing costs by leveraging PaaS and serverless components.
  3. Gaining Agility: Azure allows you to deliver business value in weeks, instead of months. Not only because of the tooling but also because of the capabilities of the platform and availability of hundreds of connectors.
  4. Unlocking new business solutions: With Azure, new business solutions are possible. From asynchronous messaging of Service Bus to eventing of Event Grid, to smart solutions with Stream Analytics, Cognitive Services and Machine Learning, to monetisation of APIs with API Management, to advanced monitoring and alerting with OMS and Log Analytics, to name a few.

With all of that said, supporting your digital transformation leveraging integration can sometimes be challenging. Therefore, it’s imperative you start early, plan thoroughly and implement well to avoid unnecessary complications.

To get the full article, download the document from my employers site with the ten-point checklist to get your BizTalk to Azure iPaaS transition in good shape.



My participation on the Global Integration Bootcamp 2018 in Melbourne

This year, we had the second edition of the Global Integration Bootcamp, an event focused on the integration platforms of Microsoft. I participated talked about what’s new on Microsoft’s Azure iPaaS, also know as Azure Integration Services. Here my slides:

Happy Clouding!

Microsoft Azure Integration Platform as a Service (iPaaS) – Logic Apps and its Azure allies [Update]


A year ago, I wrote a post about this very same topic, the Microsoft Azure Integration Platform as a Service (iPaaS). At that time, the core iPaaS product offering, Azure Logic Apps, was roughly 6 months old, since it’s generally available launch. Since the launch date, we’ve seen an impressive release cadence from the product team, an ever-growing list of connectors from other Microsoft product teams and third-party providers, and a considerable growth of the user-base.

In this post, I’ll talk about the current state of Azure Logic Apps and the other Azure services which are relevant when building application and process integration solutions. It’s worth mentioning that there is another platform which targets hybrid data integration solutions called Azure Data Factory, however, that’s an ETL platform, and won’t be discussed in this post.

What’s an Integration Platform as a Service (iPaaS)

According to Gartner, an “Integration Platform as a Service (iPaaS) is a suite of cloud services enabling development, execution and governance of integration flows connecting any combination of on premises and cloud-based processes, services, applications and data within individual or across multiple organizations.” Its capabilities usually include:

  • Communication Protocol Connectors (HTTP, SFTP, AS2, etc.)
  • Application connectors, including SaaS and on-premises apps
  • Ability to handle data formats, like JSON or XML, and data standards like EDIFACT, HL7, etc.
  • Orchestration
  • Routing
  • Data validation and transformation
  • Monitoring and Management tools
  • Full life-cycle API Management
  • Development and solution life-cycle management tools

Microsoft’s core iPaaS product offering is Azure Logic Apps. However, we could argue that one of the key differentiators and advantages of Microsoft within the iPaaS market, is how Logic Apps is enriched with the whole Azure ecosystem, which also keeps getting richer and better.

Azure Services to Build Integration Solutions

Based on Gartner’s definition of iPaaS, I’ll describe the different Azure services which we can leverage to develop, implement and manage enterprise-class integration solutions. The figure below, shows the different capabilities of an iPaaS, and which Azure product offerings we can use to implement them.

01 iPaaS Components

Orchestration, Data Handling and Transformation, and Routing.

Logic Apps workflows allow us to orchestrate our integrations solutions. They provide a workflow visual designer (via the Azure Portal or Visual Studio) to design, build and manage our integration workflows. These workflows are serverless, which means that we can focus on the functionality and business value, while all the infrastructure and scaling are fully abstracted and we pay only for what we use. With Logic Apps we can implement long running workflows, as described in previous posts using the Webhook action, and the Service Bus connector. A figure of a Logic App workflow, which implements some conditions, a loop and some connectors, is shown below.

02 workflow

Behind the scenes, Logic Apps workflows are based on the Workflow Definition Language. Within the Logic App, we can validate messages and transform messages using Data OperationsLiquid, or XSLTs for XML documents. For Routing, we can use different Logic Apps features to control the workflow flow, and we can also use Azure Service Bus Topics Subscriptions for Pub/Sub scenarios.

Additionally, we can externalise Business Rules from Logic Apps workflows.


In addition to the Logic Apps workflows, the ever-growing list of connectors, is another of the great features of this platform. These connectors allow us to trigger workflows and get data from and push data to many diverse apps on the cloud and on-premises through different protocols. Additionally, the on-premises data gateway, allows hybrid connectivity for some of the connectors. Below there is a snapshot of the 170+ connectors available at the time of writing. Please bear in mind that this list is always changing and that some connectors are still in preview and might not be available in all regions.

Protocol Connectors

  • FTP
  • HTTP
  • HTTP with Azure AD
  • HTTP with Swagger
  • RSS
  • SFTP
  • SMTP
  • SOAP-to-REST
  • WebHook
  • AS2
Hybrid Connectors

  • BizTalk
  • DB2
  • File System
  • Informix
  • MQ
  • MySQL
  • Oracle DB
  • PostgreSQL
  • REST
  • SAP
  • SharePoint
  • SOAP (to REST and pass-through)
  • SQL Server
  • Teradata
Azure Connectors

  • Azure AD
  • Azure API Management
  • Azure App Services
  • Azure Application Insights
  • Azure Automation
  • Azure Blob Storage
  • Azure Container Instance
  • Azure Data Lake
  • Azure Data Factory
  • Azure Event Grid
  • Azure File Storage
  • Azure Functions
  • Azure Kusto
  • Azure Logic Apps
  • Azure ML
  • Azure Resource Manager
  • Azure Security Center
  • Azure SQL Data Warehouse
  • Azure Storage Queues
  • Azure Table Storage
  • Computer Vision API
  • Common Data Service
  • Content Moderator
  • Cosmos DB
  • Custom Vision
  • Event Hubs
  • Face API
  • LUIS
  • QnA Maker
  • Service Bus
  • SQL Server
  • Text Analytics
  • Video Indexer
Other Microsoft Connectors

  • Bing Maps
  • Bing Search
  • Dynamics 365
  • Dynamics 365 for Financials
  • Dynamics Nav
  • Microsoft Forms
  • Microsoft Kaizala
  • Microsoft StaffHub
  • Microsoft Teams
  • Microsoft To-Do
  • Microsoft Translator
  • MSN Weather
  • Office 365 Excel
  • Office 365 Groups
  • Office 365 Outlook
  • Office 365 Video
  • OneDrive
  • OneDrive for Business
  • OneNote
  • Outlook Customer Manager
  • Outlook Tasks
  • Outlook.com
  • Project Online
  • Power BI
  • SharePoint
  • Skype for Business
  • VSTS
  • Yammer
Third-Party SaaS Connectors

  • 10to8
  • Adobe Creative Cloud
  • Apache Impala
  • Appfigures
  • Asana
  • Aweber
  • Basecamp3
  • Benchmark Email
  • Bitbucket
  • Bitly
  • Blogger
  • Box
  • Buffer
  • Calendly
  • Campfire
  • Capsule CRM
  • Chatter
  • Cognito Forms
  • D&B Optimizer
  • Derdack Signl4
  • DocFusion
  • Docparser
  • DocuSign
  • Dropbox
  • Easy Redmine
  • Elastic Forms
  • Enadoc
  • Eventbrite
  • Facebook
  • FlowForma
  • FreshBooks
  • Freshdesk
  • Freshservice
  • GitHub
  • Gmail
  • Google Calendar
  • Google Drive
  • Google Sheets
  • Google Tasks
  • GoToMeeting
  • GoToTraining
  • GoToWebinar
  • Harvest
  • HelloSign
  • HipChat
  • iAuditor
  • Infobip
  • Infusionsoft
  • Inoreader
  • insightly
  • Instagram
  • Instapaper
  • Intercom
  • Jira
  • JotForm
  • Kintone
  • LeanKit
  • LiveChat
  • Lithium
  • MailChimp
  • Mandrill
  • Marketing Content Hub
  • Metatask
  • Muhimbi PDF
  • MySQL
  • Nexmo
  • Oracle Database
  • Pager Duty
  • Parserr
  • Paylocity
  • Pinterest
  • Pipedrive
  • Pitney Bowes Data Validation
  • Pivotal Tracker
  • Planner
  • Plivo
  • Plumsail Documents
  • Plumsail Forms
  • Plumsail SP
  • PostgreSQL
  • Redmine
  • Salesforce
  • SendGrid
  • ServiceNow
  • Slack
  • Smartsheet
  • SparkPost
  • Stripe
  • SurveyMonkey
  • Tago
  • Teamwork Projects
  • Teradata
  • Todoist
  • Toodledo
  • Trello
  • Twilio
  • Twitter
  • Typeform
  • UserVoice
  • Vimeo
  • WebMerge
  • WordPress
  • Workday HCM
  • Workday Finance
  • Wunderlist
  • YouTube
  • Zendesk
  • Zoho

A key component of Logic Apps connectivity is the on-premises data gateway, which allows the connectors listed above as “Hybrid” to connect to on-premises resources. The data gateway is an agent that can be installed on a server on-premises or on a VM on an Azure VNET. All the interchanges between the data gateway and Logic Apps are outgress (from on-premises to Azure) via a managed Service Bus Relay with encrypted channels. The data gateway can be installed on more than one VM to provide high-availability.

03 On Prem data Gateway

Serverless Compute (Custom Code and Custom Connectors)

As shown in the previous sections, Logic Apps provide a lot of connectors and functionality for message processing, however, in some scenarios, we might need to write custom code. With Azure Functions, we can write custom nano-services in C#, F#, Javascript or any other of the supported languages, and we can invoke those functions from our Logic Apps synchronously via HTTP or asynchronously via Azure Service Bus or Azure Event Grid, as described below.

We can use Azure Functions as part of our custom message processing or to send or receive messages to or from an application for which there is no out-of-the-box connector.

Additionally, we can use Azure Functions Hybrid connections to securely connect to resources on-premises via outbound calls to Azure Service Bus Relay.

Messaging and Eventing

Integration solutions usually require a way to send and receive messages in an asynchronous way. Additionally, modern computing is all about events, and increasingly, integration solutions have to react to or push events.

Azure Service Bus Messaging provides reliable message queuing and publish-subscribe messaging capabilities. Azure Service Bus Queues and Topics allow us to decouple in time upstream and downstream systems or different interrelated services. Service Bus Queues provide ordered message delivery to competing consumers, while Service Bus Topics enable publish-subscribe messaging scenarios where multiple consumers can process the same message.

The following Service Bus features make it a key component of most integration solutions on Azure:

  • temporal decoupling
  • message processing load balancing (competing consumers)
  • Publish/Subscribe pattern via Service Bus Topics;

Azure Event Grid is a fully-managed event-routing platform on Azure which we can leverage as part of our integration solutions.  Logic Apps can react to events on Event Grid coming from the different publishers or publish events for other handlers to consume.

While Service Bus provides rich and robust messaging capabilities, like transactions, ordering, sessions, dead-lettering, enqueue time, long time to live and deduplication, among others; Event Grid offers hyper-scale event routing, with filtering, routing, built-in Azure publishers, with a push-push model, but a short time to live and no ordering. Additionally, due to these limitations, Event Grid is not meant for critical or transactional messages, but for events, which might still point to its source, e.g. the event of a blob being created containing the URL of the actual blob.

API Mediation and Management

API Mediation and Management is a core funcitonality of integration solutions which require to expose RESTful APIs.

Azure API Management is an Azure service which functions as a API gateway for backend APIs in the cloud or on-premises. Some of the benefits of API Management are:

Furthermore, Azure Functions Proxies provide a small subset of what API Management does, which can be leveraged as a light API Gateway for HTTP-triggered Logic Apps, including:

  • Azure Functions Proxies can be secured in a very similar way to App Services.
  • Modify requests and responses, which also allows us to mock APIs.
  • Consolidating multiple and disperse APIs into simpler URL routes.

Depending on the level of features we require, we can go with the lightweight Azure Functions Proxies or with the full API Management for our integration solutions.

Monitoring and Management

Azure Logic Apps provide built-in monitoring tools that allow you to check the run history, trigger history, status, performance, etc. You can also install the Logic Apps Management Solution on OMS, which gives you a very rich aggregated view and charts of all your logic apps that are being monitored. The OMS Logic Apps Management Solution view is shown in the figure below.

11 OMS Solution

Additionally, you can build custom activity monitoring dashboards and publish custom queries based on diagnostics sent to Azure Log Analytics. More information on my previous posts:

A sample of what you can build in the figure as follows.

12 BAM Dashboard

Development and Solution Life-cycle Management

To develop some of the iPaaS Solution components we can use the Azure Portal, however, when we think about the whole CI/CD pipeline, it’s better to work with Visual Studio and with Visual Studio Team Services (VSTS) for source control, build and release defections. More information about CI/CD for Logic Apps on my previous post:

Handy References

If you want to know more about Logic Apps, you might find the following links handy


Wrapping Up

In this post, I’ve discussed the current state of the Microsoft Azure Integration Platform as a Service (iPaaS) components, being Logic Apps the main player and complemented by other Azure product offerings. As mentioned previously, one of the main advantages of Microsoft in the iPaaS market is the whole Azure ecosystem and how we can build integration solutions leveraging the capabilities of all these different platforms. And we need to consider not only the technologies described here, but also all those which can be utilised through the connectors; such as Azure Cognitive Services and Azure Machine Learning.

Additionally, many of these components offer per-execution price or affordable entry-level pricing options. This enables us to start small when building enterprise-class integration solutions and grow from there.

I hope you’ve found this post useful, and please feel free to ask any questions or add your comments below.

Happy clouding!

Follow me on @pacodelacruz

Cross-posted on Deloitte Platform Engineering Blog

Publishing Custom Queries of Logic Apps Execution Logs

In a previous post, I showed how to implement Business Activity Monitoring for Logic Apps. However, sometimes developers, ops, or business users want to query execution logs to get information about the processing of business messages. Whether for troubleshooting or auditing, there are some questions these personas might have, like:

  • When was a business document processed?
  • What was the content of a received document?
  • How was that message processed?

As we saw in that post, we can send diagnostic log information and custom tracked properties to Azure Log Analytics. We also saw how easy is to query those logs to get information about Logic Apps execution and messages processed. Now the question is, how can we publish those custom queries, so different users can make use of them? In this post, I’ll show one easy way to do that.

1. Tracking the relevant custom properties and sending data to Log Analytics.

The first thing to do is to track the relevant custom properties we need for our queries as tracked properties in our Logic App workflow. Then you need to configure the Logic App workflow to send diagnostics information to Azure Log Analytics. You can follow the instructions on my previous post to perform those steps.

2. Creating the queries to get the information our users need

Once the information is being logged on Log Analytics, we need to create the queries to give the users the information they need. For that, first we need to open the Azure Log Analytics Portal. To open the portal we need to

  • Go to the Log Analytics Resource on the Azure Portal
  • Go to Log Search
  • Click on Analytics

And now you are ready to create your own queries.

20 Open Log Analytics

Based on the tracked properties of the Logic App workflow shown in my previous post, I wrote this query to get all orders processed in the time range selected. This query returns, order number, total, date, channel, customer Id, the name of the Logic App workflow which processed this message, and the workflow run id. These last 2 columns would be quite handy for troubleshooting.

21a Query and Results

3. Saving the custom Azure Log Analytics query

Once we have the query ready, we can save it and export it, so later we can publish it. To do that, we follow the steps below

  • Click the Save button
  • Give a name and category to the query. The category is quite useful for searching among all saved queries.
  • Then we click the Export button
  • And select the Share a Link to Query option, so the link to the query is saved in the clipboard.

21 Save and Export Query

4. Publishing the custom Azure Log Analytics query to the users

After we have gotten the link to the query, we can publish it in the same Dashboard we created for our BAM charts described in my previous post. We need to:

  • Edit the shared Azure Dasboard
  • Add the Markdown tile.
  • Add the markdown text which contains the link to the query created above.

Now the users will have all the charts and custom queries they need in one single place!

22 Add Query to Dashboard

Making easier to users to get to the workflow run logs of a particular business message on Logic Apps.

Logic Apps provide a very detailed view of the execution of the workflows. However, I’ve been asked so many times to make easier to users to get the run details of a particular business message. Here is a tip on how to do it.

First, we need to create a query to get the runId of the workflow instance that processed the message. I created this query to get those details for the orders submitted by a particular user.

  • Once we have that query, we publish it to the same markdown tile in our dashboard.
  • We also add the link to the workflow Azure resource to the same dashboard tile.

Now users can query the orders submitted by a user, get the workflow run id, and get the workflow run details in very few clicks.

23 Query Logic App Instance by Business Id


In this post, we’ve seen how to create and publish custom queries of Logic Apps execution logs. We’ve also seen how to make easier to users to get the workflow run details of the processing of a particular business message. Now you should be ready to start creating and publishing your own custom queries and creating amazing monitoring and tracking dashboards for your Logic Apps solutions.

I hope you’ve got some useful tips from this post and you’ve enjoyed it. Feel free to leave your questions or comments below,

Follow me on @pacodelacruz

Cross-posted on Deloitte Platform Engineering Blog

Business Activity Monitoring on Azure Logic Apps with Azure Log Analytics


Azure Logic Apps provide built-in monitoring tools that allow you to check the run history (including all inputs and outputs of triggers and actions), trigger history, status, performance, etc. Additionally, you can enable diagnostic logging on your Logic Apps and send all these runtime details and events to Azure Log Analytics. You can also install the Logic Apps Management Solution on OMS, which gives you a very rich aggregated view and charts of all your logic apps that are being monitored.

All these tools are great for developers or system administrators, who want to monitor and troubleshoot the execution of the workflows. However, sometimes we need to track and monitor more business-related information. Additionally, it’s quite common that business users want to monitor, with a business perspective, what’s happening at the integration layer.

In this post, I will show how to implement tracking capabilities for business-related information and how to create a Business Activity Monitoring (BAM) dashboard for Logic Apps.


In a previous post, I introduced a fictitious company called “Farm to Table”, which provides fresh produce drone delivery. This company has been leveraging Logic Apps to implement their business processes, which integrate with multiple systems on the cloud. As part of their requirements they need to monitor the business activity flowing through this integration solution.

“Farm to Table” want to be able to monitor the orders they are receiving per channel. At the moment, customers can place orders via SMS, a web online store, and a mobile app. They also want to be able to track orders placed using customer number and order number.



To be able to track and monitor business properties and activity on Logic Apps, there are some prerequisites:

  1. Azure Log Analytics workspace. We require an Azure Log Analytics workspace. We can use a workspace previously created or create a new one following the steps described here.
  2. Enable diagnostic logging to Azure Log Analytics on the Logic App which processes the messages we want to track, following the instructions detailed here.
  3. Install the Logic App Management solution for OMS on the Azure Log Analytics.

Adding Tracking Properties to the Logic App Workflow

Once enabled diagnostic logging, by default, Logic Apps tracks all workflow instances and actions on Azure Log Analytics. This tracking can very easily be extended using Tracked Properties on the workflow actions. In the tracked properties, we can include business related data; for example, in this scenario we could track customer Id, order number, order total, and channel.

“Farm To Table” has implemented an HTTP Triggered Logic App workflow that receives orders from different channels, validates the orders, maps it to a canonical model, enriches the message, and then puts them into a Service Bus Queue. The order canonical model processed by this workflow follows the schema of the instance below:

To track business properties of the orders, we will add the tracked properties to the action that sends the message to Service Bus. It’s noteworthy that when we use tracked properties within a Logic App action, we can only use the trigger input and the action’s inputs and outputs.

In the Logic App action that sends the message into a Service Bus queue, we will add the tracked properties to include the customer Id, order number, order Total, date, and channel. I’m also adding a flag to simplify my queries, but that’s optional. The code below shows the trackedProperties section added to our workflow action.

Once we have started tracking those properties and we have information already logged on Azure Log Analytics, we can start querying and creating charts for our own Business Activity Monitoring Dashboard.

Querying Azure Log Analytics

Let’s start querying what we are tracking from the Logic Apps on Log Analytics. Bear in mind that there is a delay between the execution of the Logic App and the log being available on Log Analytics. Based on my experience, it usually takes anywhere between 2 and 10 minutes. You can find detailed documentation on the Log Analytics query language here.

The query below returns the custom data I’m tracking on the Logic App workflow. When building your queries, it’s worth noting that

  • Workflow actions are being logged using the AzureDiagnostics Log Type and with the WorkflowRuntime category.
  • Logic Apps prepend the prefix “trackedProperties_” to each property and append a suffix to declare its type.

This query should return a result set as the one below:

Additionally, we can add filters to our queries, for instance, to get all orders by the CustomerId, we could use a query as follows:

Creating a Monitoring Dashboard.

Before, Microsoft suggested to create OMS custom views and queries for this purpose. The steps to create a custom OMS dashboard for Logic Apps are described here. However, after the upgrade of OMS to the new Log Analytics Query Language (previously known as Kusto query language), the recommended approach is now to use the new Azure Log Analytics portal, create queries and charts, and pin them to a shared Azure Dashboard. If have created both, custom OMS dahsboards and custom Log Analytics Azure Dashboards, you must agree that shared Azure Dashboards and Log Analytics charts are much more user friendly.

The steps to create and shared an Azure Dashboard to include Log Analytics data and charts are described here. We will follow these steps to create our own Business Activity Monitoring Dashboard as a shared Azure Dashboard. I won’t repeat what’s in Microsoft’s documentation, I’ll just show how I’m creating the charts to be added in the Dashboard.

Order Count by Channel

“Farm to Table” want to have a chart with the order count summarised by channel for the last 7 days in their BAM Dashboard. The query below returns those details.

Once we get the results, we need to select Chart and then the Doughnut option. After that, we are ready to pin our chart to the Azure shared dashboard.

15 Chart Count By Channel

Order Total by Date and Channel

The company also want to have a chart with the order total summarised by date and channel for the last 7 days in their Dashboard. The query below returns those details.

Once we get the results, we need to select Chart and then Stacked Columns. After that, we are ready to pin our chart to the Azure shared dashboard.

16 Chart Total By Date

Business Activity Monitoring Dashboard

Once we have pinned our charts, we would be able to see them in our Azure shared dashboard. These charts are very handy and allow us to dynamically visualise the data as shown below.

20 BAM Dashboard


In this post, we’ve seen how to easily track and monitor business information flowing through our Logic Apps, using Logic App native integration with OMS and Azure Log Analytics. Additionally, we’ve seen how friendly and cool the Log Analytics charts are. This gives Logic Apps another great competitive advantage as an enterprise-grade integration Platform as a Service (iPaaS) in the market.

I hope you’ve learned something useful and enjoyed this post! Feel free to post your comments or questions below

Happy clouding!


Follow me on @pacodelacruz

Cross-posted on Deloitte Platform Engineering Blog

Preparing Azure Logic Apps for CI/CD to Multiple Environments


Logic Apps can be created from the Azure Portal, or using Visual Studio. This works well if you want to create one Logic App at a time. However, if you want to deploy the same Logic App in multiple environments, e.g. Dev, Test, or Production, you want to do it in an automated way. Azure Resource Manager (ARM) Templates allow you to define Azure Resources, including Logic Apps, for automated deployment to multiple environments in a consistent and repeatedly way. ARM Templates can be tailored for each environment using a Parameters file.

The deployment of Logic Apps using ARM Templates and Parameters can be automated with different tools, such as, PowerShell, Azure CLI, or VSTS. In my projects, I normally use a VSTS release definition for this.

You probably have noticed that the Logic App Workflow Definition Language (the JSON code behind) has many similarities with the ARM Templates structure, including the use of expressions and functions, variables, and parameters.

ARM Template expressions and functions are written within JSON string literals wrapped with square brackets []. ARM expressions and functions can appear in different sections of the ARM template, including the resources member, which might contain Logic Apps. The value of these expressions is evaluated at deployment time. More information here.

Logic App expressions and functions are defined within the Logic App definition and might appear anywhere in a JSON string value. Logic Apps expressions and functions are evaluated at execution time. These are declared using the @ sign. More information here.

These similarities can be confusing by themselves. I’ve seen that it’s a quite common practice in ARM Templates with Logic Apps, to use ARM template expressions inside the Logic App definition. For example, using ARM parameters, ARM variables or ARM functions (like concat), within the definition of a Logic App. This might seem OK, as this is what you would normally do to tailor your deployment for any other Azure resources. However, in Logic Apps, this can be quite cumbersome. If you’ve done it, I’m almost sure that you know what I’m talking about.

In this post, I’ll share some practices that I use to ease the preparation of Logic Apps for Continuous Integration / Continuous Delivery (CI/CD) to multiple environments using ARM Templates, when values inside the Logic App definition have to be customised per environment. If you don’t have to change values within the Logic App definition, then you might not need to follow every step of this post.

Why it’s not a good idea to use ARM template expressions inside a Logic App definition?

As I mentioned above, if when preparing you Logic Apps for CI/CD with ARM Templates, you have used ARM template expressions or functions inside a Logic App definition, you most probably have realised that it’s quite troublesome. I personally don’t like to do it that way for two reasons:

  1. Editing the Logic App definition to include ARM Template expressions or functions is not intuitive. Adding ARM Template expressions and functions to be resolved at deployment time in a way that results in Logic Apps expressions and functions to be evaluated at execution time can be messy. Things can become harder when you have string functions in a Logic Apps, like @concat() that accept values that are to be obtained from ARM template expressions, like [parameters()] or [variables()]. I’ve heard and read of many people complaining about it.
  2. Updating your Logic App after you have your ARM Template ready, requires more work. It’s not unlikely that you would need to update your Logic App after you’ve prepared the ARM Template for it. Whether you need to fix a little bug found at testing, or you are required to change or add some functionality, the chances are that you would need to update the ARM template without the help of the Logic App Editor; and if you are unlucky, changes would touch those complex ARM template expressions inside your Logic App definition. Not very fun!

So, the question is, is it possible to create ARM Templates for Logic Apps that can be parameterised for multiple environments while avoiding using ARM template expressions inside the Logic App definition? Fortunately, it is :). Below, I describe how.


For this post, I will work with a rather simple scenario: A Logic App that is triggered when a message in a Service Bus queue is received and posts the message to an https endpoint using basic auth. The endpoint url, the username and password will be different for each environment. Additionally, the Service Bus API Connection will have to be defined per environment.

This very simple workflow created using the Logic App editor is shown below:

And the code behind this workflow is as follows:

The code is very straight forward, but the endpoint, username and password are yet static. Not ideal for CI/CD!

Preparing the Logic App for CI/CD to be deployed to multiple environments

In this section, I’ll show how you can prepare your Logic App for CI/CD to be deployed to multiple environments using ARM Templates, without having to use any ARM Template expressions or functions inside a Logic App definition.

1. Add Logic Apps parameters to the workflow for every value that is to be changed for each environment.

Similarly to ARM Templates, the Logic App workflow definition language accepts parameters. We can use these Logic Apps parameters to prepare our Logic App definition for CI/CD. We need to add a Logic App parameter for every value that is to be tailored for each environment. Unfortunately, at the time of writing, adding Logic App parameters can only be done via the code view.

Using the code view, we need to:

  • Add the parameters definition with a default value, you should follow the same principles of parameters for ARM templates, but in this case, they are defined within the Logic App definition. The default value is the one you would use otherwise as static value at development time.
  • Update the workflow definition to use those parameters instead of the fixed values.

I’ve done this using the code view of the workflow shown above. The updated workflow definition is as follows.

After this update, at this point in time, the workflow should work just as before, but now, instead of having fixed values, you are using Logic Apps parameters with default values. If you are doing it for yours, you can test it yourself 🙂

2. Get the Logic App ARM Template for CI/CD.

Once the Logic App is ready, we can get the ARM Template for CI/CD. One easy way to do it is to use the Visual Studio Tools for Logic Apps. This requires Visual Studio 2015 or 2017, the latest Azure SDK and the Cloud Explorer. You can also use the Logic App Template Creator PowerShell module. More information on how to create ARM Templates for Logic Apps here.

The Cloud Explorer will allow you to log in to your Azure Subscription and see the supported Azure resources, including Logic Apps. When you expand the Logic Apps menu, you will see all the Logic Apps available for that subscription.

Once you’ve found the Logic App you want to export, right click on it, and click on Open with Logic App Editor. This will open the Logic App Editor on Visual Studio.

In addition to allowing to edit Logic Apps on Visual Studio, the Visual Studio Logic App Tools let you to download the ARM Template that includes the Logic App. You just need to click the Download button, and
you will get an almost ready-to-deploy ARM Template. This functionality exports the Logic App API Connections as well.

For this workflow, I got an ARM Template as follows:

As you can see, this ARM Template includes

  • ARM Template parameters definition. This is where we define the ARM Template parameters. We can set a default value. The actual value for each environment is to be set on the ARM Parameters file.
  • Logic App parameters definition: These are declared within the definition of the Logic App. These are the ones we can define using the code view of the Logic App, as we did above.
  • Logic App parameters value set: Here is where we set the values for the parameters for the Logic App. This section is declared outside of the definition property of the Logic Apps.

The structure of the ARM Template can be seen in the picture below.

3. Set the Logic App parameters values with ARM Template expressions and functions.

Once we have the ARM Template, we can set the Logic App parameters values with ARM expressions and functions, including ARM parameters or ARM variables. I’ve done it with my ARM Template as shown below.

Before you check the updated ARM Template, some things to note:

  • I added comments to the ARM Template only to make it easier to read and understand, but I don’t recommend it. Comments are not supposed to be supported in JSON documents, however, Visual Studio and ARM Templates allow it.
  • I used the “-armparam” and “-armvar” suffixes on the ARM Template parameters and variables correspondingly. I did it only to show a clear distinction between ARM Template parameters and variables and Logic Apps parameters and variables. But the notation is sufficient (Using square brackets [] for ARM Template expressions and functions, and @ sign for those of Logic Apps).
  • I just used ARM Template parameters and variables to set the values of Logic App parameters, but you can use any other ARM Template function or expression that you might require to set Logic App parameter values.

As you can see, now we are only using ARM Template expressions and functions outside the Logic App definition. This is much easier to read and maintain. Don’t you think?

4. Prepare your ARM Parameters file for each environment.

Now that we have the ARM Template ready, we can prepare an ARM Parameters file for our deployment to each environment. Below I show an example of this.

5. Work on your CI/CD Pipeline.

Once we have the ARM Template and the ARM Parameter files, we can automate the deployment using our preferred tool. If you want to use VSTS, this is a good video that shows you how.

6. Deploy and enjoy.

Once you have deployed the ARM Template, you will be able to see the deployed Logic App. The Logic App parameters value set section is hidden, but if you execute it, you will see how the values have been set accordingly.

Do you want this to be easier?

You might be thinking, just as I am, that this process is not as intuitive as it should be, and is a bit time consuming. If you wish to ask the product team to improve this, you might want to vote for these user voice requests on the links below:

Wrapping Up.

In this post, I’ve shown how to prepare your Logic Apps for CI/CD to multiple environments using ARM Templates in a more convenient way, i.e. without using ARM Template expressions or functions inside the Logic App definition. I believe that this approach makes the ARM Template of a Logic App much easier to read and to maintain.

This method not only avoids the need of writing complex ARM Template expressions inside a Logic App definition, but also allows you to update your Logic App in the Designer, after this has been deployed using ARM Templates, and later update the ARM Template by simply updating the Logic App definition section. That’s much better, isn’t it?

I hope you’ve found this post handy, and it has helped you to streamline the configuration of your CI/CD pipelines when using Logic Apps.

Do you have a different preferred way of preparing your Logic Apps for CI/CD? Feel free to leave your comments or questions below,

Happy clouding and automating!

P.S. And remember: “I will never use ARM Template expressions inside a Logic App definition” 😉

Follow me on @pacodelacruz

Cross-posted on Deloitte Platform Engineering Blog

Implementing the Correlation Identifier Pattern on Stateful Logic Apps using the Webhook Action


In many business scenarios, there is the need to implement long-running processes which first send a message to a second process and then pause and wait for an asynchronous response before they continue. Being this an asynchronous communication, the challenge is to correlate the response to the original request. The Correlation Identifier enterprise integration pattern targets this scenario.

Azure Logic Apps provides a stateful workflow engine that allow us to implement robust integration workflows quite easily. One of the workflow actions in Logic Apps is the webhook action, which can be used to implement the Correlation Identifier pattern. One typical scenario in which this pattern can be used is when an approval step with a custom API (with a similar behaviour to the Send Approval Email connector) is required in a workflow.

In this post, I will show how to implement the Correlation Identifier enterprise integration pattern on Logic Apps leveraging the webhook action.

Some background information

The Correlation Identifier Pattern

Adapted from Enterprise Integration Patterns

The Correlation Identifier enterprise integration pattern proposes to add a unique id to the request message on the requestor end and return it as the correlation identifier in the asynchronous response message. This way, when the requestor receives the asynchronous response, it knows which request that response corresponds to. Depending on the functional and non-functional requirements, this pattern can be implemented in a stateless or stateful manner.

Understanding webhooks

A webhook is a service that will be triggered on a particular event and will result on an Http call to a RESTful subscriber. A much more comprehensive definition can be found here. You might be familiar with the configuration of webhooks with static subscribers. In a previous post, I showed how to trigger a Logic App by an SMS message with a Twilio webhook. This webhook will sends all events to the same Http endpoint, i.e. a static subscriber.

The Correlation Identifier pattern on Logic Apps

If you have used the Send Approval Email Logic App Connector, this implements the Correlation Identifier pattern out-of-the-box in a stateful manner. When this connector is used in a Logic App workflow, an email is sent, and the workflow instance waits for a response. Once the email recipient clicks on a button in the email, the particular workflow instance receives an asynchronous callback with a payload containing the user selection; and it continues to the next step. This approval email comes in very handy in many cases; however, a custom implementation of this pattern might be required in different business scenarios. The webhook action allow us to have a custom implementation of the Correlation Identifier pattern.

The Logic Apps Webhook Action

To implement the Correlation Identifier pattern, it’s important that you have a basic understanding of the Logic Apps webhook action. Justin wrote some handy notes about it here. The webhook action of Logic Apps works with an instance-based, i.e. dynamic webhook subscription. Once executed, the webhook action generates an instance-based callback URL for the dynamic subscription. This URL is to be used to send a correlated response to trigger the continuation of the corresponding workflow. This applies the Return Address integration pattern.

We can implement the Correlation Identifier pattern by building a Custom API Connector for Logic Apps following the webhook subscribe and unsubscribe pattern of Logic Apps. However, it’s also possible to implement this pattern without the need of writing a Custom API Connector, as I’ll show below.


To illustrate the pattern, I’ll be using a fictitious company called FarmToTable. FarmToTable provides delivery of fresh produce by drone. Consumers subscribe to the delivery service by creating their personalised list of produce to be delivered on a weekly basis. FarmToTable requires to implement an SMS confirmation service so that an SMS message is sent to each consumer the day before the scheduled delivery date. After receiving the text message, the customer must confirm within 12 hours whether they want the delivery or not, so that the delivery is arranged.

The Solution Architecture

As mentioned above, the scenario requires sending an SMS text message and waiting for an SMS response. For sending and receiving the SMS, we will be using Twilio. More details on working with Logic Apps and Twilio on one of my previous posts. Twilio provides webhooks that are triggered when SMS messages are received. The Twilio webhooks only allow static subscriptions, i.e. calling one single Http endpoint. Nevertheless, the webhook action of Logic Apps requires the webhook subscribe and unsubscribe pattern, which works with an instance-based subscription. Thus, we need to implement a wrapper for the required subscribe/unsubscribe pattern.

The architecture of this pattern is shown in the figure below and explain after.

Components of the solution:

  1. Long-running stateful workflow. This is the Logic App that controls the main workflow, sends a request, pauses and waits for an asynchronous response. This is implememented by using the webhook action.
  2. Subscribe/Unsubscribe Webhook Wrapper. In our scenario, we are working with a third-party service (Twilio) that only supports webhooks with static subscriptions; thus, we need to create this wrapper. This wrapper is composed by 4 different parts.
  • Subscription store: A database to store the unique message Id and the instance-based callback URL provided by the webhook action. In my implementation, I’m using Azure Cosmos DB for this. Nevertheless, you can use any other suitable alternative. Because the only message id we can send to Twilio and get back is the phone number, I’m using this as my correlation identifier. We can assume that for this scenario the phone number is unique during the day.
  • Subscribe and Start Request Processing API: this is a RESTful API that is in charge of starting the processing of the request and storing the subscription. I’m implementing this API with a Logic App, but you can use an Azure Function, an API App or a Custom Api App connector for Logic App.
  • Unsubscribe and Cancel Request Processing API: this is another RESTful API that is only going to be called if the webhook action on the main workflow times out. This API is in charge of cancelling the processing and deleting the subscription from the store. The unsubscribe step has a similar purpose to the CancellationToken structure used in C# async programming. In our scenario, there is nothing to cancel though. Like the previous API, I’m implementing this with a Logic App, but you can use different technologies.
  • Instance-based webhook: this webhook is to be triggered by the third-party webhook with a static subscription. Once triggered, this Logic App is in charge of getting the instance-based callback URL from the store and invoking it. After making the call back to the main workflow instance, the subscription is to be deleted.

The actual solution

To implement this solution, I’m going to follow the steps described below:

1. Configure my Twilio account to be able to send and receive SMS messages. More details here.

2. Create a Service Bus Namespace and 2 queues. For my scenario, I’m using one inbound queue (ScheduledDeliveriesToConfirm) and one outbound queue (ConfirmedScheduledDeliveries). For your own scenarios, you can use other triggers and outbound protocols.

3. Create a Cosmos Db collection to store the instance-based webhook subscriptions. More details on how to work with Cosmos Db here.

  • Create Cosmos Db account (with the Document DB API).
  • Create database
  • Create collection.

4. Create the “Subscribe and Start Request Processing API”. I’m using a Logic App workflow to implement this API as shown below. I hope the steps with their comments are self-explanatory.

  • The workflow is Http triggered. It expects, as the request body, the scheduled delivery details and the instance-based callback URL of the calling webhook action.
  • The provided Http trigger URL is to be configured later in the webhook action subscribe Uri of the main Logic App.
  • It stores the correlation on Cosmos Db. More information on the Cosmos Db connector here.
  • It starts the request processing by calling the Twilio connector to send the SMS message.

The expected payload for this API is as the one below. This payload is to be sent by the webhook action subscribe call on the main Logic App:

    "callbackUrl": "https://prod-00.australiasoutheast.logic.azure.com/workflows/guid/runs/guid/actions/action/run?params",
    "scheduledDelivery": {
        "deliveryId": "2c5c8390-b6c8-4274-b785-33121b01e219",
        "customer": "Paco de la Cruz",
        "customerPreferredName": "Paco",
        "phone": "+61000000000",
        "orderName": "Seasonal leafy greens and fruits",
        "deliveryAddressName": "Home",
        "deliveryDate": "2017-07-20",
        "deliveryTime": "07:30",
        "createdDateTime": "2017-07-19T09:10:03.209"

You can have a look at the code behind here. Please use it just as a reference, as it hasn’t been refactored for deployment.

5. Create the “Unsubscribe and Cancel Request Processing API”. I used another Logic App workflow to implement this API. This API is only going to be called if the webhook action on the main workflow times out. The workflow is show below.

  • The workflow is Http triggered. It expects as the request body the message id so the corresponding subscription can be deleted.
  • The provided Http trigger URL is to be configured later in the webhook action unsubscribe Uri of the main Logic App.
  • It deletes the subscription from Cosmos Db. More information on the Cosmos Db connector here.

The expected payload for this API is quite simple, as the one shown below. This payload is to be sent by the webhook action unsubscribe call on the main Logic App:

    "id": "+61000000000"

The code behind is published here. Please use it just as a reference, as it hasn’t been refactored to be deployed.

6. Create the Instance-based Webhook. I’m using another Logic App to implement the instance-based webhook as shown below.

  • The workflow is Http triggered. It’s to be triggered by the Twilio webhook.
  • The provided Http trigger URL is to be configured later in the Twilio webhook.
  • It gets the message Id (phone number) from the Twilio message.
  • It then gets the instance-based subscription (callback URL) from Cosmos Db.
  • Then, it posts the received message to the corresponding instance of the main Logic App workflow by using the correlated callback URL.
  • After making the callback, it deletes the subscription from Cosmos Db.

The code behind for this workflow is here. Please use it just as a reference, as it is not ready to be deployed.

7. Configure the Twilio static webhook. Now, we have to configure the Twilio webhook to call the Logic App created above when an SMS message is received. Detailed instructions in my previous post.

8. Create the long-running stateful workflow. Once we have the implemented the subscribe/unsubscribe webhook wrapper required for the Logic App webhook action, we can start creating the long-running stateful workflow. This is shown below.

In order to trigger the Unsubscription API, the timeout property of the webhook action must be configured. This can be specified under the settings of the action. The Duration is to be configured the in ISO 8601 duration format. If you don’t want to resend the request after the time out, you should turn off the retry policy.

  • The workflow is triggered by messages on the ScheduledDeliveriesToConfirm Service Bus queue.
  • Then the webhook action:
    • Sends the scheduled delivery message and the corresponding instance-based callback URL to the Subscribe and Start Request Processing Logic App.
    • Waits for the callback from the Instance-based webhook. This would receive as an Http post the response send by the customer. If a response is received before the time out limit, the action will succeed and continue to the next action.
    • If the webhook action times out, it calls the Unsubscribe and Cancel Request Processing Logic App and sends the message id (phone number); and the action fails so the workflow does not continue. However, if required, you could continue the workflow by configuring the RunAfter property of the subsequent action.
  • If a response is received, the workflow continues assessing the response. If the response is ‘YES’, it sends the original message to the ConfirmedScheduledDeliveries queue.

The code behind of this workflow is available here. Please use it just as a reference only, as it hasn’t been refactored for deployment.

Now, we have finished implementing the whole solution! 🙂 You can have a look at all the Logic Apps JSON definitions in this repository.


In this post, I’ve shown how to implement the Correlation Identifier pattern using a stateful Logic App. To illustrate the pattern, I implemented an approval step in a Logic App workflow with a custom API. For this, I used Twilio, a third-party service, that offers a webhook with a static subscription; and created a wrapper to implement the subscribe/unsubscribe pattern, including an instance-based webhook to meet the Logic Apps webhook action requirements.

I hope you find this post useful whenever you have to add a custom approval step or correlate asynchronous messages using Logic Apps, or that I’ve given you an overview of how to enable the correlation of asynchronous messages in your own workflow or integration scenarios.

Feel free to add your comments or questions below, and happy clouding!

Follow me on @pacodelacruz

Cross-posted on Deloitte Platform Engineering Blog

Microsoft Azure Integration Platform as a Service (iPaaS) – Logic Apps, Azure Functions, API Management and Service Bus working together

Originally posted on Mexia’s blog.


If you work for an established organisation going through Digital Transformation or in a modern company born in the digital era, the chances are that IT is required to implement integration solutions more than ever before. Whether an organisation is an incumbent or a new entrant, they need to leverage the power of technology to lead the change or embrace it. Monolithic systems will not be capable of meeting the needs of digital organisations, creating the need to interconnect existing systems with best-of-breed cloud and distributed apps and to expose all these systems to other systems, business partners or consumer apps through easy to consume APIs.

Microsoft provides different technologies on Azure which enable us to build robust application, data and process integration solutions. One of the core offerings for Azure integration is Logic Apps – however there are other Azure PaaS offerings which when used with Logic Apps form the robust building blocks of first-class integration solutions. These technologies together provide a very rich and fully-managed Integration Platform as a Service (iPaaS). In this post, I will try to describe these Azure technologies that we can leverage to make our life easier, and perhaps even more fun, when implementing enterprise integration solutions.

Azure Services to build integration solutions

Out of the very large and growing list of Azure services, there are some that we can utilise to build our integration solutions, including Logic Apps, Azure Functions, API Management, and Service Bus Messaging. I will briefly describe each of them below.

Logic Apps

Logic Apps is a very robust and powerful platform to orchestrate and automate integration workflows. The visual designer together with all the available connectors, and deployment and management tools make it a great tool for this kind of scenarios. Logic Apps also provides the concept of serverless computing, in which you just focus on what you want to achieve, without worrying at all about servers, patching, scaling, etc.

Workflow and Orchestration Capabilities

Logic Apps workflows, which can be easily designed and implemented graphically (via the Azure Portal or Visual Studio), are based on the Workflow Definition Language. These workflows provide rich ways to process and manipulate data than can be obtained or pushed via different connectors. Below there is a figure which shows a simple Logic App workflow with different actions, a condition and a loop.

In Logic Apps workflows we can implement different actions, like calling an HTTP endpoint, calling Azure API Apps, calling WebHooks, waiting, calling a Logic App nested workflow, implementing conditions and loops, query arrays, terminating a workflow, or returning an HTTP Response.

Triggers, Input and Outputs

Logic Apps provides a growing list of connectors that allow us to trigger workflows and get data from and push data to many different protocols, SaaS applications, other Azure and Power Apps Services, and on-premises systems. Below there is a snapshot of the 100+ connectors available at the time of writing (few of them are still in preview).

Protocol Connectors

Azure Services and Power Apps Connectors

SaaS Connectors

B2B, XML, EDI and AS2 Connectors

Hybrid Connectors

As we can see, with all these connectors, Logic Apps allows us to very easily and quickly connect to many different protocols, apps, and also to other Azure Services. Additionally, the Enterprise Integration Pack brings some of the BizTalk Server functionality to Logic Apps and makes really easy to implement B2B integrations based on AS2 and EDI.

Azure Functions

Another Azure service that is very useful when building integration solutions is Azure Functions. Azure Functions are the evolution of Azure WebJobs, which allow us to create microservices or small pieces of code that can run synchronously or asynchronously as part of composite and distributed cloud solutions. Azure Functions are built on top of the WebJobs SDK but with some additional benefits and capabilities, including the option of being deployed on a serverless-style Dynamic Service Plan, in which you pay per consumption only, and also the capability of being triggered via HTTP. Additionally, with Functions you can deploy from very simple to quite complex pieces of code developed on different languages, such as bash (.sh), batch (.bat / .cmd), C#, F#, Node.Js, PHP, PowerShell, and Python.

Triggers, Inputs and Outputs

Azure Functions support different triggers, input and output bindings, which I summarise in the table below. These bindings make Functions very easy to be called and allow them to get data from and push data to other microservices, Logic Apps or other systems or services.

Type / Service Trigger Input Output
Schedule *
HTTP Call * * *
Azure Blob Storage * * *
Azure Event Hubs * * *
Azure Storage Queues * * *
Azure Service Bus Messaging * * *
Azure Storage Tables * *
Azure Mobile Apps Tables * *
Azure DocumentDB * *
Azure Notification Hubs *
Twilio SMS Message *
SendGrid Emails (Not fully documented yet) *
Files in Cloud File Storage SaaS, such as Box, DropBox, OneDrive, FTP and SFTP (Not fully documented yet) * * *


Integration Processes usually require custom data and message validation, enrichment, transformation or routing. These can simply be done via custom code on Azure Functions.

Service Bus Messaging

Azure Service Bus Messaging provides reliable message queuing and publish-subscribe messaging capabilities. Azure Service Bus Queues and Topics allow us to decouple in time upstream and downstream systems or different interrelated services. Service Bus Queues provide ordered message delivery to competing consumers, while Service Bus Topics enable publish-subscribe messaging scenarios where multiple consumers can process the same message.

The following Service Bus features make it a key component of most integration solutions on Azure:

  • temporal decoupling;
  • the ability to load balance message processing using competing consumers;
  • the capability of implementing Publish/Subscribe architecture via Service Bus Topics;
  • and the fact that Functions and Logic Apps can very easily read and write messages from and to Service Bus.

API Management

API Management is an Azure Service which functions as a API Gateway or Front-End for backend APIs in the cloud or on-prem. In addition, it provides a Developer portal which helps to speed up the adoption and use of the implemented APIs. Some of the benefits of API Management are:

Thanks to all these features, API Management can be leveraged on many integration solutions which require to expose RESTful APIs and require any kind of mediation.

When to use each of these technologies and how they complement each other

Now that I have described the Azure iPaaS offerings, it’s worth analysing when we could use each of these technologies and how they complement each other. I summarise this in the table below.

Technology When to use it Use together with
Logic Apps
  • To implement and orchestrate visually designed integration workflows.
  • To orchestrate distributed microservices.
  • To leverage the 100+ connectors to interact with different protocols, SaaS systems and services, other Azure services, and on-premises systems.
  • To implement cloud-based B2B integrations with AS2 and EDI.
  • Functions – so custom logic can be implemented in microservices to be orchestrated by LogicApps.
  • Service Bus
    to decouple in time different microservices and steps in the integration process.
  • API Management – for those HTTP triggered apps when some of the capabilities of API Management are required.
Azure Functions
  • To implement code-based microservices or processing.
  • Logic Apps
    so different microservices can be orchestrated.
  • Azure Service Bus Messaging – to decouple in time different microservices.
  • API Management – for those HTTP triggered functions when some of the capabilities of API Management are required.
Service Bus
  • To decouple in time upstream systems from downstream systems or different microservices.
  • To implement multiple consumers on a Publish/Subscribe pattern.
  • To allow load distribution with multi-instance competing consumers.
  • Functions and Logic Apps
    to decouple in time different microservices and steps in the integration process.
API Management
  • When any of the API Management features is required, for example: securing backend APIs, API response caching, request throttling, request routing, request transformation, API calls tracing or logging, usage and health analytics, or providing a rich portal for developers.
  • Functions and Logics Apps – that are
    triggered by an HTTP call and require some kind of mediation.


Azure provides a very robust Integration Platform as a Service (iPaaS), which is based on Logic Apps and can be complemented with Azure Functions, Service Bus Messaging and API Management

The breadth and capabilities of many different Azure technologies and how they complement each other is what differentiates Azure against other iPaaS vendors. We can leverage many different services to build first-class integration solutions. Logic Apps is the core engine to implement these. Logic Apps can connect to many different protocols, SaaS apps and Services, to other Azure and Power App Services, to on-premises systems and to B2B trading partners via a growing list of connectors. Logic Apps integration workflows can easily be extended with custom code implemented as microservices on Azure Functions. In order to decouple in time these integration processes, we can leverage Service Bus Messaging services. And in case we need to expose our integration services as RESTful APIs, we might want to make use of all the features and capabilities of API Management. Additionally, these integration solutions can be enhanced by other Azure services, such as Cognitive Services to, for example get the sentiment from social media feeds or Azure Active Directory for authenticating and authorising calls to our APIs. All of this with all the PaaS magic and the powerful DevOps capabilities of Azure.

Thanks for reading, and please share your thoughts or queries on this topic below J

Happy clouding!