Microsoft Flow

Microsoft’s automation service, Microsoft Flow has been upgraded to include 5 new services and improves advanced capabilities for JSON and HTTP.

The 5 new services include: Azure Data Lake, Bitbucket, Eventbrite, Infusionsoft and Pipedrive.

Azure Data Lake allows you to read and add data to an Azure Data Lake account.

Bitbucket is a web based hosting service for projects that use GIT revision control.

Eventbrite is a self-service ticketing platform used to create and discover local events. Eventbrite has triggers on when events or created or when someone orders a ticket for an event. This means you can copy events to other systems or keep a separate log of all the event attendees.

Infusionsoft is sales and marketing automation software built exclusively for small business success.

Finally, Pipedrive is a CRM & pipeline management tool that helps you focus on actions that matter. There are actions for adding deals or updating the stage of a deal.

Define HTTP Authentication

Although we now connect natively to 95+ different services, you may also have HTTP endpoints that you want to work with that aren’t natively supported with Microsoft Flow. Microsoft Flow has long supported interacting with custom, unauthenticated HTTP endpoints, and today it’s now possible to set up calls to authenticated HTTP endpoints as well. To connect to a custom HTTP endpoint, first select the HTTP service when you add an action:

There are three possible actions:

  • HTTP – a basic call to any endpoint
  • HTTP + Swagger – if you have a swagger endpoint for your API, you can include that here and it will generate an experience where you can easily fill out the inputs for the action
  • HTTP Webhook – call out to another endpoint and wait for a response to come back (aka. a callback)

After you have selected the action, select Show advanced options and then select Authentication. The default is None, but you can choose one of four other types of authentication. After you select the endpoint’s authentication type you will be able to configure any specific data you need to send.

Parse JSON messages

When working with the Request trigger or the HTTP actions, you may need to parse out JSON data. This week we added a new data operation called Parse JSON. This action takes:

  1. The content you want to parse
  2. A JSON schema to parse the content against

When the flow runs, it will evaluate the content against the schema. This means you can use the different fields you defined in the schema throughout the rest of your workflow as dynamic content.

To make it easier to generate a JSON schema, we have also added a Use sample payload button. When you select this button, you can paste in your JSON content and it will attempt to generate a JSON schema that matches that payload. We have also added this new button directly on the HTTP Request card.

Better filtering for flow runs

Finally, we have added additional options for filtering the runs of your flow. Previously, you could only see succeeded or failed runs when you selected a flow and looked at its history.

Now, you can also filter to see just the instances that are currently running, or those runs that are cancelled. It is now possible to filter specifically to see failed checks (meaning the trigger did not fire because of an error), or, to see failed runs (meaning one or more of the actions did not execute successfully).

Source