Over the past few weeks and months, I’ve posted quite a few blog posts about working with JSON files, particularly in Power Automate, and interacting with APIs.
The content has come from what I’ve learned integrating the Power Platform with third party APIs. It been enjoyable and at times frustrating, so I’ve collated and summarised each of those posts as a point of reference for anyone out there who finds themselves working with JSON.
Table of Contents
The Structure of JSON
JSONs can be complex and are often intimidating. I definitely felt that way initially. Sometimes I got the right results but didn’t always completely understand why!
It’s worth investing some time to understand how JSONs are constructed and I hope the posts below help do that.
In this article I explain the basics of JSON including what denotes a JSON array and what constitutes an object with key/value pairs.
I also discuss how you can transform arrays,including the essentials of parsing a JSON, defining what parsing actually means and what the benefits are.
My objective in this post was to describe a JSON structure, and how to use the knowledge gained to extract the particular data you require using Power Automate. I explain how to target specific arrays in a JSON, the objects they contain and how to count the number of objects that are returned.
I also discuss how to make use of free online tools such as JSON formatters to visualise and navigate JSON structures effectively.
When constructing a JSON in a Canvas app, even if your code is written perfectly, sometimes through no fault of your own, carriage returns and lines breaks are inadvertently introduced. This is likely to result in API call failures.
I suffered from this and it took me days to track the root cause down. I now always use the Power FX ‘Substitute’ function to remove possible carriage returns and new line characters from a newly constructed JSON. By implementing this method, hours of frustration can be avoided to ensure reliable API interactions.
Preparing JSON for Parsing
When you receive your API response from a HTTP action, the returned JSON can be very complex. Simplifying or removing unnecessary content can be a great help.
This post introduces a technique to reduce the JSON returned from an API in Power Automate, to just the data you require, before parsing.
Using the ‘Select’ action, focus on the specific section of the JSON that you need, discarding the unnecessary parts to create a more manageable dataset. This simplification facilitates better data extraction with the Parse JSON action and hence easier use of data in the subsequent flow actions.
When the data returned from an API call is complex and contains more information than you need, follow the recommendation in this post. I explain how to extract data from nested JSON structures using Power Automate, a common requirement when processing API responses.
The ‘Select’ action is used to directly map and flatten nested components into a simplified array. This approach avoids the need for ‘Apply to each’ loops, so resulting in more efficient flows.
Good practice for Parsing JSON
When you are ready to Parse your JSON, take a moment to think about the data.
What will you do if some expected data is not returned by the API? Will it generate an error and cause the Parse JSON action to fail? 99% of your API calls may return the data you expect but be prepared for the 1% that may not.
In this post I explain how you can build flexibility into your flows by improving the format of a JSON schema to handle incomplete data, in particular when an API returns a null value.
Null values may be a valid response from an API, but unless you specifically allow nulls when defining your JSON schema in the Parse JSON action, an error could be returned. It’s vital to avoid data errors so your flow doesn’t fail.
This article focuses on validating the JSON data returned by APIs to ensure all required fields are present before processing.
By defining required properties in the ‘Parse JSON’ action schema, the flow can check for missing data or empty strings and handle them appropriately. This can allow the initiation of alternative actions, so that any incomplete datasets can be managed effectively.
Making Use of JSON Data
Now you’ve parsed the JSON and have the data you need in Power Automate, how will you use it? You may want to transform or merge arrays, or even pass the JSON to a child flow. Here is how you can do just that.
This article discusses how to handle scenarios where an API returns an empty string. Empty strings in JSON can lead to errors, particularly when inserting data into Dataverse.
Using the Power Automate ‘Coalesce’ function, you can replace empty strings with null values before processing. By implementing this approach, you can ensure that flows handle incomplete data gracefully without any failures.
There are two methods for handling JSON arrays in Power Automate: ‘Apply to each’ loops and the ‘Select’ action.
If specific actions are required for each item in an array, for example, sending an email, then a loop is required.
However, if your objective is to simply transform the contents of an array, then the ‘Select’ action is a much more efficient alternative. ‘Select,’ enables streamlined and faster flows when manipulating JSON data.
The ‘Union’ function is employed within a ‘Compose’ action to combine arrays in Power Automate.
This approach appends the second array to the first while removing any duplicate elements. The merged result is stored in a new array variable for further use.
Passing a JSON array from a parent flow to a child flow in Power Automate can be a challenge given that child flows with manual triggers do not accept arrays as input types.
The solution requires the JSON array to be converted into a string in the parent flow and passing it as ‘Text’ to the child flow. In the child flow, the string is then parsed back into a JSON array, enabling seamless data transfer between flows without input type constraints.
Happy New Year!
The end of 2024 brings to a close my posts on JSON – at least for the moment. So for a last time, happy parsing and a Happy New Year!