Due to its enterprise use, parallel runs, and ample allowance, we never normally think about the need to optimize our flows.
Yet there are a couple of genuine uses, the first being Power Platform API calls. Every variable initialize, connector call and internal loop step use Power Platform API calls, and the standard user account only allows a max of 40k per day https://docs.microsoft.com/en-us/power-platform.
Then there is completion time, not something thought of, but what about when called by an app of virtual agent, then response times can be critical.
And finally is the warm feeling you get when you know your flow is super optimized.
So I thought I would share my top 5, each with an example of how it can save either Power Platform API calls or time.
1.Array Positioning
This one you probably already know but just in case (and because it was the first optimization trick I learned). When querying a SharePoint list or ther data source by using a filter to get one result back, Power Automate will auto add an 'Apply_to_each' as it doesnt know only one result will be returned so returns an array.
To get around it you can use the array positions:
outputs('Get_items')?['body/value'][0]?['ID']
The [0] represents first position ([2] would be 3rd), so when this is set as a variable you are setting the first ID not the array of ID's.
- Time savings: 0% (they were the same to the millisecond)
- API calls: 1
No time saving and only one API, but it def makes the flow look a lot neater.
There are lots more uses, like patterned returns (every odd row) and using one variable split to make API savings and decrease flow times.
2. Office Scripts
Office Scripts often deliver unparalleled performance and I would recommend reading my other Blog about writing to Excel here for more detail, especially around large datasets.
The only way to write data to Excel is to use the 'Create_row_into_table' action, which has to be in a loop. This can be very slow and uses a Power Platform API call for every row. But you can use an office script to do it, not only quicker but in 1 API call.
The below script just needs to be updated with 'colA', 'colB', etc with your field names in both the for loop and interface
function main(workbook: ExcelScript.Workbook, param: tab[])
let wb = workbook.getWorksheet("Sheet1");
let rowCount = param.length;
let starterrow = 2;
let count = wb.getUsedRange().getRowCount();
wb.getRange("2:" + count).delete(ExcelScript.DeleteShiftDirection.up);
for (let i = 0; i < param.length; i++) {
const currentObject = param[i];
const formattedrow = [[currentObject.colA, currentObject.colB, currentObject.colC, currentObject.colD]];
const rowRange = `A${starterrow + i}:D${starterrow + i}`;
wb.getRange(rowRange).setValues(formattedrow);
}
return rowCount + " Order Lines in total"
}
interface tab {
colA: string;
colB: string;
colC: number;
colD: string;
}
- Time savings: 97.5%,
- API calls: 100 (based on 100 items)
The use of Office Scripts is almost endless, with complex looping easily replaced with a few lines of code and additional functionality like Regex's.
3. Xpath Expression
The standard way to sum an array item would be to loop over every item and increment a variable with the value. Over large datasets that is a lot of API calls and can be very slow.
However this is where xpath can help, it can reference anywhere in an xml structure, but the cool bit is it has built in functions like sum.
To get there we need 3 steps,
- 'Select' to create and array of the numbers,
- 'Compose' to create a xml compliant object (root/fieldName).
- And finally a 'Compose' / 'Set_variable' to store the sum in by converting it to an xml and using xpath.
So in the above we would have to store the sum in 'Sum'
xpath(xml(outputs('ComposeConvert')), 'sum(/root/number)')
The above uses the 'xml' expression to covert the output of 'ComposeCovert' to xml, and returns the sum of the '' item.
- Time savings: 99.9%,
- API calls: 98 (based on 100 items) Yep, xpath took almost 0 seconds
Xpath can do Min,Max,Average,Count as well as Sum. It can also use And/Or operators to filter cross fields, and much much more.
4. Avoid Loop in Loops
There are a few times when a loop within a loop is required. The 2 most common I come across are looking up another field in a different SharePoint list, and 'Attachments' within 'Get_Emails'. The problem is the performance can be very poor, so to improve we can use a little know behaviour within Power Automate. 'Select' and 'Compose' called in a loop act like a single item within the loop, but outside return an array.
So in a SharePoint example we use the 'Select' to get the fields from the second 'Get_items' instead of looping over them.
The only issue is this creates an array of arrays
{"body":[{"ID":1}]},{"body":[{"ID":2}]},{"body":[{"ID":3}]},{"body":[{"ID":4}]},{"body":[{"ID":5}]}
So we need to convert it to a string so that we can remove the '{"body":' from the beginning, the '}' from the end. Then replace al the ']},{"body":[' strings with ',' so that we get:
[{"ID":1},{"ID":2},{"ID":1},{"ID":2},{"ID":3},{"ID":4},{"ID":5}]
This is then the input for the 'Parse_JSON' (so we can turn it back into an array). Which means skipping the first 10 characters (9 as 0 is first in an array), with a length shortened by 10+1 (10 the first characters, 1 the last '}'). Finally we have to wrap in another replace to remove any ',,', as an empty return would leave nothing between the commas. So you get:
replace(replace(substring(string(Outputs('Select')),9,sub(length(string(Outputs('Select'))),11)),']},{"body":[',','),',,',',')
- Time savings: 64%,
- API calls: 99 (based on 100 by 1 items)
If the 'Get_items' inside the loop returned more then 1 the savings would be exponentially more.
Moving everything out of loops or replacing loops with Select will always improve performance of any flow.
5. HTTP Connectors vs Out of the Box
Probably my favourite as this saved an app I was making due to slow load times. When we use the SharePoint 'Get_items' action it is set to return all data. That includes a lot of hidden/unnecessary metadata, which you have to receive with the fields you actually want. But using the 'Send_an_HTTP_request_to_SharePoint' action allows you to change the 'oData' header to 'nometadata'. This is a game changing as you can now remove all that metadata. The performance improvement is massive and only gets bigger with larger datasets:
- Time savings: 92% (based on 1000 items)
- API calls: 0
HTTP calls for SharePoint and the new Outlook action allow you to use the full suite of the API, allowing much more control and additional features.
So there are my top 5, they all have many more examples and there are plenty more optimizing tricks (like trigger conditions), but they are the ones I use the most.
Top comments (1)
The 3rd trick. converting XML (a markup clunky way to exchange data) for power automate process to JSON is a cool trick
😍