DEV Community

Cover image for Deploying Between Environments with Power Automate Instead of Pipelines
david wyatt
david wyatt

Posted on • Updated on

Deploying Between Environments with Power Automate Instead of Pipelines

Although not a function from the outset (And I think it should have been) ALM has been a definite focus for Microsoft, first with Dev Ops pipelines and now the new Power Platform pipelines.
But neither are quite perfect, Dev Ops is anything but 'LowCode' and requires support from your IT dept. The new pipelines look promising, though they are still very much in preview (missing features) and they require managed environments. Managed environments are great with lots of features, but they require all makers AND users to have premium licenses. So that bog standard SharePoint Power App would require all its uses to have premium license even though there are now premium capabilities.

The great thing about the Power Platform is it's built in Dataverse (the same place where we can store our data) and it has a great API. Even better there are Admin/Dataverse connectors also built on the API, so that means with the right know how we can pretty much build our own pipeline.

There are a couple of caveats

  1. The pipeline solution would have to be installed in every environment
  2. The maker/owner of the solution would have to have System Admin/Customizer security role
  3. There is no way to sign in for any maker/owner, so when a solution is imported for the first time a human in the loop step is required to sign in the connection references.

So for my pipeline there are a couple of components

  • SharePoint List as trigger and tracker
  • SharePoint List for human in the loop actions
  • Export Flow
  • Import Flow
  • Change Owner Flow
  • An Environment Variable

The solution looks like this:

solution contents

The overarching design plan is below, which each environment passing the solution along.

system diagram

For protection I set the list to edit only records, that way others cant trigger peoples pipelines.

sharepoint settings

SharePoint list should have following fields

  • Solution Name
  • Version Number
  • From Environment
  • To Environment
  • Deployment (Dev-Test, Test-Prod, Dev-Dev)
  • Change Number (for Test to Prod to show compliance)
  • Service Account (owner of solution in Test / Prod)
  • Account Sign in (used to request manual sign in to connectors)
  • Status (Queued, Exporting, Exported, Importing, Complete, Failed, Not Approved)
  • Error Message (Fail messages or not approved details)
  • Solution Data (where we store the solution)

The first 8 should be filled in by the requester, Status should default to Queued, Solution Data and Error Message left empty and filled in by the flows.

The Status is used by the flows as the key trigger condition:

  • Queued - triggers export
  • Exporting - while creating export
  • Exported - when export complete and triggers import
  • Importing - while importing (including waiting for approval)
  • Complete - import complete
  • Not Approved - if not approved
  • Failed - any exception

Point of reference, my solutions follows my business processes, so yours might not be the same. For me Dev-Test does not require any approval. Nothing can go from Dev to Prod, it has to go Dev to Test to Prod. Test to Prod requires a pre-approved change request (with all test evidence and approvals). Someone has to then validate/approve the change request to ensure its valid before imports to Prod. To help control/identify this all our environments have either DEV, TEST or PROD in their name, this helps the flow know when to request change approval.

The Export Flow

The export flow is triggered when the from environment is its environment (this is stored in an environment variable) and status is Queued.

@equals(triggerOutputs()?['body/FromEnvirnoment'], parameters('Environment (ia_Environment)'))
@equals(triggerOutputs()?['body/Status/Value'], 'Queued')
Enter fullscreen mode Exit fullscreen mode

First it updates the SharePoint item to Exporting, this is to make sure we don't have any accidental triggers and to let user know the flow has started.
It will then use the Unbound Dataverse action to create a solution export, selecting ExportSolution as the Action Name.

unbound solution export

The outputs('Perform_an_unbound_action')?['body/ExportSolutionFile'] will be stored in the Solution Data field.

update item

A backup of the export can be saved as an attachment but we need to convert the contents from base64 to binary, we can then attach it.

attach file

So the full flow should look something like this:

full flow

The Import Flow

The import flow is triggered when the To environment is its environment (this is stored in an environment variable) and status is Exported.
First it updates the SharePoint item to Importing, this is to make sure we don't have any accidental triggers and to let user know the flow has started.
As I mentioned I have additional steps for Test to Prod imports, so the flow will first do some checks

No Approval
Type = Dev to Test or Dev to Dev
From Environment includes DEV
To Environment does not include PROD
Approval
Type = Test-Prod
From Environment includes TEST
To Environment includes PROD

I use the SharePoint List with the Human in the Loop names to create the approvers for the approval flow. If it does not follow one of the two patterns an exception will be reported.

power automate flow

We use the unbound action again, but this time Action Name ImportSolution. The export contents was stored in the Solution Data field, and we need to create a unique import guid.

ubound import

guid()
Enter fullscreen mode Exit fullscreen mode

At this point we have successfully imported our solution, but there are a couple for things still to do.

First we need to check if the connections need updating. As I said we can't create those connections without signing in as that account. But once connected future imports will be ok and can use the signed in connection references.

This is just another simple logic that sends approval request, once they have completed the sign ins, they set to approved i.e. complete (Again the SharePoint List with Human in the Loop people is used for the approvers).

human in the loop logic

Next we need to change the owner of the Solution, as all of its contents (flows, apps, connection refs, environment vars, etc) are now owned the account that ran the import flow.
To keep things simple this is a separate child flow, and I will go over it in the next blog (it's useful to use outside of pipelines). Though if you plan to us same account then you can skip this stage.

Then as before final step is to update the SharePoint item but this time to Complete.

full flow


As you can see it's not exactly simple, but I wouldn't say difficult, and if you want to save money on premium licenses it could be the way to go.

I'm hopeful in the future I might be able to automate the connections, as it's in another dataverse table too.

Top comments (10)

Collapse
 
balagmadhu profile image
Bala Madhusoodhanan

@wyattdave Love the post... "Every problem is an opportunity to invent new possibility"

Collapse
 
oburgos20 profile image
Oliver Burgos • Edited

Hey David this was a great post! It was so good that I joined the dev.to community right away!

Unfortunately, the pictures are not clear and it is difficult to replicate these flows to test them.

Would it be possible for you to somehow share all the actions expanded?

Maybe attach a PDF to this article or to a new one?

Thanks!!!!

Collapse
 
wyattdave profile image
david wyatt

Hi Oliver, let me see if I can find the solution and share it with you if that will help. It's a little out of date now as we can use the new Dataverse connectors across environments (so don't need to have in every environment now)

Collapse
 
oburgos20 profile image
Oliver Burgos

That would be awesome! And many thanks for answering so quickly!

This sounds like a great improvement! --> "we can use the new Dataverse connectors across environments (so don't need to have in every environment now)"

Thread Thread
 
wyattdave profile image
david wyatt

Ive uploaded it here, github.com/wyattdave/Power-Platfor.... Like I said its a bit out of date and will need some work to make it fit your setup but hopefully point you in the right direction. Any questions let me know 😊

Thread Thread
 
oburgos20 profile image
Oliver Burgos

This is awesome! Thank you David!

Collapse
 
thangaraj_moorthi_a45baa0 profile image
Thangaraj Moorthi • Edited

Will the above steps help to import the solution into all other environments?
Because based on your blog above it doesn't seems it will import solution into another environment from the environment this solution is deployed, Could you please point us in which place you're switching the environment for importing a solution?

Thank you.

Collapse
 
wyattdave profile image
david wyatt

The pipeline automatically exports and imports. But you have to have the pipeline solution deployed in all environments.
The Export flow is in the environment with the solution. The important flow is triggered in the environment you want to import it.
I use trigger conditions in the SharePoint list, each import flow uses an environment variable as the trigger. So on each import of the pipeline solution set the environment variable to it

Collapse
 
thangaraj_moorthi_a45baa0 profile image
Thangaraj Moorthi

Thank you, David for taking some time and replying to my query.

Thread Thread
 
wyattdave profile image
david wyatt

No problem, if you have any more questions let me know