DEV Community

Cover image for Why your Power Platform Setup Needs a Repo
david wyatt
david wyatt Subscriber

Posted on • Updated on

Why your Power Platform Setup Needs a Repo

A Repo (repository) is simply a place to store what you develop. GitHub is probably the most famous repository (though it does more then just that), but there are others like BitBucket, Artificatory and Assembla.

It can be stored as code (editable individual files) or packages (aka artifacts), there are multiple benefits but the main are:

  • Source control (different save versions)
  • Redundancy (back up)
  • Multi dev working (split/merge different versions)
  • Integrated Deployments (Dev/Test/Prod)

So the question you are now asking is why would I need a repo in the Power Platform, my Apps and even flows now, have version history. Each environment has a copy of the solution, and I store copies when I manually deploy or through Pipeline (A copy of the solution is stored in Dataverse). But there are a few reasons, and it is also easier then you think, so lets dive in.

  1. Why - what's the benefits
  2. How to setup

1. Why - what's the benefits

Connections
There is one big big problem in the Power Platform, that also happens to be its biggest strength, connections. Connections are great, you use your own so no messing around setting up spn's (Service Principle Name), but now we have a problem, they are your connections. Which means we really shouldn't be sharing (hacked by power automate and how to avoid it), this is fine in Default on your personal solutions, but what about when its a business solution. If you want other devs to be able to update/maintain the flows/apps you have to give them access to your connections, which not only causes security issues but can cause export issues (if a flow uses a different persons connection reference, you can't see it in the solution so will export without it, causing dependency issues).

And this is where a repo comes in, once development is finished it is uploaded to the repo and deleted from the environment. Next dev who needs to work on it simply downloads and imports with their connections.

repo process

Code Consistency
I have seen far to often a solution deployed to prod, the developer then 'tinkers' with the dev version trying new things out. They stop and forget, when a new update is required these unlogged changes cause unexpected behaviour and bugs. Or the flip side there is a bug in prod and you can't recreate it in dev as you are actually working on a different version.

Again going to the repo ensures you are always working off the right version.

House Keeping
Dev environments can get very very messy very quickly. It can become hard top navigate and monitor on. Not to forget that it is all stored in Dataverse, which we have limited capacity (and it is not cheap to buy more). One solution is to burn the environment (delete after x days), and this is a good solution, but means you really need a repo even more else solutions could be lost.

Storing in a repo keeps your dev environment tidy and will most likely save you money in the long run.

Back ups
Although Power Platform environments do have recovery back ups it is good practice to diversify your storage. Storing on different platforms ensures you do not lose your data if any accidents/disasters happen. It also helps from personal errors (A dev deletes a solution by mistake, a environment rollback would then wipe any other developers work since back up).
 
backups


2. How to setup

There are 2 main ways to store your solution, as code or as an artifact.

If you want to go the code route then for Power Apps you would need to use the Git version control (its been in experimental for 2 years+ now) that integrates directly with GitHub, I wont go into this as there is good documentation here.

For Power Automate you have more options as at its core it is row in a Dataverse table (workflows), so you can store it in any database. You would also need to do the same for Connection References, Environment Variables and other components. All do-able (I've gone into detail what's in a solution here), but I would strongly recommend going with the easier artifact approach.

In this approach we let the Power Platform do all the packing and we just download the completed zip file with everything we need.

As we are just storing a zip file it can be anywhere, so the easy one is obviously SharePoint, but if you want to do it properly I would still recommend somewhere like Artifactory.

The Dataverse api has the ExportSolution Action which creates the solution export. You pass in some key information and it returns the zip file. So its easy to integrate with pro code solutions, but the easier way is to leverage Power Automate and the Dataverse Unbound Action (which calls the same api).

The below example exports the solution and creates an item in a SharePoint list. It stores some key information and a unique reference (run Guid). It then attaches the export to the item.

export flow

This can be called when needed (like when dev finishes for the he day), on a schedule, or during the deployment process.

trigger options


Repos can be thought of another 'burden' to maintain, with a 'its low code so I don't need pro code processes approach' or 'Its all do before me by Microsoft', but what we should really be looking at is the value. If the solution has high value, then we need to be risk adverse, and this is when established pro code processes like ALM (Application Lifecycle Management) and repos come in (why re-invent the wheel).

Top comments (0)