top of page

Power Platform ALM - Enhanced Source Control

Maintaining a healthy source control is a very important element of Application Lifecycle Management (ALM). Microsoft has released Power Platform Build Tools that allow you to export your solution as configuration files into the Azure DevOps source repositories.

There are heaps of blogs available that explain how to set it up using Azure DevOps pipelines in case you have not done it before, and here is a quick screenshot of what that pipeline looks like and YAML for you to copy-paste.

pool:   name: Azure Pipelines variables:   SolutionName: 'PurchaseRequestSolution'  
- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.tool-installer.PowerPlatformToolInstaller@0   displayName: 'Power Platform Tool Installer '  
- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.export-solution.PowerPlatformExportSolution@0   displayName: 'Power Platform Export Solution '   inputs:     PowerPlatformEnvironment: ''     SolutionName: '$(SolutionName)'     SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\$(SolutionName).zip'  
- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.unpack-solution.PowerPlatformUnpackSolution@0   displayName: 'Power Platform Unpack Solution '   inputs:     SolutionInputFile: '$(Build.ArtifactStagingDirectory)\$(SolutionName).zip'     SolutionTargetFolder: '$(Build.SourcesDirectory)\$(SolutionName)'  
- script: |    echo commit all changes    git config ""    git config "Automatic Build"    git checkout main    git add --all    git commit -m "solution init"    echo push code to new repo    git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin main       displayName: 'Command Line Script – Commit to Repo' 

If you are using the above YAML , make sure to fix the service connection in the Export task and also set up the SolutionName variable in your pipeline

On successful execution of the pipeline, you will see your solution extracted and organized into folders and files in your repo

For Dataverse based solutions, most of these configuration files will be of XML type as these were inherited from the Dynamics XRM base, these XML files are nicely formatted and what that allows us to do is to easily compare them between branches and commits for change tracking

However, the more modern component of Power Platform such as Power Automate flows are stored as minified JSON files without any line breaks or formatting, therefore the comparison of these JSON files looks something like this, i.e. with all the code being on a single line the DIFF utility cannot pick up the exact change as it can do in the XML files

I have written the following script, when added to your pipeline it will iterate through all the JSON files and prettify them. add this script between the unpack and commit tasks of the original pipeline. Make sure that the solutionfolder variable in the script on line 1 is pointing to the folder where the unpack task has extracted the files

$solutionFolder = "$(Build.SourcesDirectory)\$(SolutionName)"

$files = Get-ChildItem -Path $solutionFolder *.json -Recurse

Foreach ($file in $files){
    $json =  Get-Content $file.FullName 

    if ($json -notmatch '\r?\n') {
        $json = ($json | ConvertFrom-Json) | ConvertTo-Json -Depth 100
    Set-Content -Path $file.FullName -Value $json

The pipeline will look like this

By prettifying the JSON files the comparison function works much better, and you will have a similar experience as XML files and this will come in handy in code reviews and pull request process

If you want to minify these JSON files in other pipelines where you want to consume and package the files from source control, the following PowerShell script will help

$solutionFolder = "C:\temp"

$files = Get-ChildItem -Path $solutionFolder *.json -Recurse

Foreach ($file in $files){
    $json =  Get-Content $file.FullName

    $minJson = ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100 -Compress
    Set-Content -Path $file.FullName -Value $minJson

bottom of page