Azure DevOps: Update Repo files via Release Pipeline

Before we get into the meat of the article, I have to give some big thanks to Toon Vanhoutte and his blog entry about authenticating DevOps using the REST API. I would like to present the Real MVP Award below to Toon for his work.

In previous entries, we had discussed using the OAUTH token to modify variables but had not discussed (or even found) the source account, nor the additional permissions needed to update files within the repository. His work in researching and inquiring with the product team to determine the user account and permission level helped carry this process past the final hurdle.

Permission Prerequisites

As mentioned, there are some prerequisites that must be in place for our solution to work. Thankfully, it is not too difficult to manage. Navigate to Project Settings, and locate the Repositories tab. After the tab opens, select the individual repository you wish to work in. You will see security and membership info on the right-hand side. Upon location, select the Add option, and add the user account Project Collection Build Service (*your project*).

Save your changes, and after a short time, you will be greeted with the same screen as before, but with a new addition, the Project Collection Build Service account showing in the window. We now need to set appropriate permissions. Select the account, and we will Allow Contribute Permissions, in the addendum to any inherited. All other permissions we can set to Deny.

Uploading Files via Pipeline

To perform our re-upload of files, we will rely on something we have of course used before, API calls via Invoke-RestMethod. Thankfully, the DevOps API is pretty well documented, and we can translate it to our needs.

Again, similar parameters apply to all of the things I have covered. We want to keep the solution within PowerShell and will look to complete the process via it. For this instance, it will be helpful to know our end state, so we will work backward to arrive at our solution.

Creating a Push Request

To get the new file data into the Git Repository, we need to generate a push request via the DevOps API. Sounds simple enough. Looking over the documentation, we can see the format and information we will need.

We will post the data to our Project and Organization at dev.azure.com. We will use a base64 auth header containing our system token to authorize the request. Finally, we will format our data as the following for upload.

{
  "refUpdates":  [
    {
      "name":  "refs/heads/master",
      "oldObjectId":  "0987654321234567890asdf"
    }
  ],
  "commits":  [
    {
      "changes":  [
        {
          "newContent":  {
            "content": "ContentData",
            "contentType":  "rawtext"
          },
          "changeType":  "edit",
          "item":  {
            "path":  "/path/to/data.txt"
          }
        }
      ],
      "comment":  "Added rule asdf to set testNSG.csv"
    }
  ]
}

This may look familiar, as I used this piece before in our other DevOps entry to discuss how JSON conversion looks within PowerShell.

PowerShell JSON Recap

When PowerShell performs a ‘ConvertTo-Json’ command, hashtables get converted to braces ({ }), and arrays get converted to brackets ([ ]). Therefore, we can expect the information below to be converted to our item above

@{
  refUpdates = @(
    @{
      "name" ="$refName"
      "oldObjectId" = "$objectId"
    }
  )
  commits = @(
    @{
      "comment" = "$comment"
      "changes" = @(
        @{
          "changeType" = "edit"
          "item" = @{
            "path" = "$filePath"
          }
          "newContent" = @{
            "content" = "$nsgConcat"
            "contentType" = "rawtext"
          }
        }
      )
    }
  )
}

Looking at the requested values, and through the documentation, it can be easy to get tripped up in grabbing the correct information. To clear the confusion, let’s run through how to populate our JSON body.

Creating the JSON Request Body

You will note that the request body is divided into two initial parts, the commits, and the ‘refUpdates’ sections. the name field will specify which branch you want to target for the push. In our case of pushing to the master branch, we would use refs/heads/master.

refUpdates = @(
  @{
    "name" = "refs/heads/master"
  }
)

Coming next we have the oldObjectId. In quick terms, this is the latest Commit ID connected to the file we want to update. We will take a quick detour on how to get that information

Retrieving the OldObjectId

Getting the oldObjectId manually is not too difficult, thankfully. Navigate to your repository, navigate to the root, and select History. When there, locate the last updated version of the repository, click the three dots next to the commit value and select Copy Full SHA.

Again, simple enough, but how do we get that without needing to get hands-on with the portal? We are once more able to leverage the DevOps API to get this information.

Using the reference documentation, we are able to craft our request. Sending a request to our Project and Organization in dev.azure.com, we now also specify that we want to locate the information from the master branch.

Running an Invoke-WebRequest we get the full commit information in JSON. Converting out of JSON, we can find the data we need in the first value as the Commit ID. With the Commit ID now received, we can add it to our ‘refUpdates’ section

#Generate Get Commit URL
$commitUri = ('https://dev.azure.com/{0}/{1}/_apis/git/repositories/{1}/commits?searchCriteria.itemVersion.version={2}&api-version=5.0' -f $organization, $project, $branch)

#Retrieve Commit Data
$commit = Invoke-webrequest -uri $commitUri -Headers $header

#Retrieve Commit ID from Request
$commitId = ($commit | ConvertFrom-Json).value[0].commitId

refUpdates = @(
  @{
    "name" = "refs/heads/master"
    "oldObjectId" = "$commitId"
  }
)

Creating the Commits Section

With our ‘refUpdates’ out of the way, we can now craft our Commits section. Starting with the easiest option, the comments field is fairly straightforward, this is the data that will be commented along with your push request. This should be something that helps explain the changes that were made.

Within the changes section, we initially see the ‘ChangeType’ option. There are a few options available, Add, Edit, Rename, Delete, among others. The requirements between these four are all fairly similar, with only minor differences. The parameters needed for each can be found below.

AddEditRenameDelete
ChangeTypeXXXX
sourceServerItemX
itemXXXX
newContentXX

Covering edits today, we know that we will need ‘changeType’, item, and ‘newContent’ values. For now, let’s use ‘changeType’ edit, and we will want to get the item information next. Item will be the path to your file within the repository. If all your items are in root, the path will be /myItem.ext, otherwise it will include any subfolders it is in, akin to /path/to/myItem.ext.

If you open the portal, you can get this path by navigating to your file and copying the path that appears after your project name.

Lastly, we will cover ‘newContent’. Another that is exactly what it says on the tin, this will be the new data that gets uploaded. This is specified in two parts, content and contentType. ContentType can be either ‘rawContent’ or base64encoded. That setting will dictate how the content is entered. If you select ‘rawContent’, the input is read exactly as is. If you select base64encoded, you will need to enter the data as a base64 encoded value and it will then get decoded as such.

We will use ‘rawContent’ for our example, as it allows us some more control and visibility. With that decided, we can craft our commits section. After this, we can combine with the ‘refUpdates’ to get our request body.

commits = @(
  @{
     "comment" = "$comment"
     "changes" = @(
       @{
         "changeType" = "edit"
         "item" = @{
           "path" = "$filePath"
         }
         "newContent" = @{
           "content" = "$csvUpload"
           "contentType" = "rawtext"
         }
       }	
    )	
  }
)

Posting the Data

With all of that information now in place, we are able to upload our file using Invoke-RestMethod. Let’s walk through a slightly practical example below.

#Get Environment Variables
$person = $env:RELEASE_DEPLOYMENT_REQUESTEDFOR
$organization = ($env:SYSTEM_COLLECTIONURI.Replace('https://vsrm.dev.azure.com/', '')).replace('/','')
$project = $env:SYSTEM_TEAMPROJECT
$systemToken = $env:SYSTEM_ACCESSTOKEN

#Create Base 64 String
$auth = (':{0}' -f $systemToken)
$auth = [Text.Encoding]::UTF8.GetBytes($auth)
$auth = [Convert]::ToBase64String($auth)
$header = @{Authorization=('Basic {0}' -f $auth)}

#Set Additional Variables
$instance = 'dev.azure.com'
$branch = 'master'
$version = '5.0'
$filePath = '/path/to/myItem.ext'
$content = 'This is new content'
$comment = "Adding new content to $filePath - Requested by $person"
$refName = ('refs/heads/{0}' -f $branch)

#Retrieve Commit ID from Request
$commitUri = ('https://{0}/{1}/{2}/_apis/git/repositories/{3}/commits?searchCriteria.itemVersion.version={4}&api-version={5}' -f $instance, $organization, $project, $project, $branch, $version)
$commit = Invoke-webrequest -uri $commitUri -Headers $header
$commitId = ($commit | ConvertFrom-Json).value[0].commitId

#Create JSON Post Structure
  $updateRequest = @{
    refUpdates = @(
      @{
        "name" ="$refName"
        "oldObjectId" = "$commitId"
      }
    )
    commits = @(
      @{
        "comment" = "$comment"
        "changes" = @(
          @{
            "changeType" = "edit"
            "item" = @{
              "path" = "$filePath"
            }
            "newContent" = @{
              "content" = "$content"
              "contentType" = "rawtext"
            }
          }
        )
      }
    )
  }

$updateRequestJson = $updateRequest | ConvertTo-Json -Depth 100

#Post Data back to repo
$updateUri = ('https://{0}/{1}/{2}/_apis/git/repositories/{3}/pushes?api-version={4}' -f $instance, $organization, $project, $project, $version)
Invoke-RestMethod -Method Post -Uri $updateUri -Headers $header -Body $updateRequestJson -ContentType 'application/json'

And with that run, we now will be able to see our file updated with the new information, with our commit comments, and the commit run by the Project Collection Build Service account.

If in the addendum you wish to get additional information about the person running the command, you could embed the environment variable RELEASE_DEPLOYMENT_REQUESTEDFOR into the comment line for tracking, as seen above.

I hope this was helpful, and I look forward to bringing you more content in the future.

Azure DevOps: Update Repo files via Release Pipeline

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.