If you don’t already know, I’m a pretty heavy user of Chocolatey. Especially when coupled with Boxstarter to spin up new machines and get all my software on them without chaining me to the desk through the whole process.
If you’re a *ix user and haven’t met Chocolatey, I highly suggest you get to know it; you’ll fall (back) in love with Windows 😉
That said, it’s the community of package authors that really makes Chocolatey worthwhile. Developers who either include Chocolatey packages (built on the nuget spec) in their release processes, or passionate users like me who build Chocolatey packages for software they know and/or love.
As of this writing, I manage 29 packages on Chocolatey.org. One thing it’s imperative package authors do, though, is keep their packages up to date as new versions of the software they’re wrapping get released. Without this, packages simply atrophy and the value of Chocolatey goes down. That said, managing nearly 30 packages would be an all-consuming task if I was forced to monitor all the various distribution channels as well as hand-roll updates to them whenever they were released. So how do I do it? Let me show you.
Step 1: Figure out the trigger.
There are a handful of ways I’ve noticed the packages I manage release software in a way that I can monitor.
- GitHub Releases
If you’ve got a package that releases via GitHub, there’s an API for that. However, it’s not an api you can trigger from, so you basically have to poll it. For me, this ends up being ~6hr when I implement it. - RSS XML
If you’ve got a package that releases or posts to a destination that has an RSS feed, you’re in really good luck here, because you can trigger off that RSS Feed and kick off your process – very handy. - Custom endpoint
If there’s an app that has its own "Check for Updates" feature built-in, you can inspect this to see how it’s doing the work, then replicate it as part of your package updating.
Step 2: Implement the check
Use Microsoft Flow
Why Flow? There are a few reasons:
- It’s free
- It only requires a Microsoft Account (literally nothing else)
- It’s free
Once you’ve figured out how the app either posts updates or checks for them, it’s time to write a Flow that does this check.
- For GitHub Releases or Custom Endpoints, you’ll use the
Recurrence
trigger in Flow and set this to an interval you don’t think will get you throttled or banned by the endpoints 😉 - For RSS XML, you can use the "On new Feed Item" trigger.
Because it’s more complicated (and nowadays more common) we’ll walk through the Custom Endpoint example here.
Set up the trigger
- Click
My Flows
in the left-hand side - Choose
New | Create from blank
- Click
Search hundreds of connectors and triggers
- In the search box, type
recurrence
and choose theSchedule
option that shows up - Define your interval. I recommend 6 hours because there are very few choco users who expect instant updates, and this will check for/ship updates at least a few times/day which tends to be within what the community expects.
Note: This timer starts when you click
Save
unless you expand outAdvanced Options
and choose a Time Zone and define a start time for it to begin kicking off. So if you want it to be 6a, 12p, 6p, 12a, then define a midnight tomorrow start time now.
- Add the next step of the flow.
When this kicks off, what should it do? The algorithm looks like this:
- Check the app’s version endpoint & extract latest available version data
- Download the latest version binary & compute SHA256 hash on it (used in Chocolatey pkg)
- Query Chocolatey’s API and extract latest published version data
- Compare the two version numbers
- Do work in case of discrepancy (we’ll go in to more detail on this later)
Let’s dive deeper in to each of these steps
Query the app’s version endpoint & extract latest version data + Compute SHA of binary
These two steps are best done by an Azure Function. Mostly because, well, Flow can’t do SHA compute.
To implement this in a (C#) Azure Function that will be called by Flow, first create an HTTP-Triggered Azure Function (Good news! You can use Azure Functions in a Free Azure Subscription and chances are very good you’ll never end up being billed). In this Function, hit the HTTP endpoint for your software’s update check. Next, stream the binary to .Net’s SHA library for sha computation. When done, send the latest version # plus the SHA for it back to the caller. For my most popular package (Vivaldi) this looks like this:
[FunctionName("GetLatestVersion")]
public static async System.Threading.Tasks.Task<IActionResult> RunAsync([HttpTrigger(AuthorizationLevel.Function, "get", Route = null)]HttpRequest req, TraceWriter log)
{
string targetUrl = Environment.GetEnvironmentVariable(@"ReleaseVersionCheckUrl");
var xdocResult = XDocument.Load(await _client.GetStreamAsync(targetUrl));
var enclosureElement = xdocResult
.Element(@"rss")
.Element(@"channel")
.Element(@"item")
.Element(@"enclosure");
var version = enclosureElement.Attributes()
.Single(a => a.Name.LocalName.Equals(@"version", StringComparison.OrdinalIgnoreCase));
var x64url = enclosureElement.Attribute(@"url").Value;
var x86url = x64url.Replace(@".x64", string.Empty).Replace(@"X64", string.Empty);
// We make use of using(), streams, and GC.Collect() here to make sure our Function keeps its memory footprint low so as not to incur unnecessary usage charges on the consumption plan
string x64hashString, x86hashString;
using (var sha = SHA256.Create())
{
using (var versionByteStream = await _client.GetStreamAsync(x64url))
{
x64hashString = string.Join(string.Empty, sha.ComputeHash(versionByteStream).Select(b => b.ToString("X2")));
log.Info($@"64-bit hash: {x64hashString}");
}
GC.Collect();
using (var versionByteStream = await _client.GetStreamAsync(x86url))
{
x86hashString = string.Join(string.Empty, sha.ComputeHash(versionByteStream).Select(b => b.ToString("X2")));
log.Info($@"32-bit hash: {x86hashString}");
}
GC.Collect();
}
GC.Collect();
return new OkObjectResult(new { version = version.Value, x86 = new { download = x86url, hash = x86hashString }, x64 = new { download = x64url, hash = x64hashString } });
}
You tie this Function in to your Flow with the HTTP
Action via a GET
.
- Type
HTTP
and choose theHTTP
action - Add the URL to your Azure Function
- Once you’ve got this response, it’s time to parse it in to something you can use later on in your Flow. To do this, we use the Parse JSON action
with this schema:
{
"type": "object",
"properties": {
"version": {
"type": "string"
},
"x86": {
"type": "object",
"properties": {
"download": {
"type": "string"
},
"hash": {
"type": "string"
}
}
},
"x64": {
"type": "object",
"properties": {
"download": {
"type": "string"
},
"hash": {
"type": "string"
}
}
}
}
}
Query Chocolatey’s API to get the latest published version of the target package
Similar to calling the Azure Function for the latest available version for a package’s target software, the Chocolatey API will also use the HTTP
action and define the HTTP endpoint for it to hit. A few things to keep in mind:
- You’ll want to have done this in a separate tool so you can see the response that comes back
- If at all possible, get the response in JSON (eg: define the
Content-Type
header asapplication/json
to let the target endpoint know you want the data back in JSON) because Flow does a lot better with this than XML (on the Free tier, anyway)
Start by, again, adding the HTTP
action to your Flow
Chocolatey uses the nuget API under the covers, which in turn utilizes the OData spec to perform filters, searches, etc at the HTTP query level. For example, in my Vivaldi implementation the target URL is http://chocolatey.org/api/v2/Packages()?$filter=Id%20eq%20'vivaldi'%20and%20not%20IsPrerelease&$orderby=Published%20desc&$top=1
. This returns all packages with:
id == vivaldi
!IsPrerelease
ORDER BY Published
(date)desc
- Top 1
Which gets me back the last published version of Vivaldi to the Chocolate community repository. In addition, on this HTTP Action I set the Accept
header to application/json
so choco gives me back the result at JSON.
Once you’ve got this response, it’s time to parse it in to something you can use later on in your Flow. To do this, we use the Parse JSON action
{
"type": "object",
"properties": {
"d": {
"type": "array",
"items": {
"type": "object",
"properties": {
"__metadata": {
"type": "object",
"properties": {
"uri": {
"type": "string"
},
"type": {
"type": "string"
},
"edit_media": {
"type": "string"
},
"media_src": {
"type": "string"
},
"content_type": {
"type": "string"
}
}
},
"Id": {
"type": "string"
},
"Version": {
"type": "string"
},
"Title": {
"type": "string"
},
"Summary": {
"type": "string"
},
"Description": {
"type": "string"
},
"Tags": {
"type": "string"
},
"Authors": {
"type": "string"
},
"Copyright": {},
"Created": {
"type": "string"
},
"Dependencies": {
"type": "string"
},
"DownloadCount": {
"type": "integer"
},
"VersionDownloadCount": {
"type": "integer"
},
"GalleryDetailsUrl": {
"type": "string"
},
"ReportAbuseUrl": {
"type": "string"
},
"IconUrl": {
"type": "string"
},
"IsLatestVersion": {
"type": "boolean"
},
"IsAbsoluteLatestVersion": {
"type": "boolean"
},
"IsPrerelease": {
"type": "boolean"
},
"Language": {},
"LastUpdated": {
"type": "string"
},
"Published": {
"type": "string"
},
"LicenseUrl": {
"type": "string"
},
"RequireLicenseAcceptance": {
"type": "boolean"
},
"PackageHash": {
"type": "string"
},
"PackageHashAlgorithm": {
"type": "string"
},
"PackageSize": {
"type": "string"
},
"ProjectUrl": {
"type": "string"
},
"ReleaseNotes": {
"type": "string"
},
"ProjectSourceUrl": {
"type": "string"
},
"PackageSourceUrl": {
"type": "string"
},
"DocsUrl": {
"type": "string"
},
"MailingListUrl": {
"type": "string"
},
"BugTrackerUrl": {
"type": "string"
},
"IsApproved": {
"type": "boolean"
},
"PackageStatus": {
"type": "string"
},
"PackageSubmittedStatus": {
"type": "string"
},
"PackageTestResultUrl": {
"type": "string"
},
"PackageTestResultStatus": {
"type": "string"
},
"PackageTestResultStatusDate": {
"type": "string"
},
"PackageValidationResultStatus": {
"type": "string"
},
"PackageValidationResultDate": {
"type": "string"
},
"PackageCleanupResultDate": {},
"PackageReviewedDate": {
"type": "string"
},
"PackageApprovedDate": {},
"PackageReviewer": {
"type": "string"
},
"IsDownloadCacheAvailable": {
"type": "boolean"
},
"DownloadCacheStatus": {
"type": "string"
},
"DownloadCacheDate": {},
"DownloadCache": {},
"PackageScanStatus": {
"type": "string"
},
"PackageScanResultDate": {
"type": "string"
}
},
"required": [
"__metadata",
"Id",
"Version",
"Title",
"Summary",
"Description",
"Tags",
"Authors",
"Copyright",
"Created",
"Dependencies",
"DownloadCount",
"VersionDownloadCount",
"GalleryDetailsUrl",
"ReportAbuseUrl",
"IconUrl",
"IsLatestVersion",
"IsAbsoluteLatestVersion",
"IsPrerelease",
"Language",
"LastUpdated",
"Published",
"LicenseUrl",
"RequireLicenseAcceptance",
"PackageHash",
"PackageHashAlgorithm",
"PackageSize",
"ProjectUrl",
"ReleaseNotes",
"ProjectSourceUrl",
"PackageSourceUrl",
"DocsUrl",
"MailingListUrl",
"BugTrackerUrl",
"IsApproved",
"PackageStatus",
"PackageSubmittedStatus",
"PackageTestResultUrl",
"PackageTestResultStatus",
"PackageTestResultStatusDate",
"PackageValidationResultStatus",
"PackageValidationResultDate",
"PackageCleanupResultDate",
"PackageReviewedDate",
"PackageApprovedDate",
"PackageReviewer",
"IsDownloadCacheAvailable",
"DownloadCacheStatus",
"DownloadCacheDate",
"DownloadCache",
"PackageScanStatus",
"PackageScanResultDate"
]
}
}
}
}
As with your custom Azure Function, Flow will use this schema to give you intellisense when you want to use the properties of the response from Chocolatey (the version, in our case) in subsequent actions of the Flow.
The thing to notice here, though, is that – per the schema – the result is always an array. But we’ve only requested TOP 1
so we just need the first item in this array. Not to worry, Flow provides a robust set of "expressions" to do exactly things like this.
Add another step to your flow, this time searching for compose
and choosing the Data Operations
action:
Once added, click in the Inputs
text box and notice the pop-out. Choose the Expression
tab, type first
and pick it
While inside the first()
function, switch back to Dynamic Content
and choose the Parse JSON
step you added prior to this
This add’s Flow’s reference to the output of that step as a parameter to the first()
method, thereby choosing the first item in the JSON array exactly as we want. But, we also want the .Version
property off that. No matter, just tack it on the end like so:
and click OK.
Compare the two results
Now we’ve got the latest available version of the software (Azure Function .version
) as well as the latest published version of the software (Chocolatey query .Version
) so we need to compare them. What would a workflow orchestrator be if it didn’t have control blocks?
This is as simple as adding an if
block to your flow.
+ New step
and search forcondition
. Choose theCondition
Control action:- The left side should be the Chocolatey Version, the right side the Available Version, and the operator
does not start with
:
This is because the Chocolatey version, due to the semver nature of nuget package versioning, will always start with the same as a matching available version, but may not always equal it.
Once you’ve defined this, we can add more logic to the ‘If Yes’ side of the conditional’s canvas:
Step 3: Publish a new version
Here’s where things got pretty inventive, if I do say so myself. Until recently, the ‘If Yes’ portion of this conditional just sent me an e-mail letting me know a new version was out and included the version number and the relevant SHA values. I would then take these, manually edit the Chocolatey package for the software, and push it up.
As you can imagine, this got quite arduous. Especially with a package like Vivaldi which regularly ships either release or snapshot builds around 1x/week.
So I got to thinking, what exactly needs to happen at this point? Well, my local workflow was
- Clone/pull my GitHub repo containing the pkg
- Replace the version of the software in the nupkg and links within the install.ps1 file
- Replace the SHA values in the install.ps1 file
choco pack
choco push
Then it dawned on me. Azure DevOps now has free build/release pipelines for Open Source projects
and this was one such project! Azure DevOps builds on actual machines. Machines which can run PowerShell scripts. And – as I found out – machines which already have Chocolatey installed. So full automation should be entirely possible (spoiler alert: it was). Here’s how I did it.
Step 3.1: Get the repo ready
Rather than checking in a nuspec file targeting the actual version, and a chocolateyInstall.ps1 file with the real links to binaries and SHAs, I needed to instead have tokenized versions of these files. So my nuspec changed to have this in it:
<package xmlns="http://schemas.microsoft.com/packaging/2015/06/nuspec.xsd">
<metadata>
<id>vivaldi</id>
<version>$version$</version>
and my chocolateyInstall.ps1 changed to this:
$packageArgs = @{
packageName = $packageName
fileType = 'exe'
url = '$32url$' # token to be replaced
silentArgs = '--vivaldi-silent --do-not-launch-chrome --vivaldi-update'
checksum = '$32sha$' # token to be replaced
checksumType = 'sha256'
url64bit = '$64url$' # token to be replaced
checksum64 = '$64sha$' # token to be replaced
checksumType64 = 'sha256'
}
because Chocolatey’s command line utilizes nuget to do the actual packing, I can pass --version <version>
to choco pack
and it’ll take care of the line in the nuspec
for me. The others, however, I needed to figure out how to do some other way. Of course PowerShell is very, well, powerful and has this ability.
I created a new file pack.ps1
which takes care of the find/replace as well as the choco pack
command. I parameterized this file so it could be easily called from outside and do all the right work. The outcome looked like this:
Param(
# Version # of the build
[Parameter(Mandatory = $true)]
[string]
$version,
# 32-bit download URL
[Parameter(Mandatory = $true)]
[string]
$url32,
# 32-bit SHA value
[Parameter(Mandatory = $true)]
[string]
$sha32,
# 64-bit download URL
[Parameter(Mandatory = $true)]
[string]
$url64,
# 64-bit SHA256 value
[Parameter(Mandatory = $true)]
[string]
$sha64,
# output directory
[Parameter()]
[string]
$outputdirectory = '.\')
$installFile = Get-ChildItem .\tools\chocolateyinstall.ps1
$content = Get-Content $installFile
$content = $content -replace [regex]::Escape('$32url$'), $url32
$content = $content -replace [regex]::Escape('$64url$'), $url64
$content = $content -replace [regex]::Escape('$32sha$'), $sha32
$content = $content -replace [regex]::Escape('$64sha$'), $sha64
Set-Content $installFile.PSPath -Value $content
choco pack --version $version --out $outputdirectory
Now I could call this script from anywhere, giving it the new version number for Vivaldi along with the new SHA values and it would create a new Chocolatey package ready for pushing up to the community repo. Let’s see how to do that next.
Step 3.2: Creating the Azure DevOps pipeline
As we’ve already pointed out, Azure DevOps is free for open-source projects. Host your chocolatey package’s source on GitHub (recommended anyway) and you’re all set.
Build Pipeline
In Azure DevOps, create a new Build Pipeline by going to Pipelines
| Builds
| + New
| New build pipeline
Configure your new Build Pipeline to use your GitHub repo as its source, picking the right branch and click Continue
Since our flow is pretty custom, choose Empty job
at the top
Agent pool should be the Hosted 2017 pool, as these have everything we need (Chocolatey) pre-installed and ready to go. So just click +
on the Agent job 1
task list now
There are only two things we need to do: run the pack.ps1
script, and drop the resulting .nupkg
file in to a location that can be (by a Release process) pushed to the community feed.
Search for powershell
in the task list and choose PowerShell
. Click Add
For this item, choose inline
as it’s a pretty straightforward command we’ve got to run. It should look like this:
That’s
.\pack.ps1 -version $(pkgver) -url32 $(32url) -sha32 $(32sha) -url64 $(64url) -sha64 $(64sha) -outputdirectory "$(Build.ArtifactStagingDirectory)"
You might be wondering "Where are the sha & url variables coming from? How will they be populated?"
😃
We’ll get to that.
Add another task to this pipeline, the Publish Artifact task (search artifact
):
Because we set the outputdirectory
of the pack script to ArtifactStagingDirectory
, its default configuration is all we need:
This will take the output of the pack script (the new nupkg
file) and put it in a drop folder that can be picked up by other pipelines. More notably a Release Pipeline which will push the file to the Chocolatey community repository.
Configure the input variables
Let’s back up now and look at where we define those input variables to the build process. As you might’ve guessed, head to the Variables
tab up top:
In this Pipeline variables
section, add the following:
32sha
32url
64sha
64url
pkgver
and make them all settable at queue time. The end result should look like this:
Now these get to be 1) set at the time we queue a built (we’ll see this in a minute) and 2) used by tasks in our Build pipeline, like our pack.ps1
PowerShell task.
In the Save & queue
button, you can click Save
now and we’re done
Release pipeline
Now that we’ve built the nupkg
for our new release, we need to push it out to the community repository. While we could have done this with that same inline PowerShell script, it’s more correct to do this in a separate pipeline – and does have its advantages.
Firstly, it’s a separation of concerns. If anything ever changes with how/where Chocolatey publishes things, we don’t have to risk mucking up how it builds things to adapt. We just change our Release pipeline.
Secondly, Azure DevOps Release Pipelines allow for some nice things. Not the least of which are gated releases and release approvers. This means that before a Release happens, you can send an e-mail to one or more people requiring one or more of them to Approve the Release before it actually takes place. I recommend doing this for the first few executions of your new pipeline so you can sanity check things before they go out the door. It gives you a sense of confidence before you completely take the training wheels off. 😃
To create a Release pipeline triggered from our Build pipeline, Go to Releases
| + New
| Release pipeline
Again due to the custom nature of what we’re doing, start with an Empty job
In the workflow UI, click the Add an artifact
box in the Artifacts area
Here simply choose the Build Pipeline you created earlier, and all is automatically wired up for you:
Next, configure the Release for Continuous Deployment by clicking the lightning bolt icon over the artifact box:
and simply enabling it
With this done it’s time to move on to tasks for the release. Click the "1 job, 0 task" link in the Stage 1 area:
This should look familiar as it’s a lot like the Build Pipeline tasks area, so go ahead and search for powershell
again and choose the PowerShell task:
The PowerShell to run, again, is pretty simple so just define it inline:
Note: You can get your Chocolatey API key to use in this call from the Chocolatey Account page. You may want to store this as a Variable in your Release pipeline if you’d rather not have it shown in plain text on the screen at any time. You can do this like so:
If you want to turn on approvals, go back to the Pre-deployment conditions
area in the pipeline:
Enable it, and decide who – in your DevOps team/org – you want to approve releases:
Step 4: Connect the dots
So far we’ve:
- Created an Azure Function to look for new versions of our software and compute the SHA
- Created a Microsoft Flow that checks the latest version published on Chocolatey and compares that to the version returned from our Azure Function
- Configured our package source to accept parameterized build commands
- Configured an Azure DevOps Build & Release pipeline to build a
nupkg
for a new version of our software and push it to the Chocolatey community repository.
But wait… how do we get the Flow to kick off the Build? We aren’t checking in any new code, so it can’t be a Continuous Integration trigger…
… everything in Azure has an API, my friend!
Queue a build from Microsoft Flow
A simple HTTP POST to the right API and we’re golden. You can find the details here, or just follow along (hey, you’ve made it this far…).
Step 4.1.1: Get a PAT for your Azure DevOps instance
Back in Azure DevOps, hover over your avatar in the upper right (or the generic one if you haven’t given yourself a picture yet) and choose Security
You’ll be dropped in the PAT area of your account, click Add
To be most secure, choose only the Organization in which you created the Build pipeline, and grant only Build read and Build read & execute permissions:
Copy the resulting PAT and be ready to use it in a minute
Step 4.1.2: Queue a build with Microsoft Flow via the Azure DevOps API
As we saw earlier, we can make simple HTTP request with the HTTP
action in Flow. Spin another one up now to queue a Build on Azure DevOps. It should look like so:
You can deduce the URL to use by simple copy/pasting the URL of your browser when you’re viewing your Build Pipelines.
You’ll also need to pass your PAT as an Authorization
header with Basic
auth. We’re also going to be posting some JSON to the endpoint, so set the Content-Type
header as well:
The body is where you tell it which Build Definition you want to queue, and pass any necessary queue-time parameters (remember the variables we defined?). That’ll end up looking like this:
Note how you pull out the values of the responses from previous steps, and inject it in to the body of the request to queue your build.
That’s pretty much it! At this point, your build will kick off and if you set a Release approver, you’ll get notified that the release is waiting after the Build completes.
But we can do better, right? Like getting a notification when the Build is all done?
Waiting for the build to complete then notifying
Unfortunately, the Azure DevOps API doesn’t have any way to know when a Build is done other than polling for status. But, if you run the above state, you’ll see our HTTP Post to the Queue Build endpoint comes back with a large payload – including the Id of the queued Build. We can use this information, and another piece of Control logic provided by Microsoft Flow, to query for Build status until the Build is done.
Parse the response from the Queue Build call in to JSON we can use
- Add a
Parse JSON
action to our Flow: - Its input should be the output of the Queue Build HTTP call:
- Its schema should be:
{
"type": "object",
"properties": {
"_links": {
"type": "object",
"properties": {
"self": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"web": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"sourceVersionDisplayUri": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"timeline": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"badge": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"properties": {
"type": "object",
"properties": {}
},
"tags": {
"type": "array"
},
"validationResults": {
"type": "array"
},
"plans": {
"type": "array",
"items": {
"type": "object",
"properties": {
"planId": {
"type": "string"
}
},
"required": [
"planId"
]
}
},
"triggerInfo": {
"type": "object",
"properties": {}
},
"id": {
"type": "integer"
},
"buildNumber": {
"type": "string"
},
"status": {
"type": "string"
},
"queueTime": {
"type": "string"
},
"url": {
"type": "string"
},
"definition": {
"type": "object",
"properties": {
"drafts": {
"type": "array"
},
"id": {
"type": "integer"
},
"name": {
"type": "string"
},
"url": {
"type": "string"
},
"uri": {
"type": "string"
},
"path": {
"type": "string"
},
"type": {
"type": "string"
},
"queueStatus": {
"type": "string"
},
"revision": {
"type": "integer"
},
"project": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"url": {
"type": "string"
},
"state": {
"type": "string"
},
"revision": {
"type": "integer"
},
"visibility": {
"type": "string"
}
}
}
}
},
"project": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"url": {
"type": "string"
},
"state": {
"type": "string"
},
"revision": {
"type": "integer"
},
"visibility": {
"type": "string"
}
}
},
"uri": {
"type": "string"
},
"sourceBranch": {
"type": "string"
},
"queue": {
"type": "object",
"properties": {
"id": {
"type": "integer"
},
"name": {
"type": "string"
},
"pool": {
"type": "object",
"properties": {
"id": {
"type": "integer"
},
"name": {
"type": "string"
},
"isHosted": {
"type": "boolean"
}
}
}
}
},
"priority": {
"type": "string"
},
"reason": {
"type": "string"
},
"requestedFor": {
"type": "object",
"properties": {
"displayName": {
"type": "string"
},
"url": {
"type": "string"
},
"_links": {
"type": "object",
"properties": {
"avatar": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"id": {
"type": "string"
},
"uniqueName": {
"type": "string"
},
"imageUrl": {
"type": "string"
},
"descriptor": {
"type": "string"
}
}
},
"requestedBy": {
"type": "object",
"properties": {
"displayName": {
"type": "string"
},
"url": {
"type": "string"
},
"_links": {
"type": "object",
"properties": {
"avatar": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"id": {
"type": "string"
},
"uniqueName": {
"type": "string"
},
"imageUrl": {
"type": "string"
},
"descriptor": {
"type": "string"
}
}
},
"lastChangedDate": {
"type": "string"
},
"lastChangedBy": {
"type": "object",
"properties": {
"displayName": {
"type": "string"
},
"url": {
"type": "string"
},
"_links": {
"type": "object",
"properties": {
"avatar": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"id": {
"type": "string"
},
"uniqueName": {
"type": "string"
},
"imageUrl": {
"type": "string"
},
"descriptor": {
"type": "string"
}
}
},
"parameters": {
"type": "string"
},
"orchestrationPlan": {
"type": "object",
"properties": {
"planId": {
"type": "string"
}
}
},
"logs": {
"type": "object",
"properties": {
"id": {
"type": "integer"
},
"type": {
"type": "string"
},
"url": {
"type": "string"
}
}
},
"repository": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"type": {
"type": "string"
},
"clean": {},
"checkoutSubmodules": {
"type": "boolean"
}
}
},
"keepForever": {
"type": "boolean"
},
"retainedByRelease": {
"type": "boolean"
},
"triggeredByBuild": {}
}
}
Wait until Build Status comes back completed
- Add a
Do Until
block to our Flow: - Add an HTTP action to Query for Build status:
Note: the
href
token in theURI
parameter comes from the output of the Parse JSON for the Build Queue that you just added; choose the firsthref
that shows up in the Dynamic Content area
3. Parse the response from the Build Status query:
JSON schema should be
{
"type": "object",
"properties": {
"_links": {
"type": "object",
"properties": {
"self": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"web": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"sourceVersionDisplayUri": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"timeline": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
},
"badge": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"properties": {
"type": "object",
"properties": {}
},
"tags": {
"type": "array"
},
"validationResults": {
"type": "array"
},
"plans": {
"type": "array",
"items": {
"type": "object",
"properties": {
"planId": {
"type": "string"
}
},
"required": [
"planId"
]
}
},
"triggerInfo": {
"type": "object",
"properties": {}
},
"id": {
"type": "integer"
},
"buildNumber": {
"type": "string"
},
"status": {
"type": "string"
},
"result": {
"type": "string"
},
"queueTime": {
"type": "string"
},
"startTime": {
"type": "string"
},
"finishTime": {
"type": "string"
},
"url": {
"type": "string"
},
"definition": {
"type": "object",
"properties": {
"drafts": {
"type": "array"
},
"id": {
"type": "integer"
},
"name": {
"type": "string"
},
"url": {
"type": "string"
},
"uri": {
"type": "string"
},
"path": {
"type": "string"
},
"type": {
"type": "string"
},
"queueStatus": {
"type": "string"
},
"revision": {
"type": "integer"
},
"project": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"url": {
"type": "string"
},
"state": {
"type": "string"
},
"revision": {
"type": "integer"
},
"visibility": {
"type": "string"
}
}
}
}
},
"project": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"url": {
"type": "string"
},
"state": {
"type": "string"
},
"revision": {
"type": "integer"
},
"visibility": {
"type": "string"
}
}
},
"uri": {
"type": "string"
},
"sourceBranch": {
"type": "string"
},
"sourceVersion": {
"type": "string"
},
"queue": {
"type": "object",
"properties": {
"id": {
"type": "integer"
},
"name": {
"type": "string"
},
"pool": {
"type": "object",
"properties": {
"id": {
"type": "integer"
},
"name": {
"type": "string"
},
"isHosted": {
"type": "boolean"
}
}
}
}
},
"priority": {
"type": "string"
},
"reason": {
"type": "string"
},
"requestedFor": {
"type": "object",
"properties": {
"displayName": {
"type": "string"
},
"url": {
"type": "string"
},
"_links": {
"type": "object",
"properties": {
"avatar": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"id": {
"type": "string"
},
"uniqueName": {
"type": "string"
},
"imageUrl": {
"type": "string"
},
"descriptor": {
"type": "string"
}
}
},
"requestedBy": {
"type": "object",
"properties": {
"displayName": {
"type": "string"
},
"url": {
"type": "string"
},
"_links": {
"type": "object",
"properties": {
"avatar": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"id": {
"type": "string"
},
"uniqueName": {
"type": "string"
},
"imageUrl": {
"type": "string"
},
"descriptor": {
"type": "string"
}
}
},
"lastChangedDate": {
"type": "string"
},
"lastChangedBy": {
"type": "object",
"properties": {
"displayName": {
"type": "string"
},
"url": {
"type": "string"
},
"_links": {
"type": "object",
"properties": {
"avatar": {
"type": "object",
"properties": {
"href": {
"type": "string"
}
}
}
}
},
"id": {
"type": "string"
},
"uniqueName": {
"type": "string"
},
"imageUrl": {
"type": "string"
},
"descriptor": {
"type": "string"
}
}
},
"parameters": {
"type": "string"
},
"orchestrationPlan": {
"type": "object",
"properties": {
"planId": {
"type": "string"
}
}
},
"logs": {
"type": "object",
"properties": {
"id": {
"type": "integer"
},
"type": {
"type": "string"
},
"url": {
"type": "string"
}
}
},
"repository": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"type": {
"type": "string"
},
"clean": {},
"checkoutSubmodules": {
"type": "boolean"
}
}
},
"keepForever": {
"type": "boolean"
},
"retainedByRelease": {
"type": "boolean"
},
"triggeredByBuild": {}
}
}
- Add a
Delay
action to wait 10 seconds before checking again: - Use the
status
token as theChoose a value
in the Do Until loop:
Additionally, set the overall timeout to 5 minutes and the number of executions to 31. This ensures the loop bails after 5 minutes be either hitting the timeout, or hitting the max # of executions (based on having a 10s delay between each).
Notify when done
- Add a new Mobile Notification Action:
- Configure it to send a link to the Build Status page for the build:
The overall picture
My Vivaldi MS Flow implementation does everything I’ve walked through here, but for both Release and Snapshot versions of Vivaldi. In parallel. Yes, Flow can even execute things in parallel. Here’s the top-level view of my Vivaldi Updater for MS Flow:
Conclusion
We’ve got a wealth of power at our fingertips as developers in the 21st century. Companies like Microsoft continue to invest in higher an higher abstractions so we can focus on solving our business problems instead of things that have been solved 1000 times before in infrastructure, networking, etc. Tools like Flow, Functions, and DevOps have become the new staples of a cloud developer and offer an incredible amount of power, flexibility, and productivity when used to their fullest. As an added bonus, you can often experience all of these awesome features for free.
The next time you’re wondering about how to automate something, have a look at MS Flow or MS Logic Apps, Functions, and Azure DevOps to see if they can help you focus on the bigger problems instead of the minutia. This is just one example out of dozens where I’ve used these techs to give me back time I’d otherwise be spending doing something repetitive.