The blog of Richard Fennell
Whilst presenting on ‘Migrating DevOps Toolsets’ at DDDNorth last weekend, I mentioned a blog post & flowchart I had created a few years ago to guide people through their options when migrating from TFS to what was then called VSTS (Azure DevOps Services) When I got home, I though the post & flowchart were worth an update to bring it up to date. I also took the chance to make them a little more generic, so they maybe useful as a guide for a wider range of DevOps toolset migrations ...
Background I have been doing the regular maintenance in our Azure DevOps Pipelines of updating the versions of tasks. This usually means you perform one of the following actions Just increment the major version number in the YAML e.g task: MyTask@1 to task: MyTask@2 when there is a newer version of a task available. If the task is out of support and abandoned, swap to a different task, one that is still being supported, that does the same action. Fork the out of date task, and perform the updates to bring it back into support. Swap to a PowerShell/Bash script that wrappers a CLI tool to do the same action - increasingly my go to solution. It is a shame the first option is often not possible, but don’t get me, or other MVPs on the long running subject of abandoned insecure Azure DevOps extensions ...
I have posted in the past about the issues a misconfigured cache can have in Azure DevOps Pipelines, or GitHub Actions. Well it caught me out again today, wasting a few hours of my time. Today, I had a failing integration test in a CI/CD pipeline, but all the tests passed locally. The failing test was looking for a certain number of rows returned from a API with known seed test data. I had revised the seed data and my test, but in my CI/CD pipeline my test was failing as it was still looking for the old number of rows. ...
The Issue I recently had an issue trying to add a new Azure DevOps Pipeline Agent Pool to an existing Azure DevOps 2022 Server via the Team Project Collection Settings UI. When tried to add the agent pool I got the error Access denied needs Manage permissions to perform this action. For more information, contact the Azure DevOps Server administrator The problem was that I was the Azure DevOps Server administrator ...
The Issue I have been chasing what it turned out to be a non-existent fault when trying to ingest test code coverage data into our SonarQube instance. I saw my ‘problem’ in a .NET 8.0 solution with XUnit v3 based unit tests, this solution was being built using this Azure DevOps Pipelines YAML - task: SonarQubePrepare@7 inputs: SonarQube: "SonarQube" scannerMode: "dotnet" jdkversion: "JAVA_HOME_17_X64" projectKey: "${{ parameters.sonarQubeProjectKey }}" projectName: "${{ parameters.sonarQubeProjectName }}" projectVersion: "$(GitVersion_Major).$(GitVersion_Minor)" extraProperties: | # Additional properties that will be passed to the scanner, # Put one key=value per line, example: sonar.cpd.exclusions=**/AssemblyInfo.cs,**/*.g.cs # Ingest the test results and coverage data sonar.cs.vscoveragexml.reportsPaths=$(Agent.TempDirectory)/**/*.coveragexml sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/**/*.trx - task: DotNetCoreCLI@2 displayName: ".NET Build" inputs: command: "build" arguments: > --configuration ${{ parameters.buildConfiguration }} --no-restore projects: "$(Build.SourcesDirectory)/src/MySolution.sln" - task: DotNetCoreCLI@2 displayName: ".NET Test" inputs: command: "test" projects: "$(Build.SourcesDirectory)/src/MySolution.sln" arguments: > --configuration ${{ parameters.buildConfiguration }} --collect "Code coverage" --no-restore --no-build - task: SonarQubeAnalyze@7 displayName: 'Complete the SonarQube analysis' inputs: jdkversion: "JAVA_HOME_17_X64" - task: SonarQubePublish@7 displayName: 'Publish Quality Gate Result' inputs: pollingTimeoutSec: "300" At the start of the SonarQubeAnalyze@7 task log I could see that the .coverage file was found and converted into a .coveragexml file. However, there were multiple ‘The device is not ready’ errors when parsing this file later in the process. ...
The Issue I have some long standing PowerBI reports that I use for summarizing project data. They use a variety of data sources, including Azure hosted SQL instances. I recently moved the Azure hosted SQL databases to a new instance as part of a major tidy up of my Azure resources. This of course caused my reports to break. I thought swapping the SQL connection details in PowerBI would be easy, and I guess it was, but it took me too long to work out how. ...
The Issue We have used SonarQube and the OWASP Dependency Checker Plugin for many years to perform analysis and vulnerability checking within our Azure DevOps Pipelines. Recently, whilst picking up an old project for a new phase of development, I came across a couple of problems due to changes in both tools since the project CI/CD pipelines were last run. The OWASP Dependency Checker vulnerabilities were not appearing in SonarQube as issues The OWASP Dependency Checker HTML report could not (always) be loaded in SonarQube The issues were just down to changes in both tools over time. It just goes to show that you can’t just setup a CI/CD system and expect it work forever, changes are always being introduced in cloud based tools. ...
Another podcast I recently recorded with our friends at Grey Matter has just been published Secure by design: The DevSecOps mindset
I recently came across an interesting side effect with the Azure DevOps cache task if its settings are not correctly configured. One that caused me to get somewhat confused before I realised what had occurred. The Problem I had a working pipeline that as part of its build process ran the OWASP Dependency Checker task. This can be slow to run as it has to download the current vulnerability database. To try to speed my builds I have been using the cache task to cache the current pipeline run’s downloaded vulnerability database, so on the next run the vast majority of the database is already downloaded. ...
Updated 30-Oct-2025 Added more details on Screen Refresh Rate and Teams Room App The Problem We have owned a Surface Hub v1 for a number of years, and it has served us well. However, with Microsoft ending support for Windows 10 it was in danger of becoming a large piece of sculpture in the office. This is not just because we did not want to run a Windows 10 device when security patches were not available, but that the embedded version of Teams would not even load. ...