Microsoft DevOps FastTrack & Azure DevOps Migrations

ethan-weil-262745-unsplash

While writing this post Microsoft has re-branded VSTS (Visual Studio Team Services) towards Azure DevOps. I have reflected the new name in this post so that it is up-to-date with latest naming and documentation references.

Introduction

Recently I have completed a very nice project and it finished with a migration weekend bringing several TFS collections towards Azure DevOps. This writeup helps me share my experiences with running the DevOps FastTrack Program as well as my approach on migrating from TFS to Azure DevOps.

Microsoft FastTrack Program

The Microsoft FastTrack (DevOps Accelerator) program is a program for customers that qualify. The FastTrack program consists of a two-week engagement, run by Microsoft selected and Microsoft trained consultants. Together with Rene van Osnabrugge and myself Xpirit has two!

The program consists of two flavors, one is called “Lighthouse” and is focused on helping the customer learn and experience DevOps principles in the area where they need it most. These areas are investigated upfront and the customer puts together a team of people to work closely together on the selected areas. As a consultant delivering this program, it is your responsibility to have your customer get the most out of this engagement. Running this program typically touches all of the Application Lifecycle Management topics you can think of.

The second flavor is a “Migration” scenario. This applies to customers having one or more TFS environments and their desire to migrate towards Azure DevOps. In this program the consultant works closely with a predefined team to help the customer migrate from TFS towards Azure DevOps.

Both programs start with delivering the Microsoft DevOps story on their transition towards 1ES (One Engineering System) and their drive to become a true DevOps organization. Many of these resources used for this can be found publicly on the Microsoft DevOps Blog.

One of the key takeaways for me is to see how a product like Azure DevOps evolved over the years and how Microsoft is dedicated to delivering the product to the market in such a dedicated way. Dogfooding Azure DevOps internally at Microsoft by 800+ engineers working on the product as well as having the Windows and Office teams using it. Seeing DevOps practices applied in Azure DevOps at such a scale provides lots of confidence in the tools.

Running these types of programs allow customers to really get a head start in DevOps transformations!

Database Import Service Migrations

My colleague Manuel Riezebosch has written a detailed blogpost on how we typically run Azure DevOps migrations. During my recent FastTrack migrations, I have further developed this strategy. The most important lesson learned running them, is that this approach always fits, although there always is a devil in the migration details. The approach follows the large database migration route using the TFS Database Import Service because this makes the process repeatable for any size collection, no matter the size of the database.

The official guide has a good structure and lots of references and pointer to help customers but lack a guidance to make a repeatable process of all the steps involved, they are to be executed manually. Having done many of these migrations, I have learned that you are in for repeating these steps and yourself at least multiple times. In my DevOps-mindset, I like to script these repeating activities and thus I mean automate everything! To add to this, put your scripts and logs in a Git repository so you keep track of everything. Even comparing a log file between runs can be valuable! In my approach running a migration always consist of three distinct phases:

Migration-Phases

Preparation Phase

In this phase you plan all the work that needs and can be executed before starting the actual migration. This allows you to take most of the work of the critical path. Some pointers to think about are;

  • Are the users ready for this migration?
  • Do we want our situation to be the same after the migration? Can we make improvements before start the migration?
  • Can we move away from Team Foundation Version Control towards Git?
  • What about the build definitions, do you have agents to build your software on?
  • Can we utilize Azure Hosted Agents?
  • Do we have or need a private NuGet repository? Can we move to Package Management?
  • Are the Extensions used currently available and they install-able in Azure DevOps?

There are many questions to ask and answers to be provided next to the things to be done, that there is a substantial risk of taking on things that are not necessarily required for the migration itself. I have had customers that would like to change everything and take on many improvements while we are moving towards Azure DevOps. My recommendation is to make the migration as small as possible. First this will make the migration so much easier, its complex enough already. Secondly it allows for a much smaller impact when executing the migration. Most of the time this is done in weekends to limit impact, and as we know weekends are short.

Key takeaway here; make sure to have discussions with the customer on all topics but try to limit scope! Address further improvements to be made in an Agile fashion after the migration.

Migration Phase

The migration phase consists of all the activities that are necessary until you can trigger the actual database import process. This phase should at least be run once towards a dryrun import Azure DevOps. I prefer moving to a first dryrun very quickly while this makes it so much more practical and allows you to find and prove the changes you are going to have to do. Again, script your changes while you will need to do this at least another time. Unfortunately, and lucky otherwise it would be quite boring, every migration differs and therefore a “typical” Database Migration scenario involves the following steps:

  • Restore Database – Restore a backup of the production TFS database(s).
  • Attach Collection – Attach the collection to a newly created TFS instance, most of the times this TFS instance will be used to upgrade to the latest available TFS and Migrator tooling.
  • Run TFSMigrator Validate – To determine most of the activities needed when migrating your collection(s) you need to follow instructions from the Validation step.
  • Export Workitems – If needed you may need to export workitem type definitions.
  • Fix Workitems – Fix the workitems and/or process definitions.
  • Import Workitems – Import the changed definitions
  • Delete Global Lists – Global Lists are not supported and need to be deleted.
  • Run TFSMigrator Prepare – Run a prepare with the TFS Migrator, to validate your collection is now ready to be imported.
  • Copy JSON File – The import configuration needs to be in the right place.
  • Create DB User – We need a user to expose our database with.
  • Detach Database – Take the collection offline from TFS
  • Prepare JSON file – Change details in the configuration.
  • Queue TFSMigrator Import – Trigger the TFS Migrator and wait 😊

Post-Migration Phase

In this last phase you will enjoy your new Azure DevOps environment and will take action to be able to use it. During this phase you will execute scripts, you prepared earlier, after an initial dryrun import. These are all things you can prepare in advance, some topics to think about:

  • Install used / required Marketplace extensions
  • (Re)create Agent Pools
  • Setup Agents
  • Enable / ask for desired preview features
  • Test access and permissions with real users
  • Check if all users are informed and ready
  • Enable alternate accounts for users

These post migration phase steps are often executed under more time pressure. Therefor it is advisable to really prepare well with a dryrun, because having to come up with uncovered issues directly after the production import. After some bigger issues after a migration I know for sure you want your scripts to be ready, anything can happen.

Migration scripts

Before starting with the scripts, a caution needs to be given here. You can only run these scripts at your own risk! You will need to make changes to them to make them fit your situation. I also never run or touch the original (production) TFS database(s). I always create a new machine that I use for the migration, preferably on a server that is re-creatable and that cannot create issues in your current production environment.

To help you get started I have publicly shared my starting repository here: https://github.com/JasperGilhuis/AzureDevOpsMigration.

In the repository you will find my default structure. In the repository you will also find a Markdown file containing a more detailed description on the files and there function. Typically the scripts contain a more or less similar approach, that could be improved, and are therefore mostly self-explanatory.

Currently the repository contains the Migration Phase scripts only.

Running a Migration

As mentioned before, any migration has many moving parts, so be prepared for last minute changes. If you have done a dryrun weeks before the actual migration, do another dryrun just close to the production migration. A new version of tools may be available or recent changes in TFS may require new changes. Depending on the environment you are running the migration in you may also encounter that Azure DevOps Import Service has new public IP addresses. This may require you to open new firewall ports and unfortunately in many organizations this can be a bit of a challenge.

Once you have worked your way through all the scripts and you have been able to successfully trigger an import, you get email messages when the import is done. Sometimes, it is nice to get more updates then the default ones and also be able to send it to more people at once. You could setup a service that checks areas of you page for changes made to the page. I have experienced that using Wachete Service4 for this works really well.

Migration War Stories

Some of the migrations did not run as smoothly as expected, others worked like a breeze. Here are a few areas to reflect on. In any of the odd cases I have had tremendous support from Microsoft and have always been able to resolve any issues!

All historical users

In case you have no way of connecting your AAD instance directly, or for whatever other reason, you may end up with all your TFS users as historical users.

In this scenario I have found the Database Import Service to actually import the “old” user. This means in groups and teams the old members are still visible. This cannot be used for current permissions but can be used as a placeholder to replace the old user with the new one using the REST API.

Broken Build and Release Definitions

There areas typically need work in the Post-Migration phase. I have had a production import show undesired behavior. Several of the customers custom extensions seemed to be installed but the actual tasks for them were not available. I have worked around them by using the TFX command line utilities to install the tasks manually. Microsoft has provided a fix for this issue quickly, so this was not a blocker in the end. But make sure you test your build and release definitions, so you know you are not in the blind.

Process Customizations

While I know it its possible to change processes in TFS and in Azure DevOps I am not really a big fan. What I have learned is that typically many customizations are somewhat of a a flaw or an overly “controlled” or “micromanaged” process, typically applied in an Agile DevOps environment where teams don’t need much to be able to deliver end-user value. In many of these assignments I try to overcome current TFS customizations and get everybody on the same process. This is often successful and then I am able to help teams focus on the more valuable things. Azure DevOps has out of the box functionality that is often more sufficient.

Database Collations

TFS allows to install many collations for your SQL Server database platform. This may result in TFS content in your database containing special characters. While there is only one way to find out, and that is doing a dryrun, there might be issues in this area. With the help of Microsoft I have always been able to resolve the problems. I have had a migrations where that resulted in a difficult area with conflicts during the database import, and unexpected end results, but by cleaning all workspaces I was able to successfully import the collection. In this scenario, deleting data was acceptable for the customer, because we had only historical users and therefor the workspaces were not of high value. But more importantly it shows that unexpected things occur and you need to be prepared to take action!

Multiple Collections

Customers with many collections, often created for legit reasons or inherited from many previous TFS installations can be a bit of a discussion point. The Database Import migration service is a clean and solid approach but needs convincing of customers to take this route. Often customers feel that one organization is the only acceptable result. This would imply that having many collections will result in many organizations. While this may be suboptimal for some reasons there is first class functionality in Azure DevOps to support this. I believe spending time on lengthy manual and custom migrations is often waste of time that could be spend on delivering value for the organization.

Go for it!

Migrating towards Azure DevOps by running the TFS Database Import Service has my personal preference. My experience with the above described migration scenario is proven to be solid and can be applied in reasonable short amount of time with strong support from Microsoft. My advice, prepare well and go for it!

Cover photo by Ethan Weil on Unsplash 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: