.NET · .net-core · Actions · Architecture · AzureDevOps · C# · Entra · GitHub · microsoft

Migrating Azure DevOps repositories to GitHub Enterprise with the GitHub import APIs

Azure DevOps (ADO) teams keep asking for a repeatable way to land on GitHub Enterprise Cloud without babysitting manual Git mirrors. The good news: GitHub’s import surface now covers one-off REST-based imports, the GitHub Enterprise Importer (GEI) GraphQL APIs, and automation-friendly tooling such as the gh ado2gh extension. Below is a field-tested playbook that blends those APIs, shows where the GitHub azdo2gh tool shines, and demonstrates how to kick off migrations programmatically from .NET.

What actually moves during an ADO ➜ GitHub migration

According to the latest GitHub docs, GEI lifts Git source (including commit history), pull requests, reviewer history, work-item links, attachments, and branch policies. Git LFS pointers are supported, but binary objects still need to be pushed separately after the migration, and you’ll want to double-check Git/GitHub limits such as 400 MiB per file during import and 2 GiB per commit.

Choosing the right import interface

OptionWhen to use itGotchas
GitHub Importer REST API (/repos/{owner}/{repo}/import)Small Git repos or TFVC conversions when you just need code and commit authors.The Source Import API is retiring (sunsets started April 12 2024); it never moved pull requests or policies, so it’s no longer the recommended path for ADO migrations.
GitHub Enterprise Importer GraphQL APIFull-fidelity migrations (code, branches, pull requests, attachments, branch policies).Requires GraphQL, PATs for both clouds, and orchestration of multiple mutations/queries.
gh ado2gh CLI extensionWhen you prefer guardrails: generate scripts, queue batches, reclaim mannequins, download logs.Needs both GH_PAT and ADO_PAT, weekly updates, and PowerShell when you generate scripts.

The azdo2gh tool in practice

GitHub ships gh ado2gh, a GitHub CLI extension purpose-built for Azure DevOps migrations (docs). A typical flow:

  1. gh extension install github/gh-ado2gh (requires GitHub CLI v2.4+).
  2. Export GH_PAT (destination) and ADO_PAT (source). When targeting ghe.com data residency, also export TARGET_API_URL=https://api.<subdomain>.ghe.com and pass --target-api-url on every command.
  3. gh ado2gh generate-script --ado-org <source> --github-org <destination> --output migrate.ps1 to scaffold a PowerShell script that queues each repo migration. Flags like --all pull in extras (pipelines, teams, Boards wiring) and --download-migration-logs captures artifacts.
  4. Review/edit the script (rename target repos, drop ones you don’t want) and run it (.abrikam-to-octocat.ps1).
  5. For one-off repos, gh ado2gh migrate-repo ... is faster.

Think of gh ado2gh as an opinionated shell around the same GraphQL APIs discussed next. You still manage PAT scopes, rate limits, and migration sequencing—but you inherit a ton of ergonomics (CSV manifests, retry logic, log downloads, mannequin reclamation helpers).

Programmatic migrations with the GraphQL API (and .NET)

When you need finer control—maybe you’re cherry-picking only the repositories tied to a specific cost center—you can talk straight to the GEI GraphQL surface. The core contract, per the API guide, looks like this:

  1. Get your destination’s ownerId via organization(login: ...).
  2. Register the source once with createMigrationSource, setting type: AZURE_DEVOPS and reusing the sourceId for every repo.
  3. Kick off each repo with startRepositoryMigration, passing the ADO repo URL, destination name, PATs, and visibility. Up to five repos run concurrently.
  4. Poll with getMigration / getMigrations until you see COMPLETED (or grab the failure reason if it trips).

Here’s a trimmed .NET 8 console snippet that loops over a list of repositories and orchestrates those mutations with the plain HttpClient + System.Text.Json stack. It assumes you already resolved ownerIdsourceId, and the two PATs (GitHub + Azure DevOps):

using System.Net.Http.Headers;
using System.Text;
using System.Text.Json;

var http = new HttpClient { BaseAddress = new Uri("https://api.github.com/graphql") };
http.DefaultRequestHeaders.UserAgent.ParseAdd("ado-migration-sample");
http.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", githubPat);

var startMutation = @"mutation ($input: StartRepositoryMigrationInput!) {
  startRepositoryMigration(input: $input) {
	repositoryMigration { id state failureReason }
  }
}";

async Task<string?> StartMigrationAsync(string sourceRepoUrl, string targetName)
{
	var payload = new
	{
		query = startMutation,
		variables = new
		{
			input = new
			{
				sourceId,
				ownerId,
				repositoryName = targetName,
				sourceRepositoryUrl = sourceRepoUrl,
				targetRepoVisibility = "private",
				continueOnError = true,
				githubPat,
				accessToken = adoPat
			}
		}
	};

	var response = await http.PostAsync(string.Empty,
		new StringContent(JsonSerializer.Serialize(payload), Encoding.UTF8, "application/json"));
	response.EnsureSuccessStatusCode();

	using var document = JsonDocument.Parse(await response.Content.ReadAsStringAsync());
	return document.RootElement
		.GetProperty("data")
		.GetProperty("startRepositoryMigration")
		.GetProperty("repositoryMigration")
		.GetProperty("id")
		.GetString();
}

foreach (var repo in reposToMigrate)
{
	var migrationId = await StartMigrationAsync(repo.SourceUrl, repo.TargetName);
	Console.WriteLine($"Started {repo.SourceUrl} ➜ {repo.TargetName} as {migrationId}");
}

From here you can add a getMigration query (same HTTP client) that inspects state/failureReason, download the “Migration Log” issue that GEI opens in each destination repo, and fan out status alerts to Teams/Slack.

Edge cases to bake into your automation

  • Branch policies: imported automatically, but user-scoped or cross-repo policies stay behind. Re-build those with branch protection rules post-cutover.
  • Rate limits: GEI caps you at five concurrent repository migrations. Queue the rest so you don’t get throttled mid-flight.
  • Large binaries: the importer skips Git LFS objects—push them after the fact or switch to GitHub Actions Artifacts.

Mannequins (“maniquines”) and fixing user history

Even with perfect PAT scopes, legacy commenters show up as mannequins—a placeholder account carrying over issue/pull-request history. GitHub lets you reclaim mannequins either interactively or via CLI:

  • Bulk-export the mannequin roster: gh ado2gh generate-mannequin-csv --github-org <destination> --output mannequins.csv.
  • Map each mannequin row to a real GitHub username.
  • Reclaim in one go: gh ado2gh reclaim-mannequin --github-org <destination> --csv mannequins.csv (add --skip-invitation if you’re on Enterprise Managed Users and don’t want email approvals).

Until you reclaim mannequins, their activity stays searchable only by placeholder ID. Commit authorship is separate—engineers can reattribute commits by adding the email they used in ADO to their GitHub account (or, for Enterprise Managed Users, by aligning IdP primary emails).

Suggested migration checklist

  1. Dry run quickly: migrate a staging org first, read the Migration Log issue, and capture fix-ups.
  2. Freeze + communicate: the importer is not delta-aware; plan for a short content freeze or manually backfill deltas after the cutover.
  3. Script your GraphQL flow: wrap the snippet above with resilient retry logic and status polling, then store migration IDs so you can resume after interruptions.
  4. Reclaim mannequins + reapply permissions: mannequin reclamation doesn’t grant repo access—add teams/individuals once they’ve reclaimed history.
  5. Watch for ruleset conflicts: org-level branch rules (required signatures, email suffixes, etc.) can block pushes from the importer. Temporarily relax them or migrate into a staging org.

With these pieces—GitHub’s import APIs, the azdo2gh CLI, and a thin .NET orchestrator—you can move curated sets of Azure DevOps repositories into GitHub Enterprise Cloud while keeping code, branches, pull requests, attachments, and branch policies intact, then rapidly clean up identities so contributors see their own history on day one.

Leave a comment