Skip to main content

CLI-first documentation

ado-sync

Bidirectional sync between local test specs and Azure DevOps Test Cases.

ado-sync supports Gherkin .feature, Markdown .md, C#, Java, Python, JavaScript/TypeScript frameworks such as Jest, Playwright, Cypress, Puppeteer, and TestCafe, plus mobile and desktop test stacks including Detox, Espresso, Flutter, Robot Framework, Go, RSpec, PHPUnit, Rust, CSV, and Excel.

Use it for

  • Creating and updating Azure DevOps Test Cases from local specs
  • Writing IDs back into source files so tests and work items stay linked
  • Publishing results from TRX, JUnit, Playwright, and Cucumber outputs
  • Supporting AI-assisted summarization, generation, and MCP workflows

Start here

Quick start

npm install -g ado-sync
ado-sync init
export AZURE_DEVOPS_TOKEN=your_pat
ado-sync push --dry-run
ado-sync push

Minimum config:

{
"orgUrl": "https://dev.azure.com/YOUR-ORG",
"project": "YOUR-PROJECT",
"auth": { "type": "pat", "token": "$AZURE_DEVOPS_TOKEN" },
"testPlan": { "id": 12345 },
"local": { "type": "gherkin", "include": "specs/**/*.feature" }
}

Core flow

  1. Run ado-sync init to create the config.
  2. Set AZURE_DEVOPS_TOKEN or another supported auth method.
  3. Use ado-sync push --dry-run to inspect pending work safely.
  4. Use ado-sync push or ado-sync pull once the mapping looks correct.
  5. Publish automated test results with ado-sync publish-test-results.

Common commands

CommandPurpose
initInteractive config wizard
validateCheck config and Azure connectivity
pushLocal specs to Azure Test Cases
pullAzure Test Cases to local files
statusShow pending changes without modifying anything
diffField-level drift between local and Azure
generateScaffold spec files from ADO User Stories
publish-test-resultsPublish TRX, JUnit, Playwright, or Cucumber results

Working modes

ModeEntry points
Day-to-day syncpush, pull, status, diff
Planning and generationgenerate, story-context, coverage, stale
Automation and agentsmcp-server, agent-setup, publish-test-results

Documentation map

AI providers

ado-sync supports multiple AI providers for test-step summarization, spec generation, and failure analysis. Provider SDKs are optional, so you only install what you need.

  • heuristic for zero-setup summaries
  • local for GGUF models through node-llama-cpp
  • ollama for local model serving
  • openai, anthropic, github, bedrock, azureai, and azureinference for hosted models

Detailed setup and provider-specific examples are covered in the CLI, advanced, and MCP server docs.