Boards

Automating Work Item Creation from Git Commits

Build a Node.js webhook listener that parses conventional commit messages and automatically creates Azure DevOps work items with proper fields, area paths, and deduplication.

Automating Work Item Creation from Git Commits

Overview

Manually creating work items every time a developer commits a fix or feature is tedious, error-prone, and almost always forgotten. Azure DevOps provides native commit-to-work-item linking, but the out-of-the-box experience barely scratches the surface. This article walks through building a Node.js service that listens for Git push events via Azure DevOps service hooks, parses conventional commit messages, and automatically creates fully populated work items — bugs, tasks, and features — without anyone touching a board.

Prerequisites

  • An Azure DevOps organization with at least one project and Git repository
  • A Personal Access Token (PAT) with Work Items (Read & Write) and Code (Read) scopes
  • Node.js v18+ installed locally
  • Basic familiarity with Express.js and REST APIs
  • An Azure DevOps project with configured area paths and iteration paths

Work Item Linking Basics

Azure DevOps supports two native syntaxes for linking commits to existing work items directly from your commit message.

The Hash Syntax

The simplest approach uses # followed by the work item ID:

git commit -m "Fixed the null reference exception #1234"

This creates a link between the commit and work item 1234. The link type is "Fixed in Commit" when the commit lands on the default branch, or "Associated with Commit" otherwise. This works in Azure Repos out of the box — no configuration needed.

The AB# Syntax (Cross-Project)

When you host code on GitHub but track work in Azure DevOps, use the AB# prefix:

git commit -m "Resolved timeout in payment gateway AB#5678"

This syntax requires the Azure Boards GitHub app to be installed on your repository. It works across organization boundaries, which makes it useful for open-source projects backed by private boards.

What Linking Does NOT Do

Here is the critical gap: neither syntax creates work items. They only link commits to items that already exist. If you write fix: resolve crash on login with no work item number, nothing happens on the board. That gap is exactly what we are going to fill.


Configuring Automatic State Transitions

Before building custom automation, squeeze everything you can out of native features. Azure DevOps can automatically transition work item states when pull requests complete.

Navigate to Project Settings > Boards > Settings and enable:

  • Resolve work items on PR completion — moves linked items to "Resolved" when a PR merges
  • Close work items on PR completion — moves linked items to "Closed" instead

You configure this per-repository under Project Settings > Repositories > [Your Repo] > Settings.

The JSON payload for this setting via the REST API looks like:

{
  "autoCompleteWorkItems": true,
  "completionOptions": {
    "transitionWorkItems": true,
    "mergeStrategy": "squash"
  }
}

This is useful but limited. It only transitions state — it does not create items, assign area paths, set priority, or fill custom fields. For that, we build.


Azure DevOps Service Hooks

Service hooks are the event system that powers our automation. They fire HTTP POST requests to your endpoint when events occur in Azure DevOps.

Configuration Steps

  1. Navigate to Project Settings > Service hooks
  2. Click Create subscription
  3. Select Web Hooks as the service
  4. Choose Code pushed as the trigger event
  5. Filter by repository and branch if needed
  6. Enter your endpoint URL and configure authentication

The Push Event Payload

When a push occurs, Azure DevOps sends a payload shaped like this:

{
  "subscriptionId": "00000000-0000-0000-0000-000000000000",
  "notificationId": 4,
  "id": "03c164c2-8912-4d5e-8009-3707d5f83734",
  "eventType": "git.push",
  "publisherId": "tfs",
  "message": {
    "text": "Shane pushed updates to master"
  },
  "resource": {
    "commits": [
      {
        "commitId": "a1b2c3d4e5f6",
        "author": {
          "name": "Shane",
          "email": "[email protected]",
          "date": "2026-02-08T10:30:00Z"
        },
        "committer": {
          "name": "Shane",
          "email": "[email protected]",
          "date": "2026-02-08T10:30:00Z"
        },
        "comment": "feat: add webhook listener for service hook events",
        "url": "https://dev.azure.com/myorg/myproject/_apis/git/repositories/..."
      }
    ],
    "refUpdates": [
      {
        "name": "refs/heads/master",
        "oldObjectId": "aaa111",
        "newObjectId": "bbb222"
      }
    ],
    "repository": {
      "id": "repo-guid",
      "name": "my-api",
      "project": {
        "id": "project-guid",
        "name": "MyProject"
      }
    },
    "pushedBy": {
      "displayName": "Shane",
      "uniqueName": "[email protected]"
    }
  },
  "createdDate": "2026-02-08T10:30:05Z"
}

The resource.commits array contains every commit in the push. Each commit's comment field is what we parse.


Parsing Conventional Commits

The Conventional Commits specification gives commit messages a machine-readable structure:

<type>(<scope>): <description>

[optional body]

[optional footer(s)]

We map types to Azure DevOps work item types:

Commit Type Work Item Type Priority
feat Feature 2
fix Bug 2
bug Bug 1
task Task 3
chore Task 4
docs Task 4
perf Bug 2
refactor Task 3

Here is the parser:

// commit-parser.js
function parseConventionalCommit(message) {
  var pattern = /^(feat|fix|bug|task|chore|docs|perf|refactor)(\(([^)]+)\))?(!)?:\s*(.+)/i;
  var match = message.match(pattern);

  if (!match) {
    return null;
  }

  var type = match[1].toLowerCase();
  var scope = match[3] || null;
  var breaking = match[4] === '!';
  var description = match[5].trim();

  var workItemTypeMap = {
    feat: 'Feature',
    fix: 'Bug',
    bug: 'Bug',
    task: 'Task',
    chore: 'Task',
    docs: 'Task',
    perf: 'Bug',
    refactor: 'Task'
  };

  var priorityMap = {
    feat: 2,
    fix: 2,
    bug: 1,
    task: 3,
    chore: 4,
    docs: 4,
    perf: 2,
    refactor: 3
  };

  return {
    type: type,
    scope: scope,
    breaking: breaking,
    description: description,
    workItemType: workItemTypeMap[type],
    priority: breaking ? 1 : priorityMap[type]
  };
}

module.exports = { parseConventionalCommit: parseConventionalCommit };

Test it:

var result = parseConventionalCommit('feat(auth): add OAuth2 PKCE flow');
console.log(result);
// {
//   type: 'feat',
//   scope: 'auth',
//   description: 'add OAuth2 PKCE flow',
//   breaking: false,
//   workItemType: 'Feature',
//   priority: 2
// }

var bugResult = parseConventionalCommit('fix!: prevent SQL injection in user search');
console.log(bugResult);
// {
//   type: 'fix',
//   scope: null,
//   description: 'prevent SQL injection in user search',
//   breaking: true,
//   workItemType: 'Bug',
//   priority: 1   <-- elevated because of breaking change
// }

Azure DevOps REST API for Work Items

The Work Items API uses JSON Patch format. Every field update is a patch operation.

Creating a Work Item

// devops-client.js
var https = require('https');

function createWorkItem(config, workItemType, fields, callback) {
  var patchDocument = [];

  Object.keys(fields).forEach(function(field) {
    patchDocument.push({
      op: 'add',
      path: '/fields/' + field,
      value: fields[field]
    });
  });

  var options = {
    hostname: 'dev.azure.com',
    path: '/' + config.organization + '/' + config.project +
          '/_apis/wit/workitems/$' + encodeURIComponent(workItemType) +
          '?api-version=7.1',
    method: 'POST',
    headers: {
      'Content-Type': 'application/json-patch+json',
      'Authorization': 'Basic ' + Buffer.from(':' + config.pat).toString('base64')
    }
  };

  var req = https.request(options, function(res) {
    var body = '';
    res.on('data', function(chunk) { body += chunk; });
    res.on('end', function() {
      if (res.statusCode === 200) {
        callback(null, JSON.parse(body));
      } else {
        callback(new Error('API returned ' + res.statusCode + ': ' + body));
      }
    });
  });

  req.on('error', callback);
  req.write(JSON.stringify(patchDocument));
  req.end();
}

function updateWorkItemState(config, workItemId, state, callback) {
  var patchDocument = [
    {
      op: 'add',
      path: '/fields/System.State',
      value: state
    }
  ];

  var options = {
    hostname: 'dev.azure.com',
    path: '/' + config.organization + '/' + config.project +
          '/_apis/wit/workitems/' + workItemId +
          '?api-version=7.1',
    method: 'PATCH',
    headers: {
      'Content-Type': 'application/json-patch+json',
      'Authorization': 'Basic ' + Buffer.from(':' + config.pat).toString('base64')
    }
  };

  var req = https.request(options, function(res) {
    var body = '';
    res.on('data', function(chunk) { body += chunk; });
    res.on('end', function() {
      if (res.statusCode === 200) {
        callback(null, JSON.parse(body));
      } else {
        callback(new Error('API returned ' + res.statusCode + ': ' + body));
      }
    });
  });

  req.on('error', callback);
  req.write(JSON.stringify(patchDocument));
  req.end();
}

module.exports = {
  createWorkItem: createWorkItem,
  updateWorkItemState: updateWorkItemState
};

Work Item Fields Reference

The most commonly used fields when creating work items programmatically:

var fields = {
  'System.Title': 'Add OAuth2 PKCE flow',
  'System.Description': '<p>Created automatically from commit a1b2c3d</p>',
  'System.WorkItemType': 'Feature',
  'System.State': 'New',
  'System.AssignedTo': '[email protected]',
  'System.AreaPath': 'MyProject\\API\\Authentication',
  'System.IterationPath': 'MyProject\\Sprint 14',
  'Microsoft.VSTS.Common.Priority': 2,
  'Microsoft.VSTS.Common.Severity': '2 - High',
  'System.Tags': 'auto-created; from-commit'
};

Note the double backslash in area paths — these are hierarchical and the API requires the full path from the project root.


Building the Express Webhook Server

Now we wire everything together into a production-ready Express server.

Project Setup

mkdir devops-webhook-listener
cd devops-webhook-listener
npm init -y
npm install express body-parser

Your package.json should look like:

{
  "name": "devops-webhook-listener",
  "version": "1.0.0",
  "main": "server.js",
  "scripts": {
    "start": "node server.js"
  },
  "dependencies": {
    "body-parser": "^1.20.2",
    "express": "^4.18.2"
  }
}

The Complete Server

// server.js
var express = require('express');
var bodyParser = require('body-parser');
var crypto = require('crypto');
var commitParser = require('./commit-parser');
var devopsClient = require('./devops-client');

var app = express();
var PORT = process.env.PORT || 3500;

var config = {
  organization: process.env.AZDO_ORG,
  project: process.env.AZDO_PROJECT,
  pat: process.env.AZDO_PAT,
  webhookSecret: process.env.WEBHOOK_SECRET,
  defaultAreaPath: process.env.DEFAULT_AREA_PATH || '',
  defaultIterationPath: process.env.DEFAULT_ITERATION_PATH || ''
};

// Scope-to-area-path mapping
var scopeAreaMap = {
  auth: 'Authentication',
  api: 'API',
  ui: 'Frontend',
  db: 'Database',
  infra: 'Infrastructure',
  docs: 'Documentation'
};

app.use(bodyParser.json({ limit: '1mb' }));

// Webhook authentication middleware
function verifyWebhook(req, res, next) {
  if (!config.webhookSecret) {
    return next();
  }

  var authHeader = req.headers['authorization'];
  if (!authHeader) {
    console.error('[WEBHOOK] Missing authorization header');
    return res.status(401).json({ error: 'Missing authorization header' });
  }

  // Azure DevOps sends Basic auth with the secret as password
  var expected = 'Basic ' + Buffer.from(':' + config.webhookSecret).toString('base64');
  if (authHeader !== expected) {
    console.error('[WEBHOOK] Invalid authorization');
    return res.status(403).json({ error: 'Invalid authorization' });
  }

  next();
}

// Health check endpoint
app.get('/health', function(req, res) {
  res.json({
    status: 'healthy',
    uptime: process.uptime(),
    timestamp: new Date().toISOString()
  });
});

// Main webhook endpoint
app.post('/webhooks/push', verifyWebhook, function(req, res) {
  var payload = req.body;

  // Validate event type
  if (payload.eventType !== 'git.push') {
    console.log('[WEBHOOK] Ignoring event type: ' + payload.eventType);
    return res.status(200).json({ message: 'Event type ignored' });
  }

  var commits = payload.resource && payload.resource.commits;
  if (!commits || commits.length === 0) {
    return res.status(200).json({ message: 'No commits to process' });
  }

  var projectName = payload.resource.repository.project.name;
  var repoName = payload.resource.repository.name;
  var pushedBy = payload.resource.pushedBy;

  console.log('[WEBHOOK] Processing %d commits from %s/%s by %s',
    commits.length, projectName, repoName, pushedBy.displayName);

  // Respond immediately — process asynchronously
  res.status(202).json({
    message: 'Processing ' + commits.length + ' commits',
    accepted: true
  });

  // Process each commit
  var processed = 0;
  var created = 0;
  var errors = 0;

  commits.forEach(function(commit) {
    var parsed = commitParser.parseConventionalCommit(commit.comment);

    if (!parsed) {
      processed++;
      console.log('[SKIP] Non-conventional commit: %s', commit.comment.substring(0, 60));
      return;
    }

    // Skip if commit already references a work item
    if (/#\d+/.test(commit.comment) || /AB#\d+/.test(commit.comment)) {
      processed++;
      console.log('[SKIP] Commit already linked to work item: %s', commit.comment.substring(0, 60));
      return;
    }

    // Build area path from scope
    var areaPath = config.defaultAreaPath;
    if (parsed.scope && scopeAreaMap[parsed.scope]) {
      areaPath = projectName + '\\' + scopeAreaMap[parsed.scope];
    }

    // Build work item fields
    var fields = {
      'System.Title': parsed.description.charAt(0).toUpperCase() + parsed.description.slice(1),
      'System.Description': buildDescription(commit, parsed, repoName),
      'System.State': 'New',
      'System.Tags': 'auto-created; from-commit; ' + parsed.type,
      'Microsoft.VSTS.Common.Priority': parsed.priority
    };

    if (areaPath) {
      fields['System.AreaPath'] = areaPath;
    }

    if (config.defaultIterationPath) {
      fields['System.IterationPath'] = config.defaultIterationPath;
    }

    if (pushedBy && pushedBy.uniqueName) {
      fields['System.AssignedTo'] = pushedBy.uniqueName;
    }

    // Add reproduction steps for bugs
    if (parsed.workItemType === 'Bug') {
      fields['Microsoft.VSTS.TCM.ReproSteps'] =
        '<p>Identified from commit <code>' + commit.commitId.substring(0, 8) +
        '</code> in repository <strong>' + repoName + '</strong>.</p>' +
        '<p>Commit message: ' + commit.comment + '</p>';
    }

    devopsClient.createWorkItem(config, parsed.workItemType, fields, function(err, workItem) {
      processed++;

      if (err) {
        errors++;
        console.error('[ERROR] Failed to create %s: %s', parsed.workItemType, err.message);
      } else {
        created++;
        console.log('[CREATED] %s #%d: %s (Priority: %d)',
          parsed.workItemType, workItem.id, fields['System.Title'], parsed.priority);
      }

      if (processed === commits.length) {
        console.log('[DONE] Processed %d commits: %d created, %d skipped, %d errors',
          commits.length, created, commits.length - created - errors, errors);
      }
    });
  });
});

function buildDescription(commit, parsed, repoName) {
  var html = '<h3>Auto-generated from Git Commit</h3>';
  html += '<table>';
  html += '<tr><td><strong>Repository</strong></td><td>' + repoName + '</td></tr>';
  html += '<tr><td><strong>Commit</strong></td><td><code>' + commit.commitId.substring(0, 8) + '</code></td></tr>';
  html += '<tr><td><strong>Author</strong></td><td>' + commit.author.name + '</td></tr>';
  html += '<tr><td><strong>Date</strong></td><td>' + commit.author.date + '</td></tr>';
  html += '<tr><td><strong>Type</strong></td><td>' + parsed.type + '</td></tr>';
  if (parsed.scope) {
    html += '<tr><td><strong>Scope</strong></td><td>' + parsed.scope + '</td></tr>';
  }
  if (parsed.breaking) {
    html += '<tr><td><strong>Breaking Change</strong></td><td>Yes</td></tr>';
  }
  html += '</table>';
  html += '<h4>Full Commit Message</h4>';
  html += '<pre>' + commit.comment + '</pre>';

  return html;
}

app.listen(PORT, function() {
  console.log('[SERVER] Webhook listener running on port %d', PORT);
  console.log('[CONFIG] Organization: %s', config.organization);
  console.log('[CONFIG] Project: %s', config.project);
  console.log('[CONFIG] Default Area Path: %s', config.defaultAreaPath || '(project root)');
});

Running the Server

export AZDO_ORG="myorganization"
export AZDO_PROJECT="MyProject"
export AZDO_PAT="your-personal-access-token"
export WEBHOOK_SECRET="a-strong-shared-secret"
export DEFAULT_AREA_PATH="MyProject\\Backend"
export DEFAULT_ITERATION_PATH="MyProject\\Sprint 14"

npm start

Output:

[SERVER] Webhook listener running on port 3500
[CONFIG] Organization: myorganization
[CONFIG] Project: MyProject
[CONFIG] Default Area Path: MyProject\Backend

When a push with conventional commits lands:

[WEBHOOK] Processing 3 commits from MyProject/my-api by Shane
[CREATED] Feature #4521: Add OAuth2 PKCE flow (Priority: 2)
[SKIP] Non-conventional commit: Merge branch 'feature/auth' into main
[CREATED] Bug #4522: Prevent SQL injection in user search (Priority: 1)
[DONE] Processed 3 commits: 2 created, 1 skipped, 0 errors

Integrating with CI Pipelines

Beyond commit-triggered work items, you can create work items when builds fail. This is powerful for tracking regressions.

Add this as a pipeline task or a standalone script that runs in your on_failure condition:

// create-failure-workitem.js
// Usage: node create-failure-workitem.js <build-number> <build-url> <reason>

var devopsClient = require('./devops-client');

var config = {
  organization: process.env.AZDO_ORG,
  project: process.env.AZDO_PROJECT,
  pat: process.env.AZDO_PAT
};

var buildNumber = process.argv[2] || 'unknown';
var buildUrl = process.argv[3] || '';
var failureReason = process.argv[4] || 'Unknown failure';

var fields = {
  'System.Title': 'Build Failure: ' + buildNumber,
  'System.Description':
    '<p>Build <strong>' + buildNumber + '</strong> failed.</p>' +
    '<p>Reason: ' + failureReason + '</p>' +
    (buildUrl ? '<p><a href="' + buildUrl + '">View Build</a></p>' : ''),
  'System.State': 'New',
  'System.Tags': 'build-failure; auto-created; ci',
  'Microsoft.VSTS.Common.Priority': 1,
  'Microsoft.VSTS.Common.Severity': '1 - Critical'
};

devopsClient.createWorkItem(config, 'Bug', fields, function(err, workItem) {
  if (err) {
    console.error('Failed to create work item: ' + err.message);
    process.exit(1);
  }

  console.log('Created Bug #%d for build failure: %s', workItem.id, buildNumber);
  process.exit(0);
});

In your azure-pipelines.yml:

stages:
  - stage: Build
    jobs:
      - job: BuildAndTest
        pool:
          vmImage: 'ubuntu-latest'
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'

          - script: npm ci && npm test
            displayName: 'Install and Test'

          - script: |
              node create-failure-workitem.js \
                "$(Build.BuildNumber)" \
                "$(System.TeamFoundationCollectionUri)$(System.TeamProject)/_build/results?buildId=$(Build.BuildId)" \
                "Build failed during test execution"
            displayName: 'Create Bug for Failure'
            condition: failed()
            env:
              AZDO_ORG: $(System.TeamFoundationCollectionUri)
              AZDO_PROJECT: $(System.TeamProject)
              AZDO_PAT: $(System.AccessToken)

Handling Duplicate Prevention

One problem you will hit immediately: if a developer pushes the same branch multiple times (amending, rebasing), you will create duplicate work items. Here is a deduplication layer using commit hashes:

// dedup-store.js
// Simple in-memory store. Replace with Redis or a database in production.

var processedCommits = {};
var MAX_ENTRIES = 10000;

function hasProcessed(commitId) {
  return processedCommits[commitId] === true;
}

function markProcessed(commitId, workItemId) {
  // Evict oldest if at capacity
  var keys = Object.keys(processedCommits);
  if (keys.length >= MAX_ENTRIES) {
    delete processedCommits[keys[0]];
  }

  processedCommits[commitId] = true;
  console.log('[DEDUP] Commit %s mapped to work item #%d', commitId.substring(0, 8), workItemId);
}

function getStats() {
  return {
    trackedCommits: Object.keys(processedCommits).length,
    maxCapacity: MAX_ENTRIES
  };
}

module.exports = {
  hasProcessed: hasProcessed,
  markProcessed: markProcessed,
  getStats: getStats
};

Then in your webhook handler, add the check before creating:

var dedupStore = require('./dedup-store');

// Inside the commit processing loop:
if (dedupStore.hasProcessed(commit.commitId)) {
  processed++;
  console.log('[DEDUP] Skipping already processed commit: %s', commit.commitId.substring(0, 8));
  return;
}

// After successful creation:
dedupStore.markProcessed(commit.commitId, workItem.id);

For production systems, replace the in-memory store with Redis:

// dedup-redis.js
var redis = require('redis');
var client = redis.createClient({ url: process.env.REDIS_URL });

client.connect().catch(function(err) {
  console.error('[REDIS] Connection failed:', err.message);
});

function hasProcessed(commitId, callback) {
  client.get('commit:' + commitId).then(function(result) {
    callback(null, result !== null);
  }).catch(callback);
}

function markProcessed(commitId, workItemId) {
  // Expire after 30 days
  client.set('commit:' + commitId, String(workItemId), { EX: 2592000 });
}

module.exports = {
  hasProcessed: hasProcessed,
  markProcessed: markProcessed
};

Common Issues & Troubleshooting

1. "TF401027: You need the Git 'GenericRead' permission to perform this action"

This happens when the PAT lacks the correct scopes. Your PAT needs:

  • Work Items: Read & Write
  • Code: Read (for commit details)
  • Project and Team: Read
HTTP 403
{
  "$id": "1",
  "innerException": null,
  "message": "TF401027: You need the Git 'GenericRead' permission to perform this action.",
  "typeName": "Microsoft.TeamFoundation.Git.Server.GitNeedsPermissionException",
  "typeKey": "GitNeedsPermissionException"
}

Fix: Regenerate the PAT with full "Work Items (Read, write, & manage)" and "Code (Read)" scopes. Do not use fine-grained scopes unless you know exactly which ones are needed.

2. "VS402336: The field 'System.AreaPath' contains a value that is not in the list of supported values"

The area path you specified does not exist in the project. Area paths are case-sensitive and must include the full hierarchy from the project root.

HTTP 400
{
  "message": "VS402336: The field 'System.AreaPath' contains value 'MyProject\\backend' that is not in the list of supported values."
}

Fix: Note the lowercase backend — area paths are case-sensitive. Use MyProject\\Backend if that is what is configured. Query available area paths first:

curl -u :$PAT \
  "https://dev.azure.com/{org}/{project}/_apis/wit/classificationnodes/Areas?api-version=7.1&\$depth=3"

3. "The request body could not be parsed. Content-Type must be application/json-patch+json"

This is the most common mistake when calling the work items API. The content type is NOT application/json.

HTTP 415
{
  "message": "The request body could not be parsed."
}

Fix: Set the Content-Type header explicitly:

headers: {
  'Content-Type': 'application/json-patch+json',   // NOT application/json
  'Authorization': 'Basic ' + Buffer.from(':' + pat).toString('base64')
}

4. Webhook Payloads Arriving Empty or Malformed

When the service hook fires but req.body is empty or undefined, it usually means the body parser middleware is misconfigured or another middleware is consuming the body stream.

// Wrong - this consumes raw body and leaves nothing for json parser
app.use(function(req, res, next) {
  var data = '';
  req.on('data', function(chunk) { data += chunk; });
  req.on('end', function() { req.rawBody = data; next(); });
});

// Right - use body-parser with verify to capture raw body without consuming it
app.use(bodyParser.json({
  verify: function(req, res, buf) {
    req.rawBody = buf.toString();
  }
}));

5. "TF237124: Work Item is not ready to save" — Missing Required Fields

Custom work item types or processes may have required fields beyond the defaults. If you get this error, you are missing a field that your process template mandates.

HTTP 400
{
  "message": "TF237124: Work Item is not ready to save. The following fields are required: [Acceptance Criteria]"
}

Fix: Query the work item type's fields to discover required ones:

curl -u :$PAT \
  "https://dev.azure.com/{org}/{project}/_apis/wit/workitemtypes/Bug/fields?api-version=7.1" | \
  jq '.value[] | select(.alwaysRequired == true) | .referenceName'

Best Practices

  • Always respond to webhooks within 10 seconds. Azure DevOps will retry failed deliveries and eventually disable the subscription if your endpoint consistently times out. Accept the payload with a 202 and process asynchronously.

  • Implement idempotency from day one. Webhook deliveries can be duplicated. Use commit SHAs as idempotency keys and store them in Redis or a database. The in-memory approach works for prototyping but loses state on restart.

  • Skip merge commits. Merge commits are noise — they do not represent actual work. Filter them out before parsing:

    if (/^Merge (branch|pull request)/.test(commit.comment)) {
      return; // skip merge commits
    }
    
  • Use the System.History field to add context. Every work item has a discussion thread. Append a comment explaining why the item was auto-created so future readers understand its origin:

    fields['System.History'] = 'Auto-created from commit ' + commitId +
      ' pushed to ' + repoName + ' by ' + authorName + '.';
    
  • Set up alerting on your webhook endpoint. If your listener goes down, commits will pile up without creating work items, and nobody will notice until sprint review. Use a health check endpoint and an uptime monitor.

  • Tag all auto-created items consistently. Use a tag like auto-created on every work item your service generates. This lets you query, report, and bulk-modify them easily. It also makes it trivial to distinguish human-created items from automated ones.

  • Rate limit your API calls. Azure DevOps has rate limits (approximately 200 requests per minute per user for PAT authentication). If a developer pushes 50 commits at once, batch the processing or add a small delay between API calls:

    function processWithDelay(items, delayMs, processFn) {
      var index = 0;
      function next() {
        if (index >= items.length) return;
        processFn(items[index]);
        index++;
        setTimeout(next, delayMs);
      }
      next();
    }
    
  • Log everything. Include the commit SHA, author, parsed type, created work item ID, and any errors in structured log output. When something goes wrong at 2 AM, logs are all you have.

  • Validate your PAT expiration proactively. PATs expire silently. Add a startup check that calls the Azure DevOps API and logs the PAT's expiration date. Better yet, set up a calendar reminder to rotate it.


References

Powered by Contentful