Introducing benchmarks: setting standards for cycle time


Engineering leaders have consistently told us how difficult it is to set standards, and set their baselines for productivity. What does good look like, and how do we communicate this across teams?

With a data set of millions of pull requests synced into our system, we’re now sharing benchmarks to help teams quickly know where they are blocked and how their cycle time stack ranks against other fast-moving teams.

Understanding PR Cycle Time Benchmarks

Our Benchmarks feature calculates the average of all the ‘time to merge’ values within a specified range and lets you how your team performs compared to others. Note: ‘time to merge’ is the duration between when the PR is first opened and when it is merged. 

Note: PR Cycle time in Tara AI, is measured in hours or days when synced to Github.

Four Grades of PR Efficiency

Our Benchmarks indicator categorizes your PR efficiency into one of four grades:

  • Excellent: If your average ‘time to merge’ is 24 hours or less, your team is doing a fantastic job of maintaining an efficient workflow.
  • Good: a period of 25 to 48 hours indicates a good efficiency level, and that there’s some room for improvement to get your team to the ‘Excellent’ mark.
  • Fair: 49 to 119 hours might be a sign that there are a few aspects in your PR workflow that need a closer look and refinement. There are potentially a few blockers in your development process that need to be addressed.
  • Deficient: If the average time exceeds 120 hours, it’s a clear sign that your team needs to look into your processes, assess the hurdles, and take action to improve efficiency.

Know Where You Stand

While internal metrics are valuable, understanding where you stand in the broader ecosystem can offer a lot of insights. While it is difficult to answer the age-old question “How is my team performing?”, Benchmarks allows you to gain visibility into how they stack up against other software teams– to achieve a new level of understanding into their performance.

Teams can use Benchmarks as an objective measure of efficiency when it comes to PR workflows. By looking at Benchmarks, teams can establish baseline productivity and identify the strengths or areas of improvement in their development process.

How we improved our PR Cycle Time from 7 days, to 16 hours

When benchmarks first rolled out, our cycle time was high

What happened: When our benchmarks feature was rolled out internally, our team noticed that our cycle time was alarmingly high, nearly reaching 7 days! It was in the Deficient category, and it was clear that we needed to improve our efficiency and reduce our cycle time.

The solution: In an effort to improve team velocity, we initiated an internal push amongst our developers and emphasized the importance of making smaller PRs to easily stay on top of them.

The result: In a month, we successfully improved our PR Cycle Time from 7 days to 16 hours. Since then, we have continued to maintain a PR Cycle Time of under a day. We are now able to keep our process efficient by constantly monitoring the PR stats and seeing how we are performing using the established baseline.

Final Words

We’re excited to hear how this new feature allows your team to improve your performance as well as your workflows. 

As always, your feedback is invaluable to us. Let us know what you think by commenting below or tweeting us @taradotai.