Skip to main content

How to Optimize Flow Metrics Quality in Scaled Agile Framework (SAFe)

DevOps teams gather over a scrum task board to assess Flow metrics quality

In May of 2021, Scaled Agile Framework (SAFe) updated its metrics guidance to better include measurements of outcome, Flow, and competency. Flow framework and metrics are excellent tools for leading a team. Accordingly, they make welcome additions to SAFe. 


However, when examining this implementation in daily DevOps workflow, one essential dimension seems to be missing: Flow Metrics quality. Let’s explore how to identify metrics quality within SAFe’s existing metrics guidance. 

Didn’t Flow Framework already cover quality?

In Flow Framework, quality is measured in terms of business result, or outcome. This approach refers to quality as consumers perceive it. Some call this “quality in use.” Quality in use is important, but it shouldn’t be your only metric. Focusing on the outcome can cause us to miss all the interconnected elements that led to it. To fully understand Flow metrics quality, we must measure it throughout the release cycle with actionable metrics. 

Flow Metrics Quality in a DevOps Environment 

To measure quality throughout the release cycle, you must examine it in both the quality assurance (QA) and production environments. This can only be achieved by correctly measuring and interpreting test pass rates and defects.

Test Pass Rate

The test pass rate trends of automated test suites immediately tell you three things:


  1. Whether you have regression

  2. Whether your test cases are out of sync with the latest functional changes

  3. How quickly your team is able to resolve issues to return the pass rate to 100 percent.

In production, “shift-right” tests determine whether the essential use cases still work fine for consumers. Unfortunately, the shift-right approach only paints a fraction of the bigger picture. Let's examine the Flow metrics quality in the graph below.

Test Pass Rate Production and QA - Copado

The graph shows that the production pass rate trend has less volatility than its QA counterpart. Nevertheless, each one requires action since they both deviate from the 100 percent norm. Possible action items include:

  • fixing a bug

  • adjusting a test case

  • resolving an environment issue. 

Trends like these can also reveal process bottlenecks and provide the context necessary to address them. 


In practice, you must complement automated regression testing with exploratory testing to guarantee sufficient test coverage of the new functionality. Comprehensive automation solutions like Copado Robotic Testing can help cover all the bases.  

Defect InFlow/OutFlow

Defect inFlow/outFlow trends are interesting because they determine whether your team can fix more bugs than just those discovered in the tests. However, your Flow metrics quality discoveries may not always be as cut and dry as they seem. 


For example, low defect inFlow is not always something to be proud of. It could just indicate insufficient testing. Transparent defect inFlow and outFlow can help you see what’s behind these observations. Open communication allows your teams to understand your measurements and address them through data-driven decisions. 


The graph below shows an example of a company with a higher defect inFlow than outFlow. This team appears to have invested in paying off their quality debt. For those unfamiliar, the term “quality debt” refers to the measure of effort needed to fix defects at a given time. The Flow load and amount of open defects can be expected to decline.

Defect InFlow/OutFlow - Copado

A defect inFlow-outFlow graph is handy for teams developing a large or challenging product that requires a longer release cycle. Since the point where outFlow outperforms inFlow is easy to pinpoint, system maturity becomes measurable. 

Using Flow Metrics Quality to Measure Release Readiness

The sheer number of must-fix defects indicates how close you are to release readiness. You can measure this with simple metrics, like must-fix tags, to systematically classify defects by severity. These tags can help your team identify potential stumbling blocks with ease. 


Automated deployment pipelines can be configured to address the absence of must-fix tags and test pass rate criteria. You can then compare a release candidate’s quality to its production state. These basic test and defect metrics can help any team manage the essential elements of software quality with data. The goal is to extend software quality measurements to other aspects, like data security and business goals. 


Artificial intelligence can also incorporate Flow Metrics, alongside other data, in order to predict release quality before you deploy. Copado Robotic Testing makes it easy to assess how far you are from release and can adapt tests to the changes you make throughout the development cycle.