ALM_Demo and ALM CAPs

The purpose of the ALM_Demo and ALM Content Acceleration Packs (CAPs) is to provide a set of items (Scorecards, Contexts, KPIs, Metrics, and more) that automatically gathers information from across your enterprise to build key performance indicators (KPIs) related to ALM-related issues with OOTB data from Data Warehouse as well as Dashboard pages that display the relevant information.

The relevant content pack is ALM. For details, see Integration with ALM.

ClosedUser Story

  1. Dan is the RnD Director responsible for multiple applications within the IT department. He has a weekly meeting with his team in order to discuss the performance and expectations of the applications that his organization implements. Before the meeting, he logs on to the IT Business Analytics (ITBA) application.
  2. Dan launches the ALM-Pre Release page to view the latest updates of the overall release performance.
  3. He starts looking at the KPIs in the Quality matrix section, and notices that the Fixing Ratio KPI value is very low and does not meet the normal thresholds. Its status is Red. Other Quality KPI values are in the normal range.
  4. Dan changes the KPI view to display the trend view in order to view the historical trend of the KPI in the component below.

  5. He notices that some times in the last few months, the KPI value showed a bad ratio.
  6. To understand if there is a specific period in the month that affects the overall ratio, Dan drills down by changing the periodicity to weekly.

  7. He finds out that the ratio is overall good, however, at the end and beginning of each month the ratio drops for unknown reasons.

  8. Dan wishes to continue his analysis to understand which team is responsible for this situation.
  9. He changes the visualization back to Bars and selects the Fixing Ratio KPI and clicks the breakdown by ALM Release in order to understand which release project may cause the KPI result.

  10. He realizes that the Hercules release worsened dramatically and caused the value of the KPI to increase.

  11. He writes an annotation to Peter, the PMO who manage this release project, requesting clarifications.
  12. Peter receives an email from Dan regarding the problematic KPI. He logs on to the ITBA application, checks the same KPI, and sees that the ratio is low.
  13. Peter checks the future behavior of the KPI (forecast) and understands that something needs to be done to overcome this issue.
  14. He sets a brainstorm meeting with his team to understand why the Fixing Ratio decreases at the beginning and at the end of each month and asks the team to come up with reasons why and with a plan.
  15. The team finds out that for the past few months, the more experienced engineers attended training workshops at the beginning and at the end of each month, which left the new and not so experienced engineers to deal with defects. This caused a low quality delivery and a low Fixing Ratio.

ClosedALM_Demo and ALM

User interface elements are described below (when relevant, unlabeled elements are shown in angle brackets):

UI Element

Description

Pages

ALM-Application LifeCycle Management Overview Page

ALM-Pre-Release Page

ALM-Post-Release Page

Scorecards
  • Application Lifecycle Management
  • ALM-Rnd Director

Business Contexts
  • ALM_Demo
  • ALM_Defect
  • ALM_Requirement
  • ALM_Test
  • ALM_TestInstance
  • ALM_TestRun

For details, see Semantic Layer - Contexts and Universes.

KPIs and Metrics

Note The Demo CAP includes KPIs with names followed by (Demo). The CAP includes the same KPIs with names that do not include (Demo). The KPIs are the same but have different names so that both the Demo CAP and the CAP can be activated at the same time.

  • Agile Medium and Low Defect Resolution KPI. The number of medium and low defects resolved within 30 days of the release date relative to the number of defects detected in the release minus the pending defects (unresolved defects younger than 30 days of the release date).

  • Agile Urgent and High Defect Resolution KPI. The number of high and urgent defects resolved within 30 days of the release date relative to the number of defects detected in the release minus the pending defects (unresolved defects younger than 30 days of the release date).

  • Average CPE Incident Resolution Duration KPI. The average time needed to close a CPE incident during the measurement period.

  • Average Cycle Duration KPI. The average cycle duration. This KPI. enables you to measure whether the organization has adopted Agile best practices. Short cycles can lead eventually to more agility from the time-to-market perspective.

  • Average Time to Resolve Production Defect KPI. The average time needed to fix a defect in production during the measurement period. A production defect is a post-release defect (detected after the release end date).

  • Average Time to Review Requirement KPI. The average time spent to review and approve a requirement during the measurement period.

  • CPE Incident Backlog KPI. The number of open CPE incidents.

  • CPE Long Duration Incident Backlog KPI. The number of CPE incidents that are still open after 30 days.

  • CPE Medium and Low Incident Backlog KPI. The number of open CPE incidents with medium and low severity.

  • CPE Urgent and High Incident Backlog KPI. The number of open CPE incidents with urgent and high severity.

  • Cycle Time on User Story KPI. The average time needed to implement a User Story.

  • Defect Resolution Time KPI. The average time it takes to close a defect during the measurement period.

  • Detected Vs Closed Defects Ratio KPI. The ratio between detected defects and closed defects during the measurement period.

  • Feature Actual Burndown KPI. The number of actual features that are ongoing.

  • Feature Planned Burndown KPI. The number of ongoing planned features.

  • Feature Traceability KPI. The number of features that are covered by a test case relative to the sum of the number of completed features that are tested and of the number of uncompleted features that are covered by a test case.

  • Fixing Ratio KPI. The number of fixed defects relative to the total number of defects.

  • Number of Closed CPE Incident KPI. The number of CPE incidents that were closed during the measurement period.

  • Number of Escalated CPE Incidents KPI. The number of escalated CPE incidents that were opened during the measurement period.

  • Number of Escaped Defects KPI. The number of defects that were not discovered during pre-production quality testing, and were found after the release (their discovery date is after the release).

  • Number of Opened CPE Incident KPI. The number of CPE incidents that were opened during the measurement period.

  • Percentage of Actual vs Planned Executed Tests KPI. The number of tests that were executed relative to the total number of test instances that were opened during the measurement period.

  • Percentage of Authorized Test Cases KPI. The number of test cases with a planning status that is ready relative to the total number of test cases that were created during the measurement period.

  • Percentage of Automated Test Cases KPI. The number of test cases that were automated relative to the total number of test cases that were created during the measurement period.

  • Percentage of Completed Test Instances KPI. The number of test instances that were executed relative to the total number of test instances that were planned to be executed during the measurement period.

  • Percentage of Completed Test Runs KPI. The number of test runs that were executed relative to the total number of test that were run during the measurement period.

  • Percentage of Critical Defects KPI. The number of critical defects ('Urgent' and 'Very High' Statuses) that occurred relative to the total number of defects that were opened during the measurement period.

  • Percentage of Documented Requirements KPI. The number of requirements with attachments or descriptions larger than 50 words relative to the total number of requirements during the measurement period.

  • Percentage of Failed Test Instances KPI. The number of test instances that failed relative to the total number of test instances that were run during the measurement period.

  • Percentage of Failed Test Runs KPI. The number of test runs that failed relative to the total number of test runs that occurred during the measurement period.

  • Percentage of Rejected Defects KPI. The number of defects that were rejected relative to the total number of defects that were opened during the measurement period.

  • Percentage of Reopened Defects KPI. The number of reopened defects (supposedly fixed defects or defects that were once fixed but reappeared) relative to the total number of defects that were logged during the measurement period

  • Percentage of Requirements Traced to Tests KPI. The number of requirements that have a corresponding test relative to the total number of requirements. Note: The assumption is that a cycle duration is shorter than the KPI. periodicity (if a KPI. periodicity is monthly, the cycle duration should be a month or less).

  • Percentage of Reviewed Requirements KPI. The number of business or functional requirements that have been reviewed relative to the total number of business or functional requirements that were planned to be reviewed during the measurement period.

  • Percentage of Successful Test Cases KPI. The number of test cases with a passed execution status in the last run (last test instance) relative to the total number of test cases that were planned to be executed during the measurement period.

  • Percentage of Successful Test Instances KPI. The number of successful test instances relative to the total number of test instances that occurred during the measurement period.

  • Percentage of Successful Test Runs KPI. The number of successful test runs relative to the total number of test runs that occurred during the measurement period.

  • Percentage of Test Instances Resulting in Defects KPI. The total number of test instances linked to defects relative to the total number of test instances that were run during the measurement period. Note that the KPI. only counts the test instances that are directly linked to defects.

  • Percentage of Tested Requirements KPI. The number of requirements that are actually covered and completed by tests that run, relative to the total number of requirements during the measurement period. Note: By default, the KPI. is based on cycles; if the organization does not use cycles the default can be changed to Projects.

  • Regression Ratio KPI. The number of regression defects relative to the total number of defects.

  • Reject Ratio KPI. The number of rejected defects relative to the total number of defects.

  • Reopen Ratio KPI. The number of submitted defects that have been fixed or closed by Dev and then reopened because there are still problems. Defects that are reopened n times will be counted n times.

  • User Story Traceability KPI. The number of user stories that are Validated, Done and don't have "N/A" as QA Status actually covered by test case relative to the total number of user stories that are Validated, Done and don't have "N/A" as QA Status.

Data (External Tables)

ALM_Demo

ClosedALM-Application LifeCycle Management Overview Page

ClosedALM-Pre-Release Page

ClosedALM-Post-Release Page