Software delivered with speed, stability, and maximized organizational/operational availability offsets fluctuating customer demands and expectations, competition, security, regulatory compliance, and a host of other external challenges in dynamically paced business markets.
In order for the software market to establish holistic industry-standard measurements and assess the aforementioned technological capabilities in accordance with intelligent research ensuring reliable, repeatable and scalable benchmarks, the DORA platform was formed to address these key areas aimed at improving software product/process performance.
Furthermore, the DORA platform specifically addresses:
“the unavailability of external benchmarking data to drive performance comparisons, and the inability to measure improvement quantitatively over time in relation to changes in the rest of the industry, prevents teams from understanding the dynamic wider context in which they operate. This can lead to teams failing to take sufficiently strong action, and falling further behind the industry over time…Software development and delivery include several key capabilities: strong technical practices, decoupled architectures, lean management practices, and a trusting organizational culture.” [1]
Reaching a solution for the aforementioned challenges was achieved through use of the DORA Assessment Tool[1] that was systematized to detail/analyse qualitative assessments targeted towards business leaders both directly and indirectly (channel partners such as consultancies or system integrators) and report these quantitative industry-standard metrics to the software community at large. The DORA assessment surveyed software professionals across the entire production value stream including: development, testing, QA, IT operations, information security, and product management spheres.
The key capabilities stated above are measured along four primary dimensions[1] consisting of:
- Technical (continuous delivery components with practices including: versioning, both test and deployment automation, trunk-based development, and security shifting left)
- Process [lean methods including: work visualization, work decomposition (single piece flow), and work in process (WIP) limits]
- Measurement (metrics utilization for business decisions and use of monitoring tools)
- Cultural (measures including: organizational trust, informational flow, learning, and job satisfaction)
Software metrics derived from the DORA Assessment Tool methodology includes: deployment frequency, lead time for changes, time to restore service, change failure rate, and software delivery and operational performance. Each of these DORA key software metrics are defined in the table below.
Metric |
Aspect of software delivery performance |
Reference |
Deployment frequency |
For the primary application or service you work on, how often does your organization deploy code to production or release it to end users? |
[2] |
Lead time for changes |
For the primary application or service you work on, what is your lead time for changes (how long does it take to go from code committed to code successfully running in production)? |
Time to restore service [Mean time to restore (MTTR)] |
For the primary application or service you work on, how long does it generally take to restore service when a service incident or a defect that impacts users occurs (unplanned outage or service impairment)? |
Change failure rate |
For the primary application or service you work on, what percentage of changes to production or released to users result in degraded service (lead to service impairment or service outage) and subsequently require remediation (require a hotfix, rollback, fix forward, patch)? |
Availability [Software delivery and operational (SDO) performance] |
At a high level, availability represents an ability for technology teams and organizations to make and keep promises and assertions about the software product or service they are operating. Notably, availability is about ensuring a product or service is also available to and can be accessed by your end users. Our measure of availability also captures how well teams define their availability targets and learn from any outages, making sure their feedback loops are complete. |
[3] |
Finally, efficient software delivery is realized with speed, stability, and maximized organizational/operational availability quantified by these key metrics utilized in the value stream.
Summary
Based on the detailed discussion above, equating a unique and working software solution comprehensively encompassing core requirements is effectually implemented with UrbanCode Velocity that optimizes flow through the value stream utilizing Agile/Lean principles. Discover and dynamically interact with UrbanCode Velocity’s value stream key metrics by viewing the complementary blog article located here. The HCL team will provide intelligent practical solutions uncovering data-driven insights propagating DevOps best practices across the enterprise.
[1] Forsgren, N., M. Chiarini Tremblay, D. Vandermeer, J. Humble (2017). “DORA Platform: DevOps Assessment and Benchmarking,” In Proceedings of the International Conference on Design Science Research in Information System and Technology (DESRIST) 2017, Karlsruhe, Germany.
[2] DORA DevOps Research & Assessment: Accelerate: State of DevOps (2019)
[3] DORA DevOps Research & Assessment: Accelerate: State of DevOps: Strategies for a new economy (2018)
#DevOps#UrbanCodeVelocity