At the heart of the latest wave of tool consolidation in enterprise software delivery is data—and getting all that precious information into one place.
By creating a single platform experience to help software delivery teams at largescale organizations plan and track all work from ideation to operation, tool vendors hope to support customers in this goal by providing faster, more accurate knowledge- sharing between teams and tools in real time to accelerate the planning, building, and delivering of products. Obtaining one source of truth to create business-centric IT reports helps organizations understand whether investments are paying off and improve decision making.
The problem with the notion of one tool, however, is that software delivery work is too complex to incorporate into a single approach. Specialist discipline teams across the value stream need targeted tools for their key slice of the process. For enterprises employing more than 10,000 IT staff members, consolidating all work and data into one tool is impossible considering the sheer scale of operations—not to mention the nuanced needs of the people involved with high-level requirements, portfolio management, and test management.
Enterprises need another way to mine and handle precious software delivery data to turn it into meaningful IT and business intelligence. One option is a best-of-breed approach that maximizes the value of the latest cutting-edge tools, supports the spiraling network of specialized software delivery teams, and enables an adaptive modular infrastructure that is responsive to the evolving needs of the business.
A vital part of this foundation is automating the flow of data across end-to-end tool networks to enhance cross-team collaboration and productivity in real time. This includes joining, abstracting, and measuring cross-tool data that represents the flow of value as features (business value), defects (quality), risk (security vulnerabilities), and debt (impediments to future delivery).
At a glance, this may seem to be a catch-22; you need one source of truth but also must embrace a best-of-breed toolchain that is siloed and contains multiple viewpoints of a product’s development. Faced with data ambiguity and the rising costs of a massive warehouse strategy, enterprises have begun to use enterprise toolchain integration to create “one system” to optimize their data analytics strategy for software delivery. There’s a reason that the data integration market is expected to be worth $12.24 billion by 2022 and growing at a respectable 13.7% CAGR. Data is the currency of business.
Through integration, automating the flow of data, and simplifying the crosstool reporting process, organizations can achieve the following results:
Save Money on ETL
Given that most large-scale organizations use at least three main tools in their software delivery toolchains (for development, testing and support), according to research my company has conducted, the cost of data extraction and storage begins to stack up and fast. By connecting the tools in the value stream, you can extract the data that matters from multiple datapoints in real time. Then, through modeling, you can instantly align, reconcile, andabstract this cross-tool data into a specific set of value-stream metrics that measure the rate of business-value delivery for software products through the lens of your customers (whether internal or external).
Because data is stored in tools in different formats, cross-tool metrics require data normalization to make sense of it all. Moreover, unique (and unpublished) database schemas per tool require in-depth exploration that steals precious time from software delivery specialists who should be focusing on their job, not wading through endless data lakes. In addition, some tools generate a database per project, meaning hundreds of databases with nuanced schemas in each database. The sheer complexity and effort of this data work may explain why lead times for building reports are so long; if it takes 2–3 weeks to build a report, its findings are most likely going to be outof- date. Likewise, any upgrade to tools or change to workflows will result in rework to accommodate for the changes to schemas, fields, and values. Instead, you should seek to use self-learning modeling technology that enables you to project a business view over the collected data (even if you’re not a data specialist).
Improve and Safeguard Data Access
Getting the data out of the tools can be a challenge unto itself. As more tools are consumed as a service, SaaS vendors are protecting their shared infrastructure’s performance by throttling API calls. Through built-for-purpose, sophisticated toolchain integration, you can extract the information you need from the tools in near-real-time. Furthermore, many SaaS vendors purge data periodically to save on storage, limiting access to historical data.
By letting data integration and automation software do the heavy lifting, you can focus on harnessing that precious data in several ways to support the business.
The benefits include the following:
- Automated traceability: Trace requirements all the way through development (including test) and delivery and back in order to meet stringent regulatory and compliance obligations.
- Ability to find bottlenecks: Identify where your wait states are that are slowing your end-to-end flow down in order to identify where to focus improvement initiatives.
- Data-driven decision making: Use value stream metrics aligned to key business results to monitor the realtime health of value stream, inform strategic planning, and drive faster decision making.
With a fully connected and automated workflow and a means to extract real-time value-stream metrics, you can make more sophisticated decisions that continuously improve the flow of business value at a faster clip.
- Manage work-in-progress (WIP): Understand how much work is in the whole system—and not just one area of the value stream—to help ensure demand isn’t outweighing capacity and undermining output.