Senior IT managers need to contend with a variety of different software development projects going on at the same...
time. And there is no shortage of challenges: Identifying the projects and the measures that make sense across projects, implementing the collection of data from a variety of different sources, normalizing them across projects of different sizes, selecting meaningful ways of presenting all this information and eventually using it effectively. Approaching the analysis, design and implementation systematically will make the ALM quality dashboard simpler, actionable and effective.
Analysis for an ALM quality dashboard
- Metrics for the ALM quality dashboard: Metrics needed for the ALM dashboard must span evenly all phases of the application lifecycle-- requirements gathering, design, development, testing and maintenance. They also need to look at the efficiency of collection and effectiveness of these metrics. For example, an efficiency measure for requirements gathering will need to be balanced with an effectiveness measure like end user satisfaction. If the end user is not satisfied with the way requirements are getting addressed in a prioritized, systematic fashion in a reasonable period of time, they may feel that they are part of a lengthening backlog.
- Normalization needed across projects: Applications can be small or large; mainframe-based, client-server or mobile. They may be multiyear, large-investment projects or short-run, simple projects. The ALM quality dashboard needs to have more or less the same measures on all projects so that you’re comparing them in a meaningful way. Weighting projects based on size and complexity is often needed for meaningful comparisons.
- Actionable metrics: The metrics need to be actionable, especially if they will be a part of an ALM quality dashboard. The actions flowing out of the metrics need to be practical and implementable, leading to improved ALM quality.
Design of an ALM quality dashboard
- Handful of metrics. ALM quality dashboards are useless if there are too many metrics. The whole idea of a dashboard is that you don’t have time to wade through an ocean of metrics to quickly get a comprehensive picture of your quality. It is always possible to classify all the metrics you could measure as primary, secondary and tertiary depending on how much effect they have on overall ALM quality. Dashboards should focus on only the primary ones.
- Display design. An ALM quality dashboard display should be restricted to one screen full of information, whether it’s browser-based or a display that pops up on a laptop, desktop or a mobile device like a smartphone or tablet. Increasingly, tablets and smartphones are becoming ideal vehicles for dashboards.
- Drill-down capabilities. Dashboards are just springboards for further analysis. Further action may not be possible without deeper analysis of the metrics on a dashboard. Drilling down will enable developers to unearth these anomalies with statistics.
Implementation, Follow-Up and Action
Data from multiple sources: Data for ALM quality metrics may come from multiple test management solutions, some central and some distributed. Making all this information flow to a central system for analysis and action is not trivial. The dashboard system chosen needs to accommodate inputs from direct feeds from database queries, reports, spreadsheets or data in flat files exported and sent. Monitoring of such sources is needed to ensure that all data from all sources have been received before the dashboards are generated.
- Comparable timeline matching: The timelines reflected in the data need to be comparable. This is of special importance if some of the data sources are networked to the ALM dashboard software but others are asynchronous and come through file transfers, spreadsheets or reports.
- Corrective actions: Corrective actions and feedback are ideally fed directly from the dashboard to the people responsible for the various quality metrics. If comments or questions can be emailed directly from the dashboard to the people responsible for that metric, at that level of drill-down, it will improve course corrections.
- Follow-up and loop back: Reporting systems should allow feedback and discussion on an ongoing basis as part of a feedback loop. Metrics that are out of range and actions that need to be taken and were taken should all be available on the dashboard. These capabilities are possible when dashboards are implemented as part of collaborative software like Microsoft SharePoint Portal.
ALM quality dashboards need to convey a large amount of critical information to IT management for observation, analysis and corrective action. In reality, projects could be in various stages of the ALM lifecycle, they could be in-house or outsourced, the development teams could be geographically dispersed and the applications themselves may be of varying size and complexity. Creating a single ALM quality dashboard in such cases is challenging, but if we focus on a few key characteristics needed from such a dashboard for maximum efficiency and effectiveness, it can be designed, implemented and used well.