Fotolia

Tip

Report testing checklist: Perform QA on data analysis reports

To incorporate data analysis features into software, fully test the reports they generate. Use this checklist to cover these reports' requirements, the test environment, APIs and test data.

Software-generated reports provide a great deal of value to customers. Reporting tools and dashboards serve up exceptional data analysis and slick graphics within software. But QA for these software-generated reports is difficult.

QA teams face two challenges when testing reports: First, getting well-defined, planned and designed reports with enough time for testing. And second, eliciting the relevant data to test them in a nonproduction environment.

Testers should also understand how to use the reporting tool; however, reporting tools get changed frequently. I'm not sure why, but in my experience, businesses replace the reporting tool more frequently than the toilet paper in the office restroom. Worse yet, many companies support multiple tools. Development and testing with one reporting tool is difficult, but doing the same tasks for multiple, often-changing tools is close to impossible.

To test reports in software properly, practice the measures in this checklist:

  1. Report requirements: Request better software requirements for reports.
  2. The test environment: Ensure the test environment works with reporting tools and other components.
  3. Connections and APIs for test servers: Set up necessary connections early.
  4. Test data sets: Get appropriate test data and account for how it differs from real data.

Following this checklist will help QA professionals test reporting features and software-generated reports.

1. Report requirements

Treat a report like any other feature, and call for thorough software requirements. Without functional definition from the start, reports can end up delaying release.

These requirements ensure appropriate design and testing:

  • A list of the data columns desired, particularly effective if the report is a simple, non-graphical spreadsheet with only data columns;
  • Requirements for advanced elements, including the assorted filters and sorting options, and calculations, expected from the reporting software; and
  • Description of the types of graphics expected. Requirements should include how to display calculations, not just the type of calculation the report's target users need. Ask stakeholders to describe desired pop-out designs, and what data displays on X and Y axes of charts.

As with most software requirements, companies should not rely on a developer or tester to guess the intent of the end product. Report requirements need stakeholder input if the feature is going to meet the customer's expectations.

Last-minute development and the testing window

To ship an overall release on time, a development team can end up rushing any given software component -- but reports suffer this fate often. Developers nearly always create reporting features at the end of the development cycle, or the stakeholder drops the task on the team after the rest of development and testing is already complete. Reports tend to be a surprise -- i.e., work that no one communicated to the people who build the feature until the last minute.

By the time developers finish building the reports, the testing cycle is either over or close to it. And generally, no one has allocated time to test the reports. Accordingly, the final product suffers.

To avoid these issues, advocate for better planning around reports in software builds. Since this feature is so important to the customer, the powers-that-be should not rush developers and testers to work on reports after they've finished the application's other functionalities.

There's a dynamic that makes report design difficult: Those persons responsible for the requirements often are not familiar with the application's database structure. The requirements' authors might not understand how the report displays in the reporting tool the IT organization uses. Those authors also aren't familiar with ways the reporting tool can enhance the data views. Be sure to ask questions during reviews, and get the visual elements defined along with the data elements.

QA professionals should review reporting user stories carefully:

  • Do you have enough requirements to create data and test the report?
  • Is there a screenshot or prototype of the report look and feel?
  • Are the expected filters and default sorting options clear?

2. The test environment

The QA professional responsible for checking the report features needs a controlled test environment, especially to ensure data integrity during the software tests.

  • The test environment should be dedicated to reports, not shared with other development works.
  • The report testing environment needs to be refreshable, meaning the data can be reset to a known state in each test before the system generates reports. If you don't refresh the data, you will waste time tracing data on the report to verify the software functioned correctly.
  • Install the application with a defined configuration. A defined, known configuration ensures that the report displays and makes calculations with a valid data set. Without it, testers will see differences in data depending on configuration settings. Request information as early as possible on how the customers expect to configure the system. Match that configuration as closely as possible to ensure report data representations are accurate.

3. Connections and APIs for test servers

Plan ahead to set up connections for the reports' test server and the reporting tool. Get the system working before the test starts if the report is designed to show data over time.

  • At the system or regression testing stage, execute tests within the reporting test environment. In this way, you'll execute functional tests while creating a data set for the overall report testing. Use the test cases to define the expected data that the report should display.
  • Test connections between this environment and the reporting tool as early as possible. Connections should return the expected data and should not generate errors.
  • Verify that the data downloaded onto the report is correct. Often, incorrect data indicates that the tester needs to reconfigure this interconnected system of the application that collects the data and the tool that analyses it.

4. Test data sets

Testers should emphasize the importance of a test data set that populates the reports as expected. Reports are designed to work with production data, which is not available to the tester. Test server data is usually vastly different, though the data types should be the same. Test servers might pull unusual data, including data from negative tests generated. No automatic data feeds coming in from real users exist.

Most development projects lack production-like data because of cost, licensing or space. IT organizations must scrub production data to use it in a test environment. It's hard to connect an API to a test server that only picks up data once testers have scrubbed it, and add it when needed to the test server. Data API access usually requires additional security measures, or a contractual agreement. But testers can work with these challenges to get workable data into the testing environment.

  • Plan time in test execution for reports to create data, or manually pull and upload a data set for testing purposes. You may be able to copy data over from another test server rather than manually creating it.
  • Ensure test data types match those of production data.
  • Investigate data problems previous test runs potentially caused. A well-designed test environment can help prevent problems.

Dig Deeper on Software testing tools and techniques

Cloud Computing
App Architecture
ITOperations
TheServerSide.com
SearchAWS
Close