Testing overload: Where do we go from here?
January 18, 2017
With the recent decision by the Federal Aviation Administration (FAA) that all Boeing 787 Dreamliners must be reset every 21 days[1], it’s no wonder that organizations are feeling the demands of...
With the recent decision by the Federal Aviation Administration (FAA) that all Boeing 787 Dreamliners must be reset every 21 days[1], it’s no wonder that organizations are feeling the demands of testing – and testing efficiently. Many companies in the embedded market are already using tools like static code analysis (SCA) to help with testing, but with all the requirements and regulations you have to wonder: “Have we reached testing overload?”
The first real bug found in code was on September 9, 1945 in the Harvard Mark II[2]. Computer scientists spent time over the next several decades writing, testing, debugging, and reviewing code – though not necessarily defined as individual tasks. It wasn’t until the late 1970s when debugging and testing were introduced as separate tasks[3].
In 1979, Glenford Myers published his first version of a book, The Art of Software Testing, which helped establish how software development should be managed. Myers helped pave the way for cleaner coding, testing tools, and better software. But between 1979 and 2013, Myers’ book has been through 69 editions. With that number – not to mention the many, many other testing books and guides that have been added into the market since – it’s no wonder people feel overwhelmed.
Testing used to be done as an individual task, rereading code and correcting mistakes that could be found with the naked eye. There was also peer-review – better than self-reviewed code, but certainly still overwhelming and something that was destined to let issues fall through the cracks.
Flash-forward to today, and the tools available to help developers with testing are endless. And, like books on proper testing, the same issue remains: “How do you weed through all the options and know what tools are right for you?”
Many developers in the embedded market use SCA as a testing tool while they’re creating code. SCA is effective because developers can find code issues and fix them before they’re submitted into the entire codebase, making it easier for multiple people on a project to work together. But SCA tools can have their own issues and their own barriers to success. For example, developers are often left with a very long list of issues that need to be corrected, and it can be hard to figure out where to start and what issues are the most important.
It’s a developer’s dream to see defects in a prioritized list. Imagine how much easier testing would be if developers could triage defects depending on which were most likely to be a false positive, helping better manage their time and workload by avoiding time wasted on items that were incorrectly flagged as defects.
Luckily, embedded developers can now take advantage of a long-overdue ranking system for false-positive detection within SCA. Developers can cut through the noise and see exactly where teams should start with their testing efforts.
Rogue Wave Software’s Klocwork 2017 offers a feature that allows developers to prioritize, review, and triage issues in their projects. Based on sophisticated analysis of the code, including factors such as analysis complexity, the tool can identify and apply a recommendation to detected issues so developers can be confident that the identified issues are true and should be tackled first. By using many factors to calculate the triage suggestions, whether it’s a desktop analysis or a continuous integration build, the triage can also change for individual issues depending on how the code has changed.
Because developing for the embedded market is so complex and getting more so every day, testing will always be a large, important task. Tools will continue to make this process less of a burden so developers can cut through the noise, fix what really matters, and move on to developing the next important part of a project.
ERogue Wave Software
www.roguewave.com
LinkedIn: www.linkedin.com/company/rogue-wave-software
Facebook: www.facebook.com/pages/Rogue-Wave-Software/340310722680802
YouTube: www.youtube.com/user/roguewavesoftware
References:
1. Kaszycki, Michael. “Airworthiness Directives; The Boeing Company Airplanes.” Rgl.faa.gov. December 2, 2016. Accessed January 19, 2017. http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.nsf/0/828bbc426b0667298625807d00585f86/$FILE/2016-24-09.pdf.
2. Staff, Computerworld. “Moth in the machine: Debugging the origins of ‘bug’” Computerworld. September 03, 2011. Accessed January 18, 2017. http://www.computerworld.com/article/2515435/app-development/moth-in-the-machine–debugging-the-origins-of–bug-.html.
3. “Software testing history.” History software testing. Accessed January 18, 2017. http://extremesoftwaretesting.com/Info/SoftwareTestingHistory.html.