The process of locating, identifying and targeting software vulnerabilities has changed beyond measure in the last 20 years. Automated tools are a huge factor in this and have made the whole process a lot easier, yet subtle and esoteric weaknesses are beginning to creep into software – and they aren’t easy to spot.
You can’t picture what a house looks like if all you have to go on is a single brick. You have to look at how the bricks work together and how the house is able to stand upright. To glean the best picture, you also need to know what’s inside the house and what surrounds it. Similarly, in order to gain a full and thorough understanding of the security vulnerabilities in software, it’s important to understand the environment in which the target software will run, the third parties that it depends on, and how every aspect works together, from the source code to the way it handles errors while running.
Gaining a comprehensive understanding of every security vulnerability in a piece of software, and how to mitigate them, requires a vast understanding of what software risks can occur and how they manifest themselves as vulnerabilities. Implementing multiple software analysis techniques can help to make this process easier.
Manual program analysis. Historically, this was the method researchers would use to look for flaws in software and using a review of software code to understand which parts are vulnerable is still a popular approach today. Researchers can focus on vulnerabilities that will have a high probability of exploitation as opposed to those which result in crashes. At times it’s easy to spot a problem, however, some issues can only be identified by researchers who have an in-depth understanding about one specific program. This is where the automated analysis comes in.
Dynamic analysis. This involves software testing performed while a program is executing, usually through an automated tool. Here researchers can observe how a program runs and responds to different inputs, for example, causing certain actions to fail by inputting malformed data. Overall this is a solid way to find vulnerabilities as it measures the state of the program in real-time and helps discover vulnerabilities that are extremely difficult to detect by other methods.
Statistic code analysis. This is another great technique for finding vulnerabilities that does not require execution of a program. Like other methods, researchers can perform an analysis on source code, but it’s important they choose tools that have a deep understanding of programming language syntax, complier and target architecture. But no matter how intelligent the tool is, if the researcher is looking at a fresh code base that has never been exposed to static analysis tools, she will need to prepare for a load of vulnerabilities, many of which may be false positives.
Static software composition analysis. This looks at the composition of software and dependent third-party components. Although this method can be tricky (as third-party components can themselves be made of third-party software), static analysis techniques are able to find out what libraries are used by looking at package manager metadata or proprietary databases. It is cross-referenced with vulnerability databases such as NIST NVD in order to identify which libraries pose a security risk. This can help to identify weak links and indicators of how these can be exploited.
Fuzzing. Fuzzing is a popular detection system for researchers. This method of vulnerability discovery uses purpose-built software or hardware to generate and submit test cases into target software. High-profile vulnerabilities are being discovered by more sophisticated fuzzers. The technique has a come a long way in a short time and promises to continue to push the boundaries of vulnerability discovery.
From the vantage point of the applied end of the security industry, this is a high-level view of the state of software vulnerability discovery in 2016. While much progress has been made in the last 20 or so years, there is still much to be done around enhancing consistency, coverage, automation and vulnerability discovery techniques.
You can find out more on this topic here.