The most important, and perhaps obvious, lesson is that the software flaws are here to stay, said Peter Mell, a senior computer scientist for the National Institute of Standards and Technology (NIST) and the creator of the National Vulnerability Database (NVD), one of the four databases surveyed. “The problem of people breaking into computers is not going away any time soon,” Mell said. “There is certainly more patches every year that system administrators need to install, but the caveat is that more vulnerabilities seem to apply to less important software.”
In 2005, NIST created the National Vulnerability Database and software makers and security service providers have cooperated to create the Common Vulnerability Scoring System (CVSS), a standardized measure of the severity of software flaws. The National Vulnerability Database completed scoring flaws in its database using the CVSS in late November.
Four databases were surveyed: The Computer Emergency Response Team (CERT) Coordination Center’s database, the National Vulnerability Database (NVD), the Open-Source Vulnerability Database (OSVDB), and the Symantec Vulnerability Database.
The number of flaws cataloged by each database in 2005 varied widely, because of differing definitions of what constitutes a vulnerability and differing editorial policy. The OSVDB–which counted the highest number of flaws in 2005 at 7,187–breaks down vulnerabilities into their component parts, so what another database might classify as one flaw might be assigned multiple entries.
SecurityFocus had the lowest count of the vulnerabilities at 3,766.
The variations in editorial policy and lack of cross-referencing between databases as well as unmeasurable biases in the research community and disclosure policy mean that the databases–or refined vulnerability information (RVI) sources–do not produce statistics that can be meaningfully compared, Steve Christey, the editor of the Common Vulnerability and Exposures (CVE), wrote in an e-mail to security mailing lists on Thursday. The CVE is a dictionary of security issues compiled by The MITRE Corp., a government contractor and nonprofit organization. “In my opinion, RVI sources are still a year or two away from being able to produce reliable, repeatable, and comparable statistics,” he wrote. “In general, consumers should treat current statistics as suggestive, not conclusive.” Recent numbers produced by the U.S. Computer Emergency Readiness Team (US-CERT) revealed some of the problems with refined vulnerability sources. Some mainstream media outlets noted the number, compared it to the CERT Coordination Center’s previous data–which is compiled from a different set of vulnerability reports–and concluded there was a 38 percent increase in vulnerabilities in 2005 over the previous year. In fact, discounting the updated reports resulted in a 41 percent decrease to 3,074 vulnerabilities, according to an analysis done by Alan Wylie, an independent computer programmer. If the data point could be compared with statistics from CERT/CC, that would have placed the number of flaws reported in line with the previous three years. The computer scientist conducted an informal survey of entries for flaws in products from well-known companies and found that six of 14 software makers had seen a doubling in the number of vulnerability reports, while another four firms saw a decrease in the number of reports.
http://www.securityfocus.com/news/11367?ref=rss