Poor quality code is exposing applications to cyber-attacks similar to the Heartbleed bug, according to new research

Poor coding puts business applications at risk of hacking

Poor code quality is exposing more than two-thirds of retail and financial services applications to cyber-attacks similar to the Heartbleed bug, according to new research.

Software analysts CAST says new findings from its ongoing research on application software health shows that a growing number of data breaches and security incidents can be directly linked to poor software design.

The firm found that 70 per cent of retail and 69 per cent of financial services applications featured data input validation violations, a form of coding error central to the Heartbleed bug that exposed over 60 per cent of the internet’s servers to hackers earlier this year.

In the Heartbleed case, a lack of proper input validation in the heartbeat extension of the Transport Layer Security (TLS) encryption protocol allowed hackers to trick the victim computer into thinking more data had been sent to the system than was actually the case, causing more data to be sent back to hacker than should be and exposing the contents of the victim machine’s hard drive.

CAST executive vice president Lev Lesokhin, who led the analysis said: “So long as IT organizations sacrifice software quality and security for the sake of meeting unrealistic schedules, we can expect to see more high-profile attacks leading to the exposure and exploitation of sensitive customer data.

“Businesses handling customer financial information have a responsibility to improve software quality and reduce the operational risk of their applications --not only to protect their businesses, but ultimately their customers.”

According to Verizon’s 2014 Data Breach Investigations Report, input validation attacks were exploited in 80 per cent of attacks against applications last year in the retail industry alone – including the record breaking eBay data breach, which resulting in hackers gaining access to over 145 million user records.

CAST’s biennial CRASH Report found that government IT projects had the highest percentage of applications without any input validation violations at 61 per cent, while independent software vendors came in dead last at 12 per cent.

The data showed that the financial services industry has the highest number of input validation violations per application (224) even though their applications, on average, are only half as complex as the largest application scanned.

The research team also found a significant correlation between application robustness, its ability to avoid failures, and application security.

Dr Bill Curtis, chief scientist at CAST and author of the report, said: “Some security experts argue software security is different from software quality and should be treated separately. The CRASH Report data proves this is false. Badly-constructed software won’t just cause systems to crash, corrupt data, and make recovery difficult, but also leaves numerous security holes.”

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them