The Stack Archive

Nearly all websites still use Google as an after-the-fact virus alert service

Tue 19 Apr 2016

A new report from Google into incidents of web-hijacking highlights research that shows only 6% of websites have any kind of security scanning capable of flagging infections, SQL injections and other hacking attacks before Google itself begins to recognise and flag the problem in search results – by which time the infected sites are already subjecting users to risk of infection themselves.

The report, entitled Remedying Web Hijacking: Notification Effectiveness and Webmaster Comprehension [PDF], paints a comprehensive picture of the current state of incidents of web-site hacking, and covers 760,935 hijacking incidents that occurred between July 2014 and June 2015.

This-site-may-harm-your-computerCommon attack vectors include drive-by-downloads, scams and cloaking redirects, and fall into two categories, each of which is likely to receive different warning flags in Google search results. Search Quality flags relate to incidents where the hijacking of a site has taken place in order to insert content which links to disreputable locations, with the ambit of hijacking ‘link juice’ (domain authority) and actual traffic; these are flagged in results with the legend ‘This site may be hacked’. Safe Browsing flags indicate actual malicious content, such as binary-pushing JavaScripts or attempts to break out of the browser sand-box, and these are signified in search results with ‘This site may harm your computer’.

The report, led by Frank Li of UC Berkeley, indicates how much webmasters rely on Google to find and identify such attacks, citing a study [PDF] by StopBadware and CommTouch which surveyed over 600 webmasters of compromised websites, and found that only 6% of them employed any proactive monitoring for their domains. Instead 49% of the affected webmasters were notified by Google’s added warnings in search results, and another 35% via third-party reporting channels, including warnings from friends and notifications from the hosting provider.

The Google report depicts an internet dominated by major content managements systems (especially WordPress, Joomla and Drupal) which are under constant attack because of their market share, but where such security provisions which may be in place – if any – are not backed up by active and ongoing scrutiny. The report comments on who should actually be taking responsibility for website security and the resolution of successful attacks:

‘Site operators are best positioned to redress hijacking incidents, but our and prior work has shown that webmasters are often unaware their site is compromised until an outside alert. Alternatively, hosting providers own the serving infrastructure for compromised sites, of which security scanning could be a service. However, doing so comes at a financial cost or technical burden; today, few providers scan for harmful content or [vulnerabilities]. Finally, ISPs, browser vendors, and search engine providers can enact incentives to spur action, but ultimately they possess limited capacity to directly help with remediation. These factors—representing the decentralized ideals of the Internet—make web compromise more challenging to address than account or credit card breaches where a centralized operator responds.’

The Google study shows that popular sites – harder targets with higher potential yields – generally respond to search-based browser warnings about site safety within 14 days, often resolving the problem within 24 hours, but then needing to await Google’s fortnightly scanning service in order to register the changes. Search Quality infections cannot be cleaned up within the same discrete time-span, since Google handles these on a per-case basis, with webmasters requesting re-evaluation as necessary, and the report notes that resolving such issues can take several months.

However website attackers who are seeking to make sites infectious (rather than leverage their popularity) have a far greater preference for low-ranked and less popular sites, as these have enough residual traffic to make engagement worthwhile, but are far less likely to be under scrutiny by their original webmasters.


Google news security
Send us a correction about this article Send us a news tip