Thursday, October 17, 2013

Hackers use botnet to scrape Google for vulnerable sites



Some 35,000 sites that use vBulletin, a popular website forum package, were hacked recently by taking advantage of the presence of files left over from the program's installation process, according to security researcher Brian Krebs.


The hack by itself is fairly standard, but the way in which it was carried out shows how search engines like Google can unwittingly become a party to such hacking.


Krebs' findings were unearthed in conjunction with work done by security research firm Imperva, members of which believe the hacks are being executed by way of a botnet. The botnet not only injects the malicious code into the target sites, but also scrapes Google in a massively parallel fashion looking for vBulletin-powered sites that might make good targets.


Why scrape Google in parallel? As a workaround for Google's defense mechanisms against automated searches.


Such defenses work well against a single user scraping Google, since after a certain number of such searches from a single host, the user is presented with a CAPTCHA. This typically stops most bot-driven scrapes. But if a great many such searches are performed in parallel, it doesn't matter if each one of them eventually runs afoul of a CAPTCHA. Together, in parallel, they can still scrape far more than any one system alone can. (Krebs did not describe the size of the botnet used, however.)


The hacks themselves, of which Krebs has identified two, are fortunately rather easy to detect. One involves adding surreptitious admin accounts to the vulnerable vBulletin installations. The other hack, "apparently used in a mass website defacement campaign," adds an admin account named "Th3H4ck".


Now the good news: The very thing that made it possible to find those vulnerable vBulletin sites -- a properly crafted Google search -- can also be used to identify any existing hacked vBulletin installs. If you see a site you know on that list, tell the administrator. There's a good chance he doesn't know he's been hacked.


Scanning for vulnerabilities with Google isn't by itself new; Bruce Schneier pointed out in 2008 how this process was not only possible but could be automated. But deploying such Google scanning via a botnet for the sake of seeking out vulnerable sites in a massive parallel operation is a relatively new wrinkle -- at least until Google finds a way to block such things en masse without impacting regular search services.


Krebs points out it's difficult to place the blame exclusively on vBulletin. The makers of the software point out that its installation instructions ask that users remove the "/install" and "/core/install" directories after setting up the program.


In that sense, this issue is akin to the ways ColdFusion projects have been plagued by break-ins -- in part because many outfits are running older, unpatched versions of the software, but mainly because many firms don't follow Adobe's own instructions for hardening ColdFusion setups.


The oft-targeted WordPress has the same issue: It's easy to set up, but securing it requires that the end-user take a number of steps that often aren't followed.


This story, "Hackers use botnet to scrape Google for vulnerable sites," was originally published at InfoWorld.com. Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. For the latest developments in business technology news, follow InfoWorld.com on Twitter.


Source: http://www.infoworld.com/t/hacking/hackers-use-botnet-scrape-google-vulnerable-sites-228799?source=rss_infoworld_blogs
Related Topics: Mexico vs Costa Rica   Emily Ratajkowski   obamacare   harry potter   amanda bynes  

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.