Company Creates “Crash Test Dummy” for the Web
Don’t know why no one really thought of this before. A company has created a series of automated bots that surf the net, download any free software offerred, fill out web forms with unique email addresses, all in an attempt to find out which sites are safe, and which sites will infect users with spam, spyware and trojans.
They call the bot a “HoneyMonkey.” This is akin to a “honeypot” which is a fake system intended to get hacked so the observers can study the techniques of hackers, and a “honeynet” which is a network that is also designed to get hacked. A “HoneyMonkey” is a fake user, designed to get screwed over by bad websites in order to categorize the malware on the sites.
From SecurityFocus:
SAN FRANCISCO — File-sharing software that installs adware, Web sites that attempt to compromise a visitor’s computer, and free downloads that install a host of other unwanted software — the Web has become a confusing, and sometimes dangerous, place for the average home user.
Tom Pinckney, vice president of engineering and co-founder, SiteAdvisor A group of graduates from the Massachusetts Institute of Technology (MIT) aim to change that by crawling the Web with hundreds, and soon thousands, of virtual computers that detect which Web sites attempt to download software to a visitor’s computer and whether giving out an e-mail address during registration can lead to an avalanche of spam.
“We put a robot ‘you’ in first, and then you can see if it comes out with a bunch of arrows stuck in it,” Pinckney said. “By using virtual yous, we end up measuring behaviors in a more holistic approach–creating a list of sites that are good and a list of sites that have dubious behavior.”
The service builds on research into data collected through legions of computers that automatically browse the Internet, downloading free programs and filling out forms. Microsoft has used such client-side honeypots, which the software giant’s researchers call honeymonkeys, to browse the riskier side of the Web for servers that use zero-day exploits against visitors. The University of Washington used similar techniques to survey the Web and find that one in twenty executables on the Internet contain spyware.
SiteAdvisor wanted to take the model further and so created virtual computers that acted as if they were the most permissive users–downloading any offered program and entering in uniquely identifiable e-mail addresses into any forms. The result is that the service can keep track of the consequences of visiting a site and giving that site information. So far, the company’s browsing automatons–virtual Windows machines running on Linux computers–have browsed more than 1.6 million Web sites.
The company intends to offer a free service to consumers and charge for a premium version of the service to make money.
“If your car had an EULA 20 pages long that allowed the manufacturer to wallpaper your house in Exxon ads, would you go for that?” he asked. “That is what these sites are trying to do to our computers.”
“Because there are some bad screensaver sites out there, no one is downloading free screensavers anymore,” he said. “This could help the good sites.”
If this type of rating system catches on, sites that host questionable logic or sypware will be forced to clean up their acts. And in the long run, this type of protection will be good for the whole internet community. Of course, SiteAdvisor will likely be gobbled up by one of the big AV companies before long and bundled in one of their comprehensive offerings. Better yet, such a service may be used by a portal company such as AOL or Earthlink.