CyberPatrol Blog

Follow Our Blog for News and Advice on Creating a Safe and Secure Online Experience

June 25, 2009

Block Viruses Distributed by Web Pages

Filed under: Computer Security, Education/Library, SMB, computer virus, schools — BarbR @ 3:42 pm

Computer Security for Schools and Small Businesses

For a small-to-medium enterprise like a business or library, protection of its computer network is not easy.  Hackers are constantly concocting new ways to infect the network (with viruses and other malware) by way of the web pages that network users visit.  Although the enterprise can choose from an array of tools to protect its network, those tools can be expensive and cumbersome.  No tool or combination of tools is perfect.  Finding the right mix of cost, effectiveness and easy of use is a problem.

To answer this problem, CyberPatrol has developed a smart service for steering network users away from dangerous web sites.  Known as SiteSURV, the service relies on CyberPatrol’s SiteCAT system, which constantly crawls (spiders) the web to assess and categorize web pages.  The service provides two layers of filtering.  One layer examines sites according to their content and purpose, and then blacklists those that appear to be dangerous.  The second layer specifically analyzes files and downloads from each site to ascertain whether they contain signatures for known malware.

We asked Chris Overton, VP of CyberPatrol, to explain these two layers of protection.  First, he highlighted the security achieved just by keeping users away from sites of questionable content:  “Certain types of sites tend to deliver malware more than others.  Along with adult and XXX sites, “parked domains” and “warez” sites are more likely to deliver malware than other site categories.  We know this because files pulled from these sites have a higher percentage of malware infection than files from other sites.  So, we can infer that preventing access to these dangerous site categories will advance the fight against malware infections.  Preventing access to a dangerous site protects against all the malware at that site, regardless of whether anyone has developed signatures to detect any or all of the different malware there.”

Chris further described what SiteCAT does when it crawls a web site:  “SiteCAT’s algorithms analyze a web site based on several factors – content, structure, link count, link references, and so on.  Based on this analysis, our system decides which pages/files to download from that site.  Typically we’ll download the main index page of a site and analyze it; then our algorithms decide how much deeper to dig.  All files we want to analyze are pulled by the crawler and saved into our analysis archive.  Then the files feed into a malware detection engine, which looks for the signatures of malware such as a virus or a worm.  If we detect any malware when we crawl the site, we can blacklist it and prevent all of the malware the site might deliver, even malware that we have not specifically detected.”

In other words, SiteSURV allows an enterprise to adopt a conservative, one-strike-and-your-out approach toward web sites.  If a site either contains suspicious content or manifests one instance of infection, the enterprise can block it entirely.

Bookmark and Share