<body><script type="text/javascript"> function setAttributeOnload(object, attribute, val) { if(window.addEventListener) { window.addEventListener('load', function(){ object[attribute] = val; }, false); } else { window.attachEvent('onload', function(){ object[attribute] = val; }); } } </script> <div id="navbar-iframe-container"></div> <script type="text/javascript" src="https://apis.google.com/js/platform.js"></script> <script type="text/javascript"> gapi.load("gapi.iframes:gapi.iframes.style.bubble", function() { if (gapi.iframes && gapi.iframes.getContext) { gapi.iframes.getContext().openChild({ url: 'https://www.blogger.com/navbar.g?targetBlogID\x3d16851663\x26blogName\x3dBurton+Speaks\x26publishMode\x3dPUBLISH_MODE_HOSTED\x26navbarType\x3dBLUE\x26layoutType\x3dCLASSIC\x26searchRoot\x3dhttp://www.walkingsaint.com/search\x26blogLocale\x3den_US\x26v\x3d2\x26homepageUrl\x3dhttp://www.walkingsaint.com/\x26vt\x3d-1892815651864643552', where: document.getElementById("navbar-iframe-container"), id: "navbar-iframe" }); } }); </script>

Referer Spam

Monday, November 12, 2007

In writing my blogs and running my server, I've made an effort - from a technical standpoint - to streamline operations as much as possible. I've also made an attempt to make information about this site as public as possible, for the benefit of my reader(s). That's why, for instance, I've made my web stats available for all to see.

In doing this, however, I've run up against a particularly frustrating type of spam: Referer Spam. In short, there are herds of maliciously controlled computers out there that will crawl across the web, looking for sites (such as mine) that publish their stats, and will continually make requests for a page but claim that they were referred to the link by some website they're trying to promote. Since I publish my stats, I then show that a bunch of people came to my website via a link on "besttexasholdem.com" or "livenudegirls.com" or some website that I can assure you doesn't actually link to me. This benefits them in Google's page rankings, however, since suddenly they have a bunch of pages linking to them.

Now, despite the fact that I generally don't let search engines crawl through the stats pages (using the robots.txt file), that only eliminates the benefits for the referer bots - it doesn't fix the problem (nor does it change the fact that they're filling my log files with crap that still shows up in the stats.) So, after ignoring the problem for months (if not years), I finally got off my butt and implemented something that should cut down on the problem considerably.

Using a combination of mod_security2 and mod_setenvif (both are modules for Apache) I've managed to deny web access to most of the bots and I don't even log the access anymore. If there are any matches from a list of substrings in the referer field of the HTTP request header, mod_security sends back an error status of 412, which denies access but is still logged. I then use setenvif with that same list to set an environmental variable, one I tell the log files to ignore. This solves all my problems - except for the fact that I'll have to periodically check to see if there are any new pattens for referer spam headers.

I guess it's good enough for now, and it's a bit of peace of mind, too.


Blogger Laura said...

i didn't pay attention to any of that. you know why? Cus you shaved.
I was momentarily confused and wondered who's blog I'd stumbled upon.

9:22 PM, November 12, 2007  

Post a Comment

<< Home



Twitter Updates

My Other Sites

Site Information

Friend Blogs

Awesome Links

Favorite Webcomics

Previous Posts


Powered by Blogger