Jump to content

Hackers really annoying me... Time to start banning countries?


Guest So Called

Recommended Posts

Guest So Called
So if you can identify a spammer by ip as soon as a page is requested you could give him a 404 error, right?
All categories, not just spammers. Bad web crawlers (who ignore my robots.txt file), what I call domain sniffers (sites that want to sell my URL to somebody else if I forget to renew it), etc. I posted about all the various categories elsewhere in this forum. I can use IP address, client domain, user agent string, any number of ways to categorize traffic into human and not human, and into good web crawlers vs. any other scripted activity that is negative behavior in my opinion. Yeah I have a wide range of options: silently drop the connectionsend 404 Not Foundsend null <HTML></HTML>send random text (you know the Lorem Ipsum thing?)send 500 Internal Server Errorsend 403 Forbidden301 forward them to a URL of my choosing Currently I just drop connections that aren't legitimate human visitors or legitimate search engines indexing my site. Edited by So Called
Link to comment
Share on other sites

The thing with .htaccess and mod_rewrite, those accesses would disappear and I'd never know if legitimate traffic was attempted. But since I log everything I can see even bans, and I have the opportunity to modify them.
But that's kind of my point - instead of nuking whole countries, you can permanently disable known malicious hosts and attackers, and thus keep your logs clear, while still allowing humans from those countries. This would also be a more efficient option BTW.If it troubles you that a spam IP might become a legitimate one (which doesn't seem to be the case, but still...), you could make a timeout for each entry, and periodically unban expired entries.
Link to comment
Share on other sites

Guest So Called

I'm tired of writing ban after ban to lamers in the same country when the country produces no legitimate accesses. My subject material is clearly not interesting to humans in the countries I've banned, and if that changes I can identify the human traffic and rewrite the bans to allow the humans and be more specific about the lamers and their scripts. Keep in mind, this is an experiment. It's not like I'm denying access to Wikipedia for an entire country.

Link to comment
Share on other sites

I'm tired of writing ban after ban to lamers in the same country when the country produces no legitimate accesses.
But you won't be writing any ban if you just place the automated rules once. The only thing you'll be adding are new referrer entries. And since only a few particular refers are the problem, you could match that to any ".ru", ".md", etc. (other than google.* that is) and ban any IP that contains such a refer. Simple, effective, automated, and allows legitimate traffic in.
Link to comment
Share on other sites

I'm tired of writing ban after ban to lamers in the same country when the country produces no legitimate accesses.
I normally go the route of saying I'm tired of analyzing logs and looking for things to ban when I'm not actually being attacked. I can sit in the server room and look at requests from Russia and China all day if I want to, but I have better things to do than look at attacks that don't succeed.
Link to comment
Share on other sites

Guest So Called
But you won't be writing any ban if you just place the automated rules once. The only thing you'll be adding are new referrer entries. And since only a few particular refers are the problem, you could match that to any ".ru", ".md", etc. (other than google.* that is) and ban any IP that contains such a refer. Simple, effective, automated, and allows legitimate traffic in.
I can already do that from my script but they just coming up with new URLs and many of those URLs are not inside the banned countries. And that only solves the link spammer or log spammer problem. I don't see any reason why I should allow traffic from domain sniffers and vulnerability probers and all the other lamers. I guess my explanation that this is an experiment is not getting through. Part of the experiment is that I have three sites with absolutely no content, no reason why any human would ever want to visit those sites, and one site with content. At some level I can subtract the content accesses from the site with content, and compare what's left with the non-human traffic on the other three sites. That traffic falls into three categories: good web crawlers, bad web crawlers and lamers and hackers, and a small amount of traffic I haven't identified yet. I'm just banning the bad traffic that appears in common across all 4 sites. In over 50,000 accesses in the last few years I've seen only 2-3 human accesses that accidentally got banned. I fixed my rules so that if that traffic repeats they'd get in to the site. Edited by So Called
Link to comment
Share on other sites

Guest So Called
I normally go the route of saying I'm tired of analyzing logs and looking for things to ban when I'm not actually being attacked. I can sit in the server room and look at requests from Russia and China all day if I want to, but I have better things to do than look at attacks that don't succeed.
Yeah but you've got a job. I have nothing better to do than my hobbies. Today I'm dividing my attention between Internet and taking care of my sick dog. And I don't necessarily even look at my logs every day. It's a hobby. It's an experiment.
Link to comment
Share on other sites

  • 2 months later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...