Hi all users, in common sites there are more chances of vulnerabilities , by whichhackers can find loopholes and can exploit your website easily. So in today's tutorial, we will teach you how you can protect your website from being scanned by third parties (other peoples) for vulnerabilities and how to block that person who is scanning your website. This tutorial will help to prevent prevent people(hackers) for doing this. So lets get started.
Steps to Setup :
1. Frst of all you need to create adirectory to trap the scanner. So create a new directory in the root directory and name it trap or whatever you want. The main use of this directory is when someone will scan your website, he will accesses this directory and will be blocked by the IP . So this is how it'll prevent further scanning of your website. Please note that by this method, not only guests even member can also be banned.
2. Next thing we want to keep in mind is that don't need to prevent legit bots like thegoogle bots( bots by search engines). For that, we need to set access path to make sure that these bots will not enter in the trap directory. For this ,we must add this piece of code in the robots.txt file.
1. Frst of all you need to create a
2. Next thing we want to keep in mind is that don't need to prevent legit bots like thegoogle bots( bots by search engines). For that, we need to set access path to make sure that these bots will not enter in the trap directory. For this ,we must add this piece of code in the robots.txt file.
User-agent: *Disallow: /trap/
So now these bots will ignore this directory and move ahead.
Now create a new index.php file in the new trap/ directory and add this code :
Now create a new index.php file in the new trap/ directory and add this code :
<?php
$agent = $_SERVER['$HTTP_USER_AGENT'];
$datum = date("Y-m-d (D) H:i:s",$tmestamp);
$ipad = $_SERVER['REMOTE_ADDR'];
$ban = "Deny from $ipad # /trap/ $agent - on $datum \r\n";
$file = "../.htaccess";
$search = file_get_contents($file);
$check = strpos($search, $ipad);
if ($check === FALSE) {
$open = @fopen($file, "a+");
$write = @fputs($open, $ban);
echo "The IP <b>".$ipad."</b> has been blocked for an invalid attempt to access a file, directory or invalid scanning attempt.;
@fclose($open);
} else {
echo "The IP <b>".$ipad."</b> has been blocked for an invalid attempt to access a file, directory or invalid scanning attempt.";
}
exit;
?>
Make sure you CHMOD .htaccess to 666.
$file - you need to change this depending on where you place your trap.
Now setup is completed successfully now we need to place the bait to lure the scanners to your trap.
Edit the index file in your root directory and add this piece of code at the bottom :
$file - you need to change this depending on where you place your trap.
Now setup is completed successfully now we need to place the bait to lure the scanners to your trap.
Edit the index file in your root directory and add this piece of code at the bottom :
<a href="/trap/"><img src="images/pix.JPG" border="0" alt=" " width="1" height="1"></a>
Since the bots or scanners check links while crawling, they will see this line of code and immediately travel to our /trap/ folder and get trapped in it. When a bot or scanner will try to access the directory, they'll get an error message "access denied". You can use this in many situation e.g. you can use this in your admin directory where you have the ACP so that no one will access it.
Last but not the least, you must know that this is not 100% guarantee of the security as it can still be bypassed (but only 10% chance of bypassing) because 90% of skids attempts to scan the site and they never know how to bypass this restriction :)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.