True but using a robots.txt file is far easier than creating rules to block bots. If the majority of the bots in question are "rogue", than a robots.txt file won't be of much help. If the majority of the bots in question aren't rogue, then a robots.txt file can be used to easily control which bots can crawl the site and what areas of the site they are permitted to crawl. Doing this doesn't sacrifice all benefits of allowing a bot to crawl the site if some search engine presence is desirable.Some malicious bots will ignore robots.txt, so by denying them access in .htaccess is more effective
Are you looking for the solution to your computer problem? Join our site today to ask your question. This site is completely free -- paid for by advertisers and donations.
If you're not already familiar with forums, watch our Welcome Guide to get started.
Join over 807,865 other people just like you!