GermanPrince Posted September 21, 2009 Share Posted September 21, 2009 ROBOTS.txt files are used to stop search engines from refrencing certain files or directories. Say you don't want search engines to index "/scripts" or "/data" or "/images". You would write the following code: User-Agent: *Disallow: /scripts/Disallow: /data/Disallow: /images/ What if you don't want anyone BUT Google to index your images. You would use the following code: User-Agent: *Disallow: /images/User-Agent: Googlebot-ImageAllow: /images/ "#" is used for comments. Take a peak at the following code for a better idea. User-Agent: Googlebot-Image #Disallows Google Images from indexing certain imagesDisallow: /images/ Each search engine has its own bot name. It would be great if you guys listed some for people to use. That is how you use robots.txt files.This is subject to change. Link to comment Share on other sites More sharing options...
Ingolme Posted September 22, 2009 Share Posted September 22, 2009 For more information, visit http://www.robotstxt.org/A list of search engine robots can be found here: http://www.robotstxt.org/db.html Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.