Effective use of robot.txt
You can use robots.txt to provide an autodiscovery mechanism for the spider to find
the XML Sitemap file. The search engines can be told to find the file with one simple
line in the robots.txt file: Sitemap: sitemap_location The sitemap_location should
be the complete URL to the Sitemap, such as http:// www.yourdomain.com/sitemap.xml.
You can place this anywhere in your file. For full instructions on how to apply
robots.txt, see Robots.txt.org. You may also find it valuable to use Dave Naylor’s
robots.txt generation tool to save time and heartache (http://www .davidnaylor.co.uk/the-robotstxt-builder-a-new-tool.html).
You should use great care when making changes to robots.txt. A simple typing error
can, for example, suddenly tell the search engines to no longer crawl any part of
your site. After updating your robots.txt file it is always a good idea to check
it with the Google Webmaster Tools Test Robots.txt tool.
CrossRoad offer
church website design.
More web design posts
Back to recent posts