Using robots.txt, you can ban specific robots, ban all robots, or block robot access to specific pages or areas of your site. If you are not sure what to type, look at the bottom of this page for examples.
An example of SEO optimized robots.txt file (should work on most blogs… just edit the sitemap URL):
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/User-Agent: Mediapartners-Google
Allow: /User-Agent: Adsbot-Google
Allow: /User-Agent: Googlebot-Image
Allow: /User-Agent: Googlebot-Mobile
Allow: /
When robots (like the Googlebot) crawl your site, they begin by requesting
0 comments:
Post a Comment