Robots.txt files in a website instructs search engine spiders to access certain files or directories and ignore rest. You won’t want search engine spiders to waste time in following useless set up files on your WordPress blog, right? Asking spiders to visit certain areas or accessing certain files will put more emphasis on your WordPress blog’s SEO factor. But most of us don’t know what to write in that robots text file. So, here is the tutorial to write a robots.txt file for your WordPress blogs.
robots.txt file for wordpress blogsRemember that your traffic views your website contents that’s visible to them while spiders view your site directories also. WordPress set up makes some directories in your site’s root folder like, wp-admin, wp-includes etc. If you don’t restrict spiders from accessing those directories and files inside them, they will waste time in following them and spending less time in your actual blog contents (posts & pages). So, it’s highly recommended to keep a robots.txt file and use with your WordPress blogs.
How to write Robotx.txt file
1. Open a new notepad file.
2. Copy and paste the following codes to that notepad file.
Sitemap: YOUR SITEMAP URL
User-agent: *
Disallow: /wp-content/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-
Disallow: /feed/
Disallow: /trackback/
Disallow: /cgi-bin/
Allow: /wp-content/uploads/
User-agent: Googlebot
Disallow: /*.pdf$
Disallow: /*.php$
Disallow: /*.js$
Disallow: /*.cgi$
Disallow: /*.xhtml$
Disallow: /*.php*
Disallow: /*.inc$
Disallow: /*.css$
Disallow: /*.txt$
Disallow: /*?*
Disallow: /wp-*
Disallow: */feed/
Disallow: */trackback/
Disallow: /cgi-bin/
Disallow: /go/
Allow: /wp-content/uploads/
User-agent: Googlebot-Image
Allow: /*
3. Replace YOUR SITEMAP URL with your sitemap location. Save file as “robots.txt” on your computer. (You have to write just Robots and choose to save as text file. Notepad will automatically add .txt file extension.)
4. Upload robots.txt file to your website’s root directory.
5. You are done
It was really a great post. I like to suggest all beginner to read Robots.txt form this post. And the way here robots.txt protocols explain was awesome.
ReplyDeletethanks
Covetus