How to optimize robots.txt file?

by Delhi Rocks   Last Updated June 26, 2018 03:04 AM

I have updated my robots.txt file and blocked some pages and directory for my site. Is this correct or not? Because when I see in google pages are continiously showing.

User-agent: *

Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /tag/
Disallow: /author/
Disallow: /category/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Allow: /*.js/
Allow: /*.Css/

Sitemap : http://delhirocks.com.au/sitemap_index.xml


Related Questions