Robots.txt need/don't need/ what's the skinny?
While researching on Panda changes in the algo, I noticed a trend:
old tactics seemed to just not get talked about,
- no more posts on x after 2007
- no posts on Y after 2008
- no posts on Z after 2009
This post put forth the idea that if your Robots.txt file was like this one, you'd be telling the SE to not index your site.
Web Designer Mag should fix its SEO - Yoast.
Out of curiosity I checked my robots.txt file and it was "bad?" like the one described.
SOoooo, I googled What should my robots.txt file say
this one said:
The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.kinda scary...
this one from an official Google Webmaster Central Support Forum said:
User-agent: *----
Disallow:
Sitemap: http://www dot somesite dot com/sitemap.xml
that'll be fine (and no, that doesn't disallow everything)
so who is full of it?
These 2 are contradicting each other.. either that or I'm not understanding the premise
any one shed light on this gang?
yes, I am....