I still get some traffic from Bing and Yahoo, and make a few Amazon sales, so I keep trying to improve this situation with Google.
In webmaster tools just now, I noticed that it says I have 25 warnings. I went to the sitemap section, and it says "Sitemap contains urls which are blocked by robots.txt." and then an issue count of 25.
Yet, when I go to Health section, and look at crawl errors, there are only 3 urls that are not found, and when I go to the "Block Urls" section, it says there are 0 blocked urls. The robots.txt file was downloaded just 14 hours ago.
Someone please help me understand all of this. Although webmaster tools says everything I've submitted is indexed, whenever I check rankchecker there is never one single keyword in the top 100. I've decreased keyword density, and done no additional backlinking.
Should there not be a robots. txt file at all?