Accidentally barred directory sites with robots.txt

by burtie
3 replies
My mate did something amazingly dumb.

H changed hosts recently, as his old host has been giving great problems.

In the process, he accidentally changed his robots.txt to

User-agent: *
Disallow: *

Doh!

Now, Google has (obviously) started to remove his listings!

He has fixed robots.txt as it should be.

How can he get Google to re-read robots.txt quickly?

The website is
www.landau.ws
#accidentally #barred #directory #robotstxt #sites
  • Profile picture of the author Bruce Hearder
    He's done Step 1 already , that is fix up the robots.txt file

    Step 2. Give the site a couple of manual pings, using pingoat, or pingler or pingomatic

    Step 3. Change the content on the page so that there is incentive for BigG to reindex the page. Making sure the page sizes changes by a fair bit.

    Otherwise there i always the concern that G looks at page last modied fal and page size, and if neither has changed by much, it moves onto the next site..

    Hope this helps, let us know how he gets on with things..

    Take care

    Bruce
    {{ DiscussionBoard.errors[462931].message }}
    • Profile picture of the author pannsoln
      I'm the dumb mate!

      I'm pleased to report that Google has re-indexed my robots.txt.

      Curiously, however, the Google Webmaster Tools area says that it still has the old one!

      Thanks to Burtie and Bruce for your help!
      {{ DiscussionBoard.errors[465489].message }}
      • Profile picture of the author Bruce Hearder
        No problems..

        Glad we can help

        Take care

        Bruce
        {{ DiscussionBoard.errors[465724].message }}

Trending Topics