Need help regarding Robots.txt Files & Link Crawling
- SEO |
And I m doing some on-page related tasks and inclusion of robots.txt is one of them.
I saw that there are so many unnecessary URL's has been crawled on Google when I was checking that how much pages have been crawled. and I found following types of irrelevant links which are:
https://rated-builders.com/emails/
https://rated-builders.com/service/index.php
https://rated-builders.com/service/login.php
https://rated-builders.com/service/create-job.php
https://rated-builders.com/emails/Ne...ewsletter.html
https://rated-builders.com/emails/Receipt/receipt.html
https://rated-builders.com/emails/St...tationery.html
https://rated-builders.com/emails/Pr...ouncement.html
https://rated-builders.com/emails/Si...ouncement.html
https://rated-builders.com/emails/Pr...0-%20Copy.html
https://rated-builders.com/emails/Pr...0Announcement/
https://rated-builders.com/emails/Newsletter/
https://rated-builders.com/emails/Receipt/
https://rated-builders.com/emails/Si...0Announcement/
https://rated-builders.com/emails/Stationery/
So I wanted to disallow all these above links from the search engine. So can anyone tell me what syntax I could have to use in robots.txt file for disallowing all the above irrelevant links from search engines. Please show me the exact syntax if someone can know how can we disallow all that irrelevant links from the search engines.
I used following syntax in my robots.txt file:
User-agent: * Disallow: Disallow:/emails/ Disallow:/login.php |
Please help me...!!!
-
daniel27lt -
[ 1 ] Thanks
SignatureDownload Free PLR Products to give away to build your list. Find all the most up-to-date PLR on the market.{{ DiscussionBoard.errors[11078641].message }} -
-
altonroot -
Thanks
SignatureCompare Energy Prices | Compare Life Insurance{{ DiscussionBoard.errors[11079400].message }} -