Anyone have experience with large sites (50,000 pages+)
- SEO |
So for an experiment last November I wrote a program that publishes a 1200 word Wordpress post every 5 minutes. The program data from spread sheets containing business information scraped for Yellow pages, and other list data creating 67 different variables.
Each 1200 word post is testing in Copyscape at 40%-60% unique so G has cached and indexed every last page it gets around to. There is 37,000 pages on the domain now, and the traffic last week spiked up to 400 unique a day. The content is in an extremely competitive niche so this is a success for sure.....but.
Linking structure is;
Home --> 2nd level --> post
Every post has a link back to 2nd menu and home page and 20 "related posts" links feeding G juice internally.
Question 1#: after G has indexed all the pages and counted the link juice will the traffic take another large bump up?
Question 2#: do large sites that have 80K-90K pages indexed with around 50% unique per page have a real advantage with G?
Have no idea what to expect as the largest site we've run in the past had 350 unique posts.
TZ
$php_coding = "consistent cash";
echo ("Give me" . " " . $php_coding . "!");
$php_coding = "consistent cash";
echo ("Give me" . " " . $php_coding . "!");