« Home | Live Search Launch! » | GeoTargeted AdWords Preview Tool » | Live Goes Live » | Google Cache Date » | Google Sitelinks » | Google Supplemental Search Link Display » | Microsoft Live Writer » | Google's Domain Apps » | Sesame Elite Search » | WordPress Plugins »

GOODROI Internet Marketing » One reason why new sites have a hard time ranking

GOODROI Internet Marketing » One reason why new sites have a hard time ranking:

"INCUMBENCY

The sites that currently are in the top rankings have the benefit of being the incumbents and it is a huge benefit. These incumbent sites gain hundreds of free links from scrapers.

Scrapers sites put up massive amounts of auto-generated content and wrap adsense around it. Where do these scraper sites get this content? They simply republish the serps for thousands of keywords. Thus the link and snippet for the top ranking sites gets republished on hundreds if not thousands of scraper sites.

I know what you are thinking, but Greg these scraper sites have very little link popularity. It is true that they only pass a little link popularity but a little link popularity from 500 or 1000 sites is some real nice link juice. And new sites do not have any of this link juice."

I have found a very similiar situation with the online directories, which pretty much do the same thing as the scrapers. My clients are in the medical field so the directories are just bought lists of doctors compiled into data directory lists with AdSense spread all over them.

It just complicates the seniority factor and the mass publication of a lot of data causes headaches when trying to do focused local SEO for dentists and orthodontists who are small one city shops. My client base is somewhere around 2000 and those directories get frustrating fast.

Trackback

Links to this post

Create a Link