Unfortunately, the Google Sandbox now has two levels. Yes, you still need a lot of trust to get your pages ranking. But before you start worrying about that, you need to worry about getting your pages indexed in the first place.
Because, of course, you’re not really indexed if most of your pages have gone supplemental!
What happened? Well apparently the Google Indexer got tired of her nickname “Loose Louise”, she’s born again and now very choosy thankyouverymuch about who she lets enter. Rumor has it she’s only interested in trusted sites that are after a whitehat, long term commitment.
“What exactly is that supplemental index again?”
I won’t parrot the quasi-official Google answer here. (In fact, I can promise that I will never give you the “official Google answer” on Tropical SEO.) Take your pick:
* The Google Supplemental index is the Siberian work camp for web pages.
* The Google Supplemental index is to the normal index, as Scoreboard Media is to Tropical SEO.
* The Google Supplemental index is where they put web pages with little trust.
* The Google Supplemental index is where they put web pages that aren’t going to rank for anything important.
Got it? Cool.
Now, assuming your site is supplemental, here are five tips to get it out of Supplemental Hell.
1. Give each page a unique title. This is so basic it kills me but sometimes people still aren’t doing it. There is absolutely no reason not to do this, as a unique title also helps SEO-wise, it helps accessibility-wise, gets higher clickthroughs in the SERPs, etc.
2. Give each page a unique META DESCRIPTION. Remember when we all thought META tags were dead? Well, Google’s gotten funny about that. Let’s not waste time wondering why. Just give every page a unique META DESCRIPTION, even if it just matches the title tag.
3. Make sure each page has a good amount of unique content. This problem can rear its ugly head in a few different ways. Most commonly, a particular piece of content is being served at multiple URLs. This is usually a CMS or shopping cart issue, and the fix will be unique to whatever system you use. Also pretty common is the existence of very thin pages (a lot of large, hollow/empty web directories have this problem). My rule of thumb (and it’s not authoritative, just what I go by), is that a page should have at least 100 words of unique content at a minimum.
4. Get some more trusted links. Link building is all about trust these days. A few links from older, already-ranking domains will do wonders towards convincing Google a newer site deserves to be trusted. I also like to get a few higher-PageRank links in there (sitewide? even better). Yes, I know that tip is going to bring out the haters (insert “PageRank is dead!” comment here). But it appears that “overall link weight” seems to matter again in Google, if not in terms of rankings, at least in terms of indexing.
5. Get some links to internal pages. This is all about convincing Google your site doesn’t have “hollow shell syndrome”–when a site has, say, 20 pages, and a few dozen backlinks, but 100% of those backlinks are pointing to the homepage. Most often, the homepage of the site is in the normal index but all of the internal pages have gone supplemental. I usually go “brute force” at one internal page and get 3 or 4 links to it (giving this one internal page so much link weight that Google pretty much has to index it); normally, GoogleBot revisits the entire site and re-crawls and indexes the other internal pages, too (up to a point: if the site has hundreds or thousands of pages, you’ll need to rinse+repeat this a few times).
No comments:
Post a Comment