March 31, 2011

Google: Bing Is Cheating, Copying Our Search Results

Google has run a sting operation that it says proves Bing has been watching what people search for on Google, the sites they select from Google’s results, then uses that information to improve Bing’s own search listings. Bing doesn’t deny this.
As a result of the apparent monitoring, Bing’s relevancy is potentially improving (or getting worse) on the back of Google’s own work. Google likens it to the digital equivalent of Bing leaning over during an exam and copying off of Google’s test.
“I’ve spent my career in pursuit of a good search engine,” says Amit Singhal, a Google Fellow who oversees the search engine’s ranking algorithm. “I’ve got no problem with a competitor developing an innovative algorithm. But copying is not innovation, in my book.”
Bing doesn’t deny Google’s claim. Indeed, the statement that Stefan Weitz, director of Microsoft’s Bing search engine, emailed me yesterday as I worked on this article seems to confirm the allegation:
As you might imagine, we use multiple signals and approaches when we think about ranking, but like the rest of the players in this industry, we’re not going to go deep and detailed in how we do it. Clearly, the overarching goal is to do a better job determining the intent of the search, so we can guess at the best and most relevant answer to a given query.
Opt-in programs like the [Bing] toolbar help us with clickstream data, one of many input signals we and other search engines use to help rank sites. This “Google experiment” seems like a hack to confuse and manipulate some of these signals.
Later today, I’ll likely have a more detailed response from Bing. Microsoft wanted to talk further after a search event it is hosting today. More about that event, and how I came to be reporting on Google’s findings just before it began, comes at the end of this story. But first, here’s how Google’s investigation unfolded.
Postscript: Bing: Why Google’s Wrong In Its Accusations is the follow-up story from talking with Bing. Please be sure to read it after this. You’ll also find another link to it at the end of this article.

Hey, Does This Seem Odd To You?

Around late May of last year, Google told me it began noticing that Bing seemed to be doing exceptionally well at returning the same sites that Google would list, when someone would enter unusual misspellings.
For example, consider a search for torsoraphy, which causes Google to return this:

In the example above, Google’s searched for the correct spelling — tarsorrhaphy — even though torsoraphy was entered. Notice the top listing for the corrected spelling is a page about the medical procedure at Wikipedia.
Over at Bing, the misspelling is NOT corrected — but somehow, Bing manages to list the same Wikipedia page at the top of its results as Google does for its corrected spelling results:

Got it? Despite the word being misspelled — and the misspelling not being corrected — Bing still manages to get the right page from Wikipedia at the top of its results, one of four total pages it finds from across the web. How did it do that?
It’s a point of pride to Google that it believes it has the best spelling correction system of any search engine. Google even claims that it can even correct misspellings that have never been searched on before. Engineers on the spelling correction team closely watch to see if they’re besting competitors on unusual terms.
So when misspellings on Bing for unusual words — such as above — started generating the same results as with Google, red flags went up among the engineers.

Google: Is Bing Copying Us?

More red flags went up in October 2010, when Google told me it noticed a marked rise in two key competitive metrics. Across a wide range of searches, Bing was showing a much greater overlap with Google’s top 10 results than in preceding months. In addition, there was an increase in the percentage of times both Google and Bing listed exactly the same page in the number one spot.
By no means did Bing have exactly the same search results as Google. There were plenty of queries where the listings had major differences. However, the increases were indicative that Bing had made some change to its search algorithm which was causing its results to be more Google-like.
Now Google began to strongly suspect that Bing might be somehow copying its results, in particular by watching what people were searching for at Google. There didn’t seem to be any other way it could be coming up with such similar matches to Google, especially in cases where spelling corrections were happening.
Google thought Microsoft’s Internet Explorer browser was part of the equation. Somehow, IE users might have been sending back data of what they were doing on Google to Bing. In particular, Google told me it suspected either the Suggested Sites feature in IE or the Bing toolbar might be doing this.

To Sting A Bing

To verify its suspicions, Google set up a sting operation. For the first time in its history, Google crafted one-time code that would allow it to manually rank a page for a certain term (code that will soon be removed, as described further below). It then created about 100 of what it calls “synthetic” searches, queries that few people, if anyone, would ever enter into Google.
These searches returned no matches on Google or Bing — or a tiny number of poor quality matches, in a few cases — before the experiment went live. With the code enabled, Google placed a honeypot page to show up at the top of each synthetic search.
The only reason these pages appeared on Google was because Google forced them to be there. There was nothing that made them naturally relevant for these searches. If they started to appeared at Bing after Google, that would mean that Bing took Google’s bait and copied its results.
This all happened in December. When the experiment was ready, about 20 Google engineers were told to run the test queries from laptops at home, using Internet Explorer, with Suggested Sites and the Bing Toolbar both enabled. They were also told to click on the top results. They started on December 17. By December 31, some of the results started appearing on Bing.
Here’s an example, which is still working as I write this, hiybbprqag at Google:

and the same exact match at Bing:

Here’s another, for mbzrxpgjys at Google:

and the same match at Bing:

Here’s one more, this time for indoswiftjobinproduction, at Google:

And at Bing:

To be clear, before the test began, these queries found either nothing or a few poor quality results on Google or Bing. Then Google made a manual change, so that a specific page would appear at the top of these searches, even though the site had nothing to do with the search. Two weeks after that, some of these pages began to appear on Bing for these searches.
It strongly suggests that Bing was copying Google’s results, by watching what some people do at Google via Internet Explorer.

The Google Ranking Signal

Only a small number of the test searches produced this result, about 7 to 9 (depending on when exactly Google checked) out of the 100. Google says it doesn’t know why they didn’t all work, but even having a few appear was enough to convince the company that Bing was copying its results.
As I wrote earlier, Bing is far from identical to Google for many queries. This suggests that even if Bing is using search activity at Google to improve its results, that’s only one of many signals being considered.
Search engines all have ranking algorithms that use various signals to determine which pages should come first. What words are used on the page? How many links point at that page? How important are those links estimated to be? What words appear in the links pointing at the page? How important is the web site estimated to be? These are just some of the signals that both Bing and Google use.
Google’s test suggests that when Bing has many of the traditional signals, as is likely for popular search topics, it relies mostly on those. But in cases where Bing has fewer trustworthy signals, such as “long tail” searches that bring up fewer matches, then Bing might lean more on how Google ranks pages for those searches.
In cases where there are no signals other than how Google ranks things, such as with the synthetic queries that Google tested, then the Google “signal” may come through much more.

Do Users Know (Or Care)?

Do Internet Explorer users know that they might be helping Bing in the way Google alleges? Technically, yes — as best I can tell. Explicitly, absolutely not.
Internet Explorer makes clear (to those who bother to read its privacy policy) that by default, it’s going to capture some of your browsing data, unless you switch certain features off. It may also gather more data if you enable some features.

Suggested Sites

Suggested Sites is one of likely ways that Bing may have been gathering information about what’s happening on Google. This is a feature (shown to the right) that suggests other sites to visit, based on the site you’re viewing.
Microsoft does disclose that Suggested Sites collects information about sites you visit. From the privacy policy:
When Suggested Sites is turned on, the addresses of websites you visit are sent to Microsoft, together with standard computer information.
To help protect your privacy, the information is encrypted when sent to Microsoft. Information associated with the web address, such as search terms or data you entered in forms might be included.
For example, if you visited the Microsoft.com search website at http://search.microsoft.com and entered “Seattle” as the search term, the full address http://search.microsoft.com/results.aspx?q=Seattle&qsc0=0&FORM=QBMH1&mkt=en-US will be sent.
I’ve bolded the key parts. What you’re searching on gets sent to Microsoft. Even though the example provided involves a search on Microsoft.com, the policy doesn’t prevent any search — including those at Google — from being sent back.
It makes sense that the Suggested Sites feature needs to report the URL you’re viewing back to Microsoft. Otherwise, it doesn’t know what page to show you suggestions for. The Google Toolbar does the same thing, tells Google what page you’re viewing, if you have the PageRank feature enabled.
But to monitor what you’re clicking on in search results? There’s no reason I can see for Suggested Sites to do that — if it indeed does. But even if it does log clicks, Microsoft may feel that this is “standard computer information” that the policy allows to be collected.

The Bing Bar

There’s also the Bing Bar — a Bing toolbar — that Microsoft encourages people to install separately from Internet Explorer (IE may come with it pre-installed through some partner deals. When you install the toolbar, by default it is set to collect information to “improve” your experience, as you can see:

The install page highlights some of what will be collected and how it will be used:
“improve your online experience with personalized content by allowing us to collect additional information about your system configuration, the searches you do, websites you visit, and how you use our software. We will also use this information to help improve our products and services.”
Again, I’ve bolded the key parts. The Learn More page about the data the Bing Bar collects ironically says less than what’s directly on the install page.
It’s hard to argue that gathering information about what people search for at Google isn’t covered. Technically, there’s nothing misleading — even if Bing, for obvious reasons, isn’t making it explicit that to improve its search results, it might look at what Bing Bar users search for at Google and click on there.

What About The Google Toolbar & Chrome?

Google has its own Google Toolbar, as well as its Chrome browser. So I asked Google. Does it do the same type of monitoring that it believes Bing does, to improve Google’s search results?
“Absolutely not. The PageRank feature sends back URLs, but we’ve never used those URLs or data to put any results on Google’s results page. We do not do that, and we will not do that,” said Singhal.
Actually, Google has previously said that the toolbar does play a role in ranking. Google uses toolbar data in part to measure site speed — and site speed was a ranking signal that Google began using last year.
Instead, Singhal seems to be saying that the URLs that the toolbar sees are not used for finding pages to index (something Google’s long denied) or to somehow find new results to add to the search results.
As for Chrome, Google says the same thing — there’s no information flowing back that’s used to improve search rankings. In fact, Google stressed that the only information that flows back at all from Chrome is what people are searching for from within the browser, if they are using Google as their search engine.
Postscript: See Google On Toolbar: We Don’t Use Bing’s Searches

Is It Illegal?

Suffice to say, Google’s pretty unhappy with the whole situation, which does raise a number of issues. For one, is what Bing seems to be doing illegal? Singhal was “hesitant” to say that since Google technically hasn’t lost anything. It still has its own results, even if it feels Bing is mimicking them.

Is it Cheating?

If it’s not illegal, is what Bing may be doing unfair, somehow cheating at the search game?
On the one hand, you could say it’s incredibly clever. Why not mine what people are selecting as the top results on Google as a signal? It’s kind of smart. Indeed, I’m pretty sure we’ve had various small services in the past that have offered for people to bookmark their top choices from various search engines.
Google doesn’t see it as clever.
“It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work,” said Singhal. “I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.”
In particular, Google seems most concerned that the impact of mining user data on its site potentially pays off the most for Bing on long-tail searches, unique searches where Google feels it works especially hard to distinguish itself.

Ending The Experiment

Now that Google’s test is done, it will be removing the one-time code it added to allow for the honeypot pages to be planted. Google has proudly claimed over the years that it had no such ability, as proof of letting its ranking algorithm make decisions. It has no plans to keep this new ability and wants to kill it, so things are back to “normal.”
Google also stressed to me that the code only worked for this limited set of synthetic queries — and that it had an additional failsafe. Should any of the test queries suddenly become even mildly popular for some reason, the honeypot page for that query would no longer show.
This means if you test the queries above, you may no longer see the same results at Google. However, I did see all these results myself before writing this, along with some additional ones that I’ve not done screenshots for. So did several of my other editors yesterday.

Why Open Up Now?

What prompted Google to step forward now and talk with me about its experiment? A grand plan to spoil Bing’s big search event today? A clever way to distract from current discussions about its search quality? Just a coincidence of timing? In the end, whatever you believe about why Google is talking now doesn’t really matter. The bigger issue is whether you believe what Bing is doing is fair play or not. But here’s the strange backstory.
Recall that Google got its experiment confirmed on December 31. The next day — New Year’s Day — TechCrunch ran an article called Why We Desperately Need a New (and Better) Google from guest author Vivek Wadhwa, praising Blekko for having better date search than Google and painting a generally dismal picture of Google’s relevancy overall.
I doubt Google had any idea that Wadhwa’s article was coming, and I’m virtually certain Wadhwa had no idea about Google’s testing of Bing. But his article kicked off a wave of “Google’s results suck” posts.
Trouble In the House of Google from Jeff Atwood of Coding Horror appeared on January 3; Google’s decreasingly useful, spam-filled web search from Marco Arment of Instapaper came out on January 5. Multiple people mistakenly reported Paul Kedrosky’s December 2009 article about struggling to research a dishwasher as also being part of the current wave. It wasn’t, but on January 11, Kedrosky weighed in with fresh thoughts in Curation is the New Search is the New Curation.
The wave kept going. It’s still going. Along the way, Search Engine Land itself had several pieces, with Conrad Saam’s column on January 12, Google vs. Bing: The Fallacy Of The Superior Search Engine, gaining a lot of attention. In it, he did a short survey of 20 searches and concluded that Google and Bing weren’t that different.

Time To Talk? Come To Our Event?

The day after that column appeared, I got a call from Google. Would I have time to come talk in person about something they wanted to show me, relating to relevancy? Sure. Checking my calendar, I said January 27 — a Thursday — would be a good time for me to fly up from where I work in Southern California to Google’s Mountain View campus.
The day after that, Bing contacted me. They were hosting an event on February 1 to talk about the state of search and wanted to make sure I had the date saved, in case I wanted to come up for it. I said I’d make it. I later learned that the event was being organized by Wadhwa, author of that TechCrunch article.
A change on Google’s end shifted my meeting to January 28, last Friday. As is typical when I visit Google, I had a number of different meetings to talk about various products and issues. My last meeting of the day was with Singhal and Cutts — where they shared everything I’ve described above, explaining this is one reason why Google and Bing might be looking so similar, as our columnist found.
Yes, they wanted the news to be out before the Bing event happened — an event that Google is participating in. They felt it was important for the overall discussion about search quality. But the timing of the news is being so close to the event is down to when I could make the trip to Google. If I’d have been able to go in earlier, then I might have been writing this a week ago.
Meanwhile, you have this odd timing of Wadhwa’s TechCrunch article and the Bing event he’s organizing. I have no idea if Wadhwa was booked to do the Bing event before his article went out or if he was contracted to do this after, perhaps because Bing saw the debate over Google’s quality kick off and decided it was good to ride it. I’ll try to find out.
In the end, for whatever reasons, the findings of Google’s experiment and Bing’s event are colliding, right in the middle of a renewed focus of attention on search quality. Was this all planned to happen? Gamesmanship by both Google and Bing? Just odd coincidences? I go with the coincidences, myself.
[Postscript: Wadhwa tweeted the event timing was a coincidence. And let me add, my assumption really was that this is all coincidence. I'm pointing it out mainly because there are just so many crazy things all happening at the same time, which some people will inevitably try to connect. Make no mistake. Both Google and Bing play the PR game. But I think what's happening right now is that there's a perfect storm of various developments all coming together at the same time. And if that storm gets people focused on demanding better search quality, I'm happy].

The Search Voice

In the end, I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t.
I’ve long written that every search engine has its own “search voice,” a unique set of search results it provides, based on its collection of documents and its own particular method of ranking those.
I like that search engines have each had their own voices. One of the worst things about Yahoo changing over to Bing’s results last year was that in the US (and in many countries around the world), we were suddenly down to only two search voices: Google’s and Bing’s.
For 15 years, I’ve covered search. In all that time, we’ve never had so few search voices as we do now. At one point, we had more than 10. That’s one thing I love about the launch of Blekko. It gave us a fresh, new search voice.
When Bing launched in 2009, the joke was that Bing stood for either “Because It’s Not Google” or “But It’s Not Google.” Mining Google’s searches makes me wonder if the joke should change to “Bing Is Now Google.”
I think Bing should develop its own search voice without using Google’s as a tuning fork. That just doesn’t ring true to me. But I look forward to talking with Bing more about the issue and hopefully getting more clarity from them about what they may be doing and their views on it.
Opening image from Real Genius. They were taking a test. There’s no suggestion that Google is cool Chris Knight or that Bing is dorky Kent (or vice versa). It’s a great movie.

March 30, 2011

55 Quick SEO Tips Even Your Mother Would Love55 Quick SEO Tips Even Your Mother Would Love

55 Quick SEO Tips Even Your Mother Would Love

 Everyone loves a good tip, right? Here are 55 quick tips for search engine optimization that even your mother could use to get cooking. Well, not my mother, but you get my point. Most folks with some web design and beginner SEO knowledge should be able to take these to the bank without any problem.

1. If you absolutely MUST use Java script drop down menus, image maps or image links, be sure to put text links somewhere on the page for the spiders to follow.

2. Content is king, so be sure to have good, well-written and unique content that will focus on your primary keyword or keyword phrase.

3. If content is king, then links are queen. Build a network of quality backlinks using your keyword phrase as the link. Remember, if there is no good, logical reason for that site to link to you, you don’t want the link.

4. Don’t be obsessed with PageRank. It is just one isty bitsy part of the ranking algorithm. A site with lower PR can actually outrank one with a higher PR.

5. Be sure you have a unique, keyword focused Title tag on every page of your site. And, if you MUST have the name of your company in it, put it at the end. Unless you are a major brand name that is a household name, your business name will probably get few searches.

6. Fresh content can help improve your rankings. Add new, useful content to your pages on a regular basis. Content freshness adds relevancy to your site in the eyes of the search engines.

7. Be sure links to your site and within your site use your keyword phrase. In other words, if your target is “blue widgets” then link to “blue widgets” instead of a “Click here” link.

8. Focus on search phrases, not single keywords, and put your location in your text (“our Palm Springs store” not “our store”) to help you get found in local searches.

9. Don’t design your web site without considering SEO. Make sure your web designer understands your expectations for organic SEO. Doing a retrofit on your shiny new Flash-based site after it is built won’t cut it. Spiders can crawl text, not Flash or images.

10. Use keywords and keyword phrases appropriately in text links, image ALT attributes and even your domain name.

11. Check for canonicalization issues – www and non-www domains. Decide which you want to use and 301 redirect the other to it. In other words, if http://www.domain.com is your preference, then http://domain.com should redirect to it.

12. Check the link to your home page throughout your site. Is index.html appended to your domain name? If so, you’re splitting your links. Outside links go to http://www.domain.com and internal links go to http://www.domain.com/index.html.

Ditch the index.html or default.php or whatever the page is and always link back to your domain.

13. Frames, Flash and AJAX all share a common problem – you can’t link to a single page. It’s either all or nothing. Don’t use Frames at all and use Flash and AJAX sparingly for best SEO results.

14. Your URL file extension doesn’t matter. You can use .html, .htm, .asp, .php, etc. and it won’t make a difference as far as your SEO is concerned.

15. Got a new web site you want spidered? Submitting through Google’s regular submission form can take weeks. The quickest way to get your site spidered is by getting a link to it through another quality site.

16. If your site content doesn’t change often, your site needs a blog because search spiders like fresh text. Blog at least three time a week with good, fresh content to feed those little crawlers.

17. When link building, think quality, not quantity. One single, good, authoritative link can do a lot more for you than a dozen poor quality links, which can actually hurt you.

18. Search engines want natural language content. Don’t try to stuff your text with keywords. It won’t work. Search engines look at how many times a term is in your content and if it is abnormally high, will count this against you rather than for you.

19. Not only should your links use keyword anchor text, but the text around the links should also be related to your keywords. In other words, surround the link with descriptive text.

20. If you are on a shared server, do a blacklist check to be sure you’re not on a proxy with a spammer or banned site. Their negative notoriety could affect your own rankings.

21. Be aware that by using services that block domain ownership information when you register a domain, Google might see you as a potential spammer.

22. When optimizing your blog posts, optimize your post title tag independently from your blog title.

23. The bottom line in SEO is Text, Links, Popularity and Reputation.

24. Make sure your site is easy to use. This can influence your link building ability and popularity and, thus, your ranking.

25. Give link love, Get link love. Don’t be stingy with linking out. That will encourage others to link to you.

26. Search engines like unique content that is also quality content. There can be a difference between unique content and quality content. Make sure your content is both.

Compete Says Bing’s Combined U.S. Market Share Rose To 29% Last November

Compete Says Bing’s Combined U.S. Market Share Rose To 29% Last November

A couple of weeks ago, comScore came out with a report that said Microsoft’s Bing had reached an all-time high market share of 11.8% in November 2010.
According to rival Compete, however, Bing’s market share is actually much larger than that.
Based on search data from its panel of more than two million US-based internet users, the Kantar Media company says Bing-powered search engines as a whole grew 4.3 percent month-over-month in query volume, driving Bing’s total market share up by 1 percentage point.
Bing (‘MSFT’) and Yahoo’s search products (which are powered by Bing these days) had 14.4% and 14.6% market share, respectively, which means the combined market share of the search engines rose to a healthy 29% in November 2010, according to Compete’s search data.
Bing.com also saw the highest growth in the number of unique visitors, with a month-over-month increase of 7.4 percent (Yahoo’s number of UVs actually declined .3 percent). In November 2009, Bing’s market share was just over 10%, according to Compete.
Both ASK and AOL’s share remained flat from October 2010 to November 2010, which means there’s only one search engine whose market share effectively declined last November: Google‘s.
According to Compete, Google has seen its query volume decline for the second month in a row now, with a recent 1.1 percent month-over-month drop. Compete registered 66.4% market share for the search engine, down a noteworthy 7 percentage points compared to November 2009.
Excuse the blurry screenshot, but this was the best I could do (see source image).

Share Your Best Google AdWords Practices Week

Share Your Best Google AdWords Practices Week

Google is trying something new this week, encouraging AdWords professionals to share their best practices on a daily basis over the course of this week.
AdWordsPro Mini posted a thread at the Google AdWords Help forum asking AdWords advertisers to share their best practices. The official Google AdWords representative explained:
This week is "Share your Best Practices week - March 28 to April 1". What is this?: To show our appreciation of your dedication to this forum and to give you a special opportunity to demonstrate what you know, we're launching this "Share your Best Practices week - March 28 to April 1". Over the course of this week, we encourage you to share your AdWords Best Practices on important, difficult, or confusing topics.
How does it work?: Everyday, we will choose a topic and post it as question and let you share your own Best Practices! You are encouraged to share tips, tricks, useful resources that you may have developed, and your experiences on what has worked for you on this topic.
The best of the Best Practices shared for each topic during the course of this week might even receive a surprise gift!
The first topic posted was "Tips and Best Practices on optimizing AdWords ad text." The thing is, it has received very limited participating as of right now. So I am hoping to drive a bit of attention to it and start the conversation.
Forum discussion at Google AdWords Help.

Is Google Rolling Out A Farmer / Panda Update?

There has been a slight uptick in the number of people reporting an update to the Google Farmer/Panda algorithm change.

Yesterday I gave a status update on farmer/panda and said I do not believe it is rolling out. Some disagreed with me but most did not.
The thing is, as Donna said, it might be rolling out on certain Google datacenters and not to everyone yet.
Trusted sources at the ongoing WebmasterWorld threads are saying that they have seen a 20-25% uptick in traffic post the Google update. So while they saw a 60% plus drop in Google referrers, since then, and recently, they saw a 20-25% increase in traffic. In addition, some are saying they are seeing the update roll out to Google UK.
Related to an improvement in the Google US algorithm update, a senior WebmasterWorld member gave some deeper insights into his traffic:
About 85% of my traffic comes from the homepage. So when I saw a 60% drop in traffic post-Panda, it was via phrase-based re-ranking on the homepage, but I did see 200 to 400 position drops (according to WMT) on my 5 shallow pages (that had affiliate links). Since internal pages only account for about 15% of my traffic, I really didn't notice their demotion impact. So what I think I am seeing, is an increase in traffic due to a specific phrase. That phrase (for the homepage) fell 550 positions, but has recovered to the top 100. This seems to have resulted in a slight upward nudge in longer-tail phrases that utilize that phrase. So "blue widgets" fell to -550, but as of Saturday it now ranks about #99. Subsequently, "big blue widgets" went from #20 to #16. For me, this has been very phrase specific, but I see the "shallow content" aspect. I'm not sure which is the chicken or which is the egg.
So maybe we are seeing early signs of Google updating and rolling out the Farmer/Panda update even further?

March 29, 2011

10 ways to be a great SEO

10 ways to be a great SEO

Despite the fact that SEO can make or break a business online, SEO still conjures up a lot of negativity.

Some of the negativity is fair. While there are plenty of legitimate SEOs, the market still has its fair share of snake oil salesman going from client to client in a hit-and-run fashion promising the world but delivering none of it.
The negativity and controversy that exists around SEO can make the SEO market a tough place to do business. Here are 10 ways to be a great SEO and to demonstrate to your clients and prospective clients that you're committed to providing a legitimate, top-notch service.
  • Don't guarantee results. No matter how skilled you are and no matter how good your track record is, it's impossible to guarantee results since there's so much that is out of your control, even if you do everything right. So instead of promising things you aren't able to promise, describe what you can do to put your client in a position to achieve results and how the things you've done in the past resulted in success for other clients.If you find yourself dealing with someone who is demanding guarantees, consider moving on since these situations rarely end well in my experience.
  • Set expectations. Even though you can't guarantee top SERPs, you can help a client create realistic expectations. Whether it's giving the client timeframes for the tasks you'll be completing or explaining the process of SEO, it's important to be sure that the client knows what to expect.
  • Educate. Some SEOs don't like to tell their clients what they're doing. They think of their knowledge as a trade secret and believe that if they educate the client, the client will eventually fire them and take over SEO themselves. This type of mentality is the hallmark of an amateur consultant. Most clients prefer transparency to secrecy and aren't interested in firing consultants performing services that don't fall under their core competencies. You should think of knowledge sharing as a way to demonstrate your competence to clients, making yourself even more valuable.
  • Admit when you don't know something. SEO is a dynamic field and things are always changing. In many cases, hard and fast rules don't exist and there aren't any 'official' answers. So when a client asks you about something and you don't have an answer, don't make one up; say so and look into it.
  • Keep your skills and knowledge up-to-date. Search engine algorithms are in a constant state of flux and the field of SEO is one of the most dynamic on the internet. Make sure you're staying on top of the latest developments since no client wants an SEO in 2009 who has 2005 skills and knowledge.
  • Define deliverables. Last week I wrote that SEO is a journey, not a destination. It's important for clients to understand that. If someone wants to hire you for a few weeks, there's a lot that you can deliver but there's also a lot that you can't deliver. Therefore I always recommend detailing what deliverables you can provide in the timeframe that the client gives you, being realistic about what this means to the client's overall SEO strategy.
  • Don't push the limits. The line between white hat and black hat is often blurred and even if your risk tolerance is high, you shouldn't assume that a client's is too. Think of yourself as a doctor when working with clients and remember to 'do no harm'.
  • Provide references. Even though some prospective clients won't ask for them, offering up references proactively is a good idea because it helps you stand out in a market that still has more than a few snake oil salesmen.
  • Don't confuse SEO with PPC. Be careful about confusing organic SEO and paid search marketing. Yes, I've actually met people who didn't know the difference because their SEO 'experts' had led them to believe that the two were the same. Obviously that was probably a way of masking the fact that they were unable to deliver organic results.
  • Remember that SEO is more than just Google. Even though Google deservedly receives most of the SEO attention because it has the marketshare in the major markets, good SEO is holistic and many of your clients might receive significant benefits from other search engines. Therefore don't exclude other search engines from your services.

WordPress SEO resources and tips

WordPress SEO resources and tips

WordPress is a popular open-source blogging platform that has a powerful set of features and can be extended through the use of free plug-ins, of which there are thousands.
 
To maximize the usefulness of WordPress as a business tool, you should take advantage of every opportunity to optimize your blog for search engines.
WordPress Documentation

WordPress' Codex includes a page dedicated to SEO topics and this page is a must-read for anybody operating a WordPress blog.
This page includes important information about setting up SEO-friendly permalinks, maintaining a healthy Robots.txt file for search engines and making sure that you're taking advantage of feed submissions. It also links to websites with useful SEO information.
WordPress Plugins

There are a number of great plug-ins that make it easy to implement SEO techniques with WordPress.

All in One SEO Pack

The "All in One SEO Pack" is one of the most downloaded WordPress plug-ins and performs a number of functions, including optimizing your page titles, adding META tags automatically and removing common duplicate content.

Google XML Sitemaps

Another popular plug-in, Google XML Sitemaps creates a Google sitemaps compliant XML-Sitemap for your blog.

KB Robots.txt

This plug-in enables you to create and edit Robots.txt files from within the WordPress admin.

Redirection

Redirection helps you monitor 404 errors, automatically manage 301 redirections when a post URL changes and has a powerful set of features that helps you redirect users to different pages based on various conditions.

Breadcrumb NavXT

Add breadcrumb navigation to your WordPress blog with this plug-in.

WordPress Related Posts

This plug-in allows your blog to feature "related posts."

SEO Friendly Images

SEO Friendly Images automatically adds the ALT and TITLE attributes to images.

Nofollow Case by Case

This plug-in enables you to decide which comments, pingbacks and trackbacks have or don't have the "nofollow" attribute which is otherwise automatically added by WordPress.

SEO strategy for new websites

SEO strategy for new websites

Starting a new company is extremely hard, which is probably why most businesses fail within the first couple of years.
Challenges such as marketing and hiring the right staff are some of the major issues that even good managers struggle with.
Launching a new website is just as hard and, as if things weren't difficult enough, Google gives people an extra hurdle which is sometimes called the "sandbox".

When you launch a new site, Google doesn't trust it at all. Even if the BBC launched a new site it would start off without any trust and would receive very little traffic.
As the site attracts links from other sites, it gradually earns enough trust to start ranking for some long tail search terms.
If the site gains enough trusted links it may start to rank highly for competitive keywords (ie keywords with lots of PPC advertisers) but this can take up to 24 months.

You are probably thinking that this is a harsh move by Google, but with the sheer volume of sites being launched it needs to have some method of making sure only the really good ones reach the top.
Luckily you can follow a few simple tips to reach the top quicker.
1. Your first links
When you launch your brand new site the first few links can make all the difference. Linking to it from other trusted sites right out of the gate can dramatically shorten the time it takes to build trust. Conversely, starting a site off with low quality links is suicide.

Submit to a couple of trusted directories such as Business.com and the Yahoo! Directory and maybe link to the site from some of your other company's websites (link from the news sections to appear more natural).

2. Spin-off sites
Sometimes a new site is a spin-off from a particular section of an existing site. For example, a car insurance site might have a caravan insurance section but then decide to launch a dedicated caravan insurance website.
In this case, the best method would be to create the new site on the new domain and not allow Google to access it. Then once the site was ready you would 301 redirect the old pages to the new pages and hope that the rankings and trust from the old domain passed across.

3. Building trust
Most people try to get relevant links to their sites, which are great for improving rankings in a particular niche but not quite as good for building trust. The ideal links for building trust are from major blogs and news sites.
These can also be quite relevant as usually the article linking to you is related to your niche - even though the rest of the site isn't.

Sites that receive a lot of attention from the mainstream press almost always start ranking a lot more quickly than sites that don't have the benefit of a large PR budget.

Why you need to know PPC to be great at SEO

Why you need to know PPC to be great at SEO

Andrew Girdwood wrote an article in April last year, which spoke about why to be great at PPC you have to be good at SEO.
There are some excellent tips here, which are well worth taking a look back over, but I believe this can be applied both ways.
Looking at this from the opposite perspective, I have listed some reasons why, in my opinion, having a strong understanding of the paid search advertising model can be a big advantage to improving SEO techniques.
Below are a few tips about how you can integrate PPC knowledge into your SEO strategy:
  • Paid search helps you to understand the importance of click through rates (CTRs) because PPC managers will try to write concise and interesting PPC headlines to generate a maximum number of clicks.

    For SEO, you can look to integrate the top performing PPC ad headline ideas into webpage title tags; maximising CTRs for organic listings, while still using optimised keywords relevant to a specific search.

  • In many cases, optimising a website’s homepage is the easy option for targeting the most important keywords. However, if this was a paid search campaign, you would not send users to a general webpage where the navigational options are high and where the user may not find what they have originally searched for, get confused and hit the back button.

    Instead, you would try and keep the bounce rate low by selecting a PPC landing page containing content related to a specific query. Shouldn’t you be doing the same for organic search traffic?
     
  • Organic search snippets are commonly very bland and boring to read for users, but the main objective of a PPC ad is to be compellingso that it attracts a user’s eye and increases the number of clickthroughs.

    PPC descriptions can be very time-consuming to perfect, but once they are performing well they can easily be applied to freshen up meta descriptions and provide a more interesting listing in the SERP’s.

    You should also remember not only to use important keywords (so that these are highlighted when searched for), but also to clearly describe the landing page content so that the description attracts as many relevant clicks as possible where traffic is likely to be of a good quality and convert at a higher rate.

    This can be very effective when applied to webpage’s which are already generating significant volumes of search engine traffic.
     
  • Landing page optimisation is often overlooked during an SEO project. However, it is possible to optimise a webpage effectively while still using many of the usability techniques you would apply to a PPC ad.

    This should create a clear navigational path to direct users towards the end goal of a converting sale or lead. Sometimes you may have to finely balance the importance of landing page, perhaps for example with a #6 ranking achieving a 5% conversion rate, compared to a less optimised #8 ranking with a 15% conversion rate.

    But the end goal should always be the quality of traffic, so optimising for conversions should always be a major consideration.
     
  • Many website URLs are long, messy and often don’t contain targeted keywords. If you have a PPC display URL which is achieving high CTRs, try to incorporate this into your organic strategy to build a URL structure which is tidier for the search engine spiders and more attractive to click for a searcher.

  • PPC is frequently seasonally-based. For example, in PPC you may heavily promote summer holidays during the early months of a New Year. Seasonality isn’t as common in SEO, but if you look at this with a PPC mindset you may try and optimise for seasonal keywords more heavily.

    Perhaps aiming to improve organic rankings for specific terms such as “Beach Holiday Destinations in 2009”, also targeting additional traffic from Google News and Universal Search results.

    If you already have high rankings for more competitive terms such as “summer holidays”, rather than creating new content on a separate URL you could update the previous version instead, maintaining rankings and providing users with more relevant and higher converting content.
     
  • Actual PPC keyword data can be invaluable to an SEO campaign. While there are many very good keyword research tools available, few compare to actual search volume, impression data from Google AdWords.

    In addition to improving the accuracy of estimating search traffic, the clickthrough rate and conversion figures can be very useful towards selecting terms which are key to a business. Some keywords may not convert as highly as expected when put into practice, so testing these first using PPC means you can focus upon proven high-quality keywords, rather than basing selection on assumptions which may prove to be incorrect.

    This is especially important when you consider that it may take over 6 months to find this out while waiting for rankings to reach a high-traffic referring level.

  • Many high converting PPC keywords are likely to be long tail variations which you may not have considered otherwise. When added to an SEO campaign, these are likely to be less competitive to obtain high organic rankings for and may quickly become a very profitable source of traffic.

Six design tips for better URLs

Six design tips for better URLs

It’s an area that is overlooked on many websites, but URL design is an important consideration. Bad URLs may mean that you website won’t be found, visited, or submitted to sites like Digg.
Ideally, URLs should be short, readable, descriptive and memorable. Visitors to your site, other sites linking to yours, and of course the search engines will appreciate the care you have taken over your URL structure.
Here are a few tips for URL design…
Make URLs readable
If your URLs are readable and describe the content of the web page they lead to in some way, people will be more likely to click on them.
For example, this URL from VentureBeat is easy to understand: http://venturebeat.com/2008/09/17/google-to-buy-valve/, but this one from ClickZ is incomprehensible: http://blog.clickz.com/080916-122203.html. There are far worse ones out there too - for some reason, laws of nature and all that, the bigger the media brand the worse this seems to become (serial offenders include CNN, Reuters, MSNBC, etc).
Keep them short
A short URL is more likely to be remembered and is much easier for people to type it straight into the address bar on their browser.
Also, if you need to email a link to someone, a long longer than around 70 characters will not display well in many email clients. Easier said than done, in many cases (it can be tricky for publishers).
Make them permanent
People bookmark webpages on their browsers, or on social sites like Digg, and Delicious so, once you have published a web page, don't alter the URL as users clicking these bookmarked links will just encounter an error.
Make them guessable
If your URL structure makes sense, then people will be able to guess and navigate around a site by editing the URL. For example, if you are on this page: http://econsultancy.com/blog, then you know that, by removing the 'blog' part you can get back to the homepage, or add another term, such as research to reach that section of the site.
Use keywords in URLs
Having keywords in URLs can be useful for search engines, as well as making them more readable. This is something that Google has advised in the past.
Another reason for including keywords is that, as the URL will be displayed under the page title and extract in search results page, this information may be used by people considering whether to click on a link.
Avoid session IDs
As different URLs are generated for different human visitors,  Session IDs cause search robots problems. As a result many search robots have a rule that they don’t crawl these pages since there are many different pages addresses for different sessions.
Websites that have registration processes, or wish to track user activity will often use session IDs; an alternative is to use session cookies instead.

Google provides tips on duplicate content

Google provides tips on duplicate content

Duplicate content is an important issue and something that can have an adverse effect on a website’s search engine rankings.
So lots of site owners will be pleased to hear that Google has provided some tips on how to address it.
According to Sven Naumann on Google's Webmaster Central Blog, there are two main types of duplicate content:
Duplicate content within one website
This is often unintentional and can be the result of sites having pages for similar products where the content has been only slightly changed, or because landing pages have been created for PPC campaigns.
In this case, Google recommends that webmasters include the preferred version of the URL on their sitemap file, which will help the search engine's crawlers find the best version.
Duplicate content across domains
This refers to content identical to that on your website appearing on third party domains, often when sites use scrapers to copy your text and use it to push themselves up the rankings.
Naumann claims that Google manages to determine the original source of the content "in most cases", and that having your content copied shouldn’t impact on your search rankings.
He offers the following tips if sites with scraped content are ranking higher than the original website:
  • Make sure your site’s content is being crawled by Google.

  • Check the Sitemap file to see if you made changes for the particular content which has been scraped.

  • Make sure your site is in line with Google’s webmaster guidelines.

Google's Matt Cutts' tips on SEO

Google's Matt Cutts' tips on SEO


Google software engineer Matt Cutts has summed up a site review session he did at the recent PubCon in Las Vegas.
In reviewing a few sites, he gives some insight into what site mistakes a webmaster should avoid, and gives some tips on improving a site’s search engine visibility.
Duplicate contentSome of the people whose sites were reviewed had a number of different domains, and the overlapping content and pages from the various sites can cause problems:
“We discussed the difficulty of adding value to feeds when you’re running lots of sites. One thing to do is to find ways to incorporate user feedback (forums, reviews, etc.).”
“The wrong thing to do is to try to add a few extra sentences or to scramble a few words or bullet points trying to avoid duplicate content detection. If I can spot duplicate content in a minute with a search, Google has time to do more in-depth duplicate detection in its index.”
Organising contentOne of the sites Matt looked at contained hundreds of articles going back to 1996, and included a very messy sitemap page listing all the articles.
“So how should a webmaster make a sitemap on their site when they have hundreds of articles? My advice would be to break the sitemap up on your pages. Instead of hundreds of links all on one page, you could organize your articles chronologically (each year could be a page), alphabetically, or by topic.”
Having mulitple domains Some of the site’s owners had many different domains registered:
“My quick take is that if you’re running 50 or 100 domains yourself, you’re fundamentally different than the chiropractor with his one site: with that many domains, each domain doesn’t always get as much loving attention, and that can really show."
"Ask yourself how many domains you have, and if it’s so many that lots of domains end up a bit cookie-cutter-like.”
Reciprocal links
Matt spotted a few instances of sites swapping reciprocal links as a quick method of increase link popularity.
“When I saw that in the backlinks, I tried to communicate that 1) it was immediately obvious to me, and therefore our algorithms can do a pretty good job of spotting excessive reciprocal links, and 2) in the instances that I looked at, the reciprocal links weren’t doing any good."
"I urged folks to spend more time looking for ways to make a compelling site that attract viral buzz or word of mouth. Compelling sites that are well-marketed attract editorially chosen links, which tend to help a site more.”

Working words: How to write for SEO

Working words: How to write for SEO


I often fill these pages with rants about what not to do when writing copy for search engine optimisation (SEO) and for a web audience.
However, it struck me recently that I have not spent much time exploring best practice in SEO copywriting and how to ensure your content is as fit for purpose as possible.
I am going to remedy that today. Please comment if you have any questions or additions.
Spiders
Spiders are the crawlers sent out by search engines such as Google to trawl through the web, recording details and information about the pages they visit.
What they find determines how well you rank for the words and terms within your site. So, what can you do to help?
1) Choose your keywords and phrases and stay conscious of them to ensure they appear whenever naturally possible within the website's text.
2) Spiders pay more attention to bold and italic words, so when you use such formatting, try to make sure it is a keyword or phrase.
3) They read left to right, so place your keywords as early as possible in the text.
4) Spiders consider headlines and sub headings to be particularly important, so make sure they are as relevant as possible and, if natural and readable, contain your keywords.
Humans
I am often amazed at how many people forget the real purpose of SEO. The motive is not to get to the top of the search engine results pages, not really. That is like saying the purpose of cooking is to heat meat, rather than eat it.
The idea is to gain greater visibility and traffic. That means all content on a page must be interesting to people. People, not search engines.
So, make your copy fun, interesting, relevant, grammatical. Everything a real person looks for when browsing the web. Also, try these tips:
1) Alliteration is good. Humans like alliterative phrasing and looking out for such opportunities allows the writer to concentrate on creating elegant copy rather than just functional words.
2) Use short sentences. The online reader is lazy. To help them read the page more easily, keep sentences short and try to limit paragraphs to just two or three lines.
3) Keep a sense of humour. Unless your pages cover genocide throughout history, there will be opportunities to make the odd joke. Seize these. The more personality you have on your site, the better.
4) Call to action. Your copy does not just exist for the reader to read, you are trying to secure business. While your content should not be one endless pitch, do not be afraid to include a call to action somewhere in it.
Linking out
There is a lot of confusion out there over links. Most people are happy with the basics: inbound links good, farmed links bad and so on, but some people worry that linking out from your content could devalue your site.
I believe there are three important points here.
1) If you are a human reading a website, outbound links to sources provide credibility and relevant further information. That makes you happy and gives you a positive impression of the page.
2) Should a search engine spider be looking at your content, outbound links to relevant pages are no problem, it will only get suspicious if you are linking out all the time and to irrelevant places.
This will devalue your site's importance to the search engines and, of course, too many links can make you look very suspicious.
3) If you want to link to a new pages, make sure a new window opens up. The last thing you want to be doing is directing people away from your site.
Linking in
Inbound links are brilliant; the more the better, particularly if you can receive them from high authority pages.
However, the main way to secure these links is to create impressive content that real people will want to link to.
This means good copywriting can result in good SEO. Rubbish and repetitive content may be cheaper or easier but it will not provide the long-term benefits you need.
Quality is everything
If you can't write, don't. If you can, then make sure you take the time to really hone your copy and make it as good as it can be.
Each time a company commits words to its pages, it is presenting itself to a potentially huge audience.
This might be the page that scores hugely in Digg or Sphinn and carries your firm's name around the world. Or, it might be that just one person sees it but he or she carries enormous buying power.
Make your words impress the search engines and the online community. Make your words work.

Tips on optimising your blog for SEO and readers

Jennifer Slegg at Search Engine Land has some useful tips on optimising your blog, for the benefit of both your readers and the search engines.
Jennifer presents some tips for increasing your visibility on search engines, others, including tips on fonts and descriptive titles, to make your blog more accessible and easy to read.
Here is a small selection of tips:
Make sure you have RSS available:
“Many hosted blogging solutions don’t have RSS automatically available, so you will need to add it. And when you do add it, ensure you have those RSS links in an obvious spot. Place all those handy subscribe links in your sidebar, which is exactly where people will look for them.”
Post regularly:
”The more frequently you post, the more likely Googlebot and other bots will stop by on a more regular basis. Google loves updated fresh sites, so it make sense to feed the bot what it wants.”
Watch out for Trackback & Comment Spam:
”You don’t want Google or Yahoo to find masses of spammy links on your site. Use one of the many tools on the market for your blog platform to manage both comment and trackback spam.”
Be Aware of Your Anchor Text:
”When you link to someone’s blog entry, or even a previous blog entry on your own site, make sure you link well. This means instead of linking to someone’s blog entry with the anchor text “click here”, you link to them using anchor text related to the blog entry.”

Tips on internal linking strategy

Internal linking is an important and often overlooked element of SEO and, unlike external links, the site owner has complete control over this, so it's crucial to make the most of it.
Scott Allen of Search Engine Guide has written a useful article on internal linking strategies. Here are a couple of his tips, alongside some from our SEO Best Practice Guide...
Link within your content
Scott advises webmasters to go through their site's content and work out where it would make sense to link to other pages which you want to rank better.
Make sure that the anchor text matches the keywords and phrases that you want the pages to rank well for. Scott also advises that these target keywords should appear on the destination page, as this will make the linking more effective.
Link from stronger pages
Some webpages may be more popular in the search rankings and receive more links than the rest of your website. You can strengthen these weaker pages by including some contextual links to them from the stronger pages.
Some additional tips from E-consultancy's SEO Guide...
Links from footers
This approach can also help with usability, as it is effectively a mini site map on every page. It also allows you to include some lengthy to be included.
For example, E-consultancy ranks well in Google for keyphrases such as 'web project management' and 'online surveys & research', both of which are included in the footer.
Sitemaps
Spending time on an effective sitemap is not only effective for SEO, but also has usability benefits.
  • Always have a sitemap with text links, not images.

  • Make sure your anchor text reflects user behaviour and keyphrase analysis, so include not just product names, but also the most popular keywords for that page.
Links from e-newsletters and blogs
Articles in newsletters and blogs are another source of internal links and, as they are often hosted on separate or sub-domains, they may provide additional value.
Include relevant anchor text in the links back to the main site, not just the title of the article.

50 SEO tips for online retailers

SEO for online retailers is the process of improving a website potential in order to gain more organic non-paid traffic from the major search engines. Normally, SEO uplift doesn't happen overnight and it can take a long while to rank well for non brand key terms. 

The rule of thumb is this: the more competition a relative term has, the harder you'll find it to rank for the term. With that said, you've got to start somewhere and there at least 50 ways I can think of to improve your SEO.  
Choose your hosting provider carefully
1. If you're targeting one specific region, say the UK, ensure that the physical IP address is country specific which will improve the likelihood of ranking in the UK
2. Always opt for a fixed IP address even if it costs slightly more
3. Run an IP address search to ensure the IP address hasn't been black listed before. Domain Tools are an excellent source for quick IP address lookup
4. Ensure the server returns accurate response:
  • 200 OK The request has succeeded - As an example you should see this server response for your homepage (www.sitename.com)
  • 301 Moved Permanently - As an example you should see this server response for your non www version of your homepage (sitename.com)
  • 302 Found - Use this server response only if you are redirecting temporary
  • 404 Not Found - Always display a correct 404 response so you can get an indication when a page is broken for better user experience

Increase crawl rates because you can never get enough of Google
5. To check when your site was last crawled and indexed, search for site:sitename.com and play with “date range” advanced search options
6. Update the site's content as often as possible. For online retailers, new promotions and offers offer a fantastic opportunity to update their content
7. Ensure pages are loading quickly by analyzing your code, content and images. Web Page Analyzer is an excellent source to analyze a page load time
8. Fix duplicate content issue such as having two versions of your homepage, for example www.sitename.com and www.sitename.com/index.php
9. Add an XML site map and submit it to the major search engines

Ensure image optimisation across the site to enjoy traffic from Google image search

9. Keep images on a folder level rather than a subdomain so sitename.com/images/ is better than images.sitename.com
10. Use a descriptive name for the image, such as the product name
11. Use alt text for all your images and use a descriptive name again for the image alt text name
12. Use caption by placing a small description directly under, on top or on the side of your image
13. When possible save the image as jpg format
14. Use a free tool such as xenu to find images with no alt text

Ensure metadata optimisation to get high level of qualified traffic
15. Ensure that every page has unique metadata in terms of page title and page description
16. Limited page title to 70 characters and page description to 150 characters
17. Don't bother too much about keywords, do something else
18. Optimise each page around one key term
19. Place the most important term first, followed by a soft (non spammy) call to action and brand
20. For product pages, opt for an auto generate metadata solution based on <product title> + <call to action> at <site name>
21. Use AdWords ads to test the best text for better CTR by creating a few ad variation in AdWords which include your key term

Content is truly king
22. Every page should have unique content which reads well for users (and therefore for the search engine spiders as well)
23. Don't repeat the key term more than 3 times so to avoid keyword stuffing
24. Place the key term in the page H1 title, image alt text and once in bold
25. Use "recommended products" to link between similar products to increase their relevancy

Apply essential URL and coding tweaks
26. Use robot.txt to block parts of the site you don't wish the engines to index
27. Offer an HTML site map which is auto updated based on the XML site map
28. For sites running on PHP use an .htaccess file to avoid content duplication
29. Use breadcrumbs navigation across the site for better user experience and SEO
30. From time to time, view your site using a text browser such as SEO Browser to "see" how spiders are likely to find your on page content
31. If your site architecture has more than three levels, restructure it so to m ake the information more accessible to both users and spiders
32. Keep URLs short for better SEO and to create a better viral effect as short URLs are more memorable
33. Include your key term such as a product title in the URL

Apply essential maintenance from time to time
34. Fix all your 404 errors and consider redirecting to a more appropriate page
35. When products are removed from stock or discontinued ensure that a 301 is placed to the main category or to a similar product
36. When linking to another site, consider checking whether you're linking to a bad neighborhood using a free tool such as text link checker
37. If you have multiple domains unify around one domain using a 301 redirect taking into account links pointing to each domain, domain age and the domain name

The Google PageRank issue
38. Don't pay too much attention to Google PR as it won't effect your ranking
39. If you want to control page rank, use a nofollow HTML attribute on pages such as "terms and conditions", "privacy policy" etc

Content is king, so start blogging
40. Place your blog on a directory level so www.sitename.com/blog/ rather than a sub domain blog.sitename.com
41. Blogging at least once a week will help increase your crawl rate
42. Read The Definitive Guide To Higher Rankings For Your Blog

Get more links, otherwise no one will see you
43. If you're considering directories as part if your link acquisition, focus on quality reviewed directories such as Yahoo and Best of the Web
44. Don't pay for links, you'll get caught at some point
45. Help your customers help you by placing on each page an easy way to share content using a sharing tool such as AddToAny
46. Ask for links take 1 - place a nice soft request in your website sale confirmation email to link back to your site
47. Ask for links take 2 - ask your suppliers and contractors for a link
48. Give blogger prior notice of new products and ask for a review
49. Offer great products at competitive prices and the links will come organically
50. Kick off a social media strategy to encourage discussion (and links) on social networks and other user-powered sites

March 28, 2011

5 SEO techniques for website images



One of the most overlooked aspects of SEO is images. Most websites have lots of images but few actually apply SEO techniques to them.
Not implementing SEO techniques with your images could mean that you're missing out on valuable traffic from Google Image Search, which is one of Google's most popular properties. Here are 5 SEO tips that can help you capitalize on all of the searches that are being done for images.
Use Descriptive Image Names and Folder Names
If you're using image names like 00103.jpg on your website, you can't expect search engines to easily identify what the image might contain.
Use descriptive image names instead. If your website contains an image of a red iPod Nano, for instance, the image name red-apple-ipod.jpg is better. If you really want to get sophisticated, you can build a folder structure that includes relevant keywords as well (i.e. /products/apple/iPods/red-iPod-nano.jpg).
Unfortunately, a lot of content management systems and ecommerce platforms automatically give uploaded images useless names (and place them into folders with useless names) so you may need to modify your software to achieve better naming and folder structures. But if you have lots of images, doing so may be a worthwhile investment.
Use Descriptive Alt Tags
Another way you can give search engines clues about what's contained in your images is to use descriptive alt tags.
The more descriptive, the better, within reason, of course. You don’t want to be too generic, but at the same time you don’t want your alt tags to contain a Tolstoy novel. For instance, instead of using 'Ford Mustang' as your alt tag, 'This blue 1965 Ford Mustang won best of show' is better.
On my websites, I've noticed that Google Image Search seems to like sentence-form descriptions better and while I won't say that I have enough data to call this observation 100% accurate, it seems logical, as text in sentence-form is likely to contain more descriptive keywords and probably gives search engines a clue that you're not spamming.
On that note, a reminder: do not under any circumstances use keyword stuffing in your alt tags.
Use Descriptive Anchor Text
As with alt text, if you're linking to your images using text, use good anchor text that describes the contents of the image. Most of the time, this probably means that anchor text contains some of the same keywords you've used in the image name and alt text.
Use Larger Images
I've read several reports suggesting that Google Image Search prefers images that are on the larger side. While I have no first-hand evidence of this, it's important to remember that SEO isn't truly effective unless users click on your listings.
Since it does make sense that if someone is searching for an image, he'll probably be more inclined to click on a larger image with higher quality than a smaller image with lower quality, using a larger image seems to be a good approach where available and appropriate.
Focus on the Page
As with all SEO, context is everything. It's not just about naming your image files right, using good alt text, etc. It's about making sure that the pages your images are located on are tasty to search engines too.
When your pages themselves are well-optimized, the implementation of these image SEO tips will be icing on the cake.

25 SEO tips to create trust with the search engines

There's no two ways about this, Google and the other search engines have their favourites. I'm sure you've seen it all before, either working for your client or evaluating your competition. There are a number of sites in every niche, whatever content they publish they rank well whether or not the content is optimised, has any inbound links and without really trying too hard. 

You, on the other hand might have worked hard to rank for that content, have got some great natural links, lots of buzz but you've got little to show for it. What you don't know is that these websites have managed to reach a high trust level with the search engines which helps their content rank highly.
Building trust with the engines takes time (and patience) here are 25 essential SEO tips that will help:
1. Update your whois records and ensure there's a visible record of the person or company which owns the domain, address, telephone number and email address.

2. Although this might raise some eyebrows, I've had better results with top level domains so consider .com, .org, .net etc domain tlds (even if .tel domain does look cool).

3. The domain age is very important to Google, so when starting a new online business consider purchasing an exciting domain rather than registering a new domain. Sedo is a great place to start your search.

4. If the domain has more than 2 words, don't use hyphens in the URL otherwise it might look spammy so www.PremiumBlueWidgets.com is better than www.Premium-Blue-Widgets.com.

5. Google is assuming that if you've started your online business you're here to stay for the long run, therefore whether you're purchasing a new domain or renewing a domain, go for a longer renewal period of 2 plus years.

6. Get a fixed clean IP address and stick with it for the long run.

7. Verify your site with Google Webmaster central to learn about issues on your site which might affect your trust such as spyware infestation.

8. Place your contact details including phone number, address and even pictures of your office clearly on the site.

9. Have a visible privacy policy and terms or conditions when applicable and add them to your site maps.

10. Don't over-optimise your site because it could be an indication of some artificial work going on.

11. Don't repeatedly cross link between other sites you own as it might be a sign of trying to 'game the engines'.

12. Don't update existing content too often as it might be seen as a sign of manipulation.

13. Don't use doorway pages , it frustrates both the users and the engines.

14. Your links should come organically and the acceleration of link popularity should reflect that, so 1000 backlinks to a new site in one day is a bad sign.

15. Backlinks should include unoptimized anchor text as well as optimised text for example the-websol.blogspot.com rather than conversion rate optimisation because that's how most people link naturally.

16. When linking to you, the referring pages should have as little out going links as possible.
    
17. Keep your pages below 100K to ensure they're easily crawled.

18. Use Google's XML site map generator to create a site map and keep it updated whenever you add or remove pages.

19. Analyze your robot.txt file to ensure you're not blocking certain parts to your site. SEO Book has an excellent tool to analyze your robot.txt file.

20. Build on your great content to create natural links and avoid paid links so counting on social bookmark links.

21. Your content should have very low bounce rate as higher bounce rate is an indication of less relevant content.

22. A link from the three major human edited directories dmoz , Yahoo Directory and Best of The Web helps.

23. Limit your dofollow links to 100 (you might get away with 200), though this is very hard to maintain simply because the nature of the web is sharing.

24. For new domains, limit your dofollow links to trusted sites within your industry for the first 6 months.
25. Google likes fresh new content and even rewards using its fresh content algorithm so the more fresh unique content the better.

SEO tips for product pages

The SEO benefits of product pages can often be overlooked, with many just offering basic product details, photos and a brief description.
So how can you make the most of them from a search marketing perspective?
Here are some suggestions...
  • Unique product descriptions
    Avoid simply reproducing the manufacturer’s product description as several other sites will probably be using it already. It may be easier but will not help you move up the rankings as search engines will mark it down as duplicate content.

    A unique product description should not only help your conversion rate, it should also allow you to attract more search traffic.

    In addition, a slightly lengthier product page will enable you to use more of your most important keywords and phrases. This could improve your rankings for a range of terms.

  • Use keywords in URLs
    Many online retailers, especially those with large product ranges, use dynamic databases, which can produce some ugly, lengthy URLs which bear no relation to the product.

    Adding related keywords to your product page URLs will help them rank more highly in search engines, as well as making them more understandable.

  • Use customer product reviews
    Customer reviews can not only help persuade shoppers to make purchases, they can also be very useful for SEO.

March 17, 2011

Why You Should Never Duplicate Your Competitor's SEO Strategies

Engaging in competitive research before and during your SEO, PPC, Social Media, and Link Building campaigns is smart business. As they say, "information is power."
But, too much information can also cause a handicap. It's not too difficult to be so inundated with info. that you get information overload or conflicting advice. That leads to decision paralysis. You don't know the right course of action to take, or you can wind up using good information to make bad judgment calls.
Some time ago, I was working on a client's keyword research and received the following email:
We decided to optimize our website only for keywords that bring up our competitors when searched. So, what I have to do is to take every keyword that is in your research and to run a search on Google to see if our competitors are there. You'll hear back from me early next week.
I have no doubt that if this client's competitor jumped off a bridge, the client would follow. This is a great example of taking information you have and making a bad decision with it.
Now, there is nothing wrong with wanting to be ranked for the same keywords your competitors are ranked for. But, this cannot be your sole optimization campaign strategy.
Dave Thomas, the founder of Wendy's Restaurant, once said he wanted to place a Wendy's across the street from every McDonald's in America. A smart strategy. It follows the same basic principles as to why car dealerships all congregate together: Customers looking for one may be swayed when the see more available options.
But, here is what Dave Thomas knew about McDonald's that I guarantee most people don't know about their own competition: McDonald's does a significant amount of research before building a new store in a new location. Thomas realized that McDonald's only enters markets where they are confident their restaurants will thrive. As Dave saw it, what was lucrative for Ronald would also be profitable for Wendy!

How SEO Smart Is Your Competition?

Before you follow your competitor off that cliff, are you sure each of your competitors have performed the right research on all their keywords? Do you know that they know that every keyword they are ranking for is bringing in traffic and conversions? Have they employed research strategies that have gotten them ranking for every possible keyword that will produce profits?
More than likely, the answer is "no" to more than one of those questions. That's not to say that any of your competitor's don't know what they are doing. In fact, they may have a very strong and successful online marketing campaign. But, chances are pretty good they are not doing all things perfectly.
Are there some targeted keywords that they are not ranking for? Do they know all the different ways a potential customer will search for their product or service? Are they investing time into keywords that produce little traffic or no conversions? If you don't know the answers to any of the questions posed above, then this may not be someone you want to blindly follow when it comes to setting the course for your own online marketing efforts.

Is Your Competition Making Mistakes?

From a competitive standpoint, it's always good to know what your competitors are doing, who they are targeting, and what areas they are venturing into. A failure to know this information can lead to developing a poor business marketing strategy. While Dave Thomas wanted to be everywhere his competitor was, he also never stopped identifying locations to put a Wendy's that McDonald's hadn't yet exploited.
We often explore our own client's competitors and see that many do not have a full grasp on what keywords they should be targeting. Part of this is ignorance. Another is the lack of insight from those running the SEO campaigns. Or it could be strictly due to lack of budget invested in SEO. Who knows.
Those that employ a "me too" marketing strategy will undoubtedly find themselves following competitors through the same mistakes, costing themselves valuable time and money. Or, in the case of the client I mentioned above, missing out on entire segments of convertable traffic solely because their competitor isn't ranking for the same phrase.
Think about what can be accomplished (and how much money can be saved) if marketing dollars are placed into a more forward thinking marketing campaign; one that doesn't solely focus on competitors but instead focuses on the audience. After all, it's not your competitors who'll be buying from you, it's your targeted consumer.

How Budget Smart is Your Competition?

But there is one area where it may be important to follow in your competitor's footsteps. That's in the area of breadth and reach of the campaign. I often hear from business owners wanting to outperform their competition in rankings both naturally and paid, but they don't want to invest the money needed to make that happen.
This is where it becomes difficult for us managing the campaigns. An SEO can only do what the budget allows. If your competition is out spending you ten to one, and they have good people managing their campaigns, there is little chance that you'll be able to out perform them, no matter how much you cross your fingers, tap your heels together, or complain to your SEO that you're not doing as well as you had hoped.
Money isn't everything in SEO, but it certainly does open the door to a greater online presence and bolder optimization strategy. A bigger investment can implement broader keyword research, more targeted link building, and a more keyword and search engine friendly site. These things matter in SEO.
That's not to say you have to match your competition dollar for dollar. Working smarter is just as good as working harder. But, unfortunately, it still takes money to make money.
Doing what your competitors do, without ever really understanding why, is a bad SEO strategy. Pay attention to what your competitors are doing, but also know why, and make sure those same goals and objectives match up with your own before following them down ANY path, including one that might require a larger investment into your online marketing campaign.
Ultimately, you want to be able to compete for business for the same keywords, provided they are the right keywords. But you also want to find and exploit areas that your competition hasn't.
If your online marketing campaign is simply a reaction, you'll never be ahead of them. You'll always be playing catch-up. Instead of being the "me too" guy, you can become the industry authority, leaving the others playing catch up and trying to be like you.

Are you policing your domain from spammers?

It was not a pretty sight. I watched the look in his face as he was shown a page from his domain that should not have been there. Precisely how it got there, no one knows, but it was clearly placed on his site by search spammers, out to get an advantage for some of their Web sites. It was a lovely little page about prescription drugs chock full of links to other places. How could that page have gotten there? And what was it there for? Welcome to the seedy little world of black hat SEO. If you don't know if your site is vulnerable, you need to find out, so that you can make sure your own site is properly protected.
So let's first examine why anyone would put such a page on a Web site. That one is simple. The links from that site were highly valuable to spammers. In this case, not only was it a well-known site, but it was a .org site, whose links are even more valuable than .com sites, because they are more likely to be genuine expressions of quality. Except in this case.
How is it that the site owner didn't know the page was there? That one's easy, too. The spammer did not link to the page from anywhere on the real site, so the only way you'd discover it would be if you knew the URL. Or you were checking the server for stray pages.
How can you protect yourself? That question is a bit tougher, but your Webmaster needs to answer it:
  • Protect your userIDs. Carelessly leaving default passwords on well-known IDs (such as root) or using easy-to-crack passwords leaves you wide open for a drive-by spammer. Did you know that software programs can try millions of passwords over time to find the one for your site? Don't make it easy for them.
  • Keep up with security patches. Your Webmaster ought to be keeping up with exploit notifications for any software installed on your Web server. Always applying the latest security updates makes it much harder for spammers to sneak in through an unguarded spot.
  • Monitor suspicious traffic. Your server logs all traffic to your site and you can install programs that search the logs for failed access attempts and other odd patterns. Some people block suspicious IP addresses but I think that the real villains just troop off to a new IP address from their bank. The real reason to monitor traffic is so you'll see that cracker program trying a million passwords and it causes you to be especially vigilant because you know you are under attack.
  • Monitor stray pages. You were waiting for this one, right? If you know what pages should be on your site, you can check the server for any that don't belong. Often, greedy spammers put them right in the top-level www directory because the closer to the home page on the site, the more that the link might be worth.
Understand, I used to manage the Webmasters at ibm.com, but I am not a real Webmaster. Real ones know that this was the Bert and Ernie explanation of Web security. If you are using a shared hosting plan, then your Web hosting company probably does this stuff for you, but if you are using dedicated or partial server or cloud server hosting, you might be expected to do it yourself. If you host your own servers, you definitely need someone to protect your site.
But don't overlook one last possibility of how that spammy page got on that poor .org site: the inside job. It's possible that their SEO company did it, but even more likely that their employee did it, perhaps even their Webmaster. Anyone could try to boost up another site, either for personal gain or in exchange for some cash from the spammer.
If you haven't been policing your servers, don't be surprised if someone is squatting on a few pages that you don't even know are there.