March 17, 2011

Why You Should Never Duplicate Your Competitor's SEO Strategies

Engaging in competitive research before and during your SEO, PPC, Social Media, and Link Building campaigns is smart business. As they say, "information is power."
But, too much information can also cause a handicap. It's not too difficult to be so inundated with info. that you get information overload or conflicting advice. That leads to decision paralysis. You don't know the right course of action to take, or you can wind up using good information to make bad judgment calls.
Some time ago, I was working on a client's keyword research and received the following email:
We decided to optimize our website only for keywords that bring up our competitors when searched. So, what I have to do is to take every keyword that is in your research and to run a search on Google to see if our competitors are there. You'll hear back from me early next week.
I have no doubt that if this client's competitor jumped off a bridge, the client would follow. This is a great example of taking information you have and making a bad decision with it.
Now, there is nothing wrong with wanting to be ranked for the same keywords your competitors are ranked for. But, this cannot be your sole optimization campaign strategy.
Dave Thomas, the founder of Wendy's Restaurant, once said he wanted to place a Wendy's across the street from every McDonald's in America. A smart strategy. It follows the same basic principles as to why car dealerships all congregate together: Customers looking for one may be swayed when the see more available options.
But, here is what Dave Thomas knew about McDonald's that I guarantee most people don't know about their own competition: McDonald's does a significant amount of research before building a new store in a new location. Thomas realized that McDonald's only enters markets where they are confident their restaurants will thrive. As Dave saw it, what was lucrative for Ronald would also be profitable for Wendy!

How SEO Smart Is Your Competition?

Before you follow your competitor off that cliff, are you sure each of your competitors have performed the right research on all their keywords? Do you know that they know that every keyword they are ranking for is bringing in traffic and conversions? Have they employed research strategies that have gotten them ranking for every possible keyword that will produce profits?
More than likely, the answer is "no" to more than one of those questions. That's not to say that any of your competitor's don't know what they are doing. In fact, they may have a very strong and successful online marketing campaign. But, chances are pretty good they are not doing all things perfectly.
Are there some targeted keywords that they are not ranking for? Do they know all the different ways a potential customer will search for their product or service? Are they investing time into keywords that produce little traffic or no conversions? If you don't know the answers to any of the questions posed above, then this may not be someone you want to blindly follow when it comes to setting the course for your own online marketing efforts.

Is Your Competition Making Mistakes?

From a competitive standpoint, it's always good to know what your competitors are doing, who they are targeting, and what areas they are venturing into. A failure to know this information can lead to developing a poor business marketing strategy. While Dave Thomas wanted to be everywhere his competitor was, he also never stopped identifying locations to put a Wendy's that McDonald's hadn't yet exploited.
We often explore our own client's competitors and see that many do not have a full grasp on what keywords they should be targeting. Part of this is ignorance. Another is the lack of insight from those running the SEO campaigns. Or it could be strictly due to lack of budget invested in SEO. Who knows.
Those that employ a "me too" marketing strategy will undoubtedly find themselves following competitors through the same mistakes, costing themselves valuable time and money. Or, in the case of the client I mentioned above, missing out on entire segments of convertable traffic solely because their competitor isn't ranking for the same phrase.
Think about what can be accomplished (and how much money can be saved) if marketing dollars are placed into a more forward thinking marketing campaign; one that doesn't solely focus on competitors but instead focuses on the audience. After all, it's not your competitors who'll be buying from you, it's your targeted consumer.

How Budget Smart is Your Competition?

But there is one area where it may be important to follow in your competitor's footsteps. That's in the area of breadth and reach of the campaign. I often hear from business owners wanting to outperform their competition in rankings both naturally and paid, but they don't want to invest the money needed to make that happen.
This is where it becomes difficult for us managing the campaigns. An SEO can only do what the budget allows. If your competition is out spending you ten to one, and they have good people managing their campaigns, there is little chance that you'll be able to out perform them, no matter how much you cross your fingers, tap your heels together, or complain to your SEO that you're not doing as well as you had hoped.
Money isn't everything in SEO, but it certainly does open the door to a greater online presence and bolder optimization strategy. A bigger investment can implement broader keyword research, more targeted link building, and a more keyword and search engine friendly site. These things matter in SEO.
That's not to say you have to match your competition dollar for dollar. Working smarter is just as good as working harder. But, unfortunately, it still takes money to make money.
Doing what your competitors do, without ever really understanding why, is a bad SEO strategy. Pay attention to what your competitors are doing, but also know why, and make sure those same goals and objectives match up with your own before following them down ANY path, including one that might require a larger investment into your online marketing campaign.
Ultimately, you want to be able to compete for business for the same keywords, provided they are the right keywords. But you also want to find and exploit areas that your competition hasn't.
If your online marketing campaign is simply a reaction, you'll never be ahead of them. You'll always be playing catch-up. Instead of being the "me too" guy, you can become the industry authority, leaving the others playing catch up and trying to be like you.

Are you policing your domain from spammers?

It was not a pretty sight. I watched the look in his face as he was shown a page from his domain that should not have been there. Precisely how it got there, no one knows, but it was clearly placed on his site by search spammers, out to get an advantage for some of their Web sites. It was a lovely little page about prescription drugs chock full of links to other places. How could that page have gotten there? And what was it there for? Welcome to the seedy little world of black hat SEO. If you don't know if your site is vulnerable, you need to find out, so that you can make sure your own site is properly protected.
So let's first examine why anyone would put such a page on a Web site. That one is simple. The links from that site were highly valuable to spammers. In this case, not only was it a well-known site, but it was a .org site, whose links are even more valuable than .com sites, because they are more likely to be genuine expressions of quality. Except in this case.
How is it that the site owner didn't know the page was there? That one's easy, too. The spammer did not link to the page from anywhere on the real site, so the only way you'd discover it would be if you knew the URL. Or you were checking the server for stray pages.
How can you protect yourself? That question is a bit tougher, but your Webmaster needs to answer it:
  • Protect your userIDs. Carelessly leaving default passwords on well-known IDs (such as root) or using easy-to-crack passwords leaves you wide open for a drive-by spammer. Did you know that software programs can try millions of passwords over time to find the one for your site? Don't make it easy for them.
  • Keep up with security patches. Your Webmaster ought to be keeping up with exploit notifications for any software installed on your Web server. Always applying the latest security updates makes it much harder for spammers to sneak in through an unguarded spot.
  • Monitor suspicious traffic. Your server logs all traffic to your site and you can install programs that search the logs for failed access attempts and other odd patterns. Some people block suspicious IP addresses but I think that the real villains just troop off to a new IP address from their bank. The real reason to monitor traffic is so you'll see that cracker program trying a million passwords and it causes you to be especially vigilant because you know you are under attack.
  • Monitor stray pages. You were waiting for this one, right? If you know what pages should be on your site, you can check the server for any that don't belong. Often, greedy spammers put them right in the top-level www directory because the closer to the home page on the site, the more that the link might be worth.
Understand, I used to manage the Webmasters at ibm.com, but I am not a real Webmaster. Real ones know that this was the Bert and Ernie explanation of Web security. If you are using a shared hosting plan, then your Web hosting company probably does this stuff for you, but if you are using dedicated or partial server or cloud server hosting, you might be expected to do it yourself. If you host your own servers, you definitely need someone to protect your site.
But don't overlook one last possibility of how that spammy page got on that poor .org site: the inside job. It's possible that their SEO company did it, but even more likely that their employee did it, perhaps even their Webmaster. Anyone could try to boost up another site, either for personal gain or in exchange for some cash from the spammer.
If you haven't been policing your servers, don't be surprised if someone is squatting on a few pages that you don't even know are there.

Who Needs Profits...When You've Got Good Rankings?!!

Search engine marketing is an intense game of strategy, analysis, and patience. But, it's also a game with multiple, sometimes even conflicting, goals. Depending on who you talk to you, some will tell you SEO is about rankings, while others will tell you it's about conversions. It's a classic political struggle trying to answer the question, "what will bring in the greatest profits?"
You need exposure to get the traffic that leads to new business. But, you need to be user friendly in order to convert the traffic you're getting into new business. Which comes first, the chicken or the egg?
Anyone who has been optimizing a site for more than a week understands the value of getting strong search engine placement. Anyone that has had top rankings for more than a week also understands that bringing in new traffic that doesn't convert is pointless.

Why SEO is Like Government (and why government isn't like SEO)

SEO is a lot like government. No matter how many years we've been at it, there always seems to be more to do. And, like a good (or bad) law, we often don't see the effects right away. But, unlike government, SEO's analyze the results of their work. When a bad strategy is implemented, it gets repealed. Not very often is a bad law or government program withdrawn, regardless of the "unintended consequences."
Sigh.
But, I digress.
With SEO, there is almost always something that can be done to improve your site and your search rankings. But, after making specific changes, you must be patient enough to wait for the results of those changes. Then you can come back and compare the new results against previous results. This is the same whether you're making changes to improve your engine rankings or to increase conversion rates.
The changes you can make to your site are virtually endless when testing is involved. But, making too many changes too quickly, without testing and comparing the results, will almost always lead to a less than optimal marketing campaign.

Making Changes that Make Sense (and a lot of cents)

When you make changes without implementing proper tracking and testing procedures, you will often get both positive and negative results (or a combination of both), but you won't be in a position to pinpoint which of those changes were responsible for what results.
Let's say you made two changes to your home page, one was for rankings, the other for usability. If both rankings and conversions increase, you probably have performed two winning changes. But, what if rankings went up while conversions went down?
Simple, go back and undo the usability changes, right? Not necessarily.
It may be that your optimization changes improved rankings, but negatively effected usability, despite having made other usability changes. The usability changes you made may have actually resulted in a positive improvement, but that improvement was counteracted by the optimization changes that, while improving rankings, had a larger negative effect on conversions. Performing both of these changes at the same time makes it hard to pinpoint cause and effect.
Had you performed these changes separately, say the usability changes first, you might have seen an increase in conversion rates with little or no effect on rankings. The following week you would then make your optimization changes to find that your rankings went up, but your conversions dropped to levels lower than they were previously.
Now you know what to do! You undo your optimization changes, because, in this case, better rankings reduced conversions. Since you measured and tracked the results of each change, you can easily undo the change that had the greatest negative impact and then perhaps try something different to improve rankings.
Looking for opportunities to improve your site is an ongoing process. Every change and every test gives you valuable insight into what's working and what isn't. If you uncover a problem, you can't sit on your hands and do nothing. But once a "solution" is implemented, be patient and look to the results to see if it was a viable solution after all.


The Goal is Profits (not first page rankings)

In search engine marketing, there are often many goals: improve rankings, get more sales, increase conversions, drive more traffic, etc.
Profits can be achieved by improving rankings, getting more sales, increasing conversions, driving more traffic, etc. But, none of these is the goal itself. It is a means to the goal. Each of those paths can, and often do, intersect, and any of them can also lead you further away from your goal as well, if you're not careful
When I talk about getting more conversions for less money, I don't necessarily mean being able to spend less money, though that would be nice. But, getting more conversions for less usually requires spending more money, but paying less for each conversion.
Testing every change on your site allows you to keep making improvements in SEO, usability, conversions, etc. so that you can achieve your goal of getting each conversion at a lower cost than the month before. The way I see it, if marketing works the way it should, your marketing budget should always be increasing rather than decreasing, assuming, of course, that you can handle the increased business that the improvements continue to bring in.
Making sure you are using a measured approach to all your marketing efforts allows you slow, steady, and consistent growth in profits. When it's all said and done, it comes back to doing all that you can to improve your business, and measuring the results to make sure that what you're doing is working. Measuring only the end result without measuring the success or failures of the processes along the way will only result in a nice tasting goulash of a marketing campaign. Why settle for that when you can have the prime rib instead?

SEO Techniques

SEO Techniques are a set of specific tasks that would be performed by a Search Engine Optimization Company, when employed by a Client who desires high search engine positions to attract targeted traffic, with the intention of increasing their conversion rates and brand awareness.

Effective Website Optimization should enable the Search Engines to Index a site, utilising the most relevant keywords, related to the content which is promoting the goods and services offered by the Client. Implementing successful SEO Techniques require's an extensive knowledge of logic and a very good understanding of the targeted market sector.

Search Engine Optimization must be focused towards Human visitors in order to achieve good quality traffic and conversion rates. Page Content should be specific, informative and relevant to a search query. Writing relevant, quality content is one of the most important factors or SEO Techniques, which will unlock the doors of your website to real visitors.

A website is a visual and graphic interface to a Company. The web designer will incorporate many skills including graphic design and expert coding to represent the Clients goods and services, to reflect the quality, expertise and brand of the Client. The foundation of strong SEO Techniques should be developed prior to the website design stage by an SEO Specialist. The combination of a professional web designer and SEO professional working in tandem will result in an effective Internet platform, for Search Engine Marketing.

The are many methods of installing good SEO Techniques, but be aware that there are bad techniques that should not be used and avoided at all cost.

Bad techniques include Hidden Text, Optimizing Irrelevant keywords, linking to bad neighbours, keyword stuffing and Doorway or Gateway Pages. These methods of SEO will reveal very little relevant traffic, bad conversion rates and have a high chance of getting your website banned from the search engine index. These methods are often used by "so called" SEO companies offering services at very, very cheap rates.

The Development of your Internet Marketing Strategy is an ongoing process, when optimizing for natural relevant search engine results. The SEO specialist will become an integral part of the Marketing Team.

Studying Market Trends and the analysis of competitors and consumers, combined with extensive keyword research will form the basis of the optimization methodology and will outline the SEO Techniques required for the Internet Marketing Strategy.

Key factors to combine within the SEO Project Plan are as follows;

Domain Name - Short names are easier to remember ! Include short Primary Keywords ! without hyphens were possible.

Domain Extension - .com or .net For the Global Market. Use .co.uk for UK Country specific traffic

Host Location - If your attracting UK business host in the UK.

URL Names - include relevant keywords - unique to each page.

Robots.txt - A file which permits or denies access to robots or crawlers to areas of your site.

Navigation Structure - Keep it simple.

Meta Tags - Title and Description. - Unique detail for each page, related to page content.

H1 Tags - Use for the short on page content description.

H2 and H3 Tags- Use for Headings for sub category's within the Content

Page Content - Critical Component.

Internal Keyword Link Strategy - Use targeted keywords to link between pages.

Keyword Visibility - Within page Content.

Image Alt Tags - Helps with Accessibility.

Privacy Policy - Assures trust and confidentiality.

The site should confirm to the W3C standards.

Create and submit sitemap's - formatted in either .xml -.htm - .txt.

Create and submit RSS feeds to relevant feed directory's

Create and submit Articles

Find relevant websites within the same market sector or niche and form a link partnership.

Submit your website to relevant or industry related directory's.

A link exchange should be formed by utilising relevant keyword Anchor Text.

Utilise relevant Social Networks and Forums related to your Market Sector.

Utilise Blog sites relevant to your Market Sector.

The above factors are proven SEO Techniques, that will help increase targeted traffic from achieving good or high ranking search engine positions.

Search Marketing Company's require an understanding of logical information processing, this article is a primary example of the organic optimization services provided by WebPageOne Solutions.

SEO 101 - Basic Optimization Techniques

Believe it or not, basic SEO is all about common sense and simplicity. The purpose of search engine optimization is to make a website as search engine friendly as possible. It's really not that difficult. SEO 101 doesn't require specialized knowledge of algorithms, programming or taxonomy but it does require a basic understanding of how search engines work.

For the purposes of brevity this piece starts with a few assumptions. The first assumption is a single, small business site is being worked on. The second assumption is the site in question is written using a fairly standard mark-up language such as HTML or PHP. The last assumption is that some form of keyword research and determination has already taken place and the webmaster is confident in the selection of keyword targets.

There are two aspects of search engines to consider before jumping in. The first is how spiders work. The second is how search engines figure out what pages relate to which keywords and phrases.

In the simplest terms, search engines collect data about a unique website by sending an electronic spider to visit the site and copy its content which is stored in the search engine's database. Generally known as 'bots', these spiders are designed to follow links from one page to the next. As they copy and assimilate content from one page, they record links and send other bots to make copies of content on those linked pages. This process continues ad infinitum. By sending out spiders and collecting information 24/7, the major search engines have established databases that measure their size in the tens of billions.

Knowing the spiders and how they read information on a site is the technical end of basic SEO. Spiders are designed to read site content like you and I read a newspaper. Starting in the top left hand corner, a spider will read site content line by line from left to right. If columns are used (as they are in most sites), spiders will follow the left hand column to its conclusion before moving to central and right hand columns. If a spider encounters a link it can follow, it will record that link and send another bot to copy and record data found on the page the link leads to. The spider will proceed through the site until it records everything it can possible find there.

As spiders follow links and record everything in their paths, one can safely assume that if a link to a site exists, a spider will find that site. There is no need to manually or electronically submit your site to the major search engines. The search spiders are perfectly capable of finding it on their own, provided a link to your site exists somewhere on the web. Search engines have an uncanny ability to judge the topic or theme of pages they are examining, and use that ability to judge the topical relationship of pages that are linked together. The most valuable incoming links, come from sites that share topical themes.

Once a search spider finds your site, helping it get around is the first priority. One of the most important basic SEO tips is to provide clear paths for spiders to follow from point A to point Z in your website. This is easily accomplished by providing easy to follow text links directed to the most important pages on the site in the navigation menu or simply at the bottom of each page. One of these text links should lead to a text-based sitemap, which lists and provides a text link to every page in the site. The sitemap can be the most basic page in the site as its purpose is more to direct spiders than help lost site visitors though designers should keep site visitors in mind when creating the sitemap. Google also accepts more advanced, XML based sitemaps, which can be read about in their Webmaster Help Center.

There will be cases where allowing spiders free access to every page on a site is not always desirable. Therefor you'll need to know how to tell spiders that some site content is off limits and should not be added to their database using "robots.txt" files. (To learn more about setting up your Robots.txt file, start with Jennifer Laycock's article on Robots.txt basics)

Offering spiders' access to the areas of the site one wants them to access is half the battle. The other half is found in the site content. Search engines are supposed to provide their users with lists of pages that relate to the search terms people enter in their search box. Search engines need to determine which of billions of pages is relevant to a small number of specific words. In order to do this, the search engine needs to know your site relates to those words.

To begin with, there are a few elements, a search engine looks at when examining a page. After the URL of a site, a search spider records the site title. It also examines the description meta tag. Both of these elements are found in the "head" section of the source code.

Titles should be written using the strongest keyword targets as the foundation. Some titles are written using two or three basic two-keyword phrases. A key to writing a good title is to remember that human readers will see the title as the reference link on the search engine results page. Don't overload your title with keyword phrases. Concentrate on the strongest keywords that best describe the topic of the page content.

The description meta tag is also fairly important. Search engines tend to use it to gather information on the topic or theme of the page. A well written description is phrased in two or three complete sentences with the strongest keyword phrases woven into each sentence. As with the title tag, some search engines will display the description on the search results pages, generally using it in whole or in part to provide the text that appears under the reference link.

Due to abuse by webmasters, such as using irrelevant terms, search engines place minor (if any) weight in the keywords meta tag. As such, it is not necessary to spend a lot of time worrying about the keywords tag.

After reading information found in the "head" section of the source code, spiders continue on to examine site content. It is wise to remember that spiders read the same way we do, left to right and following columns.

Good content is the most important aspect of search engine optimization. The easiest and most basic SEO rule is search engine spiders can be relied upon to read basic body text 100% of the time. By providing a search engine spider with basic text content, you offer the engines information in the easiest format for them to read. While some search engines can strip text and link content from Flash files, nothing beats basic body text when it comes to providing information to the spiders. You can almost always find a way to work basic body text into a site without compromising the designer's intended look, feel and functionality.

The content itself should be thematically focused. In other words, keep it simple. Some pages cover multiple topics on each page, which is confusing for spiders. The basic SEO rule here is if you need to express more than one topic on a page, you need more pages. Fortunately, creating new pages with unique topic-focused content is one of the most basic SEO techniques, making a site simpler for both live-users and electronic spiders.

When writing page content, try to use the strongest keyword targets early in the copy. For example, a site selling "Blue Widgets" might use the following as a lead-sentence;

"Blue Widgets by Smith and Co. are the strongest construction widgets available and are trusted by leading builders and contractors."

The primary target is obviously construction applications for the blue widget. By placing the keyword phrases "blue widgets" and "construction widgets" along side other keywords such as the singular words, "strongest", "trusted" and "builders" and "contractors", the sentence is crafted to help the search engine see a relationship between these words. Subsequent sentences would also have keywords and phrases weaved into them. One thing to keep in mind when writing page copy is unnecessary repetition of keywords (keyword stuffing) is often considered spam by search engines. Another thing to remember is that ultimately, the written copy is meant to be read by human eyes as well as search spiders. Read your copy out loud. Does is make sense and sound natural? If not, you've overdone the use of keyword phrases and need to make adjustments.

Another important element a spider examines when reading the site (and later relating the content to user queries), is the anchor text used in internal links. Using relevant keyword phrases in the anchor text is a basic SEO technique aimed at solidifying the search engine's perception of the relationship between pages and the words used in the link. For example... we also have a popular series of articles on the basics of SEO written by Stoney deGeyter. Linking the term "basics of SEO" is an example of using keyword phrases in the anchor text. Terms such as "SEO 101" or "SEO for beginners" could also have been used.

Remember, the foundation of successfully optimizing your site is simplicity. The goal is to make a site easy to find, easy to follow, and easy to read for search spiders and live-visitors, with well written topical content and relevant incoming links. While basic SEO can be time consuming in the early stages, the results are worth the effort and set the stage for more advanced future work.

Great Paid Directories - Effective way to get permanent backlinks with high PR

optimisationIt's a well known that for site's promotion in search engines by popular queries it's important to get a lot of "dofollow" links from different websites. Almost always links from pages with high PR are purchased with monthly payments. But what to do if budget for SE promotion of website is small and there is no possibility to buy links for long term? There are old proven methods to get backlinks from forums, blogs and social networking websites. All these methods are based on writing comments in posts or threads already having PR or which could get it in the future. As you understand - it's a lottery. You can guess and get a good backlink in future, but in the most cases - pages where your links are located will never get high PR. Or your comments could be deleted by moderators thinking that they're advertising and you're a spammer. In any case using of such methods of getting link can not be used as systematic and guaranting a good results.


The solution of this problem is offered by GreatDirectories.org - popular website dedicated to the most effective directories. I must say I'm always somewhat skeptical of web directories. Until recently, until I've found the service described below. So GreatDirectories.org has selected 13 the most effective paid directories with high PR and availability of deep links. Many of these directories - are authorative general directories existing several years and having not only high PR of main pages but also inner pages - where links are placed. But the most important that all these directories guarantee permanent link - you pay for the submission only once - links remain forever. And this is exactly what we need!

So the list of directories contains - 12 general directories with PR6 and 1 directory with PR5. Each directory except link on site's home page provides 3 deep links - that in itself is a good opportunity to pump the inner pages. Almost all directories have different IPs. But the most important - service from GreatDirectories.org guarantee 25% discount for submission to these 13 web directories.

Benefits of using service of submission to great paid directories:

1. Opportunity to get backlinks from pages with high PR (up to PR5) by low cost.
2. Payment made once and links are permanent.
3. You save 25% through using of this service compared to submissions to each directory individually.
4. Opportunity to get deep links to inner pages of website - 3 links in each directory.
5. You can create for each directory different descriptions and titles. That gives opportunity to have unique content on the page where your links are located.

New Google Algorithm Changes

 chart barMost online entrepreneurs know by now that content is king and this phrase just became more important than ever. Google has always stressed the importance of quality original content and while many webmasters followed these guidelines, some still decided to take the easy way out and risk the odds.

Google has been under a lot of public pressure lately to deal with the endless spam and large-sized, low quality websites that seem to still be getting top spots in the search result ranks. They have finally just released another algorithm change to help combat spam websites, content scrapers and websites that in general republish previously published content.

People have been getting these top rankings by using techniques such as:
  • Commenting on blog posts with do follow links back to a website with specific keywords
  • Putting links in forum profile pages
  • Spamming an commenting pages with links
The point to this update is to help ensure the original publishers of content show up above the copied sources. While only 2% of queries changed in some way with this update, it is a significant step in combating spam. Now more than ever, it is vitally important that websites create and publish quality content that is totally original and related to their business.
Google is continuing to change their approach to managing this form of blackhat SEO spam that web marketers are using to rank new websites.

So, if you have not made a new years SEO resolution yet, you should consider making your resolution to start publishing great content and recommend to all your clients that they also should be adding original content on a regular basis. This is to ensure that your website will not be penalised and demoted by the google team.

Content has always been king and it will continue to reign supreme for the foreseeable future. Invest in good content and you will help secure your rankings for the future.

Do Sites Need Fresh Content for SEO?

contentA lot of clients often ask me if they NEED a blog or fresh content added to their site on a regular basis to keep their rankings. The truth is you don't need to add new content to hold your rankings. Many sites stay static for years and hold onto their rankings with few changes ever being made.

You do need some relevant content and so I would always suggest having a minimum of about 6-12 good content pages on various topics or pages answering questions related to your niche. When adding content you should always try and add content that would be useful and informative for your readers.

Avoid adding content just because it targets keywords or because you think you need to add it. That being said, it is good to have a blog and fresh content because the more content you have, the more traffic you can potentially get.

Publishing quality articles in your niche can also help you establish yourself as a very knowledgeable industry leader, which can result in more business overall. Blogs and content are also a great way to take advantage of social media marketing with sites such as Twitter and Facebook. Votes on these sites are becoming signals to search engines of quality.

So simply put, having fresh quality content can be a great benefit and is highly recommend, but it is not necessary to achieve or to hold onto rankings for your main keyword phrases that you have optimized your home page and other core pages for.

JC Penney Penalised by google for buying links

Today, the New York Times published an article about a search engine optimization investigation of  J.C. Penney. Perplexed by how well jcpenney.com did in unpaid (organic) search results for practically everything the retailer sold, they asked someone familiar with the world of search engine optimization (SEO) to look into it a bit more. The investigation found that thousands of seemingly unrelated web sites (many that seemed to contain only links) were linking to the J.C. Penney web site. And most of those links had really descriptive anchor text. It was almost like someone had arranged for all of those links in order to get better rankings in Google.

JC Penny was found to have hired a an SEO company who outsourced the buying of links to TNX.net, this meant that JC PENNY had ranked globally for highly targted keywords such as jumpers, shirts all one keyword targets. With the attempt of blackhat SEO New york times blew the whistle and wrote up a documented story on this SEO fallacy.
The New York Times Contacted Googler Matt Cutts, head of webspam, confirmed that the tactics violated the Google webmaster guidelines and shortly after, the J.C. Penney web site was nowhere to found for the queries they had previously ranked number one for. Matt tweeted that “Google’s algorithms had started to work; manual action also taken”.

Matt Cutts response on JC Penney
After google has manually reverted these rankings Jc Penny can now be found with highly searched keywords on the 4th and 5th page of google respectively.

Google Information from webmaster Central on Paid Links
Google and most other search engines use links to determine reputation. A site's ranking in Google search results is partly based on analysis of those sites that link to it. Link-based analysis is an extremely useful way of measuring a site's value, and has greatly improved the quality of web search. Both the quantity and, more importantly, the quality of links count towards this rating.
However, some SEOs and webmasters engage in the practice of buying and selling links that pass PageRank, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying or selling links that pass PageRank is in violation of Google's Webmaster Guidelines and can negatively impact a site's ranking in search results.
Not all paid links violate our guidelines. Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such. This can be done in several ways, such as:
  • Adding a rel="nofollow" attribute to the <a> tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

The results in traffic for JC Penny


Traffic has significantly decreased and sales have decreased online for JC Penny

Information on hiring an SEO Company
This is the due result of outsourcing SEO work to an SEO company that relied on blackhat SEO techniques and buying links in restricted or dead websites to raise rankings. Companies wishing to hire a reputable SEO company need to read up on ethical google guidelines and also be sure that the money they are paying for is whitehat seo and will result in postive results.