April 14, 2011

Google News is about looks, too

Google News is about looks, too

Getting a site accepted as a Google News source is a very hard task indeed, and it’s not for everyone. If you are targeting Google News as part of your SEO, it’s important to realise it’s not just about content. If you want any hope of success, you have to have the right look.
It might sound strange that appearance can have any effect on your chances of being selected for Google News, but it’s true. Google has been cracking down on the number of non-news sites that have managed to work their way into its news sources, looking at more than content. This is why looking more like a media site helps.
Google News Search
Here’s some changes you can make if you’re targeting Google News as part of your search engine optimisation:
Design
If your site looks like a company website, it’s not going to make the grade. Convincing Google of your newsworthiness starts with the right layout, and magazine layouts are recommended. Talk to us at SEO.co.uk for further advice.
Supplementary pages
Having the right supplementary pages also gives off the signals that Google is looking for. As well as a good About Us page, one which features some meaty content, it’s a good idea to have a featured authors page, listing the names of your writers. This can make your site more attractive to site users as well.
Make things easier with a niche
No business can afford to run a general news site as part of their search engine optimisation campaign. Pick a niche in your industry, and feature news from there. This speciality gives you enough focus to make content writing easier, and give you more of a news drive. Cornering a niche will also increase the likelihood of Google choosing your site for its rare information.

Get Bitten by the News Bug!

Get Bitten by the News Bug!

The Pew Research Center’s Project for Excellence in Journalism has recently released a survey which shows that the amount of people who get their news online has surpassed the number of people who get their information via newspapers. Forty-six per cent of Americans surveyed get their news online approximately three times throughout the week compared to forty per cent who get info through print media.
Getting news and information through televised broadcasts is still the leading platform, with fifty per cent of Americans getting their news from TV bulletins. What’s also incredibly interesting is that nearly half of Amercians surveyed – a fascinating forty-seven per cent – get their news from mobiles, especially when it relates to local matters.
News blogs can be a great source of traffic
Their report also shows that there is a steady decline when it comes to people getting their information through from traditional media. Every media sector apart from online is losing an audience in some shape or other, and we feel this data is imperative to showing people how important posting regular news items on a company blog or other actually is.
With more people searching online than ever before for products, news, opinion and more, having an up-to-date news blog on your site can help to attract organic traffic as well as keeping your core visitors coming back to your site for more.
But your news blogs should always be original, objective and say something to your readers that no other site does. Your news should be sourced by yourself in-house and be relevant to your industry when it comes to search engine optimisation. The better the quality of your news when content writing, the more relevant it is and if a lot of care’s been taken when writing it, you should win over a lot of fans and followers for years to come if you keep it up!

April 13, 2011

Snaptu it, Facebook…

Snaptu it, Facebook…

Facebook is, frankly, enormous. Not just in a social networking sense, but also in a brand sense. A commercial sense and recognition sense. Over 30 million people in the United Kingdom currently possess a Facebook account, with that number steadily growing as time goes on.
While quite a lot of people access Facebook via their smartphone, there is still a huge number of people who log on through conventional handsets. Though the smartphone trend is ploughing ahead, a whopping 80 per cent of the global phone market is still taken up by ‘non-smartphones’.
Facebook has recently bought Snaptu
In what many see as a clever move, Facebook has quickly moved to acquire London-based company Snaptu for a price estimated between $40-70 million. The American-Israeli technology firm’s expertise lie in making social networks and platforms accessible through ‘feature phones’. One of Facebook’s key goals for 2011 is to expand in the mobile market – the team at Snaptu seem happy with the acquisition, blogging: “We soon decided that working as part of the Facebook team offered the best opportunity to keep accelerating the pace of our product development.”
In the past, Facebook’s head of mobile business, Henri Moisinnac, has said: “We want to have every user in every market using Facebook– we’re investing in smartphones and at the same time as in mass market phones because we believe with great features and great integrations, every phone can become sociable.”
Mobile search is becoming incredibly important in the world of website optimization as companies look to harness searchers on the move. With 80 per cent of the global phone market being identified as ‘non-smartphone’, then Facebook are already laying down the foundations to try and corner as much of it as possible. Link building is very important when promoting your website, but keep in mind that a strong mobile presence will also be crucial in the coming years as the medium grows.

 

Facebook Getting Tough on Profiles

Facebook Getting Tough on Profiles

Facebook and other social media websites are a vital part to search engine optimisation campaigns all over the globe. Making proper use of Facebook and similar platforms is a great way to increase your brand’s exposure across the web in SEO – but sadly, like a lot of things, there are people who try to take advantage of social media for quick rankings and to gain an advantage over the competition as quickly as possible.
Facebook has recently clamped down on profiles were they believe the user is underage, and has claimed that they’re currently deleting approximately 20,000 underage users every single day. Users looking to join Facebook have to be 13 or older, but because of the popularity of the medium, obviously a lot of people under 13 are trying their luck and joining the largest social website of our time!
Facebook deletes up to 20,000 profiles every day
But there are other underlying issues here. If a 12-year-old can create a profile, then anyone can create a profile, whether the reasons are for good or bad. There’s a difference between a kid joining Facebook to share photos with their friends and a company creating a lot of fake profiles to post a lot of fake, positive comments on the wall of their company page.
“There are people who lie. There are people who are under 13 [accessing Facebook]. Facebook removes 20,000 people a day, people who are underage,” says Mozelle Thompson, chief privacy advisor at Facebook. comScore rates that number to be much higher in the US, where their research shows that a possible 3.6million underage people regularly access Facebook.
It’s great to see that Facebook has recognised there’s something of a ‘profile problem’, and that they’re working hard to fix it where they can. How can they solve the problem though in the long-term? Leave us your thoughts below!

 

Twitter – Keep it Private.

Twitter – Keep it Private 

We’re massive fans of social media, and recommend it highly for anybody who’s looking to incorporate an SEO campaign alongside their web presence. But for all the benefits the likes of Twitter and Facebook can bring to an organic SEO campaign, people still have a hard time remembering that it’s a very public platform. Just ask Calvin Harris and Katy Perry…
It’s worth remembering that no matter how bad things get, either in a personal or professional sense, it’d be a wise idea not to air your dirty laundry in public. What’s the point?
Perry and Harris have been generating headlines over the last day or so because the latter has pulled out of the former’s tour around the UK. Harris cited behind-the-scenes issues, but instead of things staying that way and both parties sorting things out amicably, a slanging match erupted between Harris and Perry on Twitter, with thousands watching.
Katy Perry and Calvin Harris have been arguing over Twitter
Of course, this is a very high-profile incident between two chart stars, but there’s lessons for everyone to learn here. Many people still, understandably, confuse Twitter as an innocent form of fun. Twitter though, is regarded as a valid form of publishing and can carry all the associated legal consequences – just ask ex-Liverpool FC footballer Ryan Babel who was hit with a hefty fine for his tweets in January.
The tweets between Harris and Perry focused a lot on what many assume were the terms of the deal between the two, with talk of ‘goalposts being moved’. Pretty dangerous when each tweet will be read by potentially millions of people.
Whatever the problems you encounter in your business, deal with it professionally. Twitter can very easily – as the example above shows – turn things into a public slanging match, adding fuel to the fire. Keep things in-house. You not only owe it to yourself, but to the poor souls who follow you and have to read your tweets!

 

Five things you must do when using social networking websites for SEO

Five things you must do when using social networking websites for SEO

It is impossible not to notice the increasing popularity of social networking websites such as Facebook and Twitter. Millions of internet users now use websites of this kind on a regular basis and it is not just those using them for personal and social reasons that are logging in. Online business owners are developing and updating profiles on social networking sites and for some, this is making a big difference to their search engine optimisation campaigns.
If used appropriately, a social networking website can help a business to gain exposure, build trust and respect for its brand and direct traffic to its web pages. These are impressive results and are why so many site owners are becoming interested in what social networking websites can do for them and how these results can be achieved.
If you want to use social networking websites successfully as part of your SEO campaign, there are five things you must do:
1. Be polite. You are representing your company when you interact with others within social networking environments and must remember this at all times. You must always be polite and respectful if you want to show others the professionalism of your business.
2. Provide relevant information. Whilst all the content you create does not have to be about your business, a lot of it should be. This is because this is what a lot of your followers will expect and will be one of the main reasons why they are following you.
3. Be fun. If you are going to keep the attention and interest of your audience, you not only have to provide useful information but you have to make things fun too. There are many ways this can be achieved such as by running competitions, having quizzes, giving free prizes away and other ideas such as this. You do not always have to be serious when using social networking sites for SEO.
4. Be ready to respond to others. When using social networking websites, it is not just about giving information to others but also about reading their responses, communicating with them and learning from the information they offer you. If another user makes a comment, try and respond. Always acknowledge those that show an interest in your profile and try to help bonds between you and them to develop.
5. Be sociable. If you are not ready to be sociable on behalf of your business then there is little point in using sites of this kind. You have to check on your social networking page often, add information, respond to others, comment on the profiles of others and really get involved in the social networking community any way you can. This is how you will gain the most exposure.
We at the-websol.blogspot.com are helping many of our clients to establish themselves within the social networking communities and are seeing impressive results. If you require some help with sites like this or with any other SEO techniques, we are here to offer our assistance.

 

AdWords Finally Gets Phone Support

AdWords Finally Gets Phone Support

It’s something that those who have been involved in search engine marketing have been crying out for for years! But the day has finally come – Google has announced that it now provides phone support for its AdWords program, meaning that people will hopefully be no longer stuck in the wilderness when they need questions answering about their PPC campaign.
It’s taken Google nearly ten years to implement this most basic of measures, but better late than never! We’re still waiting to test it ourselves to see how effective it is, but anything has to be an improvement on the methods used beforehand. Previously, getting an answer out of Google in regard to an account was a hugely drawn-out process. If the solution to your problem wasn’t on the official Google FAQ then email was pretty much your only saving grace, with responses and replies taking days at a time to resolve matters.
Google has added phone support for AdWords
Which is a shame because pay per click is an incredibly effective form of SEO, and Google’s main form of revenue. You’d think, in that sense, that they would do everything they could to make the service as accessible as possible. No use in complaining now though, a problems been identified and fixed, and we’re looking forward to seeing how easy accounts are now to manage in a practical sense.
As far as SEO services go, PPC is a great tactic to employ for quicker results and an overall long-term improvement in your ROI – as long as you have the budget, that is. PPC can be an expensive business, but a good campaign done the right way can get you fantastic results!

April 11, 2011

Google Has Stopped Street View Photography In Germany

Google Has Stopped Street View Photography In Germany

street-view-car-small
In most locations, Google sends its Street View cars out on a repeated basis “to make sure the information is accurate and kept up to date,” as the Street View website explains.
But that’s not happening in Germany.
Despite the recent German court ruling that declared photography from streets legal in Germany, Google has stopped Street View photography there and says it has “no plans to launch new imagery on Street View in Germany.” A Google spokesperson says the company’s priorities have changed:
Our business priority is to use our Google cars to collect data such as street names and road signs to improve our basic maps for our users in a similar way that other mapping companies do.
Google will continue to show its existing Street View photos for the 20 German cities that are online now, but there won’t be any updates to those photos. It’s unclear if this decision is final, or if the company might change its plans in the future.
Aside from the mention of new priorities, the company isn’t saying why it’s stopped Street View photography in Germany. It’s easy to assume that the service’s difficult birth has factored into the decision. German officials raised objections almost as soon as Google announced plans to launch Street View there. After lengthy negotiations, Google eventually agreed to let German residents opt-out of having their buildings appear online, and nearly 250,000 German households and businesses took Google up on that offer. I’m not a programmer, but I can’t help wonder if the presence of so many blurred buildings — and the potential challenge of updating Street View while maintaining their privacy — is a factor in Google’s decision.

Woman Follows Google Maps “Walking” Directions, Gets Hit, Sues

Woman Follows Google Maps “Walking” Directions, Gets Hit, Sues

Is Google responsible for giving out bad directions through its Google Maps service? We’re about to find out. After Googling walking directions for a trip in Park City, Utah, Lauren Rosenberg claims she was led onto a busy highway, where she was struck by a vehicle. She’s now suing Google for damages.
The case, Rosenberg v. Harwood, was filed in Utah, in the US District Court’s Central Division (Gary Price of ResourceShelf tipped us to it today). Harwood is Patrick Harwood, the person who actually hit Rosenberg, according to the suit. Both Harwood and Google are being sued in the same case, for damages “in excess of $100,000.”
Rosenberg used Google Maps on January 19, 2009, via her Blackberry, to get directions between 96 Daly Street, Park City, Utah and 1710 Prospector Avenue, Park City, Utah. Google provided these, telling her as part of the route to walk for about 1/2 mile along the calm-sounding “Deer Valley Drive.”
That’s an alternative name for that section of Utah State Route 224, a highway that lacks sidewalks, the case says. Rosenberg wasn’t warned about this, putting Google directly at fault in the accident, the case claims:
Defendant Google, through its “Google Maps” service provided Plaintiff Lauren Rosenberg with walking directions that led her out onto Deer valley Drive, a.k.a. State Route 224, a rural highway wit no sidewalks, and a roadway that exhibits motor vehicles traveling at high speeds, that is not reasonably safe for pedestrians. The Defendant Google expects uses of the walking map site to rely on the accuracy of the walking directions given….
As a direct and proximate cause of Defendant Google’s careless, reckless, and negligent providing of unsafe directions, Plaintiff Laren Rosenberg was led onto a dangerous highway, and was thereby stricken by a motor vehicle…
Here’s the route:

In the screenshot above, you can see that Google quite clearly warns:
Walking directions are in beta. Use caution – This route may be missing sidewalks or pedestrian paths.
That would seem to negate part of the suit’s claim — except that Rosenberg used a Blackberry. The Blackberry version of Google Maps might not have carried this warning. I don’t have a Blackberry so can’t see myself, but I’m checking on this. I know that on the iPhone version, there is no warning.
Certainly it seems embarrassing for Google to be routing people onto busy highways when they explicitly use the “walking” directions option. But then again, Google’s not alone. Bing does the same thing in its directions, which also contain a warning (at least in the web version):

Part of the issue seems to be that there’s no easy-to-find pedestrian path between these two points in Park City. Looking at the satellite view on Google Maps, there appears to be an alternative dirt path that runs roughly along the same direction. But I can’t tell if this was open to public use or not. Since it’s not along a major road, it’s something that Google Maps probably didn’t pick up.
Instead, Google’s making its best guess. That can be laughable to annoying, when it gets things wrong. Some examples:
  • Google used to advise swimming across the Atlantic Ocean to get from the US to Britain, as a joke
  • Google, before walking directions were added in July 2008, took a lot of ribbing for turning a 30 second walk in Sydney into an 18 minute car trip
  • When Google Maps rolled out its new bike directions feature in March, I found in my area, it made some wildly bad guesses
But are Google’s bad guesses also dangerous? I suspect a court is going to find that despite getting bad directions from Google (or a gas station attendant, a local person or any source), people are also expected to use common sense. So when you come to an intersection like this, as Rosenberg would have come to before crossing onto the highway:

You might be expected to consider for yourself whether it is safe to continue. Or when you’re walking down the road itself, and it looks like this:

It becomes self-evident there’s no sidewalk and probably not a good place for pedestrians to walk, regardless of whether you got a warning from Google or not.
Here’s to Google improving its directions and perhaps using more common sense of its own, understanding whether a street is a busy highway and maybe simply not offering routes when it doubt, rather than guessing.
And here’s also to common sense about anyone following any directions they’re given.

Reports: Google CEO Page Ties Bonuses To Social Success, Reorganizes Google Mgmt. Team

Reports: Google CEO Page Ties Bonuses To Social Success, Reorganizes Google Mgmt. Team


2011 bonuses for Google employees are reportedly being tied to how well the company fares in its efforts to integrate social elements across Google products. That’s according to two reports from Silicon Alley Insider — reports that say the news first spread in a company-wide memo last Friday from co-founder (and now new CEO) Larry Page.
There are also reports tonight that Page has promoted a half-dozen Google execs to new senior vice president roles; more on that below.
SAI has published a screenshot that it says is an internal Google FAQ describing a “2011 Multiplier” affecting employee bonuses. The screenshot says, in part:
For Googlers on the Company Plan, the multiplier has both an upside and a downside. It can range from 0.75 to 1.25 depending on how well we perform against our strategy to integrate relationships, sharing and identity across our products. If we’re successful, your bonus could be up to 25% bigger. If not, your bonus could be as much as 25% less than target. We all have a stake in the success of this effort and this multiplier is designed to reflect that.
(emphasis is mine)
Google’s missteps in the social space are well documented. Google Wave has already been shuttered, and Google Buzz has caused the company a number of headaches … not to mention lawsuits. Google bought services like Jaiku and Dodgeball, but later closed both.
Last September, Google began talking about a new plan: a social layer across its products. That plan has started to be realized in recent weeks with the expansion of social signals on search results’ pages, and last week’s launch of Google +1.
Page hints at the importance of Google Profiles when he refers to “identity” in the quote above. Most, if not all, of Google’s social efforts are dependent on getting searchers to create Google Profiles – and the company has been improving, promoting and emphasizing those for a few years now.
But for more and more Internet users, profiles belong to Facebook. Facebook gets about 25% of all US page views and was the most visited US web site in 2010. And it’s not just a US thing. Just look at comScore’s 2010 Europe Digital Year in Review, which reported that Europeans spend more time on Facebook than Google and showed Facebook’s dominant position among social networking sites in more than a dozen countries.
comscore-facebook-europe
So the big question might be this: Do users really want Google to be more social? Or has that ship already sailed? Google employees may get an answer in their Q4 bonus checks.
There’s more discussion of this on Techmeme.
Meanwhile, the LA Times is reporting that Page also completed a major overhaul of Google’s management structure today (Thursday). The Times reports the following promotions to new Senior VP status:
  • Andy Rubin to senior vice president of mobile
  • Vic Gundotra to senior vice president of social
  • Sundar Pichai to senior vice president of Chrome
  • Salar Kamangar to senior vice president of YouTube and video
  • Alan Eustace to senior vice president of search
  • Susan Wojcicki to senior vice president of ads

Gov’t To Okay Google-ITA Deal After Google Agrees To Burdensome Conditions


Gov’t To Okay Google-ITA Deal After Google Agrees To Burdensome Conditions

google-ita-featuredThe Justice Department (DOJ) has signed off on the Google acquisition of travel software company ITA, but says Google must meet certain conditions. The Justice Department said that as originally proposed the acquisition would have “substantially lessened competition among providers of comparative flight search websites in the United States.”
The DOJ is requiring Google to do a number of things if it wishes to proceed with the ITA acquisition:
Google will be required to continue to license ITA’s QPX software to airfare websites on commercially reasonable terms.  QPX conducts searches for air travel fares, schedules and availability.  Google will also be required to continue to fund research and development of that product at least at similar levels to what ITA has invested in recent years.  Google will also be required to further develop and offer ITA’s next generation InstaSearch product to travel websites, which will provide near instantaneous results to certain types of flexible airfare search queries.  InstaSearch is currently not commercially available, but is in development by ITA.
In other words, Google must effectively operate ITA as though the acquisition had not been made: continue software development and continue to license the platform to competitors. The proposed settlement also restricts how Google might use information it obtains from its operation of ITA:
To prevent abuse of commercially sensitive information, Google will be required to implement firewall restrictions within the company that prevent unauthorized use of competitively sensitive information and data gathered from ITA’s customers.  The proposed settlement delineates when and for what purpose that data may be used by Google.  Google is also prohibited from entering into agreements with airlines that would inappropriately restrict the airlines’ right to share seat and booking class information with Google’s competitors.
Beyond this, there will also be an enforcement mechanism to ensure that Google complies with the above and doesn’t do anything “unfair” with ITA:
The department said that Google will also be required to provide mandatory arbitration under certain circumstances and provide for a formal reporting mechanism for complainants if Google acts in an unfair manner.
It’s now Google’s move. The company could simply walk away from the acquisition or it could accept the DOJ’s settlement terms or fight the litigation.

April 03, 2011

Meta Description and Content Management Systems – SEO and Social Media Factors

Meta Description and Content Management Systems – SEO and Social Media Factors


Most of us know that while the search engines no longer consider the meta description in their ranking factors, this element of your page is still important in getting traffic to your site.

How Meta Descriptions are used on Search Engines and Social Media Websites

1. While not a ranking factor, the meta description is still important in organic searches as the search engines use this in the ‘snippet’ that appears below the title in search results.  The more relevant and enticing this snippet is, the more likely you are to get users to click on the listing.
Google Search Results showing Meta Description used as snippet for HighRankings.com
2. Many social media sites include the meta description when you share a link. As with organic search listings, a relevant and enticing description here is more likely to get clicked on and shared on further. (See this post on using titles for social media success for more information on how links are posted on social media sites.)
HighRankings’com page posted on Facebook showing Meta Description.
HighRankings’com page posted on Facebook showing Meta Description.
So to have relevant, enticing descriptions across platforms for all of the pages on your website, it is best if you can insert a unique meta description for each individual page, but sometimes this is not feasible.

How Content Management Systems deal with Meta Descriptions

These days, most Content Management Systems provide a field for meta description for each page (or you can use a plugin/add-in that creates this feature). After many years of coding by hand to have complete control I’ve selected to use WordPress with Genesis Theme Framework [http://www.studiopress.com/] which allows you to create unique Titles, Descriptions, etc. for each page – as well as for categories and tags.
Genesis Premium WP Theme Framework SEO Options and Settings
Genesis Premium WP Theme Framework SEO Options and Settings
I train my clients in how to best use these fields, but very often I find that they do not enter anything in these fields. If you don’t enter anything in the Meta Description field then there is none, there’s no default or automated description using this theme framework.

What if a page has no Meta Description?

While if you don’t enter a Custom Document Title the system makes the post/page title the <title>, the team at Studio Press, developers of the Genesis WordPress Theme Framework, have decided not to generate a default description based on input from SEOs like Joost de Valk who feel that if you are not going to handcraft a good description for a page yourself, you are better off letting the search engines generate the snippet for you, “…most plugins pick the first sentence, which might be an introductory sentence which has hardly anything to do with the subject, or another sentence with a keyword in it, which might be completely wrong to pick as description.
SEOmoz states that in some cases you should deliberately choose to leave out the meta description, “…if the page is targeting longer tail traffic (3+ keywords), for example with hundreds of articles or blog entries or even a huge product catalog, it can sometimes be wiser to let the engines themselves extract the relevant text.”
In the image at the top of page showing Google search results for “meta description” the listing for Wikipedia illustrates the benefit of letting Google create the snippet.  The page covers the Meta Element in general and the snippet shows the specific text about Meta Description.  A handcrafted description would probably be too general and an automated meta description may take just the first paragraph of the page: Meta elements are the HTML or XHTML <meta … > element used to provide structured metadata about a Web page. Multiple elements are often used on the same page: the element is the same, but its attributes are different. Meta elements can be used to specify page description, keywords and any other metadata not provided through the other head elements and attributes.”

When the Search Engine Generated Snippet Doesn’t Work Well

But the search engines don’t always do a good job with the snippets. Take the Google Webmaster Blog for example:
Google search result for “importance of meta description”
Google search result for “importance of meta description”
Google doesn’t even take its own advice in the post, probably because it has so much confidence in its own ability to create a snippet.  This post has no meta description so the search result shows a snippet that is taken from content on the page related to the search.
In this case it is a comment posted by a reader, not even from the main post where the information on the topic should be. So a handcrafted description would probably represent the page better, but even if the first paragraph of the post was used as a default it would have made a good meta description:
The first paragraph of the Google Webmaster Post 
The first paragraph of the Google Webmaster Post
With no meta description, the description of the page when shared on other sites is less than helpful. In sharing the link on Facebook it is again using a comment which is not that helpful.
The result when sharing the post on Facebook, showing a poor description as there was no Meta Description.
The result when sharing the post on Facebook, showing a poor description as there was no Meta Description.
I do agree that automatically generated meta descriptions are not necessary as far as organic search results are concerned, but a description of some kind is important when sharing links on social media and some directory sites using a sharing tool or another automated method. Obviously a handwritten, unique meta description is the ideal and for best results we should aim to create these, but being realistic, and honest – I have to admit that I’ve caught myself leaving out the description more times than I am happy with – there will be many times when the meta description is not filled out.
I also find it annoying when I go to share a link to someone else’s site on one of my social media accounts and the description is not helpful so I either delete it or rewrite it.  If you want people to share your links on a regular basis, it’s best to make it as easy as possible and for the information on the post with your link to be as relevant as possible.
For me an automatically generated meta description would work because I just think it’s good SEO copywriting, and copywriting in general, to include an opening paragraph which describes what the post is about and includes your keyphrases.  I coach my clients to do the same, but many people do not write this way.

Balancing out descriptions for Organic Search Results and Social Media Sharing

So should a Content Management System include a default meta description if you do not add one, perhaps taken from the first 150 characters of a page? Traffic from organic search results is still definitely a more important source of traffic for nearly all websites, but social media marketing has become an important part of many online marketing campaign and the ability to share links from your website is an important part of this.

Are there other options for creating descriptions?

Open Graph protocol [http://ogp.me/] is another way to indicate the information to be posted on social media sites. Perhaps its use will become common place amongst developers of Content Management Systems, themes and plugins/add-ons; as these can be set to automate the description for social media sharing separate to the meta description.  Or the developers may come up with new ways to handle this issue.
Until then I’ll have a big post-it on my computer, “Don’t forget the Meta Description!”, and if you’ve found another way to address this issue with Meta Descriptions, I’d love to hear all about it.

What’s Going On??? And What Do You Do About It?

What’s Going On??? And What Do You Do About It?


So unless you’ve been living on a rock, there are five trending topics that are dominating the online conversations recently:
  1. Charlie Sheen – people are analyzing his actions and talking about how brilliant he is at creating a conversation and how people should learn to get the exposure he has gotten.  My personal opinion is that he had no master plan that is backed by brilliance.  The man has some issues and the fallout just happens to be captivating America.  I am going to pass on further commentary on this topic.  Enough people are covering it and I’m just not buying that we should spend a minute more analyzing him and his “results” on the web.  If ya disagree, comment below and let me know why…otherwise let’s move on!
  2. The earthquake in Japan and the related Tsunami warnings – My heart goes out to the people impacted by this shocking and horrible tragedy.  I plead with everyone that is able, to rally in the coming weeks to support the people in need and donate whatever you can.  There is a lot of talk about how social media played a role in spreading information and it really is interesting to hear the different ways that social media has changed how things play out when there is a tragedy.  I am also not going to spend time talking about this one.  There are lots of articles out there to read if you are interested.  I just wanted to talk a minute here to acknowledge those suffering and pledge to do my part in helping them.
  3. The Facebook changes – OK this one I will cover briefly below, so keep reading.  I have a couple resources to mention.
  4. The Google Farmer/Panda update.  One would think this topic doesn’t need yet another post breaking it all down, but based on questions I am getting and bizarre advice I see floating around out there – I think it DOES actually need some more coverage.  So keep reading for that.
  5. Large companies being busted for poor SEO practices.  This one I am just going to cover right now with just one statement: Black hat, nefarious SEO practices don’t work long term, you will get caught and doing it right the first time, while it may take longer to get results initially is the best way to go for 2 reasons – you’ll be able to sustain long term success and it’s usually better for your site visitors (when you are providing real content of value and obtaining links that are legit and linking to content that is worthy of a link)
OK, so now let’s break down #3 and #4 a little further…

Facebook:

The new Fan/Business Pages are now live as of Friday March 11, 2011
So anyone that didn’t upgrade their page prior to Friday had it happen automatically.
Anyone creating a new Fan Page with a custom tab going forward needs to use iFrames and not the Static FBML app now.  (Don’t worry, if you have a current page with the Static FBML tab they are still supporting it and it will display just fine – you just can’t add new ones anymore.)
The really cool thing about the iFrames is that you can have your designer create a tab that looks and feels like your current website or Blog theme and then they just place it on your page (don’t forget it needs to be 520 pixels wide)
Need some resources/apps for the new iFrames?  Here are a couple good ones:
(Note: the iFrames do require some coding knowledge)
Static HTML: iframe tabs
Easy to use and getting good feedback from users so far.  One click installation and allows for showing content to fans and non-fans.  You have to use an outside editor (either via code or WYSIWYG editor) and then paste it into this tool.  The best news:  it’s free.
iFrames for Pages by Wildfire – Another easy tool.  Also easy to display fan and non-fan content. This one also requires you to create your own code and then paste it in.  Also free.
There are lots of other tools out there two, but these 2 seemed worth mentioning.
Fan Pages are becoming more important than ever.  We are seeing acquisitions of companies that offer various Fan Page technologies and apps.  It has finally become apparent to some big players in the industry that these pages are here to stay and they are a strong marketing tool for site owners to use in their arsenal of marketing strategies.(why does no body listen to me, I’ve been posting this importance of these pages for the past year! LOL)
While it’s true you definitely need some coding knowledge to be able to create a killer tab for your Fan Page, it’s not really that different than when we were using the Static FBML app, you needed coding knowledge for that as well.
Stay tuned as things evolve and more cool apps start popping up!

Google

Now time to jump into Google’s latest change – which has caused more hysteria than we usually see.
For those that don’t know – Google made a change that impacted about 12% of search results.
The goal of the recent update?  To remove poor quality sites from the top of Google’s results pages.  (A great goal in my opinion)
Here is what Google’s Matt Cutts had to say:
“This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
The reason behind the change is obvious – if searchers aren’t finding quality sites they are likely to leave and start searching on another engine (like perhaps Bing – which is now officially the second place engine according to recent stats!).  Google became Google by delivering high quality results that people could count on.
Most people should be OK with this change – unless your site was impacted by it.  If you were impacted and it was well deserved (read as: your site sucks and you were lucky you got away the rankings for as long as you did) – there is something you can do: improve your site and your content.
If you were impacted and don’t feel you should have been, let Google know: http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en
Let’s break down the problem areas and include some suggestions to get them fixed.
Through information that Matt Cutts and Amit Singhal have shared, we can surmise the following:
Sites that have a high percentage of duplicate content are going to be considered low quality.
Let’s look at this scenario: If you have a page here and there that has duplicate content but the rest of the site has unique content, those pages with the dupe content may not fare well but your site as a whole should still do OK.
But when going page by page, if the engines are hard pressed to find quality, unique content then it will end up impacting your site as a whole.  Google’s official take on this: “low quality content on part of a site can impact a site’s ranking as a whole.”  So be careful!
For those that repost content and then add a line or two with their opinion on the content, and feel that is enough unique content to satisfy the engines.  That is going to be problematic going forward!  If lots of pages have only a small percentage of the page’s content being unique, those pages won’t fare well either.
Google also seems to be looking for pages that have a high volume of ads that aren’t relevant to the content of the page (especially if the ads are higher up on the page).  These pages will not fare well in today’s Google.
If your site happened to rank well for a phrase but the phrase is not really present in the Title tag or the body content, the page isn’t going to continue to do well going forward.
Keywords being repeated over and over and over and over will trigger the new algorithm to devalue the page.  Keywords should be used appropriately.  (One test many copywriters share is reading your content out loud, your brain processed it differently when it hears the text out loud and you’ll know if there are too many keywords stuffed in there).
Now here is something lots of people are going to struggle with – high bounce rates and low time spent on the site can impact rankings.  This is great, in my opinion.  It forces people to really look at the quality on their site and improve things so they entice visitors to not bounce and to stay longer.  It means strong marketing principals and good copy are going to be more important than ever.  The sad truth is this should have always been important to site owners – but most can’t see beyond their quest for the holy grail (top rankings) and they don’t pay attention to what people do once they get to the site.
CTR (clickthrough rate) from the SERPs (search engine results pages) will also impact rankings.  If your site comes up a lot but no one is clicking, you can be sure it’ll impact things.  So what can you do to improve clickthrough rates in the SERPs?
Here are a few ideas:
Pay attention to the length and be sure your descriptions on the SERPs aren’t being truncated before you reveal the most important info.  In other words, look at where Google is pulling the data from that they display in the SERPs (usually your Title tag is what they use to create the main headline for the listing) and make sure that area of text is compelling, and that is explains exactly what the page has to offer, it’s also helpful if it includes a keyword phrase.
Ex: A Must Have Comprehensive Guide: Learn How To Care For Your Pet Turtle – not so good – the keyword phrase is pushed to the back of the phrase and will likely get truncated.
Ex: How To Care For Your Pet Turtle – Must Have, Comprehensive Guide  – much better – they keyword phrase is at the start but you also still got the descriptive compelling content in there.
Many companies insist their branding info be put in the Title tag.  If it has to be there, put it at the end.
Think of your Title tag like a little ad – be descriptive of what the page is about but be compelling so people want to click through.  Promise a benefit (as long as your content delivers on what you promise)
OK, back to what Google is now looking for after the most recent update:
They don’t want to see boilerplate content (content that is the same and repeated on every page)
Sites that have a lot of low quality inbound links aren’t going to do well.
Sites that aren’t getting some social exposure (mentions and links from social media sites) won’t do well.  (Yep we keep saying it, and it’s true – you need to be on Twitter, Facebook and LinkedIn)
So take all these factors and add them up and see how your site is going to fare overall.
If any of these factors is relevant to Panda, it is unlikely that they will be so on their own.
In a recent interview Matt Cutts was quoted as saying “Whenever we look at the most blocked sites, it did match our intuition and experience”.   Hmmm, notice the usage of the word “blocked” – he seems to be saying if you don’t score well on the areas outlined above, your site will be blocked.
So what do you do if you’ve been impacted by this update:
In Google’s words…
“If you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”
Review your web stats and see what pages have taken a hit lately – you can work on improving those pages first (but a site wide review is never a bad idea).  See if you can identify the difference between the pages that are still faring well and those that have been slammed.
If you really want to fix things, take an in-depth analytical approach and note which of the factors noted above are present on each page and strive to eliminate all the problem areas.
Focus on improving your content and the user experience and you should be just fine.
Build your exposure online via social media networks.

Use the AdWords Dashboard

Use the AdWords Dashboard


This post is focused around outlining some of the free tools and features that Google AdWords offers but people often don’t make full use of because they forget or don’t know they are available. In the past many AdWords campaigns were monitored from within Bid Management platforms or from within Google Analytics but the new AdWords home tab is seeking to change all that.
The past problem with running your campaigns within the AdWords interface was the limited ability to get a quick snapshot of account performance and quickly identify problems to resolve. The old platform was next to useless and most people quickly starting using the campaigns tab as the default page for their reporting and analysis even when using Google Analytics for discovering actionable insights has issues due to a 24-48 hour delay in importing data.
Old Dashboard

Benefits of PPC Dashboards?

The biggest benefit of the new AdWords home tab is that the your dashboard can be customised so you are able to focus on the campaign metrics that matter to each member of the team. It can be setup so your account manager, marketing manager and CEO can have a unique dashboard they are able to quickly view and monitor the top level details. While the new dashboard feature has been around for a about 2 months in most accounts it’s likely that most people have failed to notice the update as their default tab is campaigns or they never bother to login to their AdWords account.
One of the better parts about the new dashboard feature is that it can re-use your existing filters that you have created and been using to refine data showing in your campaign dashboard. The other benefit of the new dashboard interface as you don’t have to deal with the limitation that the old keyword performance module was not available for accounts with more than 10,000 keywords which made the previous module fairly useless for most large accounts.
New Dashboard

So How do I create new modules?

You can use the AdWords interface to create and save filters that can then be added as modules to display on your home tab which give you a lot more control over what information is contained on your dashboard. Some of the important filters I think most accounts should have are:
  • Keywords over your Average CPC Rates
  • Keywords over your Average CPA Rates
  • Keywords with a higher than average conversion rate
  • Keywords with a lower than average conversion rate
  • Keywords that have a lower than optimal ad position
  • Keywords that have a lower than optimal CTR
  • Keywords that changed status to limited, issues or not eligible to run
While these are just a few filters I have created that should help you focus on attaining a better ROI but also insure that your AdWords campaigns remain conversion focused. The dashboard modules work best when they are tailored to suit your campaign metrics and it is typically better to use multiple rules for your filters to get only the necessary information. You have the ability to quickly see just the keywords that are driving a bulk of your traffic, those giving you the most exposure via impressions or those making the cash register win via conversions.
Customize Modules

What about information overload?

Probably one of the best features of the AdWords dashboard is that if you label your filters well you can minimise the modules and you can see in the screenshot below that the count of items that qualify for that filter are shown in brackets.
Mini notifications

What about mobile dashboards?

A growing number of businesses need to have someone checking the campaigns when they are on the road or out of the office and credible mobile solutions are becoming increasingly important for both small companies and agencies using AdWords but is extending to the C-level with an always connected global work force. The new dashboard does load on mobile devices such as iPhone, Android and even WindowsPhone7 but besides quickly checking stats it doesn’t offer the full functionality people expect from mobile apps.
The full mobile version offers a bit more functionality as you can view your saved filters and view any custom alters for key accounts events but is not yet available for all mobile devices such as the Windows Phone 7.
While AdWords for mobile does offer a fairly limited level of functionality it’s likely the functionality will be vastly improved as Google seeks to move it’s platform into the hands of marketers so they can access it anywhere they can get a mobile signal. Outside of the update AdWords mobile dashboard there is only a limited number bid management platforms like Acquisio that have mobile apps for management of your PPC campaigns, so it’s hoping that Microsoft AdCenter rolls out a mobile application soon otherwise if you use platforms outside of Google AdWords you may have to consider a Bid Management platform if you spend a fair bit of time out of the office or at conferences.

What is the limitations of AdWords dashboard?

The downside is the information contained in the modules is not placed into context as you could in the previous version where you could see the top level performance of a particular campaign. The screenshot below shows the ease that you could see how individual campaigns were performing against a previous period of time but was limited as it could not be broken down by AdGroup, Keyword or even select multiple active campaigns only.
Old performance

Google AdWords Automated Simplicity

Google AdWords Automated Simplicity


It has been long rumoured that Google was working its own bid management platform. It now appears that it is slowly being added to Adwords accounts as part of a bigger roll out. It’s interesting that there has been no notification to users that have it enabled; a new button just appears next to “alerts” on the campaign dashboard.
It seems that Automated Rules is a fairly obvious extension of the AdWords platform and certainly worth spending some time seeing how it can reduce some time-consuming daily tasks. There are always more suitable Google AdWords Tools for any campaign depending on budgets, technical skills and complexity of project but Automated Rules does have some good potential.
create rule

What is Automated Rules available for?

  • Ad Groups
  • Ads
  • Keywords
  • Campaigns

What is Automated Rules not available for?

  • Display Network
  • Ad Extensions
  • Remarketing

What is Automated Rules created for?

Based on the Google documentation around the product, it is initially focusing on 3 types of common changes that AdWords users regularly perform on their accounts.
  1. Status changes – pause an ad or keyword when it has spent its allocated budget
  2. Bid Changes – raise the bids to first page estimated CPC when conversion rate drops
  3. Budget Changes – raise or lower budget on a particular day of the week

Who is AdWords Automated Rules suitable for?

  1. Businesses without in-house PPC resources
  2. Those not currently using a bid management platform
  3. Simple campaigns that are measuring on Clicks, Impressions and Conversions.
  4. Improving account management ensure you are keeping within targets on Clicks, Costs or Conversions.

Who will lose out to Automated Rules?

Initially it will be those who are using entry level Bid Management platforms like PPC BidMax who might shift to using Google’s Automated Rule product for their grunt work and do their reporting within Google Analytics. While the new Google automated feature does remove a large percentage of tedious tasks of your AdWords campaign management one of the reasons I would still advise exploring the use of a bid management platform is for automation of reports. The Google AdWords platform does not offer the easiest interface for generating reports and often they require data to be exported into Microsoft Office for analysis and presentation.
requirements

What ppc bid platforms won’t be impacted anytime soon?

A number of the leading bid management platforms offer a far more extensive number of rules, conditions and reporting functionality than the Google Adwords platform is ever likely to offer so if you like what Automated Rules can do but want something more consider looking at one of the leading bid platforms that can also work with Facebook & Bing.
  • Acquisio
  • Marin Software
  • Clickable
  • Kenshoo

How does Automated Rules work?

When you create a rule you set what the rule applies to and what is the automatic action when preset requirements are reached. The rules can also be applied as a one off rule, daily, weekly or monthly as anytime within an hour you select. The analysis can be based on your account data from the current day, week or month but also up to one previous calendar month. The most interesting set of data is to use from the previous business week (Mon-Fri) which allows for a large number of advertisers to increase volume and quality of traffic to their site during the peak traffic times of Monday to Friday.
frequency

Where does Automated Rules fail?

  1. If the AdWords platform is down for maintenance and your rule is scheduled to run, no changes will be made, where a bid management platform will retry later.
  2. Any rule you create will expire after one year, which makes the product not scalable.
  3. If you have too many elements being evaluated you will experience a timeout and your rule will not be applied. This is a problem for large accounts.
  4. You can have only 10 active rules per user, per account.
  5. The rules can only be applied as frequently as once a day, unlike bid management platforms that made run rules every few minutes.
  6. You cannot use data from previous weekends, only business days.
  7. It is only suitable for Google AdWords campaigns and you will likely run on other platforms.
  8. The rules have to be created manually; they are not highlighted like Google Analytics Insights does for creation of Advanced Segments

What fail safe measures does Automated Rules have?

It’s great that the AdWords team has actually taken some time to build in some fairly extensive list of fail safe measures to track and record what was done by Automated Rules, but how easy they are to undo is not yet clear. You can and are encouraged by the notifications to setup an email to be sent every time the rule runs or only if there are any errors experienced when the rule was applied to your account.
The other great feature is the preview results button which allows you to see how many keywords, ads or ad groups that rule will be applied to. There is a log of automated rules so you can monitor when they ran and you can click to view what changes were made and you must authorize Google AdWords to allow Automated Rules to make changes before you can apply any rules.
preview

How do you get it?

Contact your AdWords account manager to get your account whitelisted for the beta test if it is not already visible in your account.

PPC Tool Review: QueryMiner Free Negative Keyword Tool

PPC Tool Review: QueryMiner Free Negative Keyword Tool


I’ve blogged on more than one occasion about a major issue I have with many of Google’s AdWords features and settings: the ROI focus tends to be more around Google’s ROI than yours as the advertiser. Even with a new feature like AdWords optimize for conversions where it looks like Google is legitimately aiming to help advertisers leverage a feature aimed at better ROI for them, there are some major holes.
This represents a major opportunity for paid search software vendors, and it should also prod you, the AdWords advertiser, to investigate opportunities to leverage tools and processes outside of AdWords to ensure that you’re working to build your business and not Google’s. And indeed this is precisely where smart PPC software companies like Click Equations, Acquisio, et al are focusing their efforts.
I recently came across a tool that fits into this category as well: QueryMiner’s free negative keyword tool.  The tool is pretty slick and most importantly it’s very heavily focused on ROI, cost and conversions for advertisers.

What is QueryMiner Trying to Accomplish?

According to the splash page:
Most AdWords accounts have anywhere from 2%-35% of their budgets being wasted on irrelevant search queries.  Get your FREE analysis and see if queryminer is right for you.  It’s not unusual for the free negative keywords we give you to easily cover the cost of your queryminer subscription.
They also have some content on the site about their approach to search queries. Lots of smart people have been talking about why search queries are more important than keywords in paid search. Most AdWords accounts make fairly liberal use of broad, phrase, and now modified broad match. As the QueryMiner team points out this leads to a lot of waste in many accounts, so getting a nice audit of where this waste may lie can be a pretty massive benefit to and advertiser. Like anything else in the search marketing tools space, though, the devil is in the details of how these recommendations are made.

How Does QueryMiner Recommend Negative Keywords?

This is really the most intriguing thing about the tool: it’s entirely based on conversions. A pretty common process for paid search advertisers is to analyze search queries and look for negative keyword opportunities by:
This can be a really powerful process, but there are a few issues you often run into:
  • Lack of Data Per Query – Search queries aren’t keywords, and often times single search queries just don’t have a lot of data attached to them (and if they do, then they’re often core terms within your campaign that you wouldn’t consider setting as a negative).
  • Lots of Data to Look at In Aggregate –While there is often a lack of data per query, there is often a lot of data to mine in aggregate: this makes it difficult to find actionable negative keyword candidates because you’re analyzing a lot of different statistically insignificant data sets.
  • A Need to Look at Wrong Data – To solve this problem most PPC analysts will simply opt for the biggest possible data set to try to get more information about each search query. The issue here is that you’re looking at data a year, sometimes, more back that is far less relevant than “fresher” data (your campaign might have been structured completely differently a year ago and what’s become a winning query because of optimization may look like a loser because it performed poorly when you had bad ad copy and a poorly designed landing page).
QueryMiner has a novel approach that solves for a lot of these issues: it clusters together like terms and bubbles up clusters of terms that are spending significantly and not converting.
So for example you might only have a few clicks on the term “baby toys” and a few more on “organic baby toys” and a few more on “baby toys that aren’t expensive” and so on. By pulling a standard report you may struggle to get enough data on these terms individually to know for sure that they should be paused or are good negative keyword candidates. But what if the aggregate of everything with “baby toys” in the query had spent over five thousand dollars without converting?
These are the types of insights QueryMiner is aiming to bubble up.

Getting Started with QueryMiner

One thing that would improve the tool some for me would be AdWords API integration, but at the moment you have to grab a search query report, upload the file, and “stake your claim” (run your report). The tool is also still in beta and relatively new, so you may run into some hiccups here and there.
Setup is still a pretty easy process on the whole, and they even offer a handy tutorial on grabbing a search query report on the landing page, but I’d anticipate API integration to be on tap for the product. Despite the fact that QueryMiner allows you more flexibility in the volume of data you have to analyze, it’s still a good idea to grab as much data as is logical for you so that the tool can give you more suggestions around possible negatives. Once you have your search query report pulled and uploaded you’re off to the races.

Analyzing Your Results

Believe it or not we’re already at the meat of the tool’s utility with that one simple step. The ease of use of the tool is really pretty nice. We’re already on to a screen with some really interesting data after uploading our search query report.
The tool has a “freemium” model so it gives you some of your data at no cost then requires you to sign up to get the rest. The nice thing is you get a glimpse of some of the things they’ll find with a deeper report, and it shows you what percentage of the data they have you’re seeing – so you can gauge the value of the data you’re seeing for yourself and decide whether it’s worth it to move forward with a full account. My guess is for most paid search accounts that are of a decent size the value will be worth it. Here’s a sample of some things they found in a real AdWords account:

Here we see three pretty egregious examples of irrelevant terms that are spending and not converting. The “negative” is representative of the entire cluster – what it’s telling us is where there’s a negative cluster of terms spending a lot in aggregate and which campaign they live in. This is now a great opportunity to take these terms and make them negatives.

Final Verdict

Because of the “try before you buy” design of the tool I think it’s worth any advertiser’s time to get their initial list of possible negatives, determine the value of the sample, and calculate whether a subscription is worth the cost. In terms of the larger picture of paid search software, I think the data-driven and conversion centric approach to analyzing a metric that Google doesn’t make obvious (search queries are hidden underneath keywords) is indicative of the future of paid search software. Whether or not QueryMiner can bring enough value to your account to justify the cost, it’s precisely the type of tool you should be looking to augment (and in some ways counter-act) Google’s suite of free tools.

Analysis of Online Marketing Campaigns Effectiveness from A to Z

Analysis of Online Marketing Campaigns Effectiveness from A to Z


Many people wonder why the last financial crisis almost hasn’t affected the Internet advertising market, while other industries had huge losses. Some experts said that the industry didn`t have much money, so it had nothing to lose. But other experts said that small and medium businesses (the basis of online advertisers) have not suffered really much.
We think that another answer is truer: Internet advertising allows arranging target expenditure of the advertising budget. You can deliver your advertising message to clearly defined audience at time when this audience is ready to perceive it. Also you can easily define which of advertising channels is the most beneficial, which of your ad texts / graphics / pages … brings more sales.
We will consider the most important aspects of building a complex analysis system of marketing effectiveness (not only online marketing but also offline) in this article. Also we will highlight the most important points you should consider during your work with incoming data.

Setting business objectives and KPI

The first stage is setting business objectives. You should define your company purposes, divide the company to departments. Below is an example of departments and their goals for an online shop:
Sales
  • purchases percentage increasing
  • increasing of products amount in one order
  • increasing of an average bill of one purchase
Marketing
  • decreasing of marketing expenses for each purchase
  • brand awareness increasing
  • growth of conversion rate – turning visitors into buyers
  • increasing of repeated purchases amount
  • increasing of registrations amount
  • increasing of references amount in the network
Call-Center
  • increasing of handled calls amount
  • growth of conversion rate – turning calls into sales
  • consultancy time reduction
  • reduction of calls not related to order placing (guarantee, uptime, etc.)
Web Development
  • higher pages download speed
  • reduction of downtime of a site
  • reduction of unavailable pages amount(the 404-th and other)
Search engine optimization
  • increasing of visits from organic search
  • increasing of revenue from visitors who come from search engines
  • etc.
After it, you should define key indicators of efficiency (KPI) (material expression of your business purposes) on the basis of these purposes.
Analysis of online marketing campaigns effectiveness from A to Z
For example, you may use CRM system data to define the amount of purchases; brand search data on Google Analytics and type-in traffic to define brand awareness improvement; the number of “brand” mentions in Twitter during the calendar month to define increasing of references amount in the web, and so on.
As a result, we receive measurable indicators which will help to track the dynamics and understand if we are on the right way.
The result of your work may look like a table containing KPI and methods of KPI getting:

Establishment of a web analytics system

The establishment of a web analytics system is the next step in the effective marketing analysis construction. We recommend using Google Analytics because it is free and easy to operate. It features great functionality to receive almost all necessary data.
The installationof this system consists of two simple steps:
  • Installation of Google Analytics code (simply adding several code lines to all pages of a site)
  • Setting of goals tracking. You need to follow your KPI here. For example, if you need to get data about the amount of new buyers and the quantity of goods in one purchase, you should set two goals: “New customer” and “Successful sale”.
Also you should define other reliable data sources (for example, Twitter, keyword research in Wordtracker, call-center software etc.). And it is better not to change ways of data gathering, because it can influence data reliability.
Some tricks that can help to organize tracking of the most common goals:
1) Calls tracking:
  • You may set up phone numbers depending on traffic sources
  • Your employee may ask users to perform certain action on a site (to press «Thanks for your call» button,  to press «CTRL+Enter») which will be defined as the purpose «Successful call»
  • A combined method (If the client names, for example, the product code during the call, a special script is executed automatically on his computer)
2) Tracking files downloads (price list, case studies, company profile …). The simplest way is adding “onclick” parameter to links with these files, which allows performing a virtual page view for Google Analytics.
3) Tracking registered users. If you want to monitor behavior of registered users you may assign users to “custom variables” in Google Analytics while registering on a site, and then segment these users depending on values of these variables.

Advertising campaigns tracking

It is important to track maximum of marketing initiatives, because each of your advertising campaigns influence on your business. For example, it`s better not just to start advertising on TV, but also to control the result. If you notice that advertisement brings sales during, for instance, Champions League Final and brings nothing during an Indian soap opera, then don`t waste money on ineffective advertisement.
Let`s consider the most common challenges facing the marketer, who leads audiences to the website of your company:

Оnline campaigns tracking

  • Tracking PPC effectiveness
  • You may track Google AdWords effectiveness by integrating AdWords into Analytics.
  • Other PPC advertising systems (Microsoft AdCenter, FaceBook, MIVA, etc.) can be tracked by marking links with help of the Google and Excel URL builders (so it will be possible to understand exactly, e.g.,  which campaign in FaceBook and which keywords in Microsoft AdCenter lead to the goals achievement).
  • Internal mailing, banner placement, Twitter e. t. c.

Google URL Builder tool
Google URL Builder tool
Оffline campaigns tracking
  • Creation of marked links with the help of URL builders
  • Creation of interim pages with short URL addresses(www.amazon.com/SuperBowl)
  • Creation of automatic redirection (a user automatically is redirected through the chain)
http://www.amazon.com/SuperBowl ->
http://www.amazon.com/?utm_source=tv&utm_medium=video&utm_campaign=superbowl2011 ->
http://www.amazon.com/

Correct understanding of indicators

Even well configured web analytics system doesn`t guarantee its effective usage.
We recommend making reports based on business objectives and KPI you defined at the very first stage. It`s better to avoid abstract formulations like «the bounce rate has decreased by 10 %» or «the number of pages viewed by users has doubled». Try to be more specific.
You should always ask yourself “So what does it mean?” If you don`t manage to reach business objectives after 2 or 3 answers to this question, then this indicator doesn’t play any role.
For example:
  • The bounce rate increased by 10%. So what does it mean?
  • This causes reduction of page viewed by a visitor by 12%. So what does it mean?
  • This causes reduction of income from advertising by 8%. We`ve got the idea! We really need to seek to bounce rate reducing because bounce rate influences advertising revenue.
The main goal of web analytics is to identify problems, to suggest ways of their solving and to estimate results.
For example:
The problem is a low number of site visits.
Visits
The decision is to try the site search engine optimization.
The analysis of results is a report “Visits from search engines. “Visits amount increased by 56% for the three last months, the income from these visitors increased by ~200.000$ (+59%)
Visits
You should pay attention first of all to the next top 10 reports:
  • Visitors. The report “Map Overlay” allows you to identify the most attractive markets in terms of geography.
Map overlay
For example, we can see that Los Angeles, San Francisco and Denver convert much better than New York, Atlanta or Dallas.
  • E-commerce. The “Overview” Report allows you to define traffic sources and campaigns which give the greatest profit, to see the dynamics of conversions and revenue data for the certain period.
  • The report “Funnel Visualization” allows you to identify bottlenecks that cause the greatest difficulty for users on the path to the goal (for example, you can determine that 90% of visitors abandon the process of registration on the second page of a form)
Funnel Visualization
  • Traffic sources. The «Google AdWords» reports allow defining which Google AdWords campaigns give the best return of investments.
Traffic sources
  • Traffic sources«The Source and means» reports allow to compare the conversion of various methods of audience attraction (what the difference between direct, organic, CPC, banner, email etc. is).
Traffic sources
  • The “Top Content” report allows to define the most popular pages of your site, and increasing of conversion first of all on these pages; you can also determine which pages generate more revenue and which less (and fix it) using the $index parameter.
  • Traffic Sources Report “Keywords” (compared with the previous period) allows you to define which of the most valuable keywords (in terms of sales) began to bring less conversion.
  • The report “Search on a site” allows you to identify the most popular topics on your site and to provide users with correct content.
  • Traffic sources. The report “Direct Access” allows determining the level of interest in your brand and the dynamics of the core of regular customers (it shows brand queries in conjunction with the report “Keywords”)
  • Visitors. “New vs Returning” report allows to define if visitors are interested in a project and whether they are ready to come back there again and again.

Conclusions

  • One of the most powerful advantages of online advertising is the ability to define efficiency of each marketing measure.
  • You don`t need to buy expensive tools of web analytics because a free tool Google Analytics can solve 99% of problems.
  • Try to integrate tracking of all campaigns using the URL Builder.
  • Don`t seek to see as much reports as possible during analysis, base the marketing analysis just on your business objectives and KPI.
  • Always ask yourself the questions “So what does it mean?”and “What can I do with this?” for better orientation in indicators and umbers. A set of recommendations what to do is the best result of a good analysis.
  • Make a list of 10 reports which you will look through to obtain data which is necessary for the development of your business.

125 Social Bookmarking Sites : Importance of User Generated Tags, Votes and Links

The positive effects of social bookmarking for publishers of news sites, blogs and other web sites is outstanding. Social bookmarking can introduce the sites that you own or you like to others with relevant tastes, can drive traffic to your site, and valuable backlinks.
Some social bookmark sites like Propeller.com pass on link juice, while some use the NoFollow attribute. But do not let the use of NoFollow fool you, the search engines are looking beyond the incoming links from social bookmarking sites to gauge their value to their search indexes. The external metadata compiled via user generated descriptions, tags, titles and categorization is incredibly valued by the search engines, as in the same philosophy as anchored backlinks, descriptive content about a web site defined by the users of that site who are not associated with the marketing or coding of that site, can be extremely powerful in gauging the importance and relevance of the content and tags which are used on that site.
Bookmarks show how a site is perceived, and when these sites allow voting, they also show the engines or whatever classification system which monitors voting, how people feel about the quality of the site. Furthermore, social bookmarking can introduce a site to the search engines, as in some cases, people may find and bookmark a site or a site’s internal pages before a search engine can find those pages via another form of inbound link.
Monitoring social bookmarking services like Del.icio.us, StumbleUpon and Ma.gnolia can help search engines in multiple ways by:
  • Indexing Sites Faster : Humans bookmark sites launched by their friends or colleagues before a search engine bot can find them.
  • Deeper Indexing : Many pages bookmarked are deep into sites and sometimes not as easily linked to by others, found via bad or nonexistent site navigation or linked to from external pages.
  • Measuring Quality : Essentially if more users bookmark a page, the more quality and relevance that site has. A site with multiple bookmarks across multiple bookmarking services by multiple users is much more of an authority than a site with only several bookmarks by the same user.
  • External Meta Data : Users who bookmark sites tag them with keywords and descriptions which add an honest and unbiased definition which is created by the public and not the owner of the site.
  • Co Citation : Social bookmarking sites tend to categorize sites and pages based upon the tags used by humans to describe the site; therefore search algortihms can classify these sites with their peers.
  • Number of Votes : Similar to the number of bookmarks, the more votes a page receives on Digg or Reddit, the more useful that information usually is. If the same page receives multiple votes across multiple social news voting sites, the higher quality the site.
  • Categorization : Like Co Citation, categorization can help define the subject of a site, therefore better helping the engine address searcher intent.
To help share the wealth of social bookmarking, I’ve put together a list of 125 social bookmarking sites, some of which are very popular and others which are newer or somewhat unheard of. But besides targeting the major social bookmarking and tagging sites which are listed above, you will find some diamonds in the rough which are niche oriented or treated well by Google and other search engines. I’ve found that one such surprise is Searchles which is regarded as an authority by multiple search engines and bookmarks within Searchles sometimes show up quite high in some branded organic results.

125+ Social Bookmarking/News Sites You Should Consider

  1. Backflip : Backflip is a free service currently being run by volunteers. Backflip was started in 1999 by Netscape veterans Tim Hickman and Chris Misner. As a research tool, Backflip is clearly of value to the education community, and that community (or at least certain segments) has certainly embraced Backflip. A Google search of sites that contain the term “Backflip.com” results in numerous education-related links, including Teacher Tools.
  2. barksbookmarks : BARKS=BookmARKS is a website that combines social bookmarking, blogging, RSS, and non-hierarchical editorial control.
  3. BibSonomy :BibSonomy is run by the Knowledge Data Engineering Group of the University of Kassel, Germany. Its specifically designed for researchers, in sharing bookmarks and bibliographies
  4. Blinklist :A social bookmarking site launched by Mindvalley. According to their site, they launch several web businesses a year and are focused in 3 areas. – Technology, media and Marketing. BlinkList does have a user friendly interface indicating that its being run well and efficiently. They also quote “fully profitable” on their site. Furthermore, you can label and comment about any web page on the Internet.
  5. Blipoo :Meet Blipoo, a social bookmarking site for “cool” people sharing “cool” stories. It claims to help bloggers drive more traffic to their blog because they allow self promotion..
  6. BlogBookMark : Designed specifically for Blog hunters, BlogBookmark.com claims to have the hottest news, gossip, and blog chatter from around the web. I highly sugggest that mainstream bloggers bookmark their entires here.
  7. BlueDot : This basic social networking service allows users to save and share bookmarks.
  8. blurpalicious : Get Blurped! Not too different from other social bookmarks, but I love the tagline.
  9. Bmaccess : Social bookmarking with thumbs :)
  10. Bookkit : BookKit.com is an absolutely free web service designed to facilitate bookmark (favorites) management needs.
  11. BookMarkAll : Bookmarkall is an online bookmark community where users can create, organize and share their favorite web links online and access them anywhere.
  12. Bookmark-manager : Organizer for bookmarks, calendar, diary and knowledge.
  13. bookmarktracker : Free online storage, management, synchronizing and RSS sharing of your bookmarks.
  14. Bookmax : You can store your bookmarks and links to your favorite sites online and access them from wherever you are : basic Social Bookmarking.
  15. Buddymarks : The online personal, group and social bookmarks manager.
  16. Bukmark : Bukmark is a social bookmarking website.
  17. Chipmark :Another basic social bookmarking site.
  18. Citeulike : A free service to help academics to share, store, and organise the academic papers they are reading.
  19. Claimid : Manage your online identity. Although this is not a normal social bookmarking site, users can bookmark sites which reference their identity and build backlinks in this fashion.
  20. Clipclip : Clipclip allows you to save images and text, with a “bookmarklet”.
  21. Cloudytags : A unique word analyzer connects to your page, gets all the words and suggest you the real tags your site is showing to the world.
  22. Complore : Derived from com-(with,together) and explore-(search, research). As the name suggests, complore is a vision to connect people from diverse backgrounds
  23. Connectedy : Lets you establish a personal link directory online. As you surf the web, you collect links, categorize them in a way that makes sense to you.
  24. Connotea : Social bookmarking (for researchers).
  25. Contentpop : It has the latest Web 2.0 features such as social bookmarking, blogging & RSS. It also uses the word POP in the title which means it must be good.
  26. coRank : coRank is a site where you can share whatever you find interesting on the web with people who value your opinion
  27. Crowdfound : CrowdFound is essentially a social bookmarking website, but with a different vision in mind
  28. de.lirio.us : Store, share and tag your favourite links. Open source clone of del.icio.us with private bookmarking, tagging, blogging, and notes
  29. del.icio.us : THE social bookmarking site : It allows you to easily add sites you like to your personal collection of links and to categorize those sites with keywords. Not to mention that if enough people save your site in a bookmark, it will make their popular page and send a lot of traffic. Delicious is owned by Yahoo and is a MUST for your social media and bookmarking strategy.
  30. Diigo : Social bookmarking on steroids.
  31. Digg : The social news site that changed the Internet, Digg is a high power authority and a listing in Digg for a site, even if it only has a couple of votes, will rank highly on Google and other search engines for certain terms. If your site is shared and voted upon on Digg, and makes the Digg homepage, you’ll get a lot of traffic and attention from other bloggers who read Digg.
  32. Dropjack : DropJack.com is a social content website and owned by the ExactSeek company.
  33. Easybm : Allows users to bookmark their frequently visited sites on their private page, allowing 1-click access to their favorite web sites.
  34. Enroll : Social bookmarking system based in India.
  35. ez4u : Social Bookmarking – Ez4u to Bookmark : “Ez4u to Organize Ez4u to Share with Others Ez4u to Remember”
  36. Favoor : Favoor is your personalized new start page. Collect your favorite internet addresses.
  37. Folkd : Folkd is a social web-service about pages, news, audios, videos and blogs.
  38. Freelink : Freelink.org provides free pages of links that you can access anywhere at anytime.
  39. Freezilla : FreeZilla claims to be the first Web 2.0 freebies and promotions social networking site.
  40. Fungow : Fungow was designed to help better organize and keep track of your bookmarks.
  41. Furl : Like Delicious, LookSmart’s Furl.net is one of the first social bookmarking sites and considered an authority by the major search engines. Listing your sites in Furl will lead to traffic from organic rankings and its popular page drives traffic.
  42. Gather : Gather is a place to contribute articles and content, blog, tag and connect with people who share your passions. (Plus you can link out from the articles in this authority site).
  43. Getboo : GetBoo.com is yet another free online bookmarking service which allows you to store, edit and retrieve your bookmarks from anywhere online.
  44. Google : Allows users to save and create bookmarks in their Google toolbar that can be accessed anywhere online. Google is getting more social by the day, so take advantage of their Google Bookamrks and citations, because one day they probably will have some kind of influence on external meta data considered by the Google ranking algortihm.
  45. Hanzoweb : Hanzoweb – Bookmark, tag & share knowledge online
  46. Hyperlinkomatic : Hyperlinkomatic – bookmark list manager.
  47. i89.us : i89.us offers a free service which allows you to save your favorite website/links at one location that can be accessed from anywhere.
  48. Icio : Danish Bookmarking engine.
  49. Ikeepbookmarks : Popup feature allows you to add links while surfing the web
  50. Iloggo : Simple web based bookmarking tool that you can use for attractively displaying your favorite websites on one page.
  51. Jigg : Jigg.in is a socializing community with the latest stories / news submitted by users and has a familiar name :)
  52. Kaboodle : Kaboodle is a 2.0 shopping community where people recommend and discover new things.
  53. Kinja : Kinja is a blog guide, collecting news and commentary from some of the best sites on the web.
  54. Lifelogger : “LifeLogger is a great way to keep things that matter to you alive and sparkling.” And worth considering in a bookmarking campaign.
  55. Lilsto : Lilisto lets you store, manage and find your favorite links (or bookmarks) and removes the need to maintain them through your browser.
  56. Linkagogo : Favorites and Social Bookmarking Application, its unique dynamic toolbars automatically adapt themselves.
  57. Linkarena : German Social Bookmarking site.
  58. Linksnarf : Social link sharing with groups of friends.
  59. Listerlister :ListerLister is a social list building community where you can create, add to, and vote for both lists and the items added to them.
  60. Ma.gnolia.com : Like Furl and Delicious, anoter major bookmarking site which lets users organize bookmarks, search other people’s favorites and make friends and contacts.
  61. Markaboo : MarkaBoo is tool for saving websites, files, and notes from your browser, email or mobile phone.
  62. Marktd : Marktd is a reference & voting system that highlights marketing articles considered valuable by the marketing community.
  63. Memfrag : memFrag stores your favorites personal notes, making them globally accessible from any computer.
  64. Memotoo :Lets users centralize and share your personal data.
  65. Mister Wong : Mister Wong is a social bookmarking site that originated in Germany, and has since become a popular and widespread tool.
  66. Mixx : An up and coming bookmarking and social news sharing network which should rival Digg, Reddit and others, Mixx blends popular photos, videos and stories.
  67. Mobleo : Allows you to easily add, organize, and share your mobile phone bookmarks with your friends using your desktop computer.
  68. Multiply :Florida-based social network Multiply, which reports nearly 3 million users and $6 million in funding,opened its social bookmarking site recently and has done well. Definite authority :)
  69. Murl : My URLs is a free online bookmarks manager, think of it as a bookmarks community.
  70. MyBookmarks : MyBookmarks – access your bookmarks anytime, anywhere. Free productivity tool for business, student or personal use. Another popular bookmarking site.
  71. Myhq : Store your bookmarks in one central location. Fast, text-based, banner free!
  72. MyLinkVault : A free online bookmark manager. Other bookmark managers can be so clumsy to use – trying to rearrange your bookmarks can be slow and frustrating.
  73. mySiteVote : mySiteVote is a community where you can vote your favorite site/s and view how popular a site is.
  74. MyWebDesktop : A collaboration and communication tool, designed to be as generic and easy to use as a telephone and email.
  75. Newsvine : The mission of Newsvine is to bring together big and little media in a way which respects established journalism
  76. Newsweight :NewsWeight is a democratic news, information, and entertainment resource.
  77. Oyax : Oyax is a social bookmark manager which allows users to easily add sites you like to personal collection of links, categorize those sites with keywords.
  78. Philoi : Person-to-person link sharing community. Save bookmarks and share links with your friends.
  79. PlugIM : PlugIM is a user driven internet marketing community. Submit content, share articles, comment on projects and promote your favorites to the front page
  80. Propeller : Formaly known as Netscape, AOL’s Propeller has become a great social bookmarking news community tool which is considered an ultimate authority by Yahoo Search and passes link juice in its news story profiles. Propeller is also going to redesign very soon, which should be quite exciting.
  81. QuickieClick : QuickieClick is a second generation social bookmarking website with a visual twist.
  82. Rambhai : An Indian social bookmarking community
  83. RawSugar : A social search engine powered by user contributions. Its an online community, with over 130,000 URLs already tagged by their members.
  84. Reddit : Timely and shocking news oriented, Reddit stories are instantly voted upon and if liked by the community as a whole, can drive incredible traffic and users.
  85. Searchles : Owned by the DumbFind search engine, in my opinion Searchles is a much overlooked bookmarking tool and loved by Google, Yahoo and the other major search engines with its passing of link juice and high rankings for terms within search results themselves. Do not overlook Searchles.
  86. Segnalo : Italian Social bookmarking site.
  87. Simpy [late addition]: Social bookmarking & search, Simpy lets users “save, tag, search and share bookmarks, notes, groups and more.”
  88. Sitebar : A solution for people who use multiple browsers or computers and want to have their bookmarks available from anywhere without need to synchronize them
  89. Sitejot :Free online bookmark manager. Like every other social bookmarking site, it allows users to manage all of their bookmarks online in one convenient place.
  90. Sk*rt : sk*rt is a social media ranking platform of “pure goodness”, targeted towards women. Given the right story, Sk*rt can send A LOT of targeted traffic.
  91. Slashdot : The godfather of social news, SlashDot bookmarks are still quite powerful .. keep in mind the site has a heavy slant towards Linux and Open Source issues.
  92. SocialDanger : SocialDanger is a Web 2.0 open source content management system.
  93. Socialogs : A Digg-like Social Bookmarking Service.
  94. Sphinn : Very popular search marketing oriented social news and discussion site run via the Pligg system.
  95. Spotback : Spotback is a personalized rating system that recommends relevant content based on personal rating history using collaborative filtering
  96. Spurl : Another cherished bookmarking and tagging site, Spurl lets users keep online bookmarks & tags while offering full text searching, recommendations & storing of entire documents.
  97. Squidoo :Kind of spammed out, Squidoo is a 2.0 property which lets people and businesses set up a ‘lens’ which lists links, tags and relevant RSS feeds to different subjects.
  98. Startaid : I’ve noticed that StartAid bookmark pages rank highly in Google and other search engines. This basic bookmarking service allows users to describe, tag and categorize sites.
  99. StumbleUpon : Owned by eBay, StumbleUpon is an amazing blend of social bookmarking, voting, networking, web surfing, search and blogging. Best of all, StumbleUpon can send major traffic with its userbase of around 3 million users.
  100. Stylehive : The Stylehive is a collection of all the best products, brands, designers and stores discovered and tagged by the Hive community
  101. Syncone : SyncOne is an Internet aggregator of bookmarking and browsing.
  102. Tagfacts : Basic bookmarking and tagging, a social knowledge base.
  103. Taggly : Store, share and tag your favorite links.
  104. Tagne : TagNe.ws is user-submitted, community voted links and resources related to SEO, Blogging, RSS, Tagging, Internet Marketing and more.
  105. Tagtooga : Says that this bookmarking engine can be used to discover great sites difficult to find in Google/Yahoo by browsing categories.
  106. Tagza : A very young Social Bookmarking site mostly being used by Indian and Pakistani web masters.
  107. Technorati : Always changing and reinventing themselves, this recognized authority offers links to blogposts, tagging and a social bookmarking WTF section.
  108. Tedigo : Personal and social bookmarking in Spanish and English made simple.
  109. Thinkpocket : Lets users pocket websites you find valuable. It is a web service that aims to help store, organize and share your favorite sites
  110. Thoof : Thoof is a user generated news and information service that claims to learn about what users are interested in and delivers news that they care about.
  111. Totalpad : TotalPad is a new online news and article community where people are free to voice their opinions
  112. Urlex : With URLex system users are able to leave a comment regarding any internet link on any site. Possibly good for linking :)
  113. Uvouch : Another basic social bookmarking site, users can save their findings with one click, at one place and access it from anywhere.
  114. Vmark : An online bookmark and online favorites manager.
  115. Voteboat : VoteBoat is a user-controlled rating and voting site.
  116. Votelists : VoteLists lets users create a list of rankable items. Other can add items, comment on them, rate them and more!
  117. Vuju : Vuju allows user to submit/publish content which can be tagged and promoted.
  118. WeTogether : Social bookmarking site where people will have great opportunities to promote their own sites.
  119. Whitelinks : Securely store and quickly access favorite websites whenever connected to the Internet,:
  120. Wink : A social search engine where users can share results and answer questions. Users build profiles which can link out to bookmark pages or other web sites (hint hint).:
  121. Wirefan : Social bookmarking, news articles submission site.:
  122. Xilinus : Organize and manage bookmarks online.:
  123. Xlmark : xlmark is an easy social bookmarking site:
  124. Yahoo! Bookmarks: The MOST POPULAR social search and bookmarking service on the web. It’s similar to Delicious and something they launched before acquiring Delicious. Yahoo Bookmarks lets users store bookmarks using their Yahoo Toolbar and access them from any computer.
  125. Yattle: Bookmark Management and Mini-Blogging Service.
  126. Zlitt: Zlitt is a social bookmarking system which gives users the opportunity to share and tag favorite news, images and videos.
  127. Zurpy: Saves bookmarks, text clippings, images, files, and news feeds in one place.