You Are Browsing The SEO Category

Google AJAX Search Results

June 28 2010 // SEO // 2 Comments

A few months ago I wrote about how you could track keyword rank in Google Analytics. I said I’d blog about how to create ranking reports based on the new data but as I did due diligence on those reports I found something much more interesting.

AJAX Search Results Performance

There is a profound difference in the performance of Google AJAX search results versus normal search results.

Google Search Results Comparison

Standard search results drive 13% more traffic per keyword, 38% more keywords per landing page and 56% more traffic per landing page than their AJAX counterparts.

Creating The Search Result Comparison

The Google Analytics rank hack captures a specific rank parameter present on AJAX search results. If that parameter doesn’t exist no rank is reported for that keyword. Upon implementing the GA rank hack you find that Google is serving AJAX results approximately 20% to 25% of the time. This is simply a product of looking at traffic that is reported with a rank compared to the total traffic reported in the profile.

You can also create advanced segments that include or exclude the AJAX search results by using the user defined value (Rank:) implemented in the GA rank hack. This gives you the ability to easily compare the performance of AJAX versus normal search results.

Each week I capture the amount of traffic as well as the number of keywords and landing pages that drove that traffic. I then create calculated metrics around traffic per keyword, keywords per landing page and traffic per landing page. There’s no real right or wrong number for these metrics, but it can be quite useful in trend analysis and, in this instance, for comparison analysis.

I’ve aggregated results from two large sites that come from very different verticals. Both showed similar results. I recognize that two is a small sample size so I encourage others to perform the same analysis to increase the confidence level of these results.

What’s Going On With AJAX Results?

It’s difficult to say with certainty why AJAX search results would perform so radically different than normal search results.

I first compared the Browser and OS profile for each to see if there was a distribution discrepancy. Sure enough, there is. Normal results are presented with far greater frequency (almost to exclusion) for the following:

  • Safari/Macintosh
  • Chrome/Windows
  • Firefox/Macintosh
  • Safari/iPhone
  • Safari/Android
  • Safari/iPad
  • Safari/iPod
  • Chrome/Macintosh

Yet, when I isolate Internet Explorer/Windows (sadly, the dominant combination) I see the same pattern for each result type. So while the difference in Browser and OS is interesting, it doesn’t seem to be the reason for the performance differential between AJAX and normal search results.

Could it be algorithmic? It seems far fetched to think Google would implement a different algorithm for AJAX search results. Nevertheless, lets put it down using the ‘no idea is a bad idea’ brainstorming methodology.

Perhaps it’s related to the presentation of search results. This makes a bit of sense since they are being rendered differently. But most people can’t tell the difference between the two, so I don’t think the way in which they are rendered would result in different user behavior.

But what about the intersection between algorithm and presentation? What if the implementation of Onebox, personalization or AdWords varied by type of search result? Could Google be using AJAX search results for testing purposes?

Further analysis leads me to believe this might be true.

The June Gloom Update

The thread of this analytic sweater began when I noted that total traffic from Google was going up but ranked traffic was going down.

Google Total Visits

Google AJAX Visits

The opposing trend lines don’t make sense. This strange trend began around June 1st which is why I call it June Gloom.

Of course, a simple explanation could be that Google was showing fewer AJAX results. So I dug further.

Ranked Traffic on Google AJAX Results

The great thing about the Google Analytics rank hack is that you can track the trend of each position.

Google AJAX Traffic by Rank

The trend for keyword rank 1 and 2 dropped dramatically (~40% and ~25% respectively) while all other ranks were flat or improved during the same period. The drop in traffic from rank 1 and rank 2 far surpassed the increases in traffic from all other ranks combined. Hence, the drop in overall AJAX results traffic.

Clearly, this same pattern is not happening with normal search results. If it was, the total traffic from Google would be going down. Instead it’s going up. This also indicates that it is not about how often AJAX search is presented.

What Caused June Gloom?

Maybe the tracking parameter is broken. It could, but that would mean that it was broken on just keywords ranked first and second, and that it was only broken on those ranks some of the time. This isn’t impossible, but it certainly seems improbable. Inspection of AJAX search results didn’t turn up any missing rank parameters.

I decided to check a handful of high volume keywords which had a number 1 ranking. An advanced segment for rank 1 can quickly identify these keywords. On both sites, the top performing keywords did not show any impact from June Gloom.

I double checked this by using the new Top Queries report in Google Webmaster Tools. Downloading and comparing the weekly data, I rounded the position number and then created a pivot table to look at impressions and clicks for keywords in the first position. There was virtually no difference in impressions, clicks or click rate pre and post June Gloom.

This doesn’t mean it’s not happening, it only means that it’s not happening on the top queries. Remember, the Top Queries report is not all inclusive and it probably draws from all results – both AJAX and standard.

If top queries were unaffected, it followed that long tail keywords were being impacted by June Gloom. This could be based on keyword volume, keyword type or number of words in the query. The latter was the easiest to test.

By using the Google Analytics API we grabbed keywords that had delivered traffic at rank 1. We used an Excel function to count the number of words and then created a pivot table showing the sum of clicks by word count. We then compared the distribution of traffic by word count in a pre and post June Gloom week. The result? The distribution was virtually the same.

Word Count Traffic Distribution

So we can only theorize that long tail keywords based on volume or type are being impacted by June Gloom. What would do that? Greater AdWords reach? More personalization? A higher rotation of Onebox and/or 7-pack listings? Or maybe it is algorithmic. Unfortunately, we have been unable to find the smoking gun – that query or queries that present differently on each type of search result.

Or perhaps we’ve botched it – made a gross assumption or overlooked the obvious. Once again, peer review is encouraged.

Who’s this we? This is a good time to mention that my colleague Jeremy Post provided invaluable assistance in data gathering and in talking through the analysis.

June Gloom and Caffeine

These days no change in traffic can escape the Caffeine question. Strangely, there is some evidence of Caffeine in the data.

The number of landing pages driving total traffic has increased. Landing pages are up nearly 20% for standard search results and only down 3% for AJAX search results. In fact, one of the two sites saw a small increase in landing pages from AJAX results. The number of landing pages was least affected in AJAX results compared to the changes seen in traffic and keywords.

This leads me to believe that the improvements in crawl and indexation brought on by Caffeine may be adding more landing pages. However, these new landing pages haven’t yet acquired a cluster of keywords. This drives down the keywords per landing page metric. Essentially, the rate in which keywords are added is less than the rate landing pages are being added.

The difference on AJAX search results is that there were substantial keyword losses even as new landing pages were being added. June Gloom may actually have been tempered by the impact of Caffeine.

AJAX Results and June Gloom

The evidence suggests there is a difference between AJAX and normal search results. And June Gloom only serves to reinforce this theory.

SEO is supposed to be a zero sum game. One site gains traffic at the expense of another. Are other sites benefiting from AJAX results? What sites are now getting more traffic from those top two positions?

Of course, AJAX results and June Gloom could be the vehicle Google is using to test Onebox, 7-pack, personalization and universal search options. SEO is still a zero sum game … but the game board is expanding.

Are you ready?

The Best SEO Tools Not About SEO

June 22 2010 // SEO + Technology // 5 Comments

There are plenty of great blog posts about SEO tools, though you should be careful to look at a curated and updated list. Actually an SEO tool wiki would be an interesting idea. But I digress.

Instead of discussing the SEO tools I use I thought I’d share the other tools I use each and every day. Tools that have become indispensable, saving me time, energy and headaches.

Dropbox

Dropbox Logo

Sharing files can be hassle unless you have Dropbox.

Dropbox is essentially a cloud based storage system. I started using it to sync files between my laptop and desktop computers. But what Dropbox is really good for is sharing files with clients.

Email is unreliable and you often wind up spending time waiting for folks to find and download files. Dropbox lets you create a shared folder for each client where you can keep all related materials. Not only can your client find the materials, they can point internal resources to it with ease. This is particularly useful if a client uses contracted or offshore developers.

You may have to convince clients to install Dropbox. Don’t worry, the 2GB plan is free, installation is easy and the instant value it delivers will earn you quick kudos.

Adium

Adium Logo

I love Instant Messaging. Short of being on-site, this is often the best way to communicate, clarify and remove roadblocks. Email is slow and asynchronous. The phone doesn’t provide the added context of links or screen shots. IM is fast and effective. It can also be a hassle if you have clients on multiple IM platforms. Yahoo! Messenger, Google Talk, Jabber, AIM and more.

Setting up accounts with each is easy, but having every IM client up and running at once creates problems. Context switching between each platform’s UI is not trivial. The messages arrive in different ways (with different sounds) and you wind up having multiple windows begging for your attention.

That’s where Adium comes in.

Adium is a free instant messaging application for Mac OS X that can connect to AIM, MSN, Jabber, Yahoo, and more.

Adium unifies all your IM programs into one slick interface. The tabbed chat feature is particularly nice so that you don’t have a new window for each IM conversation cluttering up your monitor.

You can even combine contacts (the same person on multiple IM platforms) “so that each one represents a person, not an account.” This is nice when you don’t care how you reach them, just that you reach them. Like the phone, you don’t care who the carrier is, you just want to connect.

The good news is you can use Adium even if your friends or clients don’t. The bad news, it’s Mac only. Windows users might want to check out Trillian instead.

TinyGrab

TinyGrab Logo

If a picture is worth a thousand words, perhaps a screen shot is worth a few hundred.

Screen grabs are a vital part of the SEO process. You want to show clients what you’re seeing and how to fix it. If you’re building a presentation deck this isn’t a huge problem.

If you’re having an IM conversation about an issue (with Adium I hope), the traditional screen grab can be slow and clunky. Enter TinyGrab.

Download this tool and each time you take a screen grab it saves it to the cloud and copies a tiny URL of it to your clipboard. Then simply paste it into your conversation and you’ll be looking at the same thing in no time.

The free version of TinyGrab gives you 10 grabs a day. For a one-time fee of £10 you can upgrade to the premium version for unlimited grabs.

These tools make me more productive every day. Do you have other tools that make a difference in your daily life? Share them here.

Google’s Heisenberg Problem

June 13 2010 // SEO // 6 Comments

Google has a Heisenberg problem. In fact, all search engine algorithms likely have this problem.

Google Heisenberg Problem

The Heisenberg Uncertainty Principle

What is the Heisenberg Uncertainty Principle? The scientific version goes something like this.

The act of measuring one magnitude of a particle, be it its mass, its velocity, or its position, causes the other magnitudes to blur. This is not due to imprecise measurements. Technology is advanced enough to hypothetically yield correct measurements. The blurring of these magnitudes is a fundamental property of nature.

The quantum mechanics that go into the Heisenberg Uncertainty Principle are hard to follow. That’s probably an understatement. It is fascinating to read about the verbal jousting Heisenberg, Schrödinger and Einstein on the topic. (I’m envisioning what the discussion might look like as a series of Tweets.)

Yet, some of the mainstream interpretations you get from the Heisenberg Uncertainty Principle are “the very act of measuring something changes it” and, by proxy, that “the observer becomes part of the observed system”.

These two interpretations can be applied to Google’s observation of the Internet.

Google Changes the System

Observer from Fringe

Not only does Google observe and measure the Internet, they communicate about what they’re measuring. Perhaps quantum theorists (and those looking at comments on YouTube) would disagree with me, but the difference is also that the Internet is made up of real sentience. Because of this, the reaction to observation and measurement may be magnified and cause more ‘blur’.

In ancient SEO times, meta keywords were used in the algorithm. But the measurement of that signal caused a fundamental change in the use of meta keywords. This is a bit different than the true version of the Heisenberg Uncertainty Principle where measuring one signal would cause the others to blur. In this case, it’s the measurement of the signal that causes it and the system to blur.

Heisenberg and the Link Graph

Today the algorithm relies heavily on the link graph. Trust and authority are assigned based on the quality, quantity, location and diversity of links between sites. Google has been observing and measuring this for some time. The result? The act of linking has blurred.

The very act of measuring the link graph has changed the link graph.

In an unmeasured system, links might still be created organically and for the sole purpose of attribution or navigation. But the measured system has reacted and links are not created organically. The purpose for a link may be, in fact, to ensure measurement by the observer.

Nowhere is this more clear than Demand Media’s tips to driving traffic.

Demand Studios Traffic Tips

Clearly, the observer has become part of the observed system.

Social Search Won’t Help

There are those who believe that moving from document based measurement to people based measurement will solve algorithmic problems. I disagree.

In fact, the people web (aka social search) might be more prone to blur than the document web. Documents can’t, in and of themselves, alter their behavior. The raft of content produced won’t simply change on it’s own, people have to do that. And that’s precisely the problem with a people based algorithm.

Think links, social mentions and other Internet gestures are perverted by measurement now? Just think how they’d change if the measurement really were on people. The innate behavior of people on the web would change, as would their relation to documents. Do I have to mention Dr. House’s favorite mantra?

No, social search won’t help. Is it another signal? Sure. But it’s a signal that seems destined to produce a higher amount of blur.

Does Google Measure Blur?

Does Google have an attribute (meta data) attached to a signal that determines the rate or velocity of blur? I have no idea. But those folks at Google are pretty dang smart. And clearly something like that is going on since signals have gone in and out of favor over time.

At SMX Advanced 2010, Matt Cutts made it clear that the May Day algorithm change was made, primarily, to combat content farms. These content farms are a form of blur. The question is what type of signals did they trace the blur back to? Were they content based signals (e.g. – quality of content based on advanced natural language processing) or link based signals?

Does Google realize they are, inadvertently, part of the problem? Aaron Wall believes it’s more intentional than inadvertent in his excellent How To Fix The Broken Link Graph post.

Should Google Shut Up?

Shut Up Google

Google has always been somewhat opaque in how it discusses the algorithm and the signals that comprise it. However, they are trying to be better communicators over the last few years. Numerous blogs, videos and Webmaster Central tools all show a desire to give people better guidance.

You’ll still run into stonewalls during panel Q&A, but more information seems to be flowing out of the Googleplex. On the one hand, I very much appreciate that, but a part of me wonders if it’s a good thing.

In this case, the observer is clearly acting on the system.

Should Google tell us meta keywords are really dead? Or that they don’t process Title attributes? Or that domain diversity is really important? Sure, we’ll still have those who postulate about the ranking factors (thank you SEOMoz), but there would be a lot less consensus. It might produce less homogeneity around SEO practices, or in other words, the blur of other signals might lessen if the confidence in signal influence weren’t as clear.

It’s not that the algorithm would change, but perhaps the velocity in which a signal blurs beyond usefulness slows.

The Solution

Well, there really isn’t a solution. A search engine must observe and measure the Internet ecosystem. Acknowledging that they’re part of the system and working on ways to minimize their disturbance of the system would be a good start. Hey, perhaps Google already does this.

The number of signals (more than 200) may be a reaction to the blur produced by their measurement. More signals mean a distribution of blur? But I somehow doubt it’s as easy as a checks and balances system.

Google could maintain very different algorithms – 2 or even 3 – and randomly present them via their numerous data centers. However, they’d all have to provide a good user experience and it’s been difficult for Google to maintain just the one thus far.

I don’t have the answer, but I know that the rate in which the system blurs as a result of Google’s observation is increasing. I believe Google must account for this as they refine the algorithm.

What do you think?

I Ask Good Questions

June 12 2010 // Humor + SEO // Comments Off on I Ask Good Questions

I ask good questions. Really good questions. Even Matt Cutts says so.

Really Good SEO Question

Best Question at SMX Advanced

The You&A with Matt Cutts at this year’s SMX Advanced covered two main topics: Caffeine and May Day. After a humorous back-and-forth between Danny and Matt the public Q&A period started.

Danny asks about Caffeine update and HTML5. A “really good question,” Matt says. Matt says HTML5 is completely unrelated to Caffeine, and Google doesn’t give bonus points for code that validates. But Google does have an HTML parser in the wings.

That’s my question! Proof will come with the video where Danny attributes the question to me by name. Not that it really matters, but it’s nice to be acknowledged and get a pat on the back every once in a while. As an aside, the online submission form is a great way for introverts to interact.

Okay, it’s probably not the best question, but maybe Gil Reich can add a new Best Questions category to his Best of SMX Advanced 2010 post.

Best SMX Advanced Session

Speaking of the best. The You&A with Matt Cutts was great but I walked away from Search Marketing in the Facebook Zone with actionable information and a renewed passion for paid search.

Dennis Yu from BlitzLocal shared real world campaigns and tips on getting the most from Facebook advertising. And Marty Weintraub from aimClear reminded me that figuring out a new platform is amazingly fun. Thank you.

SMX is the Best

Speaking of thanks. A big thank you to Danny Sullivan who has done a super job in providing the search industry with a valuable conference series and continues to be a great ambassador for the search community. See you next year.

Unlink at your own risk

June 01 2010 // Rant + SEO + Web Design // 3 Comments

unlinking text

There’s a new unlinking meme going around that contextual links are a bad thing for web content. That they’re a distraction and take away from the prose of the journalist or blogger. It’s amazing that so many smart people actually believe the myth that people are reading their content word for word.

They’re not.

People Scan Text

Jakob Nielsen found that 79% of users scanned content and further research has supported this finding. But there are ways to increase the readability of web content.

highlighted keywords (hypertext links serve as one form of highlighting; typeface variations and color are others)

That’s right. Links actually help the usability and readability of your content.

Writing for the Web

Putting all the links at the end may encourage users to skip your content. People aren’t patient and while it would be nice if they were, I don’t think that’s going to change. Fighting against this instinct doesn’t seem to be a winning strategy, nor is it entirely bad.

Different mediums dictate different writing styles. A novel versus haiku versus grant writing. They’re all very different in style, syntax and structure. Contextual links are simply a part of the style, syntax and structure of web content.

Links and SEO

Don’t forget that links are still an important part of SEO and recent research indicates that links within the text likely carry more trust and authority. And while backlinks are far more important, establishing a hub of authority and your presence within a ‘neighborhood’ is going to help your content get read by the right people.

So, get over the anecdotal stories and past the vanity. Links within text are valuable in web communication. Period.

Why aren’t you watching Matt Cutts videos?

May 30 2010 // Rant + SEO // 5 Comments

Matt Cutts Videos

The average number of views a Google Webmaster Central video, starring Matt Cutts, receives is approximately 4,000.

That’s right, only 4,000 people (if you believe every view is unique) tune in to learn from the guy who heads up Google Web Spam and is the face of Google search.

I don’t get it.

Is the SEO community that small?

One person recently commented that the Facebook Like button made him feel lonely.

Fundamentally, this means that the web is a lonelier place for me. It’s like walking on a sidewalk on one side of the street, where it’s totally empty, and getting a glimpse that the other side of the street is crowded with friends chatting. The friends are there: they’re just not mine. I must be a loser.

When I look at the number of views Google Webmaster Central videos get, I begin to feel similarly. Are there only 4,000 people who share my passion for search engine optimization? Is the SEO community that small?

SEO Search Volume

I decided to so a little research. First up was to see what type of search volume ‘SEO’ gets using Google’s Keyword Tool.
search engine optimization search volume

It’s not monster volume but it certainly shows that there’s a fair amount of interest in the topic.

Quick note, if you don’t like Google’s new interface you can force the old version of the Keyword Tool with the following URL:

https://adwords.google.com/select/KeywordToolExternal?forceLegacy=true

SEO Site Traffic

So what about some of the major sites out there. What type of traffic do they get?
SEOmoz Traffic

SEOBook Traffic

MattCutts.com Traffic

While I don’t usually find Compete* to be accurate, it shows that sites like SEOmoz and SEO Book get nearly 1 million visitors a month. Even Matt Cutts gets nearly 400,000 visitors a month to his blog. So, the idea that only 4,000 people are watching his videos is … shocking.

*I find Compete to be wildly wrong most of the time but the alternative is Quantcast which has SEOmoz at 46K, SEO Book at 3K and Matt Cutts at 15K. While I usually find better success with Quantcast, these numbers seem outlandishly wrong.

RSS Webmaster Central Videos

Webmaster Central Videos are usually less that 2 minutes long and Matt provides as much information as he can on a specific topic. Sometimes the topic isn’t that illuminating, and sometimes Matt can’t divulge as much as you might like. But it’s better than a poke in the eye with a sharp stick, right?

Read between the lines or listen for what seems like an offhand comment and you often do learn something. As I was writing this, a new video was uploaded that addresses the May Day algorithm change.

The best way to ensure you’re watching these videos is to subscribe to them via RSS. This is easy, so do it now!

Just go to the Google Webmaster Central Channel on YouTube and click on the RSS icon in your browser toolbar. If you’re not using RSS, well … shame on you.

Optimize Sitelinks

May 11 2010 // SEO // Comments Off on Optimize Sitelinks

Sometimes you overlook the obvious. It’s right there in front of your face but you don’t see it. If it were a snake it would have bit you.

Google Insights for Search Sitelinks

Remember to Optimize Your Sitelinks

Sitelinks appear under your site’s home page search result on Google. Not every site gets them and when you do the links are generated algorithmically. The sitelink algorithm isn’t particularly refined, so it’s up to you to optimize the links.

Thankfully, Google makes sitelink management rather painless. Through Google Webmaster Central you can quickly review the algorithmically generated sitelinks and block ones you don’t wish to appear in the sitelink list. No, you cannot add a sitelink. The potential for abuse is simply too high.

Sitelink Management

SEO is not a set it and forget it business and this extends to sitelink management.

Once you’ve blocked a sitelink, it won’t appear in the Google search results for 90 days. This period will be extended every time you visit the Sitelinks page on Webmaster Tools.

So don’t forget to check in on your sitelinks to ensure that you’re putting your best foot forward on your branded search queries. An embarrassing sitelink could wind up damaging your brand.

Et Tu Google?

Just prior to SMX West Google released an SEO self-assessment. Their SEO Report Card (pdf), showed that only 32% of Google products (14 of 44) had appealing sitelinks. So, take comfort, it’s not just you that’s overlooked sitelink optimization.

Yet, two months after identifying this issue, sitelinks for Google Insights for Search have clearly not been optimized.

Et Tu Google?

Track Keyword Rank in Google Analytics

April 21 2010 // Analytics + SEO // 23 Comments

In February, Matt Cutts referenced a parameter in AJAX based Google search results that would let you track the rank of that result. Sure enough, it’s there and with just a little bit of know how you can track keyword rank in Google Analytics.

Tracking Rank in Google Analytics

At first glance you might think that tracking keyword rank would be tough to implement, but it’s really not. Here’s an easy step-by-step guide to capturing keyword rank in Google Analytics.

Create a New Google Analytics Profile

Simply click on Analytics Settings within Google Analytics. You must be a Google Analytics administrator to do this.

Google Analytics Settings

At the bottom, find and click on Add Website Profile.

create new profile

You want to Add a Profile for an existing domain and then select the domain and enter a Profile Name. I suggest something easy and descriptive like “Google Rank”.

create new google profile

When you’re done you’ll see a new profile appear in your Analytics Settings list. Don’t worry if you see a yellow triangle with an exclamation point in the Status column. The tracking for a new profile takes a bit of time to populate. As long as the current tracking for that domain is working, this will take care of itself.

Create Profile Filters

Click the Edit link next to your new profile so you can create three filters. The first ensures this profile will only report organic traffic.

analytics organic filter

The second ensures this profile will only report Google traffic.

analytics google filter

The third one is a bit more complicated and involves capturing the keyword rank using a regular expression in an Advanced Filter.

google analytics keyword rank filter

If the picture isn’t clear enough you want to enter: (\?|&)cd=([0-9]+)

All the regular expression is doing is looking for that special parameter (?cd= or &cd=) in the URL and then capturing the number (aka rank) after the cd= and using it in the User Defined field. You might be able to get away with just &cd=([0-9]+) but smart folks like Yoast are using both. I did a quick test and captured that data ($A1) and found 100% of it to be the ampersand (&). That said, I recommend covering your bases and match on both.

Remember to be sure to use $A2 since the number 2 refers to the second parenthesis where you’re capturing rank. If you’re interested (like I was) the advanced filters help on Google isn’t a bad read and this regex cheat sheet is a nice reference as well.

That’s it! Really, you’re done.

Wait and Review Your Keyword Ranking Reports

google keyword ranking report

You’ll have to wait a day for the data to be collected since filters are not retroactive.

Wake up the next day and visit your new Google Rank profile. You’ll need to navigate to the User Defined section under Visitors. Once you click User Defined you’ll hopefully see a clean keyword ranking report. The (not set) value at the top indicates that no rank was captured, most likely because it was not an AJAX search result.

Now, there are other ways to configure these filters to combine keyword and rank, or exclude non-AJAX URLs. I’ve chosen to do it this way because I find it easier to view and more flexible in creating additional filters and custom reports. That’s not to say that you couldn’t create yet another profile to try different filter variations. Don’t be afraid to try (and break) things until you figure it out.

In my next post I’ll show you some ways to configure ranking reports and gain additional keyword insight.

When SEO Won’t Work

April 05 2010 // Marketing + SEO // 1 Comment

There have been a number of recent posts around selecting the right SEO clients.  And I’ve certainly had my share of frustrations. Yet, one of the issues I often run into are potential clients who think SEO is the only marketing tactic they need to grow their business. This is just as difficult as a client who distrusts SEO.

seo infomercial

When SEO won’t work

Okay, SEO will always work but it won’t always be the best way to grow your business. Too many start-ups seem to believe that they’ll be able to drive massive amounts of traffic from Google. End of marketing plan.

The fact is SEO is but one part of an overall marketing plan. Sometimes it can be a very large part of the plan, particularly if you have a long-tail strategy. More often than not it’s going to be a focused SEO effort on a handful of high value keywords. While this is a fine strategy it may not bring a lot of traffic right away.

SEO is not a ‘just-add-water’ solution

SEO is tougher than it looks, particularly if you’re looking at optimizing a handful of competitive keywords. Sure, the basics are easy but the devil is in the details. Even when you’re doing all the right things, it may take time to conquer the rankings for those keywords. Never mind the pesky keyword volume data that can provide a reality check on expected traffic.

Unlike skeptics, SEO converts have a distorted sense of the speed and effectiveness of SEO. While they don’t understand the mechanics, they’re sure that some expert can wave a magic wand and turn on the Google spigot.

SEO Infomercial

The lure of SEO is, of course, that it’s free. In some ways SEO is like a late night infomercial. Promises of flat abs in 30 days with just a 10 minute daily workout! Did we mention that it folds up and fits under your bed too?!

SEO can be a very effective low-cost channel. But to get those flat abs you still need to eat right. And if you’re listening carefully to the legal disclaimer you’ll hear that those magical results were ‘not typical’. That means it’ll take longer than 30 days and more work than 10 minutes daily.

Swiss Army Knife SEO

Swiss Army KnifeYour marketing plan should be like a Swiss Army Knife with SEO being just one of the tools. Not only that, but you need to use that tool the right way. Getting a whole bunch of traffic that doesn’t convert isn’t going to help your business. Don’t neglect the other tools at your disposal. In fact, some of those other tools might actually help your SEO efforts in the long run.

And if SEO won’t work there’s always social media, right?

Display Advertising and SEO

March 25 2010 // Advertising + Marketing + SEM + SEO // Comments Off on Display Advertising and SEO

A new study by .Fox Networks and comScore shows (again) the positive relationship between display advertising and search.

Video and display advertising both successfully increased brand engagement in each of the four campaigns analysed. The average uplift across the campaigns saw site visitation increase by more than a factor of seven over a four week period following exposure to an ad, with consumers three times more likely to conduct search queries using brand or relevant generic terms in the same time period.

display advertising and seo

Advertising Attribution

These studies all point to a synergy between advertising channels. That’s not ground-breaking, though the measurement of it is innovative. What marketers have been trying to figure out is attribution. What channel or channels should get credit for a sale or lead? It goes to the heart of the old marketing adage: I know I’m wasting half of my marketing budget, I just don’t know which half.

Impact on Display

Many advertisers and agencies still measure success of a display campaign based on traditional click through rate (CTR) and ROI. The low CTR of display ads makes marketers suspicious. The concept of a view-through conversion made sense to some, but it still seemed like a bunch of hand waving and didn’t solve the problem of attribution. New services like Vizu also go beyond clicks and provide measurable brand lift based on display campaigns.

Studies and tools that provide multi-channel insight into conversion will help advertisers move beyond antiquated success metrics and increase their display advertising budgets.

Impact on Search

Convincing advertisers of the relationship between display and search is only half the battle. How will advertisers respond? The obvious knee-jerk reaction is to increase their display advertising spend. But is that really where advertisers should start?

If display generates more search volume, wouldn’t you first ensure search was optimized to convert that additional volume? Even within search, would you allocate more dollars into PPC or SEO? Would you prefer to pay for that customer twice or once?

Display and SEO

I’d argue that the first action item based on this study would be to invest in SEO. We already know that the vast majority of search clicks come from organic listings. The importance of rank cannot be denied, even with recent studies showing interesting behavior around brands.

Display primes the pump and generates intent. But you could be generating that intent for your competitor if you haven’t done enough SEO. Branded terms are likely safe, but the ‘relevant generic terms’ are a battlefield.

For example, if Best Buy ran a display campaign for HDTVs, this would create additional search volume for branded searches (Best Buy) and relevant generic searches (HDTVs). A brand search works out just fine. But a search for hdtvs returns Walmart as the first retailer result. Best Buy could wind up spending advertising dollars to drive sales for Walmart.

My fear is that instead of investing in SEO advertisers will simply throw money at the problem through PPC. Never mind that you’ll still only capture a small segment of that additional search volume, it’s also eating into your overall ROI.

xxx-bondage.com