You Are Browsing The SEO Category

SEO Affirmative Action

March 07 2009 // SEO // 2 Comments

Is Google’s Vince Change SEO Affirmative Action?

Google Vince Update

The recent ‘Vince Change’ by Google has been discussed, debated and well documented by Aaron Wall. In essence, the Vince Change gave brands more trust for short queries.

At first, the Vince Change didn’t bother me too much. Change is part of the business (keeps many of us in business) and it seems a waste to get worked up by such things. But the more I thought about it the more it bugged me.

Google Taxonomic Search

Late last year I’d theorized that Google might be employing taxonomic limits to certain search results. I wrote about the topic in response to some reports of unusual SERP behavior for product searches.

The general idea surrounding taxonomic search is that Google wants an even distribution of site types on certain searches. For products you’d likely want the manufacturer as well as a few retailers, a few review sites, a few shopping comparison sites and a few discussion forums. Having the results dominated by any one type (regardless of SEO strength) would be a suboptimal user experience.

I’m a bit ambivalent on the idea of taxonomic search. It probably would provide a better user experience and it simply raises the bar for good SEO. I’m still not sure Google is doing this, but it’s more intriguing to think about with Google’s Vince Change.

Commercial Results

I kept mulling over the Vince Change, both in what it was and how it might have been implemented. Matt Cutts, whom I like a lot, provided an answer that discussed the former but not the latter.

The part that caught my attention was when Matt discussed the SERP for ‘eclipse’ and specifically his mention of “commercial results”. So is it not brand but commercial results that were impacted by the Vince Change? Is there a simple taxonomy of commercial versus non-commercial sites?

But the RankPulse results don’t really bear that out. There were plenty of other commercial sites there previously, they just didn’t have the same name recognition. I couldn’t shrug off the Vince Change.

Is trust assigned or earned?

If the Vince Change was about trust and authority, how did these brand sites receive trust and authority? Was it assigned or earned? If it was earned, how exactly was it earned? If it had been earned through SEO (backlinks, on-page optimization etc.) shouldn’t they have already been highly ranked?

I found it difficult to determine how you’d ‘turn the knob’ of trust and authority and only have it impact certain types of sites. “Certain types of sites” would lead naturally to a taxonomic structure of some sort. But that would be a transparent proxy for brand and I want to believe Matt what he says they didn’t target brands.

Instead, did Google simply assign this trust and authority? A type of SEO affirmative action. If so, based on what criteria? If it’s not ‘brand’, what is it? It doesn’t seem like traffic, nor does it seem to have roots in traditional SEO. As of SMX West it wasn’t SearchWiki data, but maybe that’s changed since. Google does move fast.

Or maybe it’s another piece of data? Advertiser saturation or CTR via DoubleClick? Could it be about money?

Why the Vince Change matters

The Vince Change seems to target very broad queries. While short queries are still a big and important part of SEO, the trend is toward longer search queries. Yet, the inability to determine how trust and authority pooled around these brands is unsettling. What prevents the same type of change to be implemented on long tail queries?

Brands should be required to earn their trust on a level playing field.

Are we … being hustled?

How To See What Googlebot Sees

March 04 2009 // SEO // 1 Comment

Use the text-only cached version of a page to see what Googlebot sees. Here’s how it works.

Let’s take a snapshot of ReadWriteWeb.

ReadWriteWeb

Next, find the page in a Google search result.

ReadWriteWeb Google Search Result

Click the cached link which brings you to the full version of the cached page. This will often be the same (Google is quick!) but it might be a slightly older version of the page.

ReadWriteWeb Google Cached

On the far right you’ll see a ‘Text-only version’ link. Click it and you’ll see what Googlebot sees.

ReadWriteWeb Google Cache Text Only Version

Googlebot doesn’t see pretty pictures, great graphic design or your spot-on color palette. This is why I tell clients to treat search engines like blind five year olds.

One of the main reasons you’ll want to do this is to ensure Googlebot sees all of your links. Those neat little drop down menus have to be done the right way. Google text-only cache is also a good way to eyeball the amount of text and number of links you have on a page. Both are important factors in good SEO.

A word of warning. This shouldn’t replace spider test tools like the aptly named SpiderTest, nor does it give a comprehensive view of what Googlebot sees. The Google text-only cache is a quick, complementary tool.

Thanks to Vanessa Fox who passed along this tip in a SMX West session.

Digital Discovery Is SEO

March 04 2009 // Rant + SEO // Comments Off on Digital Discovery Is SEO

Today I read Edelman Digital’s Five Digital Trends to Watch for 2009, curated by Steve Rubel. One of these five trends was ‘Digital Discovery’ or ‘The Power of Pull’.

For more than 100 years, marketers have largely focused on reaching stakeholders through push, e.g. paid and earned media. Now, however, in an age when Google dominates, it’s equally important that we turn our attention toward digital discoverability. This requires that brands create relevant content that people will “pull” through search engines and social networks.

Call it whatever you want, but Digital Discovery is just a fancy name for SEO.

I like Steve Rubel and find his blog posts informative and often thought-provoking. So why exactly is he carefully sidestepping the true craft of Search Engine Optimization? Why not call linkbait … linkbait? And is this actually a trend for 2009?  The ‘SEO for Press Releases’ session is a golden oldie on the search conference circuit.

Mr. Angry

So perhaps it’s the audience of public relations professionals who are late in adopting SEO that shaped the report? Do they view SEO as snake oil? That we’re all a bunch of hucksters? There are bad SEOs and good SEOs, just like you’ll find good PR flaks and bad PR flaks. It doesn’t mean that I begin to call public relations something like ‘brand maximization’.

It irks me. Steve is a respected voice in his industry and beyond. This was a missed opportunity to help change the perception of SEO.

Google Did You Mean Search Results

February 26 2009 // SEO // 1 Comment

Google is presenting two Did You Mean search results above regular search results, making misspelling optimization or fat finger SEO less effective.

used boks mispelling google search results

The result above is for used boks, a misspelling of used books. The Used Books Blog doesn’t get a ton of traffic from the term, but it’s been a steady trickle given that I’m the first result. (Take that Amazon!)

And I still might get some visits on the term, but I’d guess not nearly as much now that Google is actually presenting two Did You Mean results above the misspelled results.

I’m not sure when Google began to display Did You Mean search results, but it’s relatively new since I often use the used boks example with clients. And Google could be testing the feature, so it might disappear and revert to displaying the lone Did You Mean line at the top of the results.

In fact, there are other misspellings that don’t trigger the Did You Mean search results.

toothpast google search results

Come on! Toothpast doesn’t trigger Did You Mean search results but used boks does? At least used boks could refer to a used jersey worn by the Springboks, a successful South African rugby team nicknamed the Boks.

I’m not against the feature. In many cases it likely provides users with a better search experience. Is Google testing this new feature? Does it only present Did You Mean results on certain misspellings? If so, what’s the criteria?

I wouldn’t put misspellings near the top of my optimization list, but they are often non-competitive and with a little effort you can snag the top spot and drive qualified traffic.

Understanding Google’s new Did You Mean results could change the priority of or eliminate the need for misspelling optimization.

The Future of Search is Numbered

February 23 2009 // SEM + SEO // 2 Comments

The future of search is numbered. No, it’s not what you think. I’m not predicting the demise of search. Quite the opposite.

Instead I’m talking about two trends that could have interesting implications on how both SEO and PPC campaigns are constructed.

The number of words per query is going up

Bill Tancer, General Manager of Global Research at Hitwise, shared the following statistics during a SMX West presentation.

Number of Words per Query Going Up

The trend is clear. People are using more words in their search queries. The reasons behind the increase in words is debatable. Is it an increased comfort and savvy with search or frustration with the search results? Either way, long tail searches are on the rise and will likely trend this way for some time.

People are using numbers to reformulate queries

In a post titled Study on the Structure of Search Queries, Bill Slawski of SEO by the Sea discusses a Yahoo! research paper titled The Linguistic Structure of English Web-Search Queries (pdf). One of the findings was that people were using numbers to modify their searches.

The type of word most likely to be reformulated is “number.” Examples included changing a year (”most popular baby names 2007″ ! “most popular baby names 2008″), while others included model, version and edition numbers (”harry potter 6″ ! “harry potter 7″) most likely indicating that the user is looking at variants on a theme, or correcting their search need.

The data set for this research is from 2006. Yet, combined with the increase in words per query I’d theorize that numbers remain a powerful way to search today.

I’m the first person to warn against using yourself as an example but … I’m going to break that rule right now.

I often find myself using numbers, particularly years, when searching. If I’m doing research on a volatile topic (like SEO) I might come across ancient results from 2004. These are rarely helpful.

So I’ll begin to iterate and modify my query with years to find more relevant results. I had to do something similar when trying to cobble together Google’s share of search from October 2004 to October 2008.

Speaking of Google …

Google displays the date as a meta description prefix

Sometime last year Google began to insert a date prefix before the meta description in search results. They did this for pages in which Googlebot found a date – mostly blog posts.

meta description date prefix

I certainly noticed since some of my well crafted 150 character meta descriptions were suddenly being cut off because of the inserted date.

Why exactly would Google do this? They’ve certainly made it clear they like “fresh” content but they could present fresh content without the date. I can only surmise that Google believes users are looking for and will benefit from seeing the date.

Oddly, the date prefix does not seem to be searchable. When I search for ‘Magical Thinking by Augusten Burroughs 2007’ the same result is displayed but the visual treatment changes.

date search result without meta descripition date prefix

Once again Google tromps all over my well crafted meta description, but without the meta description date prefix. This seems to prove that the meta description date prefix isn’t searchable. But perhaps it should be.

Are you including numbers in your SEO and PPC strategies?

We know that users are using more words per query and that they’re fond of reformulating queries using numbers. This should be enough evidence to implement a robust modifier strategy, if you haven’t already.

Those who already have long tail programs should think about increasing the use of numbers as valid modifiers. As an example, wouldn’t review sites benefit from bidding on terms like ‘digital camera reviews 2009’ or ‘trek bicycle ratings 2008’?

Search (whether paid or organic) is about matching your content (or ad) to the user queries. Don’t let your programs become dated.

SearchWiki Not a Signal in Search Algorithm … Yet

February 17 2009 // SEO // 1 Comment

google top secret logo

Launched in November 2008, SearchWiki lets you move, delete, add and comment on the results of a Google search. At launch I wrote that SearchWiki turned us into mechanical turks. That’s not a bad thing since the Google algorithm needs a human tutor.

Google’s SearchWiki feature has been a lighting rod within the search industry drawing the ire of many and prompting the creation of Greasemonkey scripts to turn off SearchWiki functionality. It even spilled over into ‘mainstream’ media with TechCrunch’s Michael Arrington asking Google to shelve the new feature.

So it was no surprise that the SMX West presentation on SearchWiki was well attended and full of search engine marketers eager to learn what exactly Google was up to with SearchWiki. Corey Anderson started out by giving a short presentation, outlining the four different uses of SearchWiki.

  • Bookmarking
  • Improving proper name searches
  • Collecting information on a task
  • Refinding hard-to-find information

The presentation didn’t satisfy or answer the obvious questions. What ensued was a President George Bush type of press conference where reporters asked the same question over and over in slightly different ways and got back the same ‘no comment’ answer in return. You could come away from the SearchWiki session thinking you’d learned nothing or you could read between the lines and learn a lot.

Anderson stated that SearchWiki data wasn’t a signal in the Google search algorithm “right now”. He was, however, noticeably excited by all the data they were collecting. The task ahead was to sift through the data to determine if it could be turned into information and used as a signal.

While Anderson made it clear Google felt SearchWiki was a benefit to users on its face, there is no doubt that Google is interested in the possibility of using SearchWiki data as a search algorithm signal.

SearchWiki usage numbers were something Anderson wasn’t willing to divulge. It’s not that he didn’t know them, he just wasn’t willing or able to share them. He noted that Google was “comfortable” with SearchWiki usage. The crowd full of metric junkies let out a mirthless chuckle and groan at this ultra-fuzzy term.

Anderson did mention that they had been testing different visual treatments to boost engagement and usage of SearchWiki. Obviously you need to measure usage to determine which visual treatment is performing better than the other. Google knows, they’re just not willing to tell us “right now”.

If SearchWiki weren’t important, there wouldn’t be this deliberate and careful black hole of information. I don’t blame Google or Anderson. SearchWiki provides a very rich new stream of data. The potential for data pollution is problematic. Usage numbers only help those looking to hide unnatural edits from detection.

SearchWiki data isn’t a signal in the algorithm now, but the SMX West 2009 presentation made it crystal clear that they reserve the right to use it in the future.

Twitter is not a Google competitor

February 15 2009 // SEO + Social Media + Technology // 3 Comments

Twitter search is a great feature but it in no way threatens Google’s dominance in search.

Recency does not equal relevance

The major flaw of using Twitter as a search engine is that recent tweets on a subject do not equal relevance on that subject. This should be obvious but lets do a few searches to illustrate the point. I’ll use searches that appeared in the top 100 from the Google Hot Trends list at some point on February 15, 2009.

Yosemite Camping

yosemite camping twitter search

yosemite camping google search

Which of the results best satisfies the query? Without question it’s Google. Let’s try another.

Daytona 500 Pace Car

daytona 500 pace car search on twitter

daytona 500 pace car search on google

In this instance both provide the answer. The Daytona 500 Pace Car is a 2010 Chevy Camaro. The Google result tells me it’s black and gold and gives me plenty of authority sites to visit.

Twitter on the other hand doesn’t provide this level of detail. In addition, two of the five results are from Mahalo and a third is from kinougo. More on ‘him’ later. For now lets try one last search.

Crayola Factory

crayola factory search on twitter

crayola factory search on google

Hands down Google satisfies this query better than Twitter. The only link available is, again, from our friend kinougo. So who is kinougo?

kinougo twitter profile

Essentially auto generated Tweets based on hot searches. But where do they lead?

daytona 500 pace car on kinougo

Look at that! An API based link farm with Google AdSense as the revenue source. That looks … awful!

Twitter would have been a near complete bust if it were not for Mahalo and kinougo. Yet, these sites are simply exploiting Twitter search, not contributing to it in a natural way. I doubt the user experience on these clicks would reinforce the idea that Twitter was the place to search. Probably the exact opposite.

Recency works only for hyper real-time events: earthquakes, Presidential debates and conferences to name a few. (Sidenote: There’s this other site called FriendFeed which actually did a bang up job on real-time commentary on the Presidential debates.)

Authority is nonexistent

What makes anyone believe that the latest 5 or 10 tweets on a subject are at all authoritative! Twitter has no mechanism to determine what is the best result for a given query other than the Tweets from their users in a very short time span. Do you trust the random users of Twitter that much? I don’t and neither does Google.

The Google algorithm tries to present the most authoritative, the most right, the most useful results, not just johnny-come-lately blog posts and certainly not some 140 character missive. They might not always get it right, but they’re trying … hard.

Duplication is a problem

alzheimer's disease twitter search

Twitter doesn’t handle duplicate results, opening itself up for SPAM both real and unintentional. Even for the hyper real-time events how many times do you need to see the same quote over and over again?

Twitter is not a Google competitor

Relevance, authority and duplication all ensure that Twitter is not, and likely never will be, a Google competitor. At best Twitter could provide supplemental information to a real search engine. Twitter is the crawl at the bottom of a cable news channel.

“I just ate a mango” isn’t going to disrupt the search world.

9 Reasons You Should Have Been at SMX West

February 14 2009 // SEM + SEO // 1 Comment

SMX West 2009

I’m not the biggest fan of conferences. The information is often stale, vendors seem to outnumber attendees, and the speakers are cordoned off making it more like attending a guided tour at a very boring zoo.

But SMX West was different and absolutely worthwhile. Here’s why.

Big Brains

There were a lot of smart people at SMX West. Savvy search engine marketers and an accessible bunch of representatives from the major search engines. Getting to hear Vint Cerf speak was also a treat. However, upon exiting I heard someone on a cell phone talking about Cerf’s keynote. “Yeah, some guy from Google, he was pretty good.” Like I said, a lot of smart people, not all.

Big Ideas

Danny Sullivan kicked it off by asking Google to essentially blow up the black box surrounding AdWords and AdSense. Imagine going back to a transparent Overture like bidding system. Or knowing exactly what the revenue share was on AdSense.

Vint Cerf shocked everyone when he spoke about an interplanetary network which would be operational this year. But it was the idea of ‘bit rot’ that I found most interesting. How do we ensure that information from today can be read in the far future? Hey, I have some old MacPaint files still hanging around!

Finally, Matt Cutts announced a new ‘canonical’ link tag that serves as a sort of mini-301 redirect aimed at reducing the amount of duplicate content. Even better, the three major search engines have all agreed to use the new ‘canonical’ link tag.

Cloak and Dagger

Search engine representatives versus search engine marketers, some of which have used less than white hat techniques to gain traffic and rank. Panelists often seemed to be addressing Matt Cutts directly as they spoke about their techniques, looking at him for any response or reaction. This dance was, in itself, interesting to watch but it was the session about Google’s SearchWiki and Personalized Search that really seemed like something out of a Jason Bourne movie. You could learn a lot if you read between the lines.

Camaraderie

The amount of good will in the search industry is extremely high. I witnessed Michael Gray and Rae Hoffman giving constructive feedback to Corey Anderson and Bryan Horling who presented at the SearchWiki session. And those with power strips were generous and helped many to keep the juice flowing. Everyone seems to realize that this is a small ecosystem in which both sides must flourish.

Star Gazing

There were some high profile folks at SMX West. I’m not one to be bowled over by notoriety, titles or fame (hey, people are people), but it is interesting to see some of the more well known folks close up. You can’t help but smile at Rae Hoffman’s exuberance, be swayed by Rand Fishkin’s giddy passion or meet a nicer guy than Matt Cutts.

samantha fox

Answers

You get what might be the final word on topics like dashes versus underscores or relative versus absolute. And not just a whisper-down-the-lane opinion but right from the mouth of Maile Oyhe or from the seemingly tireless Vanessa Fox, who oddly and continually triggers the distracting image of Samantha Fox in my mind.

Numbers

Data jockeys galore roam the conference rooms and hallways. Numbers and statistics are often central to the presentations. Leverage the combined research power of the panelists to deliver meaningful up-to-date statistics such as the percentage of traffic that comes from being on page one or the click distribution between organic and paid on SERPs.

Access

Not the often frustrating Microsoft Office product but instead the unparalleled ability to speak with colleagues and search engine product managers and engineers. Nearly all panelists were willing to talk with folks and answer questions after their presentations, often long after they should have told people to go jump in a lake.

Now, I’m not a super social extrovert so I wasn’t chatting people up or doing the evening party circuit. But I certainly could have and might have if it hadn’t been for the nightly hour long drive home to the East Bay.

Humor and Rumor

Have you ever noticed how those two words differ by only one letter? There was quite a bit of laughter at SMX West. Danny Sullivan was consistently funny throughout. Rae and Debra Mastaler traded analogies on link building and sex.

Todd Friesen (aka oilman) was the target of a Twitter prank. And Nathan Buggia, from Microsoft, hit on both humor and rumor as he spat out one-liners sitting behind a MacBook Pro during the site review session.

For all of these reasons and more, SMX West 2009 was a great success. Make sure you’re there next year.

Post Click SEO

January 27 2009 // Marketing + SEO // Comments Off on Post Click SEO

post click seo

Should search engine optimization professionals be more involved in post-click marketing? Yes.

SEO shouldn’t end with the click. Getting the click is half the battle. What happens after that click is just as important. Post click marketing has taken off as more and more realize the need to optimize pages and conversion paths.

SEO should be leading the charge, not taking a back seat.

Post Click SEO

Keyword targeting adjustments are the first way in which SEO can influence post click marketing. What keywords are bringing users to the page and which of those keywords is performing best from a conversion perspective?

Decisions can be made to alter the keyword targeting of the page to optimize for the better performing keyword(s). Traffic may go down but the effective yield of that traffic may go up based on the higher conversion rate.

This process might also help identify areas for new content that better meet the needs of those arriving on poorly performing keywords.

Meta descriptions and titles are extremely valuable from a post click perspective because they can set expectations and even include a call to action. Optimizing solely for the keyword or keyword phrases in the meta description might not a) encourage clicks and b) may drive unqualified clicks.

Remember, search engine marketers are constantly testing new ad copy to increase click through rates and conversion. Search engine optimizers should do the same, using the larger canvass of meta description.

The meta description can even be used for promotional purposes. Changing the meta description on a product page to include an offer of free shipping will likely increase the click yield and, if the page matches their expectation, will continue to convert effectively.

On-site optimization in the form of keyword density, headers and even bold text can all have an impact on conversion. On-site SEO is about making the page easier to understand. While the initial audience for this optimization is a search engine, those same changes help everyday readers.

A highly descriptive header will help tell both the search engine and the reader what that page is about. H2s and H3s can help further explain the topic of that page. Again, both search engines and readers benefit.

Keyword density also increases the readability of a page. Here’s an example.

The procedure was a success and fully solved the patient’s condition.

This type of sentence might appear near the end of a descriptive paragraph. The writer probably referenced the procedure and condition beforehand and believes that the reader will fill in the blanks, turn generalizations into specifics and make the mental connections needed to gain comprehension. Search engines will do none of these things.

Instead, what if the sentence read as follows.

The stomach stapling procedure was a success and fully solved the patient’s obesity condition.

Changing generalizations to specifics and filling in the mental connections we make ensure that search engines and users immediately understand the content upon scanning the page. And people are scanning pages more than ever.

Don’t make your users do the work.

Post-click marketing and SEO

Companies that seek to maximize clicks in isolation and conversion in isolation will see results.  But those results will be less than what could have been obtained. Don’t be fooled by the false positive of this silo mentality.

Search engine optimization and post-click marketing should work in tandem to get the most out of every click.

The Worst Site in the Best Neighborhood

January 22 2009 // SEO // Comments Off on The Worst Site in the Best Neighborhood

You’ve heard the old real estate adage to buy the worst house in the best neighborhood. Well, the same is true on the Internet.

You want to be the worst site in the best neighborhood.

Worst House In The Best Neighborhood

Let me explain.

Google tries very hard to understand what a site is about. One of the primary ways it does this is by looking at who links to you and who you link to. These links are essentially your neighbors and collectively they make up your neighborhood.

The quality of your neighbors matters. Not so much in terms of Page Rank, though that doesn’t hurt, but by the topic of these sites. If your site is about surfing you’ll want links from surfing blogs, famous surfers, surf competition sites (like Mavericks), surf equipment retailers and even a tide table. Do this and Google will quickly understand which neighborhood you belong to and begin to rank you accordingly for relevant search terms.

But what if you get links from a Victorian antique dealer, a florist, a Bridget Fonda fan page and a Pachinko repair site. Now, perhaps all of these sites linked to you because they like surfing but … Google will be confused. (Remember, a search engine is like blind five year old.) It will have a difficult time knowing which neighborhood to put you in.

Google probably likes tract housing.

You know, the kind where all the houses in a neighborhood look nearly the same. Oh, the paint might be a bit different, and there might be three different floor plans, but the overall effect is that they’re alike. Make sure Google knows what your ‘house’ looks like so it can put you in the proper neighborhood. Roofing Contractors with Experience in Flower Mound can help you to have the best roof services. Avail the service and make your home beautiful as google doesn’t do eclectic.

This is why quality link building is so important. Because the only other way to establish yourself is through global link popularity or quantity link building.

No, I’m not telling you to aspire to be the worst.

New Kids on the Block

But when you start out you’re going to be the new site on the block. You have no ‘street cred’, no ‘Google juice’. You need to move into the best neighborhood by getting links from those sites within that neighborhood, the more authoritative and popular the better.

Over time you’ll paint the place, put in hardwood floors, remodel the kitchen, trim back the hedges and put on an addition. You’re building sweat equity and the value of your site will grow. If you’re doing it right, you won’t be the worst for long.

But in the beginning you’ll be the worst site in the best neighborhood … and that will be just fine.

Photo credit: Shilashon

xxx-bondage.com