You Are Browsing The SEO Category

SEO Holy Trinity

January 06 2009 // SEO // 3 Comments

SEO Holy Trinity

The SEO Holy Trinity. In the name of the Title, and of the URL and of the Meta Description.

If you’re just getting started with search engine optimization (SEO) you must get these three things right. Here’s the Blind Five Year Old explanation on why these three elements are critical to your SEO success.

Title

A search engine walks into a library. No, it’s not the set-up line to a joke. It’s an analogy for how a search engine will look for content. Everyone likes new content, even search engines, so it’ll walk straight over to the new releases section.

Now, remember, a search engine is like a five year old. So when it wanders over to the new releases you need to make sure it can understand what your page is about. The title is the best way to communicate what your post or page is about.

A five year old will probably not care who wrote the book, so don’t put your name, company or blog at the beginning of the title. If you have it at all, put it last.

Frog and Toad Are Friends

A five year old isn’t going to get your clever double entendre or understand irony. Keep that stuff in the body of your content and use your title like you would for a children’s book. For example, what do you think Frog and Toad Are Friends is about?  Yeah, probably about a frog and a toad who are friends. Bottom line – keep it simple.

Keep it brief. Search engines, like most five year olds, have a short attention span. It’s not that they can’t read all the words, but Google is only going to display 65 characters to people looking for your content. So keep it under that limit. If you’re using your brand at the end of the title, you can go just past the 65 character limit and thus sacrifice the brand name to the HTML ether.

URL

Lets extend the library analogy above. You can think of the URL as the spine of the book. A search engine will have less of a chance of understanding what that book (your page or post) is about if you don’t have a descriptive URL.

Dewey Decimal Sticker Example

Imagine if all a search engine saw on the spine of your page was a dewey decimal classification sticker. Here’s are some examples of what that might look like on your site:

http://www.domain.com/p=124

http://www.domain.com/viewDocument.aspx?d=903

Now, if you were a high-functioning adult you might be able to interpret that to get what it means, or you might find your way to the card catalog to do further research on what it meant.

But search engines are like five year olds!

They’re not going to do that. They aren’t going to understand it and they’re not going to take the time to investigate or research. The search engine will simply move on to something that looks more interesting and is easier to understand.

Instead, use your title (minus your brand name if you’re using that) as your URL. Simply use hyphens between each word and you’ve created a URL that a five year old will understand.

If you can, don’t bury the page in lots of directories. Make your URL as flat as possible while retaining the context. For instance, which of the two URLs below is easier to understand at a glance?

http://www.domain.com/toys/non-electronic/animals/stuffed/bears/brown/paddington.htm

http://www.domain.com/toys/paddington-bear-stuffed-animal.htm

In a pinch you can take out words that don’t contribute additional meaning. So, you could drop ‘the’ or ‘and’ in many cases to make the URL even more concise.

Meta Description

You might have read elsewhere that meta description doesn’t matter. Don’t believe it for a second! Meta description is important because it’s your chance to market your content.

Lets, again, use the library analogy. Readers might scan the book title. It tells them a little bit about what the content of the book. Okay, it’s about a toad and a frog who are friends but … what do they do? The meta description is the blurb on the back of the book that tells you more.

The meta description is the place to entice people to click on the search result. You might get ranked well for a term, but the next step is to get them to click. Sure, you have a much better chance based on your rank, but the text you use in the meta description does matter.

Try to get your keyword phrase in there, twice if at all possible. Why? Because Google will bold the search terms in your result. The bold styling reinforces that they’ve gotten the correct result and draws the eye to that listing.

So put something in there that tells them they want to come to your page. Make it exciting or interesting. Tell them what they’ll gain from visiting your page. Sell your content.

Putting It All Together

So here’s what the SEO Holy Trinity winds up looking like in Google.
SEO Holy Trinity Example

This page or post was optimized for the phrases ‘2009 Internet Predictions’ and ‘2009 Technology Predictions‘. (I could have done a post for each one to be even more focused but … time is not infinite!) Do the searches for yourself and see how I rank.

Now this isn’t the end of SEO. You’ll need to do more to achieve high rankings and traffic. But the SEO Holy Trinity is the base that you must have to ensure success.

In the name of the Title, and of the URL and of the Meta Description.

2009 Internet and Technology Predictions

December 23 2008 // Advertising + Marketing + SEM + SEO + Social Media + Technology // 9 Comments

Now is the time when bloggers go on the record with their thoughts for the year ahead. Place your bets! Stake your claim! Here’s mine.

Crystal Ball 2009 Predictions for Internet and Technology

Facebook Becomes A Portal

Realizing that social media and advertising is like oil and water, Facebook repositions itself as a portal leveraging Microsoft’s Live Search as the revenue model. This also might result in the potential acquisition of Netvibes to provide a more robust offering to compete with Yahoo!

Identity Systems Fail

Confused about the difference between OpenID, Facebook Connect and Google Friend Connect, users throw up their hands and decide not to use any of the above.

Video Advertising Succeeds

The adoption of video is surging faster than many expected. Longer formats and better quality will bring even more eyeballs who will grudgingly accept advertising.

Microformats Go Mainstream

Why they aren’t already is shocking. Nevertheless, in 2009 we’ll see microformats become a standard and search results will become far more robust as a result.

Banner CTR Becomes Obsolete

Brands will finally realize that measuring success by click through rate (CTR) isn’t working. Measurement ‘beyond clicks’ will be the new yardstick, whether that’s through new brand advertising measurement services like Vizu or through monitoring services like Brandwatch and Trackur among a gaggle of others.

RSS Adoption Spikes

Someone will (finally) figure out how to market RSS to ‘the masses’ who will grasp the sublime benefits of having content come to you instead of the other way around.

Kindle 2.0 Flops

Amid a weak economy Amazon releases the newest version of Kindle. Other readers have gained ground where Kindle has not and at the core Kindle is a solution without a problem.

Google Search Share Stalls

The move by Facebook (see above) causes a radical change in the search landscape. Microsoft passes Yahoo! for second place and talks about a Microsoft Yahoo merger are (unbearably) reignited.

FriendFeed Surpasses Twitter

FriendFeed adoption increases at an accelerated pace due to quick innovation, uncluttered design and an interface that lends itself to communication.

Someone ‘Dies’

Users reach social media overload and VCs get even more nervous about revenue creating social media shrinkage. In this instance ‘Dying’ means a company goes under or is purchased for a song. My short list includes Plurk, Twitter, Digg and Seesmic. This isn’t a reflection of the people or product but the inability to truly reach the mainstream with a service that has a profit model.

There are plenty of other things that I believe will happen in 2009, but they seem more obvious or an extension of current trends. Instead I tried to be a bit more bold, at least on a few of my predictions.

We’ll check in this time next year to see how I fared. In the meantime, feel free to comment and provide your feedback and reaction to my predictions.

Google Taxonomic Search And SEO Affirmative Action

December 04 2008 // SEO // 4 Comments

There are rumors that Google is applying taxonomic limits to specific searches, usually surrounding products or brands.

When I first heard these rumors I dismissed the idea. Even if Google created a taxonomy of sites, I couldn’t see them using that data as part of the search algorithm. But I quickly began to see that Google likely has a robust taxonomy and could relatively easily use it as a signal within the algorithm.

Does Google have a taxonomy of websites?

Yes. It already displays this taxonomy via the AdWords placement targeting interface.

The real question is whether Google has attached a taxonomic classification to sites outside of the AdSense network? I’d guess that it may not be comprehensive, but that it includes sites that meet some quality threshold, potentially Page Rank.

Why would Google use a taxonomic signal in search?

Lets start with the idea that Google wants to ensure they deliver the best set of results to users. Would a product related SERP that contained more than five shopping comparison sites be the best result set? How about forums? Or blogs? Or retail sites?

Think about it. If you searched for ‘casio exilim’ would you want a SERP that had Walmart, Circuit City, Office Depot, Ritz Camera, Best Buy, Target and eBay? Similarly, would you want one that had PriceGrabber, BizRate, Buzzillions, Yahoo! Shopping, PriceRunner and NexTag?

I’m guessing that most of the time the natural ebb and flow of the algorithm works out and the results are a nice mixture of these different types of sites. But sometimes one type of site might be better optimized for a term and the type of taxonomic overload above might occur.

How would Google implement a taxonomic search signal?

At first the idea of using a taxonomic parameter in search sounded tough. Yet again, I quickly surmised that it might be more straightforward than I thought.

Wouldn’t you simply enter some logic that made it slightly more difficult for a site of the same taxonomic classification to rank for a query if there was already one ranked higher? The difficulty would increase based on the number of sites in the same taxonomic classification for a query. So, if two shopping comparison sites were already ranked, it would be far tougher for the third and even more so for the fourth.

It’s essentially a smoothing mechanism that gives slightly more opportunity to sites in another taxonomic classification to rank for a query that they might be shut out of otherwise.

A taxonomic limiter would be a safeguard mechanism and perhaps serve as an over-optimization trigger.

What does a taxonomic search signal mean for SEO?

A taxonomic search signal would be a bit like affirmative action for SEO, giving lesser optimized sites a leg up on the competition. It also means greater SERP volatility as sites within one taxonomic class swap positions.

For instance, two sites in the same taxonomic class could be very close and should be ranked 1st and 2nd, but because of the taxonomic limiter are ranked 1st and 5th. If the second site eclipses the first, it would immediately be ranked 1st since the limiter wouldn’t apply. Consequently, the first site would drop, likely to 4th or 5th because the limiter is now being applied.

Is Google really using a taxonomic search signal?

Maybe. While the idea makes sense, the changes some are seeing could instead be caused by one of the numerous algorithm tweaks. But the tools to measure and monitor SEO strength are getting better. Companies can monitor their rank consistently over time and identify when things go sideways. Sometimes the changes don’t make a whole lot of sense and that’s when these types of theories and rumors are born.

What do you think? Is Google using a taxonomic search signal and limiter in search results?

Does Google Have Pac-Man Fever?

December 02 2008 // Humor + SEM + SEO + Technology // 2 Comments

Google’s share of US searches continues to rise according to a recent comScore press release. In October 2008 Google led with 63.1% of all searches conducted. The resulting pie chart shows that Google is closing in on a Pac-Man like position in the search market.

It hasn’t been this way for that long though. Following is the historic comScore data I’ve cobbled together showing Google’s share of the search market.

October 2004: 34.8%
October 2005: 39.0%
October 2006: 45.4%
October 2007: 58.5%
October 2008: 63.1%

So, in five years the search market went from a dog fight to a laugher. If Google continues on this path the pie chart will take on true Pac-Man dimensions.

Now, I’m not sure who’s Inky, Blinky, Pinky or Clyde but Google certainly has the other search players on the run.

None of them seems to have the right medicine to reduce the Google fever that has swept the country. Acetaminophen (AOL), ibuprofen (Yahoo!), naproxen (Ask) and aspirin (MSN) have all failed to bring the temperature down. And upstart homeopathic remedies (Powerset, Cuil etc.) haven’t made a dent either.

Would mixing some of these medicines together help? Some Yahoo! and MSN with a dash of Powerset? Not likely. And in some cases mixing medicines can prove lethal.

Could Zoetrope Be A SERP Tracking Tool?

November 25 2008 // SEO + Technology // Comments Off on Could Zoetrope Be A SERP Tracking Tool?

Zoetrope is a new web tool being jointly developed by Adobe and researchers at the University of Washington. The general idea is to allow users to view the Internet over time. Think of it as the Wayback Machine on steroids. Sarah Perez does a good job writing about Zoetrope in ReadWriteWeb.

Yet, it’s the following video that does the best job of explaining Zoetrope.

Could Zoetrope be used as a SERP tracking tool?

If Zoetrope could capture specific search engine results pages (SERPs), then it could be a very powerful SERP tracking tool.

Let’s say I have 10 high value keywords for which I want to be highly ranked on Google. Using Zoetrope I could conceivably capture the daily SERP for each keyword and link it to the appropriate destination page. This would allow me to see if changes to my page had any impact on SERP ranking for that keyword.

I could even create ‘lenses’ for my competitors and review how they’ve tweaked their site to help influence SERP rank. Zoetrope could provide an unparalleled level of detail and analysis for a savvy SEO practitioner.

Yet there’s a big, actually huge, ‘if‘ in the statement above. I doubt that Zoetrope is or even could capture every SERP. But that doesn’t mean there’s not a way to do this. In fact, I think it provides a pretty interesting business opportunity.

Want Zoetrope to help you track SERP for a group of keywords? Simply sign-up for a subscription and they would begin to capture the information. Toss in a 30 day free trial to get the ball rolling and I think you’d have a number of people clamoring for and using Zoetrope for SERP tracking.

SearchWiki Turns You Into Free Mechanical Turk

November 21 2008 // SEO + Technology // 1 Comment

Yesterday Google launched SearchWiki, a search feature that lets users customize search results by moving, deleting, adding and commenting on search results. The search algorithm will now have access to a set of aggregated human data. As I’ve written about before, the Google search algorithm will benefit from having a human feedback mechanism.

Google SearchWiki turns users into a free Mechanical Turk.

Not familiar with Amazon’s Mechanical Turk? Here’s a relevant excerpt from Wikipedia:

The Amazon Mechanical Turk (MTurk) is one of the suite of Amazon Web Services, a crowdsourcing marketplace that enables computer programs to co-ordinate the use of human intelligence to perform tasks which computers are unable to do.

Now, clearly the search algorithm is able to produce search results. However, the algorithm still isn’t very smart. SearchWiki lets the algorithm learn as humans move, delete and add results for specific search queries.

While Google clearly states that your specific changes will not be seen by others, it seems impossible to think that Google won’t use that information to influence search results over time. Not convinced? SEO by the Sea reviews a recent Google patent that points to Google’s continuing goal of leveraging more data and behavior into search results.

Google has been experimenting with something like this for some time and finally seems ready for prime time. This also means the flirtation with Digg is likely done for good unless abuse by SEO practitioners overwhelms the signal.

How do SEO gurus react to SearchWiki?

If you employ a short term chase the algorithm type of SEO you’re seeing a threat and an opportunity. The threat is that the human feedback mechanism could help to curb over-optimization and subtle gaming. However, it also provides an opportunity to create a SearchWiki army that could coordinate changes to specific search results. The worst case scenario is that people are paid to delete, add or move certain results for specific searches.

I’m not saying SearchWiki is a bad thing. But make no mistake, Google is using it as a free way to make their core product better.

Search Pogosticking and SEO

November 14 2008 // SEO + Technology // 18 Comments

Search pogosticking is defined as going back and forth from a search engine results page (SERP) to individual search result destination sites. The behavior may indicate poor search results since the user hasn’t been satisfied by one or more of the SERP results.

Google and Yahoo are clearly using, or thinking about using, this metric as part of their search algorithm. It makes a lot of sense and could provide an important user-defined input for the algorithm as well as guard against potential over-optimization. As an aside, access to this human feedback mechanism may be one of the reasons why Google hasn’t been eager to pay a premium for Digg.

How would search pogosticking influence SEO?

A user is presented with search results based on a specific query. The engine captures what result you click on and whether you return to that SERP and click on subsequent results and/or refine your query. (They could even conceivably determine the time between each click as a proxy for satisfaction with that result. This would reduce the chances of penalizing results that did deliver value.) The information can be aggregated for each query and compared to average pogosticking behavior by SERP rank.

So, let’s say that for a specific query the engine sees that the current SERP has an abnormally high pogostick rate for the top ranked result. This information could then be fed back into the algorithm as a negative signal, thereby reducing its SERP rank in a future algorithm update. Obviously, it would have to be statistically above the average pogostick rate for that position to be flagged. However, I’m certain both Google and Yahoo have smart folks who can calculate when it reaches significance.

Does search pogosticking sound far fetched?

It shouldn’t. Do a Google search and look at the page source. You’ll see something like the following next to each search result.

onmousedown="return rwt(this,'','','res','1'

The ‘res’ likely stands for result and the next parameter is the actual rank for that result on that page. More information on this rewrite and tracking behavior can be found at searchlores and blogmal.

Even without this technical knowledge it should be obvious they’re doing something like this if they’re providing users with customized search results. I’m not sure the ability to provide custom or personalized results is the true aim, but makes the collection of this information more palatable for many users.

SEO by the Sea recently posted about this topic including a relevant pogosticking patent application by Yahoo. I recommend you read his post as well.

How can I track pogosticking?

You can’t know for sure whether a user is pogosticking. Bounce rate can sometimes be a good indication, but there are instances where query intent is satisfied and a high bounce rate is the expected result. Many Q&A sites meet this criteria.

The best course of action is to review pages with high bounce rates and make certain you’re matching query intent and delivering value to those users. Post-click SEO is going to become a larger part of the equation, bluring the line between SEO and traditional design and UI.

SEO Success By A Thousand Optimizations

November 05 2008 // SEO // 1 Comment

Google has been quoted as saying that their search algorithm takes into account over 200 signals (or variables) to determine the ranking of a page for a given query. This isn’t really news to anyone who has been paying attention but it should change how many approach search engine optimization (SEO).

Of course some of the 200 signals are weighted more heavily than others, but Google can twist the knob up or down on each of these to dial in what they believe are the most relevant results. As such, concentrating on a short list of critical search engine optimization components will only get you so far.

Don’t get me wrong, you must get the short list right, but if you stop there you’re likely going to chase the algorithm instead of ride it.

Search engine success by a thousand optimizations.

You might have heard of the saying ‘death by a thousand cuts’. The inverse is true for search engine optimization. Success can be attained through doing a lot of little things right. Use natural language to name your images. Change the anchor text or nofollow extraneous ‘see more’ links. Bold your targeted keywords within your content. The list goes on and on.

I’m often asked to provide a potential return on investment for undertaking a small SEO effort. Yet, it’s nearly impossible for me to assign that value. Will it help? Yes. How much? I don’t know. Why don’t I know? 200 signals! And it’s not just the sheer number of signals but the fact that the weighting of those signals changes.

For example, I see more and more sites abandoning meta keywords altogether. I believe this is a mistake. While the weighting for meta keywords might be very low now due to high abuse, it is still one of the signals. There remains some criteria in place for analysis. At any given time Google could determine that the proverbial ‘coast is clear’ and begin to increase the weight of meta keywords once again.

Follow blind five year old principles.

My philosophy is to treat search engines like blind five year olds. As such, I think of these 200 signals as hints that I can provide to help it make a decision. The goal is to provide the search engine with as many hints as possible, keeping in mind that they must also be good hints. (Providing a whole bunch of mediocre hints will likely have the same impact as a handful of really good ones.)

But if that five year old is putting a puzzle together I need to make sure I show it the edge pieces and flip over the pieces of, say, the Austrian Castle to make it easier. That will go a long way. But what if I can also sort out the sky from the grass and even begin to group them into different hues?

Suddenly, it’s a lot easier to complete the puzzle.

Five Foot Web Design

October 28 2008 // Marketing + SEO // 5 Comments

How do you make sure your message is getting across to users? Five foot web design.

Here’s how five foot web design works. Print out a nice color copy of the page in question. Tape it to the wall. Take a few steps back until you’re around … five feet away. What do you see? If you can’t make out the message then you might have a problem.

I’m no web design or user interface guru and I don’t pretend to have Jakob Nielsen’s experience or expertise. Instead I’m armed with the knowledge that you have a very short time to persuade someone to stay on a page. So don’t be coy. Don’t fill it up with a whole bunch of stuff. Keep it simple. Tell users exactly what your site or web page is about, what you want them to do and where to go next.

How do we get people to slow down on a street in the physical world? A sign. Let’s look at the ubiquitous yard sale sign. Which of the two signs below do you think is more effective?

No contest, right? The same rules apply on the information superhighway.

There are other real world examples that support five foot web design. My first job out of college was as an account coordinator at an ad agency. One of my tasks was writing up meeting notes. My first attempt was returned to me with more red on it than I’d seen in a long time. The message: say it with as few words as possible.

I also learned about one of their hiring techniques. Only the first page of a resume was reviewed. Any subsequent pages were thrown in the trash and never read. The lesson: brevity was and is critical, particularly in advertising.

Five foot web design simplifies your message and increases the odds that users will understand, remember and take action on that message. It will also help your search engine optimization (SEO) efforts.

The reason five foot web design supports SEO is that it forces you to use your keyword phrases prominently and to rely less on explanatory text and links. In short, it’ll help boost your keyword density.

You’ll need to use your headers wisely. Nice big H1s that don’t mince words. H2s and H3s that support your keyword(s). Strong anchor text. Judicious use of bold text. Strong call to action buttons. Keep it clean. White space is your friend.

Poker Copilot is a good example of five foot web design. Recently, they switched to much larger buttons in hopes of increasing downloads.

The spike at the end of the graph says it all. Spike is not even an adequate word here. It’s a kangaroo leap to a new level.

I also recommend the book Don’t Make Me Think by Steve Krug which presents a more detailed look at web design that dovetails nicely with five foot web design.

In lieu of additional reading, simply take a moment to step back from the computer, or roll back in your chair, and see if your site is befitting from five foot web design.

Fat Finger SEO: Optimize For Misspellings

October 12 2008 // SEO // 1 Comment

Fat Finger SEONot everyone is the world’s best speller and even those that are sometimes hit the wrong key and wind up searching for a misspelled keyword. It’s human nature and there’s no reason why you shouldn’t optimize for the most common misspellings. And there’s an easy way to target these misspellings without making yourself look like a fool.

Simply create content on the topic that mentions the common misspelling. Use the misspelling in the browser title, meta description and URL. I recommend acknowledging the misspelling in the meta description (as well as in the content) to help validate their search and encourage clicks. This is important because you have a powerful competitor in Google’s Did You Mean functionality and display.

You can even dedicate an entire blog post to the misspelling if you want, like I’ve done on my Used Books Blog.

Here, I’ve created a post about used boks, acknowledging it as a common misspelling for used books. The post is focused, humorous (or so I hope) and topical.

A more straightforward approach is one used at Caring.com. In this instance, we’ve optimized for the keyword genzar a common misspelling of a cancer drug.

While it’s currently the number one search result, I’d still like to see the misspelling in a shortened browser title. I’d also note that this was a user submitted question to the site. Be sure to check your keyword and internal search reports to identify misspelling candidates.

The increase in mobile searches will likely result in more fat finger searches. So be on the lookout for this easy source of traffic. But at the same time, don’t drive yourself crazy finding and optimizing every misspelling.

xxx-bondage.com