Google Analytics Userscripts

February 17 2011 // Analytics // 1 Comment

If you spend a lot of time in Google Analytics you may quickly find yourself frustrated with the user experience. Here are 3 userscripts that make using Google Analytics way more efficient.

What are Userscripts?

Userscripts are small pieces of JavaScript code that tweak or provide additional functionality to your web experience. You install userscripts as a simple add-on in Chrome, Firefox (requires Greasemonkey) or Internet Explorer (requires IE7Pro).

In a nutshell, userscripts make things better. A lot better.

Cleaner Profile Switching

This userscript lets you switch from one Google Analytics profile to another and see the same report. It also gives you the option of opening that new profile in a separate tab.

Cleaner Profile Switching Userscript

This is a huge time saver if you’ve got multiple profiles (which you should) since you won’t have to build the report from scratch each time.

Get it: Cleaner Profile Switching

Absolute Conversion

This userscript calculates and displays the number of conversions next to the conversion rate.

Absolute Conversion Userscript

So instead of navigating to the Goals menu or doing some math in your head, you can quickly see your conversion numbers. Please note that while this is a handy userscript, it breaks when Google Analytics samples data.

Get it: Absolute Conversion Userscript

Accordion Menu

This userscript makes all of the top level Google Analytics menus expandable without waiting for the browser to reload.

If you use Google Analytics often, you probably get tired of clicking on main section report titles, only to wait for it to load so you can click on sub-reports. Think about it, how many times have you clicked on “Traffic Sources” with the full intention of clicking on “All Traffic Sources” as soon as possible? Or “Content” just to get to “Top Content”.

This userscript is a massive time saver.

Get it: Accordion Menu Userscript

Using Userscripts

I should warn you that userscripts can sometimes be janky and cause problems. In fact this post was originally going to feature four userscripts until I noted a problem with one of them. Don’t let this keep you from trying them out. Userscripts are super easy to uninstall and many of the creators are eager to get feedback on how to improve them.

Give these Google Analytics userscripts a try and let me know if you have any others you swear by.

Valentine’s Day Thank You

February 14 2011 // Career + Life // 12 Comments

I often fail to thank folks properly. I mean to do it but … I wind up getting busy and then a week goes by and then a thank you seems false. Yet, I truly do appreciate it!

Invisible Double High Five

So it seems apt on Valentine’s Day to thank the many people who have influenced, supported and helped me over the past year.

Aaron Bradley (@aaranged) at SEO Skeptic has consistently provided engaging dialog on my blog. I hope to return the favor. Aaron is a thought leader, willing to rely on his own critical analysis in looking at a subject.

Michael Martinez (@seo_theory) at SEO Theory isn’t going to give you the same old SEO spiel found on hundreds of other blogs. Like Aaron Bradley, it’s great to find free thinkers in an industry with plenty of sheep.

Rob Diana (@robdiana) at Regular Geek is amazingly smart and engaging. Not only has he been a great supporter but he is vital to helping me find the best information on the Internet.

Matt McGee (@mattmcgee) and Danny Sullivan (@dannysullivan) are great people who have given me the opportunity to be an Editor at Sphinn. I sincerely appreciate their confidence.

Matt Cutts (@mattcutts) is a decent and generous guy. He may not always provide the answer I want to hear, but he’s nearly always there with an answer nonetheless.

Paul Buchheit (@paultoo) is, among other things, the founder of FriendFeed. That alone gets him on my list but his personal blog has also been inspirational.

Rick Bucich (@rbucich) has been a long time supporter of me and my blog. He’s wicked smart about SEO, so it’s a real compliment to have him in my corner.

Andrew Hanelly (@hanelly) has made some complimentary comments on the blog. That’s great, particularly since his own blog is great as well.

Jeremy Post (@jeremypost) is the best colleague I’ve had since I began SEO. Smart, hard-working and an all around good guy. He keeps me on my toes. Bonus – he brews his own beer.

Jonathan Mendez (@jonathanmendez) at Optimize and Prophesize provides amazing insight, bridging search and display. I’ve been lucky to interact with him a few times and always feel smarter afterward.

Lisa Barone (@LisaBarone) at Outspoken Media was kind enough to feature my Facebook SEO Guide in one of her posts. I’d be lying if it wasn’t nice to be acknowledged by one of the ‘cool crowd’ in the industry.

Rand Fishkin (@randfish) is CEO and Co-Founder of SEOmoz. You’d have to live under a rock not to know of Rand. I don’t know Rand personally, outside of a small email exchange, but his personal blog has been influential. I’m blogging more and better able to deal with haters because of his writings.

Aleyda Solis (@aleyda) at Aleyda Solis has been a tremendous supporter. My Spanish isn’t very good so I’ll simply say muchas gracias.

Kirby Freeman (@kirbyfreeman) is whip smart with a true gift for building product. She made me look good.

Michael Fruchter (@fruchter) has been a great supporter and another source for great content.

Micah France (@micah_france) has been generous with his comments and Tweets. They don’t go unnoticed.

Eric Logan (@ericloganvanman) is nearly always the first person to Like one of my posts on FriendFeed. It just seems like he’s got my back.

Roberto Bonini (@rbonini) is also quick to Like my content on FriendFeed. I appreciate it.

Louis Gray (@louisgray) introduces me to new and interesting services – constantly. He’s perhaps the nicest guy you’ll ever meet too.

Tad Chef (@onreact_com) at SEOptimise is always interesting and was kind to include me in his 30 Great SEO Blogs You Might Not Know Yet.

Elisa Gabbert (@egabbert) at WordStream is smart and funny. It’s awesome when someone like that references your work.

Alexia Tsotsis (@alexia) put my contact information smack dab on TechCrunch. That can’t be bad for business.

Tamar Weinberg (@tamar) at Techipedia does a fantastic job finding the best in Internet marketing. It was an honor to be on her list of Best Internet Marketing Posts of 2010.

Matt Gammie (@mattgammie) has been an interesting new and diverse voice. I appreciate the dialog.

Derek Perez (@perezd) at Perezium is wise beyond his years. He’s a hoot to be around but serious about the intersection of code, UX and start-ups.

Chris Eppstein (@chriseppstein) is an amazing Software Architect. Many of our conversations about search wind up as blog posts. I hope that continues.

Srikanth AD (@srikanth_AD) has been a great supporter, particularly on Quora.

Jill Whalen (@jillwhalen) at High Rankings is quick with an answer and always has an informed opinion. I may not always agree, but I like that she’s got an honest point of view.

Mark Essel (@VictusFate) at Victus Spirtus let me ride shotgun on his entrepreneurial ride. It’s been amazing to follow and his frequent blog posts often point me in interesting directions.

Mahendra Palsule (@ScepticGeek) at Skeptic Geek is a gold mine of information and insight. I’m thankful for his support and appreciate his editorial prowess.

Kristi Hines (@kikolani) at Kristi Hines is a dynamo. I certainly appreciate the mention.

Ruud Hein (@RuudHein) at Search Engine People is a great writer and search historian. Bonus – he’s friendly on Twitter.

Barry Schwartz (@rustybrick) has included me in a number of his daily search recaps. Thank you.

Danny Brown (@dannybrown) is a paragon for all bloggers. He’s smart, down-to-earth and incredibly responsive.

Greg Sterling (@gsterling) at Screenwerk was kind to chat with me at SMX Advanced. He’s the guy to talk to about local and mobile.

Bill Slawski (@bill_slawski) at SEO by the Sea provides an incredible service to the SEO industry. We all appreciate it.

Donna Fontenot (@DonnaFontenot) at DazzlinDonna is as generous and nice as she claims to be.

Bill Rowland (@billrowland) at Nexus Interactive Marketing has commented on the blog a number of times. I’m thankful for his contributions.

Marty Weintraub (@aimclear) at aimClear for reminding me that search is fun. I hope to deliver as much value when I next present.

The music is playing so let me quickly squeeze in others who have written about me on blogs, mentioned me on Twitter or included me in their daily news.

The number of folks who have been kind to me is overwhelming. I hope I continue to earn your comments and support. And I know I’ve left off a lot of people (particularly folks at FriendFeed). So thank you to those that I have unintentionally missed.

Happy Valentine’s Day.

Google Should Follow NoFollow Links

February 12 2011 // SEO // 5 Comments

What would Google look like if it relied exclusively on the nofollow link graph? It would be a sort of Bizzaro Google. We’ll call it Elgoog.

Elgoog Logo

In this search engine Matt Cutts would help ensure every SERP contained at least one result for Viagra.

Sounds like the stuff of comic books. Yet, I think the nofollow link graph could actually be useful in improving search quality.

Types of NoFollow Links

First, lets think a bit about the nofollow link graph. It isn’t homogeneous. There are different types of nofollow links and they should carry different weights within the Elgoog algorithm. I see six general classes of nofollow links: Administrative, Sculpting, Advertising, Editorial, Social and Comment.

Administrative nofollow links are internal links used for areas such as log-in, privacy, ratings or feedback. They’re generally benign.

Sculpting nofollow links are internal links used to ‘hoard page rank’ or to ensure anchor text focus. The latter is something I don’t see discussed much when abandoning the idea of page sculpting. If Google isn’t passing anchor text through nofollow links, you might find it useful to put a nofollow on obtuse and extraneous links such as ‘read more’. Whatever your position on page sculpting, it’s a type of manipulation.

Advertising nofollow links are external links used to identify paid links.

Editorial nofollow links are external links used when one site is explicitly not endorsing another site. If Aaron Wall ever linked to Mahalo you better believe he’d use a nofollow. These links are incredibly rare and, as such, quite valuable.

Social nofollow links are external links used by social platforms such as Twitter and Facebook. They could conceivably be deemed sculpting or editorial nofollow links but I’ll split them out here since they’re probably a big piece of the nofollow link graph.

Comment nofollow links are external links used in blog comments and forums. In the last two years it seems clear that Google can determine the location of a link – that the link comes from text, the footer, navigation or comments. The comment nofollow graph is where you’ll find an enormous amount of spam.

NoFollow Link Graph

The nofollow link graph would power the Elgoog algorithm. Sites with a high sculpting to total link ratio would rank well. Similarly, a high advertising to total link ratio may boost your Elgoog rank. Editorial nofollow links would provide a lift to the linked site. And of course, sites that receive a large amount of comment nofollow links would shoot to the top of Elgoog results.

In looking at just the comment nofollow link graph it seems like it should be pretty easy to identify spam. Elgoog loves spam! Remember my comment spam examples? Those comments all have links that would help that site rank well in Elgoog. As such, they should rank nowhere near the top in Google.

See how that works. If it ranks high in Elgoog, it ranks low in Google. Your Elgoog score is a signal. Sounds crazy, but on some level I feel like it might work.

Measure Unnatural Link Behavior

If people no longer naturally link, why not measure the inverse? Measure the unnatural link behavior.

At SMX West 2010 Bing made a presentation on Social Search that included an interesting slide that showed the difference between a natural community and a spam community.

How To Identify Link Spam

The difference is rather stark. Spam was highly visible.

Elgoog would reward sites with obvious link exchange schemes. It would seek out uniform anchor text. And those that received links from sites completely off topic would also benefit. In fact, Elgoog might require topic diffusion in your link graph. (Sites about lawn mowers better have links from wedding dress and online chess sites.) Of course, large amounts of links should be built in a short time frame. Slow and steady link profiles would be frowned upon.

Elgoog and the nofollow link graph could help measure manipulated trust and authority or, at a minimum, trust and authority disinformation. As the link graph degrades, wouldn’t that be a valuable signal?

Maybe Google does this already. Or they’ve poked at it and found it flawed. (I can already poke holes in it myself.) Yet, I feel like there’s a thread of potential here. Rooting out manipulated trust and authority seems a far better initiative than judging content farm quality.

What do you think?

How To Quickly Identify Comment Spam

February 10 2011 // SEO // 5 Comments

Most of the time comment spam is pretty obvious, but every now and again you want to make certain you’re trashing the right comments.

Google Your Comments

The fastest way to identify comment spam is to copy the comment and search for it on Google. Make sure to use quotes around the entire comment, which tells Google to look for the query in that exact order. If you get any results on the exact comment, it’s most likely comment spam.

Here’s one I received a few weeks ago.

Finding Comment Spam

I’ve highlighted the fact that Google is returning about 26,400 results for this bit of comment spam.

Here’s another I plucked off of another blog during my morning reading.

Comment Spam Example

Two things to note here. First is that this technique still works even when you reach the 32 word query limit. Second, this gem returned about 90,500 results. (Is it just me or does the ‘about’ sort of sound like ‘allegedly’ in this instance?)

Comment Spam

This sad and pathetic practice gives SEO a bad name. I sometimes think a database of comment spam should be created. Who knows maybe it already exists.

I can think of a number of ways that it could be used to help sites rid itself of this parasite. Why couldn’t Google alert owners (via Google Webmaster Tools) when it identified a comment that reached a certain spam threshold? Seeing the same comment even 100 times should be more than enough, never mind the 26K and 90K examples I’ve provided.

What about abandoned blogs or splogs where there is an abundance of comment spam? Couldn’t a comment spam database help flag these sites and remove them from the index and link graph?

Of course it’s more difficult than it sounds but wouldn’t it be worth it?

Retailers Slow To Adopt Like Button

February 09 2011 // eCommerce + SEO + Social Media // 4 Comments

In April 2010 Facebook launched the Open Graph and Like button, allowing sites to better control how their pages are displayed in Facebook News Feeds and search results.

Retailers Slow To Adopt Like Button

Yesterday I visited all of Internet Retailer’s Top 100 retailers to see if they were using the Like button. I did not include those who were using the Like button for their Facebook Page but instead was looking for Like button usage on product pages.

Adoption Rate of Facebook Like Button by Retailers

The adoption rate of the Like button for eCommerce seems low, with only 27% of the Top 100 online retailers using the Facebook Like button.

Like vs Share

Facebook Share was not included in the above numbers, but is more widely used by retailers. Yet, the share functionality is no longer promoted or recommended by Facebook. Searching for it on their developers platform results in very little and what does usually points to the Like button and Open Graph documentation.

While not specifically measured, I’m unsure if any of these retailers (even those with the Like button installed) were using the related social plugins. In particular, the Recommendations plugin could be an interesting cross sell feature for retailers.

Facebook Insights

Of those using the Like button, only 35% were tracking usage via Facebook Insights. My methodology for validating this was to use the Facebook Linter tool on a retailer’s domain. I counted those who had the appropriate Facebook Insights for Domains verification (fb:admins, fb:app_id or fb:page_id) enabled.

This is somewhat less surprising given the difficulty in verification, lack of robust data in Facebook Insights and ability of retailers to track downstream traffic from Facebook as a benchmark for success. However, this metric should be of concern to Facebook.

eCommerce Opportunity

Facebook Money

The Like button and Open Graph present a huge opportunity for retailers and eCommerce. Using Facebook SEO, retailers can optimize the way their products are presented on Facebook.

Each Like is a type of micro-review and an opportunity for retailers to leverage brand affinity. In addition, sites can publish stream updates to users who have Liked pages via the Open Graph API.

There are 500 million active users who spend 700 billion minutes a month on Facebook. When will retailers decide to dedicate more effort to reach this captive audience?

Blekko Censors Search

February 03 2011 // Rant + SEO // 11 Comments

The needs of the many outweigh the needs of the few.

Blekko Doesn't Grok Spock

Blekko Spam

Just prior to Farsight 2011, Blekko removed twenty sites from its search results.

“These sites are the worst spam publishers on the Web according to our users,” said Rich Skrenta, CEO of Blekko. “They are literally responsible for millions of pages on the Web that our users say are just not helpful and they’d prefer they were banned permanently. So we’re going to do that for them.”

Blekko has some interesting functionality around spam so I can see why they’d want to highlight it based on the recent spam/content farm meme surrounding search. That’s understandable. But censorship is not the answer.

Blekko Users

There is precious little data as part of this announcement. How big is Blekko? Quantcast and Compete show that the monthly unique visitor count is anywhere between 16,000 and 143,000. However, to mark anything as spam you have to be a Blekko user.

The November 2010 public launch of Blekko provided some insight into numbers and usage.

Blekko has been testing its solution to search with roughly 8,000 beta testers who have created more than 3,000 different slashtags. Blekko tells us that 11% of its existing user base come back to the site on a weekly basis.

I was a beta tester. So were a number of my colleagues – innovators, technologists and SEOs. As a search marketer we were eager to try out a competing search engine. I’m not a Google apologist.

Without hard data the math gets fuzzy, but the total number of registered users seems relatively small and is likely still composed of innovators. Do these people represent everyone?

Blekko Searches

The other missing piece of data is the searches related to these spam complaints. We don’t know the types of searches that were performed, nor the result set that was presented to users. Are the spam complaints a measure of the sites or a measure of the quality of results returned by Blekko?

Are spam complaints produced on general search queries or long tail queries? Is the incidence of spam complaints for specific sites different based on query type? (Information vs Transaction vs Navigation.)

The spam interface also leads to another question. How many of the spam complaints were made without visiting the site in question?

Blekko Censorship

Aaron Bradley took the words out of my mouth in his Blekko, Can I Please Have My Spam Back? post.

At the end of the day, I have no respect for a search engine that censors my results based on notions of quality, rather than relevancy.  It ceases to be comprehensive, it smacks of elitist righteousness and – most of all – decisions about the validity of content are being made on my behalf by people I don’t know.

Quality and taste are subjective. The fact that Blekko has chosen to use the feedback from a biased minority to censor results for the majority is unfortunate. Is the message that mainstream users don’t know enough to make their own decisions, the right decisions? If I search for ‘food’, should unhealthy foods be removed from search results?

In all seriousness, would Blekko remove specific books that users had marked as spam? According to the American Library Association, this would mean Catcher in the Rye, To Kill a Mockingbird and The Color Purple would vanish from the landscape.

Use spam feedback to reorder results, but let me make up my own mind. I don’t need a nanny search engine.

Disclosure: While I consult for Buzzillions, this post is my personal opinion and does not reflect those of Buzzillions.

Google Bait and Switch

February 02 2011 // Rant + SEO // 5 Comments

Does Google truly understand SEO? One would hope so but in the last few weeks Google took one step forward and two steps back.

What Google Says

Matt Cutts gave SEO a sort of backhanded compliment in a recent post about search neutrality.

I don’t believe all search engine optimization (SEO) is spam. Plenty of SEOs do a great job making their clients’ websites more accessible, relevant, useful, and fast.

I like Matt and I think he does understand and may even appreciate SEO.

And a recent Google Webmaster Help video titled Using Webmaster Tools Like an SEO was also a positive sign. The content is very basic and Maile seems to be talking like Mr. Rogers, but that’s probably to ensure the video helps beginners and those where English is a second language. So, they talk the talk.

What Google Does

Does Google walk the walk? The new Google Engage program recently launched and I’m seeing ads on Google promoting it.

Google SEO Search Ads

The keyword targeting seems focused around any term containing SEO. I got this one to fire when I searched for ‘seo services’.

A different version popped up during my morning Google Reader review.

Are you an SEO?

What’s the problem? Google Engage has pretty much NOTHING to do with SEO. Here’s the landing page.

Google Engage Landing Page

The highlighting is my own, but is there to underscore the fact that they’re equating search engine optimization with AdWords services. I find this disturbing.

I would give most people outside of the industry a pass on distinguishing between SEO, SEM and PPC. Google is no outsider. I think it’s pretty clear that SEO is about optimizing a site and pages for natural search. SEO is not about paid search.

Yet here they are advertising against SEO keywords, using an SEO focused display URL to encourage AdWords business. I’m left to believe that those behind Google Engage don’t understand what SEO really is or that they know what SEO is and seek to convince people to spend on paid search traffic instead of optimizing for free search traffic.

Bait and Switch

So which is it? When I search for ‘sem services’ I get a different ad.

SEM Services Adwords Ad

That ad takes me to an interesting page.

Google Defines SEO and SEM

Huh. Looks like Google’s got the definitions down pat. So I’m left to assume Google Engage is purposefully muddying the waters.

Am I blowing this out of proportion or are you disturbed by this bait and switch technique?

Google Testing Supersized Sitelinks

January 29 2011 // SEO // 6 Comments

Is Google testing a larger font size for sitelinks?

Supersized Sitelinks

Supersize Sitelinks

It looks like the sitelinks in both paid and organic listings have been supersized. Anyone else seeing supersized sitelinks?

Also, if you’re up for a chuckle, take a gander at the ‘Something different’ selections for apple.

Google Search Quality Decline or Elitism?

January 27 2011 // Marketing + SEO + Technology // 8 Comments

Are content farms really the problem or are you just a snob?

The recent complaints about Google’s search quality (here, here, here and here) range from real spam to indictments of content farms. I think we can all agree that spam (cloaking, scrapers, splogs, status code manipulation etc.) should be weeded out. But that leaves us with the larger issue: the quality of results.

Quality

The definition of quality usually refers to a ‘degree of excellence’ or ‘superiority of kind’. It’s often associated with grade. Think back to your time in school. Did you ever get back a paper you thought deserved a higher grade? You were certain it was an A paper and you got a B+ instead!

B+ Grade

Quality is a matter of taste.

Taste

Ruination IPA or Coors Light

What about beer? I adore Stone’s Ruination IPA. But I’m certain a lot more Coors Light is sold in a day than Ruination IPA in a month, maybe even a year. Even if I were to try to determine the best IPA, there would be many conflicting and passionate opinions on the topic.

Value

Perhaps it’s about value instead? Ruination IPA costs a pretty penny while Coors Light is cheap. Maybe Coors Light is the best value because of the ratio of price to quality. But people value things in very different ways. This is clear when looking at restaurant reviews.

Applebees vs The French Laundry

When I read restaurant reviews I can tell whether the reviewer has the same food bias as I do. I treat reviews which laud huge portions, or rock bottom prices, or extol the virtues of never-ending refills differently. Their view of what a good meal is differs from mine. They’re looking for quantity, no matter how mediocre the food. I’m looking for quality and generally don’t want a pound and a half of garlic mashed potatoes.

There’s nothing wrong with either perspective. But they are different.

Popularity

Google Serves Lots of People

Look around folks. What do you see more of? Fast food or fine dining? It’s fast food hands down.

And you can see this in nearly every area of life. Justin Bieber and Miley Cyrus are wildly popular musicians but I’m listening to Kasabian and Kaiser Chiefs. I haven’t touched Internet Explorer in years but it’s (sadly) still the most popular browser.

Mahalo, Squidoo and eHow get millions of visitors a month. These site are popular, and while you might find them distasteful, lacking quality or providing little value, many others (clearly) disagree.

Do I like these sites? No. Perhaps I’m a snob. Maybe you are too.

Numbers

The number of searches has skyrocketed in the last five years. Using comScore’s monthly numbers, core searches has gone from 6.9 billion at the beginning of 2007 to 16.4 billion at the beginning of 2011.

US Search Volume 2007 to 2011

At the same time Pew reports a growing percentage of adults are now online and using search engines on a daily basis.

Audience

The search audience has changed. One way to measure this is to plot daily search engine usage by adults against the innovation curve.

Diffusion of Innovation

The U.S. Census Bureau puts the population of the US at around 300 million. Using the CIA World Factbook we can estimate that 80% of those are over the age of 14. I’m going to use the resulting number (240 million) as my adult population number.

In 2007 Pew reported that 70% of adults were online and that 40% of them used search on a daily basis.

  • 240,000,000 X 70% X 40% = 67,200,000

In 2010 Pew reported that 79% of adults were online and that 49% of them used search on a daily basis.

  • 240,000,000 X 79% X 49% = 92,904,000

innovation adoption of search

In both 2007 and 2010 daily search usage penetrated the Early Majority. The difference is that the Early Majority now outnumber the Innovator and Early Adopter groups combined.

Early Majority Rule Search Volume

That’s just in three years, imagine the difference between 2005 and 2010. The picture of a daily search user is very different today.

Mental Models

The nature of our searches (as a whole) is likely changing because of who is now searching. The mental model of an Innovator or Early Adopter is going to be different than that of someone in the Early Majority.

Each group is going to approach search with different ideas and baggage. The Innovator and Early Adopter are more likely to be open to new experiences and to explore. They are more risk tolerant.

The Early Majority and Late Majority are more likely to apply their information seeking behaviors from other mediums to search. They’re looking for the familiar.

Brands

Many seemed surprised when Google Instant revealed a ‘bias’ toward brands. It has since been confirmed that Google is not engaging in any internal bias. That bias is a user bias. It’s a predication based, in large part, on the volume of searches.

Should we really be surprised? Many of these companies are spending a fortune to advertise and market their brand. Their goal is to capture mindshare and they are succeeding. So much so that people, particularly the Early and Late Majority, go online to search for those brands.

Brand Search Acceleration

In 2005, a DoubleClick report (Search Before The Purchase) showed relatively low levels of brand search. While it accelerated closer to the actual purchase, in some instances only 27% of searches were on brand. Do you honestly think that’s still true today?

eCommerce has certainly grown in that time. The number of navigation searches has climbed, which is closely related to brand. People continue to search (a lot) for Facebook or Craigslist as a way to get to those destination. But last year Bing also reported that Walmart was the 8th most searched term.

Users

Matt Cutts tells us not to chase the algorithm but to chase the user. But who is the user really? The audience has changed! And if the algorithm is trying to use human feedback as a signal, wouldn’t the results reflect that new composition?

Might that be why in October of 2010 many people noticed an algorithm change that seemed to skew toward bigger brands. It’s what Jonathan Mendez called ‘gentrification of the SERPs‘. (I wish I’d come up with that term!)

I may not think the results got better, but perhaps someone from the Early Majority or Late Majority did. They look at those results and see a lot of familiar brands and that instills confidence.

Content Farms

So when you see eHow at the top of a result and cringe, others might be thinking Google has led them to the easiest and best result. When you find a Mahalo page you might grind your teeth, but others could walk away thinking they got exactly what they needed.

I may enjoy reading the works of Shakespeare but plenty of others will be super happy to have the CliffsNotes version instead.

Which User is Google Optimizing For?

McGoogle

I believe Google when they say they want to provide the most relevant results. But there is a fair bit of subjectivity involved because the user is not some monolithic, homogeneous blob. Quality, taste, value and popularity are all going to inform what people think is relevant.

If Google is optimizing for the majority, that may mean a very different interpretation of relevancy. There’s nothing really wrong with that, but if you’re an Innovator or Early Adopter, you might think things are getting worse and not better.

There’s usually a better place to eat right down the street from a McDonald’s, but it’s McDonald’s that still gets most of the business. There are some places (North Beach in San Francisco for instance) that have a ‘no-chains’ policy.

Google could certainly do that. They could stand up and say that fast food content from Demand Media wouldn’t gain prime SERP real estate. Google could optimize for better instead of good enough. They could pick fine dining over fast food.

But is that what the ‘user’ wants?

SEO Status Codes

January 20 2011 // Analytics + SEO // 5 Comments

One of the more technical aspects of SEO is to understand, monitor and manage status codes.

Soup Nazi 400 Bad Request

What are Status Codes?

Status Codes are an essential part of HTTP, the request-response protocol that powers the Internet. Each time someone visits a page (including Googlebot) they ask the site for information. The status code is a numeric response to that request and provides guidance on how to proceed. You might be familiar with status codes such as 404 and 301.

SEO Status Codes

I recommend bookmarking the status code definitions documented by the W3C. However, I want to provide a quick reference guide specifically for SEO.

200

OK or Success. This is the response code you want to see most often. At a minimum, I want Googlebot to see a 200 response code in 90% or more instances during a crawl.

301

Moved permanently. This is the right way to redirect, telling search engines to index that content in a new location.

302

Moved temporarily. This is the wrong way to redirect (except in very rare cases). You’re essentially putting this content into limbo because it’s not at the current location but search engines won’t index the temporary location.

304

Not modified. This can be used for crawl efficiency, telling search engines that the content has not changed. You’re basically telling Googlebot not bother and to move on to other content. The advent of Caffeine may have made this unnecessary but I think it’s still worthwhile.

404

Not found. This happens when the client can’t find the content at that specific location. Too many 404s are bad. In my experience having too many is a negative algorithmic signal. Google simply doesn’t trust that sending a user to that site will be a positive experience.

I don’t have a hard and fast number for when 404s become problematic. I believe it’s probably based on a percentage of total requests to that site. As such, it’s just good practice to reduce the number of 404s.

That does not mean zero! I don’t recommend putting a 301 in place when it should return a 404. A request for domain.com/foo should return a 404. Ditto for returning a 200 when it should be a 404. (Yes, I’ve seen this lately.) I’d be surprised if having no 404s wasn’t also some sort of red flag.

410

Gone. If you know that content no longer exists, just say so. Don’t encourage Googlebot to come back again and again and again via a 404 which doesn’t tell it why that page no longer exists.

500

Internal Server Error. This generally means that the client never received an appropriate response from the site. 500 errors basically tell the search engine that the site isn’t available. Too many 500 errors call into question the reliability of that site. Google doesn’t want to send users to a site that ultimately times out and doesn’t load.

How to Track Status Codes

There are a number of ways you can track status codes. For spot checking purposes, I recommend installing one of two Firefox add-ons: HttpFox or Live HTTP Headers. These add-ons let you look at the communication between user agent and client. For example, what happens when I type ‘www.searchengineland.com’ directly into my browser bar.

HttpFox Example

Using HttpFox I see that it performs a 301 redirect to the non-www version and then resolves successfully. Google Webmaster Tools also provides you with nice insight through the Crawl Errors reporting interface.

But if you really want to use status codes to your benefit you’ll need to count and track them every day via log file analysis. I recommend creating a daily output that provides the count of status codes encountered by Googlebot and Bingbot.

Status Code Reports

Using those daily numbers you can construct insightful and actionable dashboard graphs.
Sample Status Code Reports

While this may take some doing, the investment is worthwhile. You can quickly identify and resolve 404s and 500s. Many will find it helpful to have this data (concrete numbers!) so you can prioritize issues within a larger organization.

You’ll also gain insight into how long it takes search engines to ‘digest’ a 301 and much more. Status code management can be a valuable part of an advanced SEO program.

xxx-bondage.com