You Are Browsing The SEO Category

Google Text-Only Cache Bookmarklet

March 22 2010 // SEO + Technology // 2 Comments

text only no html

Last year I wrote about Google’s text-only cache which lets you see what Googlebot sees.  This fits in well with Blind Five Year Old philosophy since Google doesn’t care if your site is pretty. I know, it’s not a perfect analogy because Google would need to read the text but … think of it as braille for Googlebot.

I still recommend the technique but wanted to pass along a better way to access Google’s text-only cache.

Google Text-Only Cache Bookmarklet

A bookmarklet is a bookmark that delivers one-click functionality to a browser or web page. You’re probably using a few already (bit.ly anyone?). Following is a bookmarklet that shows the Google text-only cache of any page.

Text Cache

Simply drag the link above to your bookmark bar to have one-click access to Google’s text-only cache of the page you’re on. This bookmarklet comes from SEOmoz, where they’ve compiled a list of 30 SEO bookmarklets along with instructions on how to create your own.

Create Your Own Bookmarklet

Creating your own bookmarklet really is easy. Here’s one I created that gives you one-click access to SEM Rush.

SEM Rush

The instructions SEOmoz provides are solid, but limited to simple queries. Anything more and you’ll need to learn additional javascript commands and syntax. If you’re technically inclined that’s not a huge task, but start out by futzing around with the simple stuff. As always, doing it is the best way to learn.

So grab or create SEO bookmarklets so you’re spending less time navigating and more time analyzing.

Google, is there a …

March 20 2010 // Humor + Life + SEO // 1 Comment

Google suggests can be an endless form of entertainment and insight. Here’s one I caught in late January.

Google Suggests for Is There a

At first glance it seems like a strange combination but upon further inspection it’s a lot like a Google-style Burroughs cut-up.

Theology

You’ve got the adult and kids version of theology with is there a god and is there a santa claus. The current suggester also includes is there an afterlife. Some heavy stuff.

Health

From herpes to AIDS to cancer, health queries are rising. I imagine that many dealing with these issues might wind up typing in is there a god or is there anybody out there. The Internet can isolate but also connect.

Lyrics

The Pink Floyd song is easy to spot, though at first glance I thought it was a Duran Duran reference. I’m an 80s fan and won’t apologize for it! The other song is by Band of Horses. I’ve never heard of them until now. All the songs do have a yearning and ethereal feel to them.

Miscellaneous

The meteor shower must have been very topical at the time because it’s not included in the current suggester. However, things falling from the heavens certainly fits into the overall theme.

As for hdmi cables, get the cheap ones.

The Best SMX West Session You Didn’t Attend

March 19 2010 // SEO // 2 Comments

SMX West 2010 Logo

This was the second straight year I attended SMX West. It’s still the best search conference I’ve been to, but a number of the sessions were nearly the same as the prior year. However there were enough new sessions to keep me busy and stoke my interest. Among those new sessions the surprise of them was ‘Search Behavior: Using Research To Improve Results’.

Search Behavior: Using Research To Improve Results

The session was on the third day right after lunch. It had been a long conference and many were more interested in attending ‘Ask The SEOs’ to see if any sparks would fly with Aaron Wall on the panel.

The few that did attend got an entertaining, theoretical, scientific and academic review of search. The panel was composed of Gordon Hotchkiss, Carla Borsoi, Venkat Kolluri and Shari Thurow.

The Brain on Search

Gordon Hotchkiss from Enquiro presented some academic research from UCLA that showed brain function by level of search savvy.

UCLA brain and search study

An interesting point of reference was that the search savvy develop habitual search patterns. This would clearly make it difficult to change the search interface in any major way without disrupting those patterns. How long might it take to rewire the brain to get the most out of search? And what new pattern would emerge as a result? It makes any Google 2.0 interface extremely risky and unlikely.

Types of Search

I’ve long been interested in the different types of search: information, navigation and transaction. I’ve had personal experiences in optimizing information and transaction based sites and there is evidence that the habitual pattern for each search type varies. Let me tell you, the SEO strategy is different.

Shari Thurow from Omni Marketing Interactive explored each type of search, mixed in personas and tossed in an F-bomb to boot. There are a number of ways you can use this information, not the least of which is to craft better titles and meta descriptions to match the search type and intent. Rank matters (a lot) but, just like PPC, good copy can also move the needle.

How Do Users Search?

Carla Borsoi from Ask also spoke about information search, looking at how users search. The focus was on questions and the few examples she provided are illuminating.

Carla Borsoi Search Behavior Presentation

Here you’re seeing the question a user really wants to ask versus what they actually type into a search engine. Clearly, matching queries with intent is a tough business. It’s one of the reasons why all of the search engines have moved to search suggesters so that they can better match query intent.

The Search Behavior session may not have been the most actionable of sessions. There were no tips, tools or lists of resources that you could use immediately. You didn’t get insight (overt or between the lines) from a search engine representative. What you did get was a rich background into the psychology of search and the implications it may have on our industry now and in the future.

In other words, you saw the forest and not the trees.

Google Caffeine Is Not An Algorithm Change

February 05 2010 // SEO // Comments Off on Google Caffeine Is Not An Algorithm Change

Google Caffeine

There’s still a lot of speculation and conjecture about Google Caffeine. More than a few have analyzed and theorized based on the preview Google provided. Some have even given tips on optimizing for Caffeine. Don’t believe the hype.

The Google Caffeine Myth

Google Caffeine is not an algorithm change. That’s not to say that the results won’t change here and there, but for the most part the actual scoring is unchanged. There are no special techniques or changes needed to address Google Caffeine.

What is Google Caffeine?

Google Caffeine is a re-architect of the entire indexing system for performance. In short, Caffeine:

  • allows Google to crawl and index more pages
  • allows Google to crawl and index pages faster

The result? The indexing data fed into individual signals is more robust and updated more frequently. This might produce some slight changes to results, but only because of the change in data, not in the change in scoring.

In Google terminology, Google Caffeine increases the rate of data refreshes. I recommend you read and listen to what Matt Cutts has to say on the topic. Vanessa Fox was also a voice of reason with her Search Engine Land Google Caffeine review.

Now, in re-architecting the indexing system, not all data structures are 100% compatible. So, some tiny changes have likely been made, but they’re not meant to be algorithmic changes.

Why Ask For Caffeine Feedback?

Google provided the SEO community with access to Caffeine results and asked for feedback. This request was looked upon by many as an indication that this was an algorithm change.

In fact, this request was a type of distributed QA. The goal of the new indexing system is to increase performance but not appreciably change the algorithmic results. Letting a bunch of hyperactive SEOs into the sandbox helped them to identify any flaws to their re-architected code.

Why Re-Architect The Indexing  System?

Most seem to believe the impetus for Caffeine was around real-time search. I’m sure this was factor, but I doubt it was the primary reason. Google could have pulled off real-time search without Caffeine.

The indexing code in use was written ages ago in Internet-time. Since that time, the Internet has grown and changed dramatically. Algorithmic changes and new signals could only go so far in ensuring quality results. If the data set they were using was incomplete or ‘old’, no amount of signal tinkering would have the desired impact. It’s a simple GIGO problem.

Google needed more data to better inform the algorithm. They need to see more of the Internet. More links. More page changes. Any of you who have ego based Google Alert feeds would have noticed a substantial increase in activity over the last month. It’s not that you’re getting more popular (sorry), Google’s just indexing more and doing so faster than ever before.

Caffeine Is The Beginning

The real question should be what will Google do once Caffeine is fully implemented. Once Google gets all of this data, new patterns will emerge and algorithm changes are certain to follow.

Google 2.0

February 02 2010 // SEO + Web Design // 1 Comment

Google has recently been trying to streamline search results as the number of universal search elements grows. It’s what Marissa Mayer, VP of Search Products and User Experience, calls ‘user interface jazz’.

Solving Google’s Jazz Problem

Recent attempts to solve the jazz problem have revolved – primarily – around a left contextual navigation pane. Whether it is always exposed or only introduced when clicked, Google seems sure that this is the way to solve search overload.

But is it really?

Google and Web 2.0

Google is rooted firmly in Web 1.0. There’s (clearly) nothing wrong about that. Yet, the interface hasn’t changed all that much as the web has evolved. While Mayer acknowledges the bimodal world of screen sizes (larger desktop screens but smaller mobile interfaces), does the interface fully acknowledge and take advantage of these advancements?

The Splinternet is real and seems only to be expanding with the launch of the iPad.

New User Interfaces

Some of the most interesting new interfaces are far more visual and horizontal in nature, allowing the user to digest more information at a glance. Think about what Google search results could look like if they used an interface like Lazyfeed or the Times Skimmer.

Google 2.0

Here’s a quick Frankenstein of what Google 2.0 could look like using a bit of Google’s Jazz UI and the Times Skimmer.

Google 2.0 User Interface

One or all of these results or panes could update in real-time. Another could present a fully embedded video. Yet another could present thumbnails for image matches. The possibilities, while not endless, are numerous.

I’ve kept the left hand navigation, but you could just as easily do without it. In fact, that would better adhere to Google’s search motto: don’t make the user do something we can do for them.

Google Takes The Safe Route?

Of course, Google would need to determine how to present AdWords effectively in this environment. Perhaps the fear of disrupting AdWords revenue is why a major UI change isn’t in the cards. But this seems like a contradiction in how Google sets goals and measures success.

Achieving 65% of the impossible is better than 100% of the ordinary – Setting impossible goals and achieving part of them sets you on a completely different path than the safe route. Sometimes you can achieve the impossible in a quarter, but even when you don’t, you are on a fast track to achieving it soon. Measuring success every quarter allows for mid course corrections and setting higher goals for the next quarter.

Maybe Google has already tested radical new UI with unsatisfactory results. Or maybe Google is taking the safe route, thinking that the search interface can remain relatively static as the web transforms.

Is Google really doing enough to solve user interface jazz?

How To See Google Analytics Traffic Faster

February 01 2010 // Analytics + SEO // Comments Off on How To See Google Analytics Traffic Faster

Sometimes you want to see your Google Analytics traffic faster. Whether you’re obsessive, impatient, troubleshooting or benchmarking, you might find yourself frustrated with the 3-4 hour time lag, particularly if it’s a site with a decent amount of traffic.

Stop Waiting for Google Analytics Traffic

Here’s a quick and easy tip to see your Google Analytics traffic faster. (Remember, this only works if you’re looking at intraday traffic.)

Go to the Visitors > Visitor Trending > Visits report in Google Analytics. Then make sure you’re looking at the graph by hour. The report will look something like this.

Google Analytics Traffic Graph

Now, in the far right select the Advanced Segments drop down and choose one of the default segments. My favorite is Non-paid Search Traffic. Then deselect All Visits so only Non-paid Search Traffic is checked. The result? You get a peek at a few more hours of traffic.

Google Analytics Non-Paid Search Graph

You can leave All Visits on to see the difference between the two if you’re really interested. For me, it’s all about looking at the day’s traffic in comparison to the same day last week. Using the same report with All Visits you get something like this.

Google Analytics All Visits Comparison Graph

Look at just Non-paid Search Traffic and you get to see those most recent hours. This is the report if you’re serious about SEO.

Google Analytics Non Paid Search Graph Comparison

You can use any of the default advanced segments and can usually use any custom advanced segment that produces enough traffic. So stop refreshing your dashboard stats again and again without success. Instead, follows these few steps and get ahead of the curve.

Yelp robots.txt

January 16 2010 // Humor + SEO // Comments Off on Yelp robots.txt

The other day I was doing some robots.txt research and found a great little Easter egg on Yelp.

Yelp Robots.txt

A robots.txt file is a great place to drop Asimov’s Three Laws of Robotics. Thanks for the chuckle Yelp!

2010 Internet, SEO and Technology Predictions

January 03 2010 // Advertising + Marketing + SEO + Social Media + Technology // 5 Comments

As we begin 2010, it’s time for me to go on the record with some predictions. A review of my 2009 predictions shows a few hits, a couple of half-credits and a few more misses. Then again, many of my predictions were pretty bold.

2010 Technology Predictions

This year is no different.

The Link Bubble Pops

At some point in 2010, the link bubble will pop. Google will be forced to address rising link abuse and neutralize billions of links. This will be the largest change in the Google algorithm in many years, disrupting individual SEO strategies as well as larger link based models such as Demand Media.

Twitter Finds a Revenue Model

As 2010 wears on Twitter will find and announce a revenue model. I don’t know what it will be and I’m unsure it will work, but I can’t see Twitter waving their hands for yet another year. Time to walk the walk Twitter.

Google Search Interface Changes

We’ve already seen the search mode test that should help users navigate and refine search results. However, I suspect this is just the beginning and not the end. The rapid rate of iteration by the Google team makes me believe we could see something as radical as LazyFeed’s new UI or the New York Times Skimmer.

Behavioral Targeting Accelerates

Government and privacy groups continue to rage against behavioral targeting (BT), seeing it as some Orwellian advertising machine hell bent on destroying the world. Yet, behavioral targeting works and savvy marketers will win against these largely ineffectual groups and general consumer apathy. Ask people if they want targeted ads and they say no, show them targeted ads and they click.

Google Launches gBooks

The settlement between Google, the Authors Guild and the Association of American Publishers will (finally) be granted final approval and then the fireworks will really start. That’s right, the settlement brouhaha was the warm up act. Look for Google to launch an iTunes like store (aka gBooks) that will be the latest in the least talked about war on the Internet: Google vs. Amazon.

RSS Reader Usage Surges

What, isn’t RSS dead? Well, Marshall Kirkpatrick doesn’t seem to think so and Louis Gray doesn’t either. I’ll side with Marshall and Louis on this one. While I still believe marketing is the biggest problem surrounding RSS readers, advancements like LazyFeed and Fever make me think the product could also advance. I’m still waiting for Google to provide their reader as a while label solution for eTailers fed up with email overhead.

Transparent Traffic Measurement Arrives

Publishers and advertisers are tired of ballpark figures or trends which are directionally accurate. Between Google Analytics and Quantcast people now expect a certain level of specificity. Even comScore is transitioning to beacon based measurement. Panel based traffic measurement will recede, replaced by transparent beacon based measurement … and there was much rejoicing.

Video Turns a Profit

Online video adoption rates have soared and more and more premium content is readily available. Early adopters bemoan the influx of advertising units, trying to convince themselves and others that people won’t put up with it. But they will. Like it or not, the vast majority of people are used to this form of advertising and this is the year it pays off.

Chrome Grabs 15% of Browser Market

Depending on who you believe, Chrome has already surpassed Safari. And this was before Chrome was available for Mac. That alone isn’t going to get Chrome to 15%. But you recall the Google ‘What’s a Browser?‘ video, right? Google will disrupt browser inertia through a combination of user disorientation and brand equity. Look for increased advertising and bundling of Chrome in 2010.

Real Time Search Jumps the Shark

2009 was, in many ways, the year of real time search. It was the brand new shiny toy for the Internati. Nearly everyone I meet thinks real time search is transformational. But is it really?

A Jonathan Mendez post titled Misguided Notions: A Study of Value Creation in Real-Time Search challenges this assumption. A recent QuadsZilla post also exposes a real time search vulnerability. The limited query set and influx of spam will reduce real time search to an interesting, though still valuable, add-on. The Internati? They’ll find something else shiny.

The Link Bubble

December 28 2009 // SEO // 6 Comments

The real estate bubble popped. Will the link bubble be next?

Link Bubble

The real estate bubble was the product of greed, low interest rates, loose lending policies and derivatives. Nearly anyone could get a house and people bought into the idea that real estate would always be a good investment. The result of this irrational exuberance? Homes were valued far more then they were worth.

The Link Bubble

Are links that different than real estate?

Links have traditionally been a reliable sign of trust and authority because they were given out judiciously, a lot like mortgages. For a long time link policies were tight. You needed references and documentation before you earned that link.

In addition, links weren’t looked upon as an investment tool. The concept that links influenced SEO hadn’t taken hold. The motivation behind links was relatively pure and that meant Google and others could rely on them as an accurate signal of quality.

Links or Content?

Many have recently bemoaned the death of hand crafted content and the rise of content farms as a threat to search quality. But is content really the problem?

Content has little innate value from a search perspective. Yes, search engines glean the content topic based on the text. It’s like knowing the street address of a house. You know where it is and, probably, a bit about the neighborhood. But it doesn’t tell you about the size, style or quality of the home.

Long tail searches are akin to searching for a house by street address. So, content without links may sometimes produce results. But the vast majority of searches will require more information. That’s where links come in.

McDonald’s Content

Lets switch analogies for a moment. Some have called Demand Media the McDonald’s of content. There’s a bit of brilliance in that comparison, but not in the way most think.

Both McDonald’s and Demand Media crank out product that many would argue is mediocre. Offline, McDonald’s buys the best real estate and uses low prices, brand equity and marketing to ensure diners select them over competitors.

Online, Google holds the prime real estate. But that real estate can’t be outright purchased. And in the absence of price, we’re left with brand equity and marketing. Online, brand equity translates into trust and authority. And links are the marketing that help build and maintain that brand equity.

Demand Media has brands (their words) that give it automatic trust and authority. Publish something on eHow and it automatically inherits the domain’s trust and authority, built on over 11 million backlinks.

Writers for Demand Media are provided revenue share opportunities on their articles. Here’s one of the tips they give to writers to boost traffic to their articles.

2) Link to your article from other websites.

Link from your own website or blog, from a message board or forum, from your social networking profile on MySpace or Facebook and more. The more high quality links to your article there are on the web, the more highly a search engine will rank it.

Demand Media combines the installed brand equity of multiple sites (which happen to be cross-linked) with an incentive to contributors to generate additional links. The content doesn’t have to be great when links secure premium online real estate.

There might be something better down the road, but McDonald’s is always right there at the corner.

Link Inflation

The last few years have produced major changes surrounding links. Linkbuilding is now a common term and strategy. A number of notable SEO firms tout links as the way to achieve success.

Linkbuilding firms sprung up. Linkbulding software of various shades of gray were launched. Paid links of various flavors flourished. Social bookmarking and networking accelerated link inflation. And new business models like Demand Media sprung up to take advantage of the link economy, creating a collection of sites and implementing incentives that result in something that resembles derivatives.

Link policies went from tight to loose and people got greedy. Anyone can get links these days. So what’s the natural result of this link activity?

Link Bubble Pops

Link Recession

The value of links is inflated and at some point the system will correct. The algorithm will change to address the abuse of links. Unlike The Federal Reserve, Google probably isn’t looking for a soft landing, nor are they going to extend a bailout.

Some links will continue to matter. Links that are in the right neighborhood. The ones with tree lined streets, good schools and low crime. But will links from cookie cutter planned communities still be valuable? Strong links will mean more because they’ll hold their value, while many more links will be neutralized.

I’m no Nouriel Roubini, but I do believe that a major link correction is coming in 2010. Google must address the link bubble to make search results better.

Google Brand Search Results

December 15 2009 // SEO // Comments Off on Google Brand Search Results

In late October Google launched a new type of search result for brand queries. Noted on Google Blogoscoped, a brand search result takes up an enormous amount of real estate and is composed of one regular listing and two indented listings.

When are brand search results shown?

The brand result is only triggered by certain queries. Oddly, it’s not just a brand name. Instead, it’s a brand name coupled with another keyword or keywords. You can visit https://fullypromotedfranchise.com/ to know more about how to promote your brand easily.

So, a search for ‘Peet’s’ will not trigger a brand result. But a search for ‘Peet’s Major Dickason’ will.

Google Brand Result

Is it really about brands?

Upon the launch of this new result it seemed like it was solely a function of what was in the domain. A search for ‘wooden bar stools’ triggered the new result.

Google Brand Result

Here, the domain of bar-stools-barstools.com triggered the new result when it matched the ‘brand + modifier’ query. A simple domain match against the keyword query, right?

Yet, when you search on this same term today you don’t get the new brand result.

Google wooden bar stools query

So what started as a simple domain match has evolved into something more refined. This isn’t capricious in nature. Google is always looking for ways to improve search quality. In this scenario they have weighted the actual brand or company site as more authoritative for a fairly large set of ‘brand + modifier’ queries.

What’s the criteria for Google brand search results?

How Google is doing this refinement is unclear. Some have surmised Google is using some sort of brand database or that it’s related to their DNS service.

I tend to believe there was some sort of initial criteria (broad in nature) for this new brand result with a built-in refinement mechanism. The refinement would be accomplished through analysis of user behavior on these results (e.g. – relative CTR and length of clicks), SearchWiki data or human editing.

User behavior might be difficult to mine since the large amount of real estate brand results take up likely make them click magnets. And click length might describe the value and quality of the site but not whether it merits the brand result presentation.

SearchWiki data and human editing are essentially the same concept, with the former being accomplished by a decentralized group of users (the proverbial cloud) and the latter being done by a centralized group of Google employees. Human editing doesn’t seem that far-fetched to me, particularly if Google used user behavior or SearchWiki data to identify domains for review. The result would be an easy and efficient review queue.

Why is Google interested in brands?

No matter how Google is doing it, the fact that they are is a signal that brands and companies provide a new proxy for trust and authority. The first step in this direction was the Vince change which unseated many highly optimized sites from root terms and replaced them with relevant brands or companies.

The current algorithm continues to struggle with trust and authority with 0ver-optimization reducing the value of on-page factors and the rise of link pollution quickly eroding the value of off-page factors.

Google brand search results show a continuing interest in moving beyond current signals to improve search quality and deliver better results. So while the new brand search result went largely unreported, it could be a harbinger of a larger algorithmic shift in 2010.

xxx-bondage.com