$1 Hosting with Free Migration
Fast Hosting Protected by SShield with Free Migration
Google confusion reigns. Recently a thread opened on the Site Reference Forums for people to vent their frustration over Google`s changing algorithms, the see-saw nature of rankings, and the reams of conflicting information on what Google is actually looking for. The thread has been a bit of therapy for those that have seen their rankings slip and those that have yet to crack Google`s top rankings. It is, I fear, indicative of how many webmasters feel when trying to understand Google.
Google – An `Open` Book
Even though there has never been so much confusion about how Google ranks websites, Google has never tried harder to be more open with the SEO community about their ranking policies. In the past year or so, Google has offered more examples, more direct advice, and more pro-active measures to help website owners know what they are looking for, and more importantly, what to avoid. Between Google Sitemaps, people like Matt Cutts offering free advice from within the Googleplex, and Google`s own program of actively informing website owners of potential problems with their site, Google is sharing a ton of information about how they work, but relatively few webmasters seem to be listening.
It may partly be Google`s newfound open personality that is causing a lot of the confusion. In general, website owners who either watch their rankings slip for some unknown reason, or website owners who cannot seem to get their quality site to crack the top rankings tend to be skeptical about anything Google says. When a webmaster reads that Google frowns on link exchanges, yet sees a competitor dominating the rankings with a link exchange, they naturally dismiss the advice as not true.
The fact that these people do not seem to want to admit to is this: Google is changing. The old ranking techniques that once could be used as a shortcut to a top ranking no longer work. Google has learned, and continues to learn, for better or worse, how to weed out websites that try to manipulate their rankings through pandering to an imperfect algorithm.
The algorithm is still imperfect, and so is Google. With their increased focus on reducing the effect of algorithm pandering they have demoted some very legitimate websites and also raised some lower quality websites into the top of their rankings. But they are changing all the same – and widely accepted `shortcut` techniques are their target.
When it comes to SEO, a lot of webmasters live on Tom Smykowski`s `jump to conclusions mat`. If they see one website that is using link exchanges that has a top ranking for a particular keyword, they assume that link exchanges must be the reason for the high ranking. If they find a single site that uses hidden text and ranks well, then it must follow that Google `likes` these things. Conclusion jumping is easy to do – especially when a particular method works for your website, or is working for a competitor`s website.
Recently I wrote and published an article entitled “Valide HTML – Does Google Care?” The article raised the question as to whether Google actually preferred invalid HTML over valid HTML. Four websites were tested on two keyphrases that had no competition. The results of the test had sites with invalid HTML consistently ranking higher than those with valid HTML.
The article was heavily criticized, and rightfully so (mea culpa, mea culpa, mea culpa). A general conclusion was drawn from two very isolated cases, cases that were being tested on keywords that did not have real life competition. At best, the article could offer some insight into how Google ranks websites for keywords with no competition, but to draw any larger conclusions would be a logical error. (As a point of interest, there has been information that while Google does not give additional weight to valid HTML, they do encourage it to ensure crawlability – invalid HTML can certainly make a website impossible to crawl. Of course, there is no penalty for valid HTML.).
The point of all this is simple: few SEO rules are going to hold true in every example. Search results have both a macro-environments and micro-environments. To draw conclusions from a single micro-environment, or even a handful of micro-environments, about more general SEO theories is often a big mistake. Just because you see one example of a link exchange working well, that does not mean that Google wants to encourage link exchanges or that they are unable to stop link exchanges.
What We Do Know
There is only one real certainty about Google: they are changing and will continue to change. For the past few years, Google has been refining their algorithm and rebuilding their index. Anyone who thinks “Big Daddy” is the last of the major updates has not been paying attention. Google does not sit still and will constantly change to more efficiently find what it is they are looking for.
We know that Google is looking for content – good original content. They want websites that are linked to naturally from reputable resources. They want to be able to trust your website, which means that websites that they already trust, and that are related to your industry must `vouch` for you. They want websites that are easy to use for visitors, websites that visitors are actually looking for.
We also know what Google is not looking for. They are not looking for websites that pander to an algorithm, nor a site that tries to fake its way into popularity through bogus links. The time is coming when outright abuse of the system will no longer be a valid option.
All this is very vague – so how about some specifics?
- Link exchanges for SEO is a bad idea. There is a big difference between link exchanges for SEO and two sites that happen to exchange links. Using automatic link exchanges, having highly unrelated links, massive amounts of links regardless of the quality, etc should be avoided at all costs.
- Purchasing links is definitely a bad idea. Some argue that Google could never find out who is selling links. The fact is, though, that purchased links, at least at the most basic level, is extremely easy to detect.
- Thousands of links does not mean what it used to. Links must be related. If you want to see a significant effect, they should also come from highly trusted websites in a natural way.
It is possible to be hurt by bad links.
- Obvious SE spam, such as hidden text, redirects, etc will eventually get you banned.
- Being a careful webmaster is important. Innocent and careless mistakes can and will cost you. Setting up a website with broken links, multiple URLs for the same page, poor navigation, etc will scream poor quality.
- Rehashing content does not work. Google wants one of two things: 1) completely original content, or 2) content conglomerated in a completely original fashion. This goes along the duplicate content filter that is talked about so much.
How do we know these things? Because Google has been open about what they are looking for. These are not theories based on observations, these are things that Google has discussed openly through employees, at conferences, through the tools they offer us. Unfortunately many webmasters simply do not want to listen.