I have been doing SEO for some time now and I have been witness to many a strange occurrence regarding serps. Most of these weird occurrence I would have to say are directly attributed to a Google Filter or Google penalty. So I have been inspired by a post over at webmasterworld and as far as I know there is not a current list out online that list’s all of the potential Google penalties so I have decided to put together an arbitrary list of potential Google Penalties. Please note that there is no proof i.e. press release from Google stating these exist but rather these are ideas, theories and assumptions from SEO’s experiences.
Google Sandbox: The Sandbox Filter is usually applied to brand new websites but has been seen to be applied to domains that have been around for a while. Since most websites do not make it past a year Google implemented a filter that will prevent a new site from getting decent rankings for competitive keyword terms. Usually brand new sites can still rank for non competitive keyword terms though.
How to work around the Sandbox: Google uses a system called trust rank. The idea behind trust rank is if authority sites link to your new site then you must be an authority site as well and since Google trust’s these older more respected sites it will trust your’s as well. Hence getting you out of the sandbox right away. That is not an easy thing to do so if you are not able to get these links then try expanding your content to rank for many more less competitive keywords and keyword phrases (long tail keywords).
Google -30: This Google filter is applied to site’s who use spammy seo tactics. When Google find you using doorway pages, java redirects etc then they will drop your rankings by 30 spots.
How to get around this: If you find yourself a victim of the Google -30 filter then usually just removing the spam elements on your site will get you back in. You can always fill out a request for re-inclusion is worse comes to worse. Here are some resources for the Google -30. Arelis, Threadwatch, SERoundtable.
Google Bombing: Google Bombing is a filter applied to sites who gain a large number of inbound links with the same anchor text. This raises a red flag to Google as it is extremely unnatural for an inbound linking structure to all have the exact same anchor text.
How to work around this: If your site actually has this filter applied then most likely you have been banned from the search engines and a re-inclusion request is probably your best bet. If the filter is not applied but through your monitoring you see this potential then you might want to go back and request people change your anchor text, buy some links with varying anchor text etc. Here are some resources for Google Bombing. Search Engine Watch, NYTimes, Google Blogspot.
Google Bowling: This is not really a filter as much as it is a series of black hat techniques that will get you banned. Usually people use this term in reference to competition or a page/site they want OUT of the serps. Google bowling is usually only effective to site’s that are much new with lower trust rank. Trying to do this to a large site with high trust rank is going to be virtually impossible.
How to get around this: Google says that there is nothing a competitor can do to drop YOUR rankings. Many seo’s do not believe this and if you seoblackhat sells services for something like this. Re-inclusion request is basically your only option. Here are some resources for Google Bowling. Web Pro News, ThreadWatch and SEroundtable.
Google Duplicate Content Filter: A duplicate content filter is applied to sites who take content that has already been created, cached and indexed on other sites. News sites are usually exempt from the duplicate content filter via a hand job. Usually the pages that have this applied are not ranked very well in the serps. Page Rank can be devalued and if a page does not have inbound links you could see your results being put into omitted search results and supplemental results.
How to get through this: If you find yourself in this filter then your first step can be trying to remedy the duplicate content. Contact the person stealing your content and ask them to remove it. You can contact the persons web host to see if they will take down there site and the last resort is “trying” to contact Google and alert them of what is going on. Keep on top of your content by using copyscape to check for duplicate content.
Google Supplemental Results: Google supplemental results take pages on your site that have been indexed and put them into a sub database in Google. Supplemental results do not rank well but rather Google uses its supplemental DB to populate its results when they don’t have enough results to show in a given query. This means pages on your site in Google’s supplemental DB will not help you in the serps.
How to get through this: Its pretty simple actually. Just get some inbound links to your pages. Check this post out to find out more about the Google Poo (supplemental results).
Google Domain name Age Filter: The Google domain name age filter is closely related to trust rank and the sandbox but it is possible to be out of the sandbox and have trust rank and still be in this filter. The idea behind this filter is that older sites and domain names are more likely to rank well for keyword terms then newer sites. If you are in this filter you will most likely not rank well for terms that are competitive until your site grows older.
How to work around this: Quality links from authority sites with high trust rank will help you do much better in the serps.
Google’s Omitted Results Filter: Pages within your website that are in omitted search results will not show up in a Google search unless a user specifically says to show all omitted results. Usually users do not even get to the last page to do this which makes any page of yours that is omitted completely out of a Google search result. The reason this happens is lack on inbound links, duplicate content, duplicate meta title, duplicate meta description and poor internal linking.
How to get out of this: In order to get pages are omitted out of this filter simply alter the meta tags and fix duplicate content and get some quality inbound links.
Google’s Trust Rank Filter: Like the PageRank algorithm the trust rank algorithm has many factors that determine a sites trust rank. Some of the known factors are the age of a site, the amount of quality authority links pointing to it, how many outbound links it has, the quality of its inbound linking structure, internal linking structure and overall SEO best practices on meta and url structure. All sites go through this filter and if your Trust Rank is low so will your rankings in the serps.
How to get work with this: An old site and a new site can both have high trust rank or low trust rank. It is basically determined by the amount of quality authority links pointing to it, how many outbound links it has, the quality of its inbound linking structure, internal linking structure and overall SEO best practices on meta and url structure. Optimize these and you will have quality Trust rank.
links.htm page filter: This filter penalizes a sites ranking determined by the use of a links.html page. Using reciprocal linking is a old technique that is not promoted by Google anymore. This filter effects your ranking in the serps.
How to work with this filter: Instead of using “links” as your page title and name try using something like “mynewbuddies” or “coolsites” as this will help get around this filter. Reciprocal links are old seo techniques and Google devalues reciprocal linking structures.Here is someone discussing this at SEOChat.
Reciprocal Link Filter: Google is very open about reciprocal linking and clearly states that their algorithm can detect reciprocal link campaigns. Usually sites that only participate in reciprocal linking will have a hard time ranking in the search engines but depending on what you are using your site for a reciprocal links campaign might be exactly what you need. For example if you are building an adsense site then you do not want to spend to much time building a site up and a reciprocal linking campaign will help your sites inbound links grow over time.
How to work with this filter: When it comes to building an inbound linking structure try to utilize some or all of the 15 types of links and how to get them post I did a ways back. Here are some resources about this filter. Matt Cutts here and here, Search engine guide and Webmasterworld.
Link Farming Filter: Link farms are sites/pages that have a mass amount of unrelated links grouped together arbitrarily. Link farms can also be related links but most commonly they are unrelated. IP farms and bad link neighborhoods are all part of link farming. Being a part of a link farm can get your rankings dropped in Google and possibly get you banned.
How to get around this: Currently the only way to get around this is to NOT participate in link farming. Here are some resources on link farming:
CO-citation Linking Filter: This popular filter by Google watches your inbound link structure. If your link is on a site who’s outbound links are related to casino’s and porn sites and your automotive site is an outbound link on this site then google will think your site is related to porn and casinos. Poorly constructed co-citation will damage your ranking and make it hard for you to rank well for the terms you are targeting.
How to work with this: When considering a link partner, paid link or monitoring your inbound links be sure to follow this linking quaility guidline page that was derived from Patrick Gavin over at text link ads.
To many links at once Filter: This filter is applied when to many inbound links are acquired by a site to fast. The result can lead to a ban across all search engines. How these links are obtained, how many and over what period of time are factors for this filter.
How to get around this: Simply do not participate in black hat linking schemes and link spaming and you should never have a problem with this. Here is some information concerning this filter over at Aaron Walls at SEObook.com,
To many Pages at once filter: Google is keen on natural site development. Anything that look “unnatural” is going to be flaged by the search engines. Having to many pages to fast will raise this flag/filter. Some people believe that 5000 is the max for pages in a month but this number in my opinion can fluctuate depending on other factors and filters your site might be going through at any given time. The effect of this filter can result in pages being omitted, pages in supplemental results and in the extreme case a Google ban.
How to get through this filter: If you have a system that pulls content in or are using a dynamic content generator be sure to limit it per week and I would stay under 5000 pages per month just to be on the safe side. Depending on how large or well known your site is then the limit will be adjusted.
Broken Link Filter: Broken internal links can cause pages from not being crawled, cached and indexed. If pages like your home page do not have a link back to it on all pages this can count against you in the serps and your overall quality score for things like PR. This is not just bad seo and bad site design but this is bad for your users and can cause poor traffic and poor serp ranking.
How to get through this: Make sure you have a quality footer, a sitemap that covers all of your pages in one central hub and make sure you test your site for broken links. (be sure to use full url’s in your linking via source code).
Page Load Time Filter: The page load filter is very simple. If your website takes to long to load then a spider will time out and move past your site or page. This will result in NEVER being cached and indexed. Ultimately this means your site or page will not be present in Googles SERPS.
How to work with this: Make sure your pages are optimized for load time. Make sure if you are using flash or many images you use java pre-load coding. Make sure you limit the file size of your page’s as much as possible to make sure the spiders can read the entire document and be sure to use web 2.0 and css best practices.
Over Optimization Filter: Over optimization can cause a Google ban or hardship in rankings. Over optimization could be considered keyword stuffing, to much keyword density and keyword proximity optimization, meta tag stuffing etc. Stay away from over optimization.
How to get around this: Don’t over optimize!!!!
There are some filter’s I have not mentioned but I thought I would give a smaller list of other filters that could be attributed to Google:
Keyword Stuffing Filter:
Meta Tag Stuffing Filter:
Automated Google Query Filter:
IP Class Filter:
Google Toolbar Filter:
Click through Filter in serps:
Google -950 Filter:
I would like to hear what other SEO professionals have to say about my list as I have seen a lot of these and I have heard other’s speak of these and since Google has not come out and told all of use which filters exists and do not exists please consider this an excersize of knowledge expansion for all .