Archive for January, 2008

Search Engine Penalties and Filters – 2008 Guide

January 24th, 2008 by Joe Griffin

There are two words that are guaranteed to strike fear into the heart of any website proprietor: penalties and filters. They damage rankings and ultimately may result in banishment from the search engines. However, before panic sets in, let us guide you through the basics of penalties and filters.

A penalty is caused by significant violations of a search engine’s website guidelines, such as:

  1. Cloaking (showing one version of a site to search engines and another version to human visitors).

  2. Hidden text (text not easily read by search engines that can be used to inflate a website’s keywords).
  3. Linking out to “bad neighborhoods” (i.e. Pills, Porn or Casinos).
  4. Consistent and abusive negative link building.

Penalties can be issued after a person reviews a website or after it has been crawled and processed by search engines. They result in a website being heavily held back in the rankings, or removed from the search engine’s index entirely. A penalty is called a ban when a website is completely removed from a search engine.

To remove a penalty, a site needs to first correct the problem that caused it, then contact the search engine and request a reinclusion. It is important to note penalties are not common for business websites, particularly in Google.

  1. In Google, this is done through a Webmaster Tools account, which can be set up for free.

  2. MSN has just created a Webmaster Tools system like Google’s.
  3. Yahoo does not have a defined reinclusion process.
  4. A filter is caused by passing a search engine’s threshold setting for one or more optimization/link building elements, such as:
    1. Too many keyword mentions on a page’s body content (over-optimization).

    2. Too much keyword blurring between a site’s pages.
    3. Too many links with the same anchor text.
    4. Too much keyword-rich internal linking (can cause 950 filter).
    5. Too much link building in a short period of time.
    6. Too much link building using the same anchor text.
    7. Link building in bad neighborhoods.

Filters are common. They are issued automatically after the site is crawled and processed by search engines and result in a site being held back in the rankings.

  1. Filters can be keyword-based or site-based.

  2. Filters can be mild (held back a few positions) or heavy (held back hundreds of positions).
  3. Filters can have a time element (like the normal Google Sandbox process, where a site is initially held back many positions and over time gets held back less and less until eventually it ranks near its allinanchor rankings).

In general, to remove a filter a site needs to first correct the problem that caused it, then wait for the search engine (s) to crawl the site again and find the corrections. The next time the search engine updates its rankings with the corrected data, the filter will be lifted automatically.

  1. In some cases, like the Google Sandbox, you can simply outwait a filter. Websites commonly spend anywhere between a few months and a year or so in the Google Sandbox. The time websites spend in the Sandbox has significantly decreased over the last few years.

  2. In some cases, you can remove a filter on a site by doing things that search engines like (such as getting quality links to the site from other respected websites in the same sector) to outweigh the things about the site or optimization they don’t like.

Filters are common, especially in Google.

  1. The Google Sandbox is technically a filter. Google closely examines new websites for over-optimization to try and minimize spammy websites filling its SERPS. As a result, newer websites often trip filters when they start an optimization campaign using traditional SEO (lots of keywords on the page, keyword-heavy titles and description, keyword heavy anchor text in incoming links). This pattern of new websites getting filtered and eventually getting released from the filter is called the Google Sandbox. Websites can speed up their release date from the Sandbox by getting quality internal links and not going overboard with on-site optimization.

  2. New websites are not the only targets. Older websites can trip filters when they go overboard with over-optimization

Being aware of how search engines assess penalties and filters is essential to avoiding them. Do not try to cheat the system, over-optimize or trick search engines. They are savvy to these tactics and punish those who attempt to take advantage of them. What may help in the short term will only end up hurting in the long term.

Being competitive in the SERP’s is important, and sitting idly is not a good strategy. If you must build links or optimize your website try to stick to the guidelines and use common sense.

Submitawebsite Launches SEOJumpstart

January 11th, 2008 by Adrienne Embery-Good

I chat daily with people visiting our website. Whether they currently have a Search Engine Optimization strategy or want to begin one, the first question is always the same: Cost.

Cost is at the forefront of many decisions we make in life from the groceries we buy for our dinner to the advertisements we buy to drive traffic. So it is no surprise that the innovators at Submitawebsite put themselves in your shoes. We came up with a streamlined program designed to guide you through the crucial, fundamental steps of SEO at a price that fits your budget.

Foundation and structure are essential. One supports while the other gives shape. If you think of building your search strategy like building a house, than SEOJumpstart is your concrete slab and frame. It lays the groundwork for your On-Page and Off-Page link building efforts. It also assembles your framework by building and optimizing your website’s content. Keywords, links, directory registration, BlogReviews and Press Releases are the building materials.

If you are launching your SEO efforts, SEOJumpstart is a blueprint containing key building blocks. If you have invested in SEO before, but have been unsuccessful, SEOJumpstart will rebuild your strategy with fundamental optimization activities.

So, if you haven’t guessed it by now, I used to work in the homebuilding industry. That may seem like a far cry from internet marketing, but building SEO is no different. Every project needs to be rooted by substance or else it is doomed to come crashing down. Submitawebsite has bundled the substance you need for successful SEO in an affordable package. And it starts at $1,500- flat.

Analytics 101 – Interpreting Rankings

January 2nd, 2008 by Adrienne Embery-Good

Ranking #1 on Google may seem like the proverbial Holy Grail, but it is only half the battle. Stellar rankings are irrelevant if they do not provide ROI. The goal is to interpret what keywords not only rank well, but drive quality traffic. This can only be accomplished by applying analytics to your rankings data.

Now we understand the word “analytics” can conjure up visions of pie charts, statistical labyrinths and dizzying mathematical patterns–but have no fear. We are here to supply a crash course in interpreting analytics.

First of all, there are a variety of analytics programs available that provide similar, or almost identical data sets (Google Analytics is a popular choice). The most notable differentiation being the amount of time spent to make the programs’ buttons and chart colors more eye popping than the next guy. These tools, in combination with tracking keyword ranking improvements, provide valuable insight into the effectiveness of hours and hours of hard work.

Essentially, there are 3 levels of analysis. Each level provides more detailed information. The goal is to identify which keywords are money makers and which keywords are all bark and no bite. For example, there may be a term that has mediocre rankings, but converts more traffic while a high performing term drives traffic to the site and falls short on conversion. If a keyword ranks on the first page of search engines, but does not lead to converted traffic, optimization efforts might be better spent elsewhere.

Level 1:
Begin the process by identifying well-ranking terms and cross-reference them with your analytics data. The analytics tool will show valuable information about these keywords such as average time on the site, percentage of new visitors, number of pages visited and bounce rate. See below for definitions:
Visits= the number of visitors a keyword has driven to the website
Pages per Visit= the number of pages viewed during each visit
Average Time on Site= the minutes a visitor spends navigating the website
Percentage of new Visits= the percent of visitors viewing the site for the first time
Bounce Rate= the percentage of visitors that leave the site from the landing page they
arrived on without visiting other pages.

Level 2:
The next step is determining whether these keywords are sending quality traffic by cross referencing them with the analytics. This will identify the behavior associated with them. Positive feedback will likely show the keyword averages two or more pages viewed per visit and a 50% or less Bounce Rate.

This research shows what keywords result in quality traffic i.e. traffic that is likely to purchase or fulfill whatever task it is that you are trying to achieve. It may also highlight keywords that perform well in spite of poor rankings in the SERPs. When keywords like this are discovered it presents the opportunity for refining optimization. When it is revealed big first page terms are not converting a site’s traffic (because they have bounce rates that look like the 2007 New England Patriots win percentage) pull some resources (i.e. links) away from them and place the emphasis on the 2nd and 3rd page terms that send just as much conversion traffic with only a fraction of the visitors.

Level 3:
The highest level of analysis is accomplished with Goal/Conversion Tracking. The process isolates the keywords that result in completed transactions. For example, most websites have forms that need to be completed and submitted to either complete a purchase or receive information (websites with shopping carts or request a quote functions are good examples). After a form is completed, a “Thank you for….” page pops up. Keywords that lead to “Thank you for…..” pages denote a purchase or submission. Focus search engine optimization efforts on these keywords to further increase profits.

There are other helpful areas above and beyond the three basic levels. These include Funnels and the Overview or Dashboard view of the website’s traffic.

Funnels:
This strategy is used with Goal/Conversion Tracking and focuses on the pages within a website’s submission structure. Let’s say a registration or purchase form has five pages. Sometimes visitors will abandon the form in the midst of completion. Usually there is a pattern. Tracking can be set up to include all pages before conversion so the point of abandonment can be determined. This insightful information can be used to filter out any offensive, personal or invasive questions to encourage the completion of the purchase or submission.

Overview or Dashboard view of the website’s traffic:
This timeline highlights spikes in traffic and determines popular seasons. These seasons often present good opportunities for focused SEO efforts. If a business sells winter boots it will likely see a significant spike in traffic in the month December. Therefore, December would be a good target for increased marketing campaigns.

In a nut shell, converting traffic to profit means successful investment in search engine optimization. Traffic comes from the keywords associated with the product or website. Analytics highlight the patterns and relationships between keywords and buying behavior. Take rankings one step further and turn those first page placements into first rate ROI.