Archive for September, 2009

Domain Trust

September 28th, 2009 by Patrick Hare

Domain Trust can be described as a search engine’s “secret credit score” for your website. Just like a real credit score, there are a variety of factors involved, and they are subject to change based on market circumstances. Another similarity is that a complex algorithm is used to determine your credit score, in the same way that the domain trust algorithm is used to determine where you’ll find yourself in the search engine listings. It should be noted that there is a difference between “domain trust,” which has several factors we’ll discuss below, and “page trust” which has to do with how a site’s content presents itself.

Domain trust isn’t the only factor in good rankings. A trusted site is no good for search engines if it is hard to read, classify, or compare to similar websites. A perfect site with few external links is going to have a hard time getting found for any terms that make money for webmasters. Having good domain trust goes a long way toward getting search engine rankings quickly, and keeping them over the long term, so it is important for anyone interested in SEO to understand the factors involved.

Generally, domain trust is based around the factors below:

  • Domain age (physical and cached) – A site doesn’t have to be too old to get good search engine rankings, but we have found that older sites tend to get better results faster when they make changes. This could be because spammers tend to acquire lots of brand new domains and try to rank them, so a preference for established sites will cut out quite a few bad eggs. This does not appear to be as big of a factor as it was in the past, but it is worth noting. To get an idea of when the site may have been cached by search engines, Archive.org can show the earliest date for content on your site. Note that a domain that expired and was purchased again should be considered to have started on the domain’s registration date in WHOIS.
  • Domain expiration date – If the domain is less than a year old, and expires on its first birthday, it may be questionable. Note that registering a domain for 5 years does not guarantee better rankings, but there is some thought in the SEO industry that short registration times indicate that you don’t think your website is worth $7/year to maintain, so the search engines may not value it that much either. Since a lot of spammers and domain tasters go with the absolute minimum registration period, there is some merit to the idea that a 2-year registration sets you apart from the bad guys.
  • Links to Domain – The quality and quantity of inbound links to your website is probably a bigger topic of discussion in the SEO world than the quality of your website itself. Links coming to your site should come from trustworthy sites. They should be coming to your homepage and to interior pages. They should come from a diverse set of sites, but ideally from relevant diverse sites. The value of the referring site is important. If the New York Times gives one link to your site, you may get more trust than 1000 links from a bunch of unknown bloggers. The old adage “lie down with dogs, wake up with fleas” is applicable in the world of linking.
  • Links from Domain – Where do your links go? Remember that links are endorsements, and they say a lot about the endorser. Getting back to the “credit score” analogy, linking to questionable sites is as smart as co-signing a credit card application for everyone at the local halfway house. If you link to adult, online pharmacy, and gambling sites, then the search engine is going to associate your site with them and their level of trustworthiness. While some sites in this field may be trustworthy from a search engine standpoint, you are unlikely to get good rankings in the “family friendly” results. In the same way, you can lose trust by participating in link exchanges, web rings, and other link swapping ventures, even if all the sites are PG-rated. When choosing to add a link to another site, you would do well to check its PageRank, content quality, and its other link partners.
  • Interior (on-site) links – A good navigation structure on your site will help pass link authority from your interior pages back to the home page and vice versa. In this case, using good Page Trust tactics will improve domain trust at the same time.
  • Who Owns the Site – Search engines don’t discriminate on a name-by-name basis, but they do take note if you own a lot of other domain names that have already been penalized. Many of the bigger spammers may own hundreds or thousands of other sites, and they follow the same pattern with all of them. It may be in a search engine’s best interest to pull domain trust from all sites owned by a single party, since it is a fast way to show relevant results via the trusted sites that remain. If you’ve only got a few sites, and none of them are banned, you probably don’t have to worry. There is a school of thought that you may have a hard time ranking different sites with nearly identical topics, so it is best to pick a “favorite” site that you want to rank.
  • Other sites on IP address – There is a certain level of debate about this factor. If a high number of sites on a certain IP range (or “C block”) are very similar or have a bad history, then the search engines would have a reason to look at all of them with some suspicion. Unfortunately, some fields like real estate might all be going through a half dozen template builders who can integrate MLS functionality, so they are all in the same batch of IP addresses. Some of these agencies have run into trouble for SEO violations, and sometimes a whole set of sites can get banned for linking to each other. If you suspect this is a problem, buying a dedicated IP address for your site usually only costs a few dollars per month.
  • Past History – You may have bought a domain name that is already in the doghouse. Google Webmaster Tools can usually tell you if this is the case. If you really want to check for big problems before purchasing a domain, you may even want to ask to get added (by way of your own metatag) to the site’s Webmaster Tools profile. If there is a message saying that your site is not part of Google’s results, then you may have problems. If you already own a domain with this kind of message, then you can file a reinclusion request, and you should explain that you just bought the domain. Once again, there is some speculation that search engines like Google keep a permanent history on domain penalties, and it may affect future rankings.

As with everything else in the field of Search Engine Optimization, some of these factors are subject to debate. Many people in SEO will tell you that domain age is not important anymore, or that the future expiration date of a site is no longer a factor. Nonetheless, if you are assuming that the site is going to be around for five years, you might as well lock in the registration for that time period. Every once in a while companies lose their domain names because they didn’t renew them, and then they drop their SEO project because it has become unnecessary. In this case checking out the domain expiration date would have been a great piece of advice, even if it was only from the perspective of maintaining lifetime customer value.

What can you do to improve domain trust? First, make sure you get links from trustworthy sources. Make sure your site navigation helps search engines pass link authority to the top pages on the site, and back to the homepage. Clean up any questionable past linking practices. To show that you’re part of the internet, you can link out to a few trusted pages on sites like Wikipedia or a good source of online information. If you sell anything, ask your vendors and whol
esalers to link to you. You can also work to become an authority in your field by making the website a source for information that people want to read, or have tools that people keep coming back to use. Building trust takes a bit of time, but it also pays of in the form of better search engine positions. Normally when you improve your domain trust you are also adding usability and features that make your site a better place to visit, and this in turn leads to owning the profitable website that you originally intended to have.

Note: Web.com Search Agency has a Domain Trust Tool that can show you how well your site stacks up to top sites in Google. It also gives you a rough indicator of what you’ll need in order to get the same results.

The Golden Triangle

September 25th, 2009 by Patrick Hare

Eye Tracking Reveals the Importance of Top Search Engine Positions

The field of search engine optimization has lots of interesting jargon, but a concept that should be crucial to any website owner is known as the “Golden Triangle.” Universities and private organizations have conducted eye-tracking studies, where they have test subjects go to a search engine and look up a phrase. Through “heat mapping” or measuring the most popular points on a page that people look at, a pattern emerges.

The pattern resembles a right triangle, and indicates that people pay very close attention to the top results on a search results page. Studies have shown that around 100% of people see the top 3 results, but only 20% of people get to the last result on the page. In the world of search engine optimization, the importance is obvious, since a top ranking on the first page gets 5 times the visual exposure of the bottom listing. Thus, the term “Golden Triangle” is not just a reflection on the heat map’s coloring, but of the potential profit margin for a company that holds such a coveted spot.

Getting a top 10 ranking in a search engine like Google can be incredibly difficult, but getting the top is even more of a challenge. However, the investment can be worth the money if you’re going for competitive terms. There aren’t too many other advertising mediums where a little extra effort can quintuple your exposure, so if you have a website that is doing a good job in search engines, but may need that extra push, then hiring an SEO consulting firm may give you the leverage you need to claim the Golden Triangle positions with the most exposure.

Should You Buy Sponsored Links?

September 23rd, 2009 by Patrick Hare

Special Note: Buying the wrong kinds of sponsored links can have serious SEO consequences. In some cases, the links will have no value whatsoever and are a waste of money, but in other situations you could damage or eliminate your search engine rankings if you engage in obvious link buying tactics. Even if you’re just buying links for traffic, you could be hurting yourself in the search engines. When in doubt, consult with an SEO professional.

Aside from the content and composition of your website, the number and type of inbound links to your website plays a huge role in search engine rankings. Google even patented its PageRank algorithm, which clearly indicates the importance on links to your website from other sites on the internet. Originally, the quantity of inbound links had a greater impact on rankings, but now the trustworthiness of every website that links to you plays a greater role in your search engine rankings. Almost all of the top sites on search engine rankings got there by way of good inbound links and good content, in that order.

Unfortunately, inbound links aren’t easy to get. It can be difficult to convince anyone out on the internet to give you a free link, especially since experienced webmasters have been approached with multiple linking schemes from people who offered worthless directory links in exchange for good ones. You can get a certain quantity of good links from vendors and customers, and you can build a presence by commenting on relevant forum, writing a blog, or doing something newsworthy. For the average site owner, this can be quite time consuming, and a lack of link building experience can lead to some rookie mistakes that significantly reduce the value of the acquired links.

People in the SEO field also buy links because they work. In fact, an optimization company that didn’t recommend link acquisition to a link-poor client would probably be considered negligent. Despite Google’s warnings about buying links, many of which are valid, it is difficult to get ahead without venturing into the gray market of link buying. Most of the sites in the top 20 search results for big terms, outside of Wikipedia and Google, have used one or more versions of link buying to get ahead. This could include reciprocal links, text link ads, sponsored postings on blogs, sponsored postings on forums, and even directory submissions. For instance, you have to “buy” a link on the Yahoo directory, even though your are technically paying for a site review. Other directories, including Business.com and Best of the Web (BOTW) also will check out your site to make sure it is legitimate, before posting your link. Even though you are technically paying for a link, there is an editing component in the process, and if your site is no good then it will get rejected. References from human-edited directories help search engines determine which sites are trustworthy, so paying for a site review and link is worth the money.

There are ways to acquire backlinks in a search engine friendly manner, and there are sites that can help you get those links. Web.com Search Agency is one of them. We have an experienced team of link builders who can build your online presence by getting relevant links from sites in your field. We can make sure that you are listed in relevant directories, and that you get references from other important websites. This is a bit different than the classic “link buying” model where text link brokers would place your link on just about any site that had space open. The old model is usually easier to detect by search engines, but it can still be effective in some cases. As with any SEO practice, link buying depends on your comfort level, since some clients are more willing to push the envelope in link acquisition.

As a final note, you should keep in mind that links are just one of several dozen factors in proper search engine optimization. Normally a link building project is accompanied by extensive keyword research, site design changes, tag and content optimization, and navigational changes designed to help search engines read all of the important page on your website and classify it in the best manner possible. Over time, many sites will start to get unsolicited external links and they will be able to move away from sponsored links completely. Sites with interesting features, tools, and updated attractions also will find that they get a steady progression of natural links. Therefore, your long term goal should be to create a site that keeps people coming back, but in the meantime you may want to consider buying enough links to get the critical mass necessary to get the ball rolling.

When to Leave “Bad SEO” in Place

September 23rd, 2009 by Patrick Hare

Sometimes we come across a site that gets great rankings in spite of itself. It might have a terrible title tag, poor content, a lousy navigational structure, or abysmally formatted URLs. Nonetheless, the site is holding top spots on Google and other engines for some very competitive terms, and holds its own against sites that obviously put a lot of time into building a solid SEO profile.

What do you change on a site that breaks the rules and still succeeds? Obviously, you can change what isn’t working. SEO consulting projects for high ranking sites with issues should be done a lot more cautiously, especially when the owner is making a viable income off the current site traffic. You might choose to bypass the homepage and work on interior pages first, in order to boost rankings on the pages that aren’t getting anywhere. If interior pages are copying the homepage title, then they probably aren’t ranking as well as they could be. You can leave the awful homepage title in place for now and fix what you know is broken.

Can you go wrong by adding content to lean pages? Sometimes. Your added content may be cannibalizing keywords from another page that previously had solid rankings. Running a ranking report that shows which URL-keyword matches can help you focus your topics and keyword choices in a way that prevents a drop in site rankings.

Similarly, URL structures are sometimes better left alone, assuming that they’re getting indexed. Instituting a server-level rewrite is usually recommended for most sites with multiple products and alphanumerical pages, but it can create a seismic shift in search engine rankings. This shift is not always positive.

What about links? Sometimes we see sites that are engaging in link acquisition schemes that can cause big penalties in Google. For instance, we see sites with thousands of identical anchor texts on hundreds of low value sites, or with sitewide text links. We still recommend removing those links ASAP, since a competitor doing a backlink check might recognize paid links and report them to Google. Naturally, some alternative “search engine friendly” link building should be in place while these links are being removed. One advantage of reducing multiple bad links is that you can eventually recognize the valuable ones that have been powering the site. You may even want to try and rework these inbound links so anchors go from saying “click here” to something a little more topical.

Usually a trip through Google Webmaster Tools will help you identify all the issues that can be worked on in the meantime. If pages aren’t getting indexed, or the site has a lot of dead links, then fixing these issues will help the site maintain rankings, and perhaps rise. Duplicate titles and descriptions should be addressed as quickly as possible. As a general rule, if Google is pointing something out, you should endeavor to fix it, and repairing duplicate descriptions can be as simple as adding a few words in front of a canned description on each page. Duplicate titles are a bigger problem, especially if they match the homepage title, and even a quick edit on these could improve results for all pages.

This is not to say that a site shouldn’t be fixed eventually. High ranking sites with bad SEO require a lot of thought and a clear strategy, but you shouldn’t over think things or become paranoid. Good client communication is always important, but especially valuable in cases like these. By explaining the site’s many SEO faults, and how you plan to fix them, you can adjust your timeline to address poor performing keywords before making changes to riskier sections of the site. You may want to delay implementation on high value pages until a slow time in the buying cycle (like after Christmas) so your client doesn’t have a negative impression of your SEO credentials. This also gives you the opportunity to put the “bad” stuff back in place if your SEO isn’t doing the job. Ideally, the implementation of search engine optimization best practices should lead to even higher rankings on your client’s website, but a cautious approach is always best when a site is ranking for all the wrong reasons.

Get Found in Bing Visual Search and Google Image Search

September 16th, 2009 by Patrick Hare

Bing’s visual search feature was unveiled a few days ago, and the reviews are good. Much like the image search features you can find on Google and Yahoo, the search shows pictures, but it offers more ways of classifying them and lets you start searching from an image catalog. Click here to see how Bing Visual Search Operates.

If you’ve got a website with lots of images, then it is in your best interest to be found in the image search results. Some stores make up to 20% of their sales from image searches. However, if your picture isn’t labeled correctly, it will probably never become part of a search engine’s visual catalog. Google even has an image labeler game that it uses to get a better understanding of what images are about, and as computers get smarter they will be able to identify images by themselves, in the same way newer digital cameras can identify faces and smiles.

Here’s some basic advice for labeling images:

  • Give your pictures a name. Whenever possible, make the filename for the picture into a brief description of the content.
  • Use the right alt tags. Image alt tags should also describe the content of the picture, especially if you’re using a website design program that gives numerical filenames to images. The alt tag should be fairly brief. If you selling vintage TV sets, you could label the image “RCA XL100 Color TV Set” and you’d be more likely to get found in image results.
  • Make the image directory readable. Many sites exclude the image directory in the robots.txt file. If your images can’t be seen, they won’t become part of the image search.
  • Write a descriptive caption. Captions get associated with images and may be part of the image search algorithm.
  • Add context on webpages. If your web page content talks about a certain topic, the search engine may associate that topic with the image.
  • Use the “IMG Title” attribute in your source code, and describe the image here as well.

Optimizing your images can be a great way to get extra search engine traffic, or just to make the web a better place. Naturally, if you don’t want your images to be part of the search experience, you can always exclude them from the search engine spiders, but most people are trying to attract more customers through every channel possible. If you’ve got an ecommerce website with thousands of pictures, it may be worth the time investment to ensure that every picture is sufficiently descriptive to get itself properly placed in Bing’s Visual Search. As search engines look to serve more people from a visual standpoint, you can give your site a clear advantage with proper image labeling and searchability.

Note: Optimizing images should be one part of a larger search engine optimization project. Web.com Search Agency can help you structure your site for optimum search engine readability, and help you identify aspects of your site that may be preventing it from getting the best possible search engine positions.

How To Lower Your Bounce Rate

September 16th, 2009 by Patrick Hare

Whether you’re using Google Analytics, Clicktracks, Webtrends, or any other analytics program, one of your big considerations from a web design and SEO standpoint should be your bounce rate. Simply defined, “bounces” are visits to your site that don’t go to any other page, and usually this means that the visitor has either closed the window (which is bad) or clicked back to the search results (which is worse).

Search engines have a vested interest in seeing if a click resulted in a clickback, because this serves as an indicator that the result that was presented may not be sufficiently relevant to the search topic. An engine like Google doesn’t even need to use its analytics program to know that your site bounced since it already knows that a unique IP address made a search, visited a site, and then came back within a certain amount of time. According to search engines, a certain bounce rate is sufficient to get your rankings lowered in the search engines for that keyword. Luckily, your analytics program can tell you what the bounce rate is on certain pages, and drill down by certain traffic sources, so you can quickly discover if your bounces are the result of searches, direct traffic, or referring sites.

Here are some of the top tips for reducing your website visitor bounce rate:

  1. Eliminate bad traffic. This tip is more for money saving purposes than SEO, but it is a very important consideration. You can use analytics to see what PPC keywords are more likely to bounce, and you should probably shut down those keywords until you either improve your site or determine if they are relevant. Likewise, if you’re buying traffic from other sites and it isn’t making a profit for you, then you should cut your losses.
  2. Don’t let them blink. Website marketers talk about a “ten second rule,” an “8 second rule” or a “4 second rule” when they are referring to the time a visitor takes to see whether they want to stay on the site. In the days of dialup, people were more patient when they were waiting for the page to load, but now you have to grab people with a proposition as soon as they land on the site. Likewise, if your page takes a long time to load, you may get bounced before it finishes.
  3. Pretend your webpage is a TV screen. The term “above the fold” is used in online marketing to denote the space on a webpage that is visible before people have to scroll down. If you had to put your site information in the visible space of a TV screen, consider how you would place information in order to keep people interested.
  4. Improve your calls to action. Every page on your site should be a selling page, and your visitors should have clear choices and options that attract clicks. If you’re selling a variety of products, make sure that all the important categories are first ones that can be found.
  5. Improve site navigation. Search engines can land visitors on just about any page. If the site navigation (either on the left margin or along the top) contains the keyword that a visitor is looking for, then that person is more likely to click to another page.
  6. Modernize your page design. Web visitors have a certain expectation when the visit a site. If you’re using blurry images, multiple large fonts, and a site design that hasn’t been updated in years, people will leave.
  7. Cut back on the animations and videos. The human eye is attracted to movement. Any moving elements on your page should be calling attention to an important message or offer, but at most there should be two moving elements. Ideally, there should be a single animation or video, since it won’t distract people.
  8. Use the right color scheme and fonts. This can vary by industry, but most of the time sites with black backgrounds and dark text are hard to read. Rough looking fonts like Times New Roman are not as easy on the eyes, and small fonts may be hard for some people to read. When in doubt, use Verdana. Also, remember that too many bright colors on a page may draw the eye away from a call to action.
  9. Say thanks. If you have a form on the page, make sure to take visitors to a separate “thank you” page when the form is filled out. This is also a great page to add to the “goals” section in Analytics. A “thank you” page also represents an opportunity to show customers some of the services, features, and benefits that they might have missed.
  10. Encourage them to learn more. You can have a brief product description with a link for more info, or a quick message that draws people to go to a link, or a link to a “how to” video that gets people to a new page.

(As a final technical issue, you may see an artificially high bounce rate if you don’t have analytics installed on every page. This won’t hurt you in the search engines, but it does make it hard to gather accurate enough data to make a decision.)

Search engine factors aside, lowering your bounce rate is a tremendous opportunity to improve conversions and squeeze more opportunities out of your existing traffic. You can even use analytics data to see if some referring sites are sending you bad traffic, or if you are experiencing click fraud. For instance, if a site that runs content match PPC ads is sending you traffic with a 99% bounce rate and an average of one second on the site, you’re probably getting ripped off. Even without worrying about click fraud, improving your user experience is going to get people further into your site, which makes them more likely to bookmark a page, make a phone call, or make a purchase. Ideally, you want any ecommerce website design to follow a retail store model, where longer visit times and a pleasant shopping experience have been shown to increase the likelihood of a purchase. Since the internet puts thousands of similar stores at a customer’s fingertips, you want to make sure people aren’t walking out the door within a few seconds of their initial visit. By making a few positive steps to cut your bounce rate, you will not only improve your search engine position potential, you can also improve your ROI, lifetime customer value, and the amount of money in your pocket.

How To Improve Your Adwords Quality Score

September 15th, 2009 by Patrick Hare

The Google Adwords Quality Score is a bit of an enigma to most people who run their own Pay-Per-Click (PPC) marketing campaigns. This is because there are multiple factors involved, several of which are confidential to Google. The importance of the quality score, however, cannot be understated because it determines your position on sponsored results pages, and a page with a higher quality score can get a top ad position at a lower cost per click than the ad below it. High volume advertisers who pay better attention to quality score could potentially save hundreds of thousands of dollars per year on the same amount of traffic. Smaller businesses could still save tens of thousands of dollars, and get better online exposure or more clicks in general.

In order to understand the Adwords Quality Score, it is necessary to understand some of the history of Google Adwords, and Pay-Per-Click in general. PPC ads originally had been based on an “auction” method, where the highest bid on a keyword got the top spot. Search engines really didn’t care where the keyword landed, or even if the keyword was relevant to landing page content. Early on, Google Adwords adopted a different method for showing ads, where a higher click through rate (CTR) would result in better ad placement. In a manner of speaking, Google would rather make a nickel ten times than a quarter once. This still didn’t stop people from sending cheap traffic to barely relevant pages. Google prides itself on relevance, so it instituted a quality score which uses various algorithms to ensure that the keyword is relevant to the web page. MSN Adcenter and Yahoo Search Marketing also have followed suit with similar page quality guidelines.

In the course of our everyday PPC management work at Web.com Search Agency, we have compiled a useful set of guideline for PPC quality management. Here are some hints on how to improve your quality score:

  1. Read Google’s Quality Score Guidelines. Google updates this list regularly, and you may find a new quality improvement factor that is affecting your keywords.
  2. Improve Your Landing Page and Content. If you have a batch of keywords landing on a particular page, make sure that most variations of the keyword are visible on the page. This does not mean that you should stuff every variation into the content, but the highest volume words should be represented. Ideally, having the main keyword in the page title and an H1 tag would be a good idea as well.
  3. Build a history. Quality score is based on data related to your account, your ads, and your keywords. New accounts are going to experience a lag in exposure, which sometimes has to be overcome by high bidding. It can take a few weeks for keywords and ads to settle in their proper positions.
  4. Make better ads. Click through rate improves ad position. You want to create ads that are relevant to the products or services on the landing pages. You may want to use dynamic match, which has been shown to increase clickthrough and conversion rates when used properly.
  5. Don’t delete or edit old ads, pause them! Any time an ad is edited, its quality score resets. If you’re creating new ads, it is best to let the old ones run while the new ones build a history, so you don’t experience a sharp drop in traffic. If you’re using the Adwords Editor to make wholesale changes, you can experience a huge disruption if you aren’t feathering in new ads. Note that your keyword destination URL can be different than your ad’s destination, and a similar effect may occur if these destinations all change at the same time.
  6. Create specialized landing pages. One of the advantages of PPC is that you can land people on highly specialized pages that regular users will never see. Every keyword could potentially have its own landing page, but most people group pages by a set of keywords. You have the potential to adjust your quality score upward and test out new page elements at the same time, and you can prevent “duplicate content” SEO issues by placing your ads in a folder not visible to normal search engine spiders.
  7. Get rid of poor performers. If you have keywords that aren’t relevant enough, you should take them out of your adgroup. Generally speaking, smaller adgroups with tight sets of focused keywords tend to perform better. This also allows you to customize your ads for better keyword topic matching.
  8. Throw your landing page into the Google Keyword Tool. There is a feature on this tool that lets Google spider a page, and you can see what types of keywords Google thinks are relevant to the page. This tool will also show you potential broad match terms that might land people on your page, so you can use negative matching to filter out bad matches.

A final potential dividend that comes with improving your Quality Score is a higher conversion rate. Through its scoring system, Google is ensuring that the content on your site matches the keywords on your landing page. If you make your landing page keywords more prominent, people are more likely to look around the site and follow the sales process to its conclusion. In a manner of speaking, your PPC ad makes a promise, and your on-page content keeps it. Therefore, the work you do to improve your quality score saves you money, improves the prominence of your website, and earns a higher profit when people arrive at your website. Investing in a better quality score simultaneously serves as a cost-cutting and profit-increasing initiative, and should be a major part of any company’s continuous improvement process.

Cheap Market Research – Spy on Your Online Competition

September 14th, 2009 by Patrick Hare

Every year companies spend millions of dollars on market research, focus groups, surveys, and special studies to find out how to better serve their customers and gain increased market share. If you’re working in the online sector, and are in a David vs. Goliath position, you can use your competitor’s formidable research skills to your own advantage. The internet is a vast source of publicly available information, and several sources aggregate valuable data from various places, and make it available to you for free or at a low cost. Since much of this information applies to your competition, you can effectively run a sophisticated intelligence operation on your adversaries without resorting to expensive or dangerous cloak-and-dagger tactics.

Piggybacking on your competition’s work isn’t exactly new. One of the best known stories in marketing involves Burger King’s strategy for finding customers, which usually involved opening a store up right next to a McDonald’s franchise. McDonald’s spent a lot of time and money figuring out how much traffic was in an area, checking out various locations, and paying people to do all kinds of research which would decide if a certain spot was good for a store. Burger King apparently trusted the McDonald’s approach as well, and succeeded in drawing away business as a result. Another well known success story involves Compaq’s reverse engineering of IBM’s PC architecture, which opened the door for the proliferation of cheap “IBM Compatible” personal computers (now just called PCs) and the success of Microsoft.

Whether you’re operating out of a garage or your own office building downtown, you can still glean a lot of actionable intelligence from online sources. Some of this information can be obtained for free, while other data may be available through a subscription that costs a few hundred dollars per year.

To find out how much traffic your website is getting compared to that of your competitors, Compete.com is a great source of information. The free version lets you compare your site traffic over time to data for 2 competitors, but you can open this up to 5 with a paid subscription. The advantage of Compete.com is that you can see if your competition is more popular (or spending more money) in one season versus another. For example, Godaddy.com spikes in February because it buys Super Bowl commercials at this time. Another online tool, Spyfu.com, lets you measure sampled Pay-Per-Click (PPC) spending by your competitors, and even allows you to export a list of the keywords that your competition is being found for. If you know your enemy’s most important keywords, you can get a pretty good glimpse into their current marketing strategy. This allows you to either add these terms into your own campaign, or carve out a niche where you aren’t going to be outbid by someone with deeper pockets.

One of the easiest ways of getting market intelligence on another website is to closely look at the site itself. Many of the top sites online have gone out of their way to make the shopping process as easy as possible, and get people to the checkout page with the smallest amount of trouble. If you look at a few dozen websites and take notes about what you like, you’ve probably reaped the benefits of several thousand hours of research and testing. You can improve your own site further by learning from sites like MarketingExperiments.com, which regularly runs case studies and webinars that test actual conversion results from different site types. You can then study the different tactics used by your adversaries to see which ones are the most popular for your field of business.

Want to find out why another site has better search engine rankings than yours? Web.com Search Agency has a variety of tools that can point you in the right direction. Search Engine Optimization (SEO) is the practice of getting higher positions for websites by enhancing the factors that search engines want to see. The two biggest factors are the layout of the website and the external links pointing at it. First, a search engine has to be able to classify and categorize your site so it will match up with whatever keyword searches you want to be found for. As a hint, these are usually the very same keywords you are buying in PPC. Second, you have to have external links pointing at your site, which qualify as virtual endorsements. Our SEO Competition Report shows you how the top 10 sites in Google for a particular keyword stack up when it comes to links and several other factors. If you want to find out where a competitor is getting links, a Yahoo (not Google) search on the site using the command “linkdomain:” will show you most of the other party’s links. For example, if you’re starting a search engine and want to beat Google in its own rankings, you can go to Yahoo and do the search “linkdomain:google.com” and find out that you’re up against 1.4 billion links.

You can even monitor another company’s reputation and offline marketing success with a few well placed searches. By searching on a company name with the term “commercial” you can see comments by people about whether or not they liked a TV advertising spot. You can use Google Trends for larger companies to see how often people are searching on particular company keywords. This tool is also great because it lets you see year-over-year results for certain searches, allowing you to identify seasonal sweet spots for some keywords. The Google Keyword Tool is another great way of finding out what people think of another company. By doing a search on the company name, you can get the last month’s data on what words people associate with it, and you can even find out how many people use a certain word or phrase when the company’s name comes up.

The advent of the World Wide Web has made it possible to learn things about competitive strategies which previously had been known only to a handful of people. On top of that, the learning curve for finding out this type of information is fairly short. Within the space of a few searches, you can get a workable profile of just about any company’s market position and its relative success in the online world. By assembling pieces of information from different sources, you can find out which strategies you can emulate, and what approaches would be most profitable if they were applied to your own business model. You might also find gaps in your own marketing strategy and learn ways to shore up your own practices in order to make them more profitable. Finally, you may learn how much information about your own site is available on the internet, which will encourage you to create a profile that impresses the people who just might be spying on you!

You Don’t Need a Website To Be A Blogger

September 11th, 2009 by Patrick Hare

(But a blog can do good things for your website.)

Anyone who can type can become a blogger. There are free sites (like Blogger.com) which make it possible to set up your own blog, and host it without having to buy a domain name or pay for a hosting account. In fact, homeless people even use blogs to improve their situations and communicate with the outside world. If you do have a website of your own, adding a blog has definite advantages in the eyes of the search engines, and you can leverage your blog into a traffic generator, customer service portal, and claim to fame, all at the same time.

For example, the aforementioned Blogger.com is owned by Google, and allows people to post updates onto a Blogger profile or directly onto their own sites. Every time a new post is published, Blogger will FTP (file transfer) the new posting onto your own hosting account, in a folder that is separate from the nuts and bolts of your website. You can alter the template of you blog so it has the same look and feel as the rest of your site, and only a savvy reader will know that you’re using Blogger. We use it for our own blog, it’s free, and we have seen our blog entries end up in Google within a few minutes of publication. Wordpress is also a very popular blog application, and has quite a few bells and whistles of its own.

What are the advantages to having a blog? From a Search Engine Optimization (SEO) standpoint, blogs are a fast way to add fresh content to your site. Each blog entry ends up on its own page, and you don’t have to pay a webmaster (or rewrite your site code) every time you make a new post. If you have new information that may be of interest to the general public, you can post a blog entry with text and/or pictures and then send the blog link to people in your network. You can even serve up contextual ads on your blog (through channels like Google Adsense) so you have the potential to make money through blogging. As a disclaimer, blogging for money should not be considered a money making career path, but there are quite a few people who do sponsored reviews for a few dollars per post.

There are plenty of business and personal advantages to blogging, but it does take a bit of time and effort. If you’re considering the blogging route, you should be aware that there are quite a few bloggers out there, given that the only barriers to entry in the blogging world might be a library card or the ability to borrow some time on a computer. Blogs become more popular when they are updated frequently with interesting topics. A little promotional work is also in order, so you will want to start by drawing attention to your blog, either by sending a link to friends and family, or to other sources if you don’t want your friends and family to know what you’re up to. If you’re a business, you can send blog links to your vendors, add blog links to your Twitter Posts and Facebook updates, or send out emails to customers. Once you get the ball rolling, you can even ask your readers to suggest topics, or go into depth on certain issues. Blogs are definitely a great channel for people who want to bring attention to themselves and their websites, and worth the small amount of time it takes to set one up.

Posted by Patrick Hare, Web.com Search Agency

Do Online Forums Still Get Search Engine Traffic?

September 10th, 2009 by Patrick Hare

In the past, having an online forum was a great way to get SEO traffic to your site. Search engines would index new forum entries as their own pages, and a lively debate between contributors would generate good online content with lots of potential long-tail keyword traffic. Forums regarding common questions and problems in a particular field often created their own SEO niche, so a person seeking information on an arcane topic could see several different posts in the top 10 Google results.

Today, there are still some forums that get good placement, and a few of these are even relevant to the world of search engine optimization. A search phrase like “google 30 penalty” will bring up the WebMasterWorld forum among the search results, with some entries from October of 2006. Several other forums are also visible in the search results, from the same time period, so a person could reasonably assume that this topic is a little dated.

If you dig deep enough into just about any topic, you can usually find search engine results for forum entries that go back into the mid-1990s. Aside from seeing the democratic nature of the internet at work, you can also chart the various abuses and anachronisms that have made forums less popular today. On the abuse side, you can see a lot of people posting link spam into forums, and there are even a few offshore SEO outfits that still make money by adding your link to their footer and posting comments into forums of all kinds. Every link that gets added contributes to the forum’s lack of relevancy. On the anachronistic side, forums haven’t had much of a makeover since they were first rendered into HTML. Modern site design practices usually tend toward the “call to action” model, where people expect to see their question answered in eight seconds or less. This may be why Yahoo Answers has broken into the SERPs, since people can choose the best answer for their question after a period of time.

Part of the feasibility question involving forums has to do with moderators. Being a forum moderator can be a full time job, especially if you allow anonymous posting. Automated and semi-automatic (automated post, human CAPCHA entry) postings can destroy a forum’s credibility with off topic posts and links to bad sites. Poor moderation can lead to a loss of visitors, as well as less search engine credibility, since links to casinos don’t usually fit a DIY Home Improvement forum. Even a self-policed forum where abuse is reported by members can have its issues, because negative postings can lead to attention from attorneys. Although the Communications Decency Act protects site owners from liability for third party postings, this will not stop lawyers from trying to get the posts removed.

Should you add an internet forum to your site to get search engine traffic? The answer to this question is going to be based on the type of traffic you expect to get, and how you plan to monetize it. If you can ensure that you have a usable header on every forum page, which makes it easy to turn visitors into customers, then a forum may still be a good way of attracting sales. However, you will still need to find people who are eager to post questions and answers about various topics. Over the past few years, some of our clients have seen their forum traffic dry up. When a posted question does not get any answers after a period of time, then the interested party will go elsewhere for a response. Similarly, if your volunteer gurus get tired of answering questions, or grow weary of the debate and abuse found on many threads, they might choose to spend their time elsewhere. When it comes to forums, a critical mass of people is going to have to invest time and energy in responding to questions and policing the pages for spam, so you should definitely be cognizant of the challenge.

For most people building a new website, a forum is not part of the site design plan. At Web.com Search Agency, we haven’t given much online forum optimization advice in some time, because people just haven’t been asking for it. In many cases blogs have supplanted the need for forum content, because a blog topic can cover a lot of ground previously covered by discussions. Most blogs still allow for comments and responses, so the art of web communication is not totally lost. For people who have older sites, with forums in place, unplugging a forum with several thousand pages could represent a genuine SEO concern, since those postings may have gotten links from outside sources, and they might get some pretty good traffic. By delving into Analytics and Webmaster Tools, you can discover the overall value of the forums before taking any drastic measures.

In the final analysis, internet forums may have served their purpose and been supplanted by more social outlets. People can post a question to Twitter and get a quick response. Search engines have become better at presenting relevant information, so forums are not the first source for getting information. Site and software developers may have corrected many of the problems people were trying to solve, or the forums may have even become successful to the extent that most important questions have already been answered. There will always be search categories where forum posts take up the bulk of the SERPs, but most savvy optimizers will create a page for any search term that gets an appreciable amount of keyword traffic. Over the coming years, it will be interesting to see whether online forums enjoy a technological renaissance or fade into obscurity.