Archive for February, 2010

Think Locally – Get Found In Search Engines for “Nearby” Searches

February 26th, 2010 by Patrick Hare
Every day, people search for products and services that they can get in their own neighborhoods. A new Google search feature even allows users to refine or restrict their queries to results near their own. The example given in Google for “things to do on St. Patrick’s Day” and modified for Minneapolis shows the power of a local search for a fairly general query.

Having local information in the text of your website, and claiming your local listing with Google Local Business Center, are two ways of making sure the search engine understands where you are. If you have any kind of service that is restricted to a geographic area, it is in your best interest to help the search engines understand as much about you as they can. For instance, if you’re a cosmetic surgeon in Beverly Hills, you want to list all of the procedures that are done in your office, and you want to make sure that “Beverly Hills California” and surrounding communities are referenced in the text of your pages, or at least in the form of an address at the bottom of important pages like the homepage and contact pages.

Ensuring that your site is locally oriented is already critical for SEO today, and may be even more important in the future as GPS based search applications become more available to smartphone customers. The Google Android platform is providing its own GPS mapping service, and your site is less likely to be associated with your business’s map location if you don’t take advantage of tools like the Google Local Business Center. Any business with a local presence, or with franchises, may want to consider revisiting its website(s) to ensure that search engines can associate the site with a specific geographic area. As more and more customers get plugged in to geo-based search tools, and search gets more segmented, a site that thinks locally can gain a competitive advantage, not to mention a little more foot traffic in tighter times.

SEO Tactics For Small Business – How David Beats Goliath

February 26th, 2010 by Patrick Hare
A recent report shows that Fortune 500 countries aren’t very good at SEO. This can be very good for small and medium sized businesses who are competing in the same product and service fields with major players. By using a set of sound SEO tactics, you can often get enviable positions on search engines despite the fact that you don’t have a multibillion dollar marketing budget.

One of the advantages you can have as a small business is that you can operate quickly and without the bureaucracy that is associated with big business projects. If you do your own SEO, your changes can be uploaded on your site in a matter of days or weeks. Your link building process is going to be directly relative to the needs of the keywords you’re going after, and a traffic volume that may seem excessive for you might not be sufficient for a large company’s ROI calculations.

How can a small business build winning SEO campaigns in a big business environment? First, you should concentrate on your niche. Large businesses can try to be all things to all people, but they may not be able to produce a specialty that a certain segment of the customer base wants. For example, if a large company is selling athletic shoes, they may not focus too tightly on a particular category of shoes. Therefore, if your SEO efforts are targeted for something like “cross training shoes” or even “shoes made from recycled tires” then you’ve got an area of focus that an in-house SEO team (if they have one) may not be able to concentrate on in the midst of a big campaign.

As time goes on, and big businesses understand the value of SEO, the number of Fortune 500 companies who use SEO will certainly change. However, big business moves slowly when it comes to implementing changes, and SEO can be such an enigma to corporate managers that there could room for small business growth in the search engines for some time. By building small business SEO campaigns now, you can also gain a foothold against competing SEO initiatives down the road.

Want to Increase Rankings? Meet Googlebot.

February 25th, 2010 by Jessica Runberg

In the complex world of SEO, there are a variety of ways to increase your search engine rankings. From adding keywords to your meta titles and on-page content to building a winning portfolio of links, there’s no shortage of things you can do to increase your rankings. But, none of it is worthwhile unless you can get Googlebot to visit your site.

Who is Googlebot?

While the name may sound a bit like a superhero (it’s a bird…it’s a plane…it’s a Googlebot?), a Googlebot agent is actually the specific web crawler that crawls the Web to build Google’s searchable index of website listings. This spider plays a very important role in increasing a website’s rankings on Google.

The presence of Googlebot robots on your site means that search-engine giant Google is actively archiving changes you’ve made and adjusting its index (and eventually rankings) accordingly. How do you know if your site’s been crawled? All you have to do is look for the cached version of your site (available on the Google toolbar). It will show you the date and time your site was cached, as well as a copy of your site as it appeared at that time.

You can also tell a Googlebot spider not to visit selected pages and directories on your website. Perhaps you have database content, scripts, or images on the site that you don’t want in the search engine listings. You may have directories that duplicate content on other pages (one example would be “printer friendly” pages) or you may want to limit the bandwidth load caused by search engine spiders. Whatever the reason, you can either set up nofollow tags or use the “disallow” command to tell Google to focus exclusively on the content you want.

So, while Googlebot isn’t exactly a superhero, its regular presence on your website can mean the start of enhanced Google rankings. Ready to learn more? Contact us at 1-877-Rank-321 to speak to one of our SEO specialists about Googlebot, search engines, rankings and more!

The Meta Title Tag: Increase Your Rankings in 60 Characters or Less!

February 25th, 2010 by Jessica Runberg

Search engine optimization is a complex process. From selecting keywords to building a comprehensive link building strategy, getting your website to rank highly in the engines takes time, hard work and patience. No matter what others promise you, nobody can get you on the first page of Google’s organic listings overnight. However, there are a few simple things that you can do right now to give your website a boost.

Among these “rankings boosters” is having a unique, keyword-optimized HTML meta title for every page on your website. This tag is written in your website’s HTML code and appears on the top left corner of your Web browser window. It can also be displayed in search engine results as the blue link above the site description.

This short line of text, usually no longer than 60 characters, has a BIG impact on your SEO. Search engine optimization experts consider a proper title tag to be the most important factor in defining how search engines classify a Web page. A poorly written title tag can eliminate the effectiveness of high quality link building and content creation.

Think of it this way: if the search engines ignored all other aspects of your on- and off-page SEO and only analyzed your meta title tag, what’s written here should tell them exactly what the page is about. The truth is, sometimes this scenario isn’t that much of an exaggeration. That’s how important the meta title is.

Although meta title tags should tell you exactly what the page is about, it’s amazing how many websites have chosen a generic and non-descriptive tag for their pages. This is one of the biggest mistakes many webmasters make. The other mistake is trying to cram too many keywords into the SEO meta title, including keyword variations that don’t complement each other. Over-optimizing is just as disastrous as the results you’ll get from not trying to optimize your title tag at all.

Like all aspects of SEO, the key is to strike a delicate balance. By including one or two of your top keywords in your title tag, you can clearly tell the search engines (and Internet users) what the page is about. Just make sure the keywords are unique to the page and do not conflict or blur with any other pages on the site.

Want to learn more? Please give us a call at 1-877-Rank-321. Our SEO consultants can evaluate your website for meta tag and SEO effectiveness.

Link Farm

February 25th, 2010 by Lisa Rosenkrantz

Anyone with knowledge of search engine optimization (SEO) realizes that good, solid inbound links from reputable, relevant sites are required for ranking highly in the search engines. They lend an air of trustworthiness and quality to your website, which is critical to being part of the Google country club. To do anything unseemly to acquire these links is asking for trouble – you can be penalized or even completely blackballed.

What are link farms?
They consist of sites that link to other sites just for the purpose of linking and inflating a site’s apparent popularity. Further, some link farm products identify potential reciprocal link partners and offer to exchange links and create directory-like link pages for Web sites.

The problem is that most of the sites are random and not relevant to the linkee at all. This is a red flag for Google, and when they detect link farm spam they take action by removing the offending site from their index.

What’s wrong with link farming?
It is an unfair practice and an affront to searchers and to sites that follow the rules. When you conduct link farming, instead of linking to related information of value to your visitors, you’re instead sending them to sites about totally irrelevant topics, sometimes even adult content. Partnering with companies that supply those links is a bad idea because they don’t have your best interest in mind.

Why is it still in place?
It remains a mindset stemming from a decade ago – when good quality natural links were more difficult to cultivate. Link farm companies cropped up to provide auto-registration, categorization and link page updates. Now, the search engines have a higher capacity to index more sites; thus, link farms are unnecessary. They still exist for the purpose of influencing search engine results and inflating PageRank.

The companies that operate these schemes call themselves reciprocal link management services and claim that they provide direct networking. They also tout themselves as an alternative to search engines for attracting visitors to websites. Some people apparently like what they hear, because many of these companies have a stable customer base.

What can happen if you participate in a link farm?
Search engines, particularly Google, offset the link farm spam pages by identifying specific characteristics associated with them and then filtering those pages from their index and search results. It may take time, but it will most likely happen to you if you’re involved in false linkage.

What are some good linking strategies?

• Make sure your links are relevant to your site’s topic.
• Only allow links that are good quality and have a decent PageRank. Search engines judge you by the company you keep.
• Use links with relevant keywords, and place them in your content.
• Don’t take part in any offers to trade links unless you’re familiar with the site. Delete any unsolicited emails that want you to pay for automated link generators.

Link building is essential for ranking in the search engines, but it can be a daunting task. Web.com Search Agency’s SEO strategists are experts in the art and science of link building and will take the lead in helping you boost your online presence, your traffic and your bottom line.

How to Measure Traffic From Other Sites

February 19th, 2010 by Patrick Hare

Find ways to monetize traffic that doesn’t come from search engines, and get more traffic like it.

Aside from search engine traffic, websites can get a significant amount of traffic from other online references. For instance, if you have a site that gets links from CNN or the Wall Street Journal, you may see an overwhelming amount of traffic in a short amount of time. On a smaller scale, there may be sites in your industry that send you high quality traffic which converts at a better rate than paid or natural search engine traffic. Furthermore, the proper measurement of search engine traffic can help you detect bad traffic, which comes in the form of undesired references or fraudulent clicks.

One of the best free tools for measuring site traffic is Google Analytics. If you have it installed on every page of your site, you can log in and go to Traffic Sources>Referring Sites to see the top sites that send you traffic. You can also click on specific sites in the list to see how much traffic comes to your site, and you can even get a graph of what days the traffic load is the heaviest.

The average amount of time that a visitor spends on your site, and the average number of pages visited, can give you a good impression of your site’s popularity among different demographics. For instance, if visitors from Facebook are spending a bit of time looking around, but traffic from MySpace doesn’t stay as long, you may want to make adjustments to content ads running on one network or the other. Generally speaking, the amount of time spent on a site indicates the value of a site to a reader, so traffic sources that last longer and visit more pages are better targets for your site’s value proposition.

One of the best things about segmenting your online traffic sources is that you can see how visitors from a particular site respond to your content. For instance, if the Bounce Rate (the ratio of people who visit one page and leave) is high, you may either consider improving the destination page, or finding a better source of visitors. If the bounce rate is unnaturally high, and the time spend on the site is very low (like a few seconds) then you may be the victim of click fraud.

Another positive aspect of traffic segmentation happens when you have goal, ecommerce, or conversion tracking installed. With goal tracking, you can understand the amount of traffic that takes a specific step like filling out a form or making a purchase. With ecommerce tracking in Google Analytics, you can put a real dollar value on the traffic from certain sites. If you’re paying a site for ad space, you may only need a calculator to see your ROI for that traffic. Similarly, if you see that the traffic isn’t driving sales, you can quickly re-allocate your dollars to better performing sites.

Knowing “who is visiting your site” can be especially useful when search engine traffic is taken out of the equation. In fact, Google counsels people to design their sites as if search engines didn’t exist, and Google’s PageRank algorithm is based around how sites link to each other, presumably as references for further information. Developing a reputation among referring sites can nicely feed back into SEO initiatives, since a strong site presence lends itself to more links, and more links in turn can lead to better search engine positions. Measuring referral traffic helps you understand what others think of your site, and communicating with other site owners may help you uncover business opportunities or potential partnerships with mutual growth potential. Finally, a clearer understanding about visitors who aren’t driven by search engines will give you a better ability to go after traffic from sites that are similar to the ones that drive conversions, and this can create a revenue stream with a lower cost and higher ROI than many paid search engine strategies.

Overoptimization – What is it?

February 19th, 2010 by Patrick Hare
What is overoptimization? In the world of SEO, over-optimization refers to the idea that your site has been heavily manipulated for search engine rankings. Normally, an over-optimized site is easy for a person to detect, and even easier for a search engine algorithm to spot.

Some symptoms of an overoptimized site include:

  • The site has a title that exactly matches its H1 Tag.
  • Keyword density on the site is very high. Usually, you can spot this by reading the content out loud and hearing the same phrases repeated. In the distant past, high keyword density led to good rankings, but currently it is best to keep a keyword’s density below the 7% range. Some experts will even quote 3%.
  • Keyword appears in the title more than twice. (Keyword stuffing anywhere else is not good either.)
  • Multiple identical backlinks with the exact same anchor text.
  • Multiple pages, usually based around all 50 states or 100 major cities, where the only difference in on-page content is the name of the city or state. This is easy for search engines to spot. Content on every page should be original.

One way to see if the search engines are penalizing you for overoptimized content is to do a search for your top keyword in Google preceded by the allinanchor: operator. This should show you where your site would be ranking based on its inbound link profile. If your site’s regular ranking is more than a couple of spots below its allinanchor score, and your keyword is prominent in the homepage title and content, you may very well be overoptimized. In any case, you should really consider tinkering with your site content to bring it up to current best practices.

In many cases, over optimization is done by people who have started on the path of DIY SEO, but who are overzealous. Whether or not there are specific “overoptimization filters” in the search engines, there are algorithms for page quality and original content, and an overoptimized page is not going to carry the same level of quality as a natural looking page with useful content. Therefore, it is best to avoid overoptimization at the outset, but if you’re revisiting an old SEO project that may be tripping some search engine filters, a bit of pruning and editing may create a cohesive profile that a search engine will find useful and acceptable. In some cases, a streamlined site can rapidly move to its rightful place in the rankings, which means that fixing overoptimization can be one of the cheapest ways to get better search engine rankings.

Domain Name Optimization

February 15th, 2010 by Jessica Runberg

For most of our clients, the domain name is already part of the equation. However, there are occasions when there is an opportunity to purchase new or additional domain names in the name of rebranding or expanding an existing business.

So, what’s in a name really?

Your domain name says a lot about your business – and even more about your SEO. The name you choose will become a part of your corporate identity and branding efforts, while also helping you rank higher in the search engines. In other words, the domain name you choose can enhance – or detract – from your online success.

While Google does give preference to Web domain names that include keywords, this shouldn’t be your only consideration. URLs should be unique, while also being easy to remember. Additionally, they should be descriptive of your products and services. Domains such as CareerBuilder.com and AutoTrader.com are great examples of short, effective and easy-to-remember domain names.

If you choose a URL that is not descriptive of your business, it may take a while to gain brand recognition. When now popular sites such as Amazon.com, Hulu.com and Kayak.com first launched their online businesses, they had to overcome the “what is this site about?” factor. Having a more obvious domain name, such as Cars.com or Travelocity.com is far easier from a branding perspective.

If at all possible, we suggest avoiding using hyphens in the domain name. They’re difficult for people to remember and can make it awkward to verbally tell someone your URL. Above all, the key is to keep it short and simple. If you choose a domain name that accurately describes your business in as few words as possible – especially when one of those words is a keyword – you’ll be setting up your website for success.

As always, please feel free to contact us at 1-877-Rank-321 if you have any questions.

How to Remove a URL from Google

February 15th, 2010 by admin

by Megan Homan

Getting a page to rank well in Google takes time, as does getting a page removed from Google. The steps you must take to get a web page removed, blocked or redirected are very easy; however, you must be patient as you wait for it to be officially gone from the search results. To remove a particular web page you must make the necessary changes to your website and then wait for the Google spiders to crawl it in anticipation that the page is no longer in the index.

Among Google’s Webmaster Tools you will discover a URL Removal Tool that will help expedite the process, but only if you are looking to remove a web page permanently. This delete URL tool will help in weeding out unwanted web pages, content and images –basically stuff you no longer have use for – and will have little to no impact on your SEO tactics.

If you don’t have access to your server, preventing a search engine from indexing your page can be accomplished through simple coding. The execution of a noindex meta tag will tell the web crawlers not to index the page and will inevitably stop the page from appearing in the search results. The following code will stop the crawlers in their tracks: < meta name=”robots” content=”noindex” >. Adding this tag will not provide a quick fix, nor will the page automatically disappear. As mentioned above, you will have to wait for the page with the noindex tag to be visited before the URL gets removed from the results.

Another option for making a page disappear from search results is to use a 301 redirect. This method of removing URLs should be used if the page is or was an important part of your site, especially if it has link power that you don’t want to lose. When the web crawlers visit that particular page, they will be redirected to its replacement and the original version will eventually be dropped from the index.

Build Website Exposure with Directory Submissions

February 15th, 2010 by admin

by Megan Homan

Web directory submission is a good way to help increase your website exposure even though their original purpose has evolved into something entirely different . Directories began as a way to help people find a website and were blamed for ruining the popularity of the standard telephone book. When more advanced search technologies like Google, Yahoo! and MSN made their way onto the scene, people lost interest in directories. Today, online directories are mostly viewed as link building opportunities that benefit your search engine optimization efforts.

When a website is submitted to a directory a backlink to your site is created, thus increasing your site’s link juice. The more established a web’s link portfolio is, the better it looks to the search engines. These types of links, however, are not as reputable to the search engines as a one-way link from an independent website relative to your industry (which is not easy to come by). Links from directory submissions are much easier to get – you just have to know which ones to choose and how to submit to them for maximum benefit.

Choosing an ideal website directory for submission is an important factor in determining how successful the link pointing to your site will be. It is best to submit to credible, well-established directories that have a good page rank in the “eyes” of Google. You want to position your site in categories that are relevant to your topics/products.

There are two types of link directory submission options – paid and free. Paid directories are actually managed by humans making them a more appealing option because of the credibility they have established with the search engines. The paid directories tend to have better page rank, so they are typically your better choice. However, to develop a well-rounded link portfolio, throwing in some free directories with lower page ranks can still be beneficial to your link portfolio.

Aside from paid or free directories, there are also general and niche directories to consider when choosing the appropriate directory. General directories will list your site in very broad categories, whereas the niche directories will allow you to place your site in very specific categories relative to your site. Be careful of how deep you go within the niche directories because the further into the site you are, the lower your chances of appearing on a page with a high page rank. Remember though, it is good to appear as relevant to your sector as possible.