Archive for June, 2009

Search Engine Positioning

June 30th, 2009 by Patrick Hare

Though the term “search engine positioning” is a little more antiquated sounding than “search engine optimization,” it has some interesting connotations that may bridge the gap between marketing people who are savvy about the internet and those who aren’t up to speed on the ever-changing face of the search marketing world. It may also help people in the search engine optimization (SEO) business sell their services to the brick and mortar marketing departments who understand the old “positioning” concept vs. its meaning in the world of SEO.

Classic marketing students will be familiar with the concept of positioning as it relates to how a brand name is perceived. Most marketing school graduates were required to read the book “Positioning: The Battle for Your Mind” by Al Reis and Jack Trout. Briefly summarized, the book contends that your brand image tells people what your capabilities are. In an example from the book, it was noted that Xerox did not compete well in the computer market against IBM because people associate IBM with computers and Xerox with copiers. Since the book’s publication, “positioning” has itself become a buzzword so it creates a clear idea in the mind of a listener who has been exposed to the concept. Therefore, “search engine positioning” may be a bit deceptive (are you trying to position a search engine?) as a phrase, but it is possible to clarify the idea in a way that clears up its real meaning and gets potential customers on board with its possibilities.

Any SEO expert will consider “search engine positioning” to be synonymous with optimization. Additionally, the marketing concept of “positioning” on the internet doesn’t really work because search is demand driven, and branding involves sending a repetitive message through various channels until someone understands the connection between the brand and its service offering. However, the process of relating the concept of SEO to people who don’t work in the search engine trenches is difficult, and takes many different approaches. With a word like “positioning,” it is possible to relate that your spot in the search engine rankings is going to build brand equity (albeit more slowly) by creating an association between your company and the “what you do” searches made by the average searcher.

Many people in SEO have no interest in brand building, but traditional marketing types may seem to think of nothing else. Therefore, it is necessary to show brand-conscious SEO prospects how you can use optimization and PPC to build brand presence while simultaneously creating a profitable sales channel for non-branded searches. One of the keys to creating long term customer relationships in the SEO world is to show value month over month, and the residual aspect of good keyword optimization is that people will come back to the site by searching for the brand name once they know it, assuming the brand name matches up with the URL.

Finally, there are people who only want to build a brand presence on the search engines, and don’t necessarily have an interest in the potential for all of the traffic generated by search terms. Many clients want to have a presence on the web that takes people to a site and presents information in the same manner as a billboard or commercial. These are not always bad clients to have, as they may spend their advertising dollars in amounts comparable to radio and television campaigns. By adding value and creating better search engine positions for keywords relative to the brand being built, a search engine marketing agency can prepare for the day that the customer’s strategy changes into a keyword conversion model. This usually happens when a shakeup or merger brings in a manager more familiar with the internet and the potential of search. In cases like these, showing tangible value in combination with online branding expertise generally creates an opportunity for your agency to be retained.

PPC Glossary

June 29th, 2009 by Patrick Hare

If you’ve got someone managing your Pay-Per-Click (PPC) campaign, then you probably are hearing a lot of jargon and acronyms. The world of PPC is similar to search engine optimization when it comes to abbreviations and tech-speak, but it is still important for a customer to understand what is being referenced in a conversation, since your PPC agency is using your money to make the best possible purchases. Search Engine Marketing (SEM) agencies generally use a whole raft of terms in their internal reporting, and unfortunately they don’t always tone down the jargon when speaking to customers.

Here is a by-no-means-definitive list of common PPC phrases that are used in the world of search engine marketing:

Adcenter – MSN’s PPC marketing platform, which shows ads on Bing and Microsoft properties.

Ad Group – A specific set of keywords associated with a related ad or group of ads. It is possible to have multiple ad groups in a campaign.

Adwords – Google’s Pay-Per-Click platform, which shows ads on Google, AOL,, and multiple custom search engines.

Average Position – The average placement position of your ad over all its impressions. It normally takes several weeks for your average position to rise to the top in Adwords, even with high bids.

Bid – How much you are willing to spend on a click. A bid can be a “default” bid on a group of keywords, or a specific bid on a single keyword.

Broad Match – A type of keyword matching that will show your ad for the widest variety of terms deemed relevant by the search engine. In general, broad match keywords give a higher volume with a lower conversion rate.

Campaign – A set of ad groups. Even though each ad group can contain different keywords and ads, the campaign has a daily budget, geographic targeting, and other settings which will be effective for all the ads in the campaign. Once the campaign’s daily budget is exhausted, all the ads in the campaign stop appearing for their relevant keywords.

Channel – In content match, one of the websites where your ad is showing. You can choose specific channels or sets of channels, so you can choose to be on Facebook, or on certain sections or demographics where your ad is likely to get better results.

Click Fraud – Clicks from fraudulent activity. Some fraud comes from competitors intentionally clicking on your ad. Other fraud involves using automated tools to collect commissions for clicks (via channels like Google Adsense) or to exhaust your daily budget.

Content Bid – The default bid for a set of keywords on the content network. Note that this should be a fraction of your search bid.

Content Matching – The placement of PPC ads on websites. Ideally, these sites should have content that is relevant to the service offering of the company running the ad. Google runs content matching on its “content network.”

Contextual Advertising – Essentially the same thing as Content Matching, given that your ad is triggered by keywords in context on a news, article, blog, or content page.

Conversion – A desired action taken by a website visitor. With PPC, it is relatively easy to measure conversions on form fill-outs and purchases, but it is more difficult to quantify dollars earned from phone calls triggered by online ads. Search engines track visitors for up to 30 days, so your conversion may not happen until a subsequent visit several days later.

Conversion Rate – The percentage of times your ad resulted in a form fill-out, sale, or action. To calculate conversion rate, you take the number of actions divided by the number of clicks. A typical e-commerce conversion rate may be 2.5%, but a form fill-out conversion rate could be 10-20%. A very low conversion rate indicates problems.

Cost Per Conversion – The dollar cost associated with the conversion rate. If you had 100 clicks for one dollar each, and had a 5% conversion rate (5 conversions), your cost per conversion is $20. The formula is essentially the dollar cost divided by the number of conversions.

CPC – Cost Per Click. How much each click cost you. Because bidding is based on many factors, this amount is generally expressed as an average of multiple clicks. When applied to “advertising” it is synonymous with Pay-Per-Click (PPC)

Creative – The text of a PPC Ad. In Google, the first line of your creative is 25 characters, the second and third lines are 35 characters.

CTR – Click Through Rate. The actual number of clicks on your ad divided by the number of impressions.

Day Parting – Setting your ads to appear at certain times of day, or certain days of the week. Many B2B customers will day part so their ads run during business hours when they can answer the phone.

Dynamic Keyword Insertion – The placement of a variable field in ad text that will show a searcher’s actual search phrase (within limits) in the body of the ad.

Exact Match – A keyword setting that only lets your ad show up when a precise keyword is searched upon. In Google Adwords, brackets are used for an exact match, so a keyword like [vegetable man] would only trigger an ad when that phrase is typed in with no words before, after, or between those two words.

Geotargeting – In a campaign, setting your ads to appear in a certain city, state, or country. Also called Local Matching.

Keyword – A word or phrase that triggers an advertisement. Keyword matches can be based on broad match, phrase match, or exact match. Negative Keywords can be used to keep ads from showing for the wrong searches.

Impressions – Number of times your ad is served up, either as a search match or through content placement.

Landing Page – The web page that the display ad or keyword “lands” on when an ad is clicked. If there is not a specific page assigned to an individual keyword, the landing page is determined by the “default URL” in the ad.

Negative Keyword – A keyword that is used to determine when an ad should not be shown. For example, if you have a campaign about computer keyboards, but don’t sell wireless keyboards, then you would add “-wireless” to your list of keywords. Anyone typing in “wireless keyboards” would not see your ad because the negative keyword excludes it.

Phrase Match – A keyword matching type that only shows your keyword when someone types in the specified phrase in a search query. With phrase match in Google Adwords, quotes are used around the keyword, so a phrase match designation for “dog food” would trigger an ad when someone typed in a search for “best dog food recipe.”

Position Preference – A setting that specifies the desired position of your PPC ads. Note that this can sometimes keep your ad from showing.

PPC – Pay-Per-Click. A model of serving ads where you only pay when someone actually clicks on the ad.

Quality Score – A somewhat complex formula used by search engines to decide whether your keywords are relevant to the landing page, how much trust your campaign has, and the click through rate on your ads. Changing an ad can change its quality score, so it is usually best to copy the ad and edit it, then pause the old one when the new one has gained its own quality level. There are also geograph
ic and bidding factors.

ROI – Return On Investment. There are various ways to calculate the value of a PPC campaign. Normally it takes at least a month to ramp up a campaign and gain a quality score, and more time to determine buying patterns and seasonality. PPC ROI can be tough to track beyond the simple cost/benefit analysis, but if clicks are not resulting in profitable sales, then improvement is called for on the site and in the campaigns.

Yahoo! Search Marketing – Yahoo’s PPC platform, previously known as Overture, previously known as GoTo. It shows ads on Yahoo! and

Some of the more advanced features of PPC management have not been touched on in this glossary, given that a tutorial would be a better spot for showing people all of the features available in Pay-Per-Click. Google offers video tutorials for its Adwords product which are highly recommended for any DIY PPC enthusiasts. The world of PPC management can be very expensive, and anyone who is in charge of an account (or a liaison to an agency) should be cognizant of the different aspects of pay-per-click, and their relative advantages or pitfalls.

What Does “Above the Fold” Mean?

June 26th, 2009 by Patrick Hare

“Above the Fold” is a newspaper term that has been adapted to web design. Originally, it referred to the information that was on the front page of a newspaper on the top section. Newspapers are sold on street corners, in vending machines, or at newsstands, so the biggest stories had to have easy to read headlines in an obvious place in order to get purchased. In website design, “above the fold” is equally applicable in the sense that the most important or actionable information should be visible without people having to scroll down.

As a selling point, having a clear message above the fold on your website is more important than it ever was for a newspaper. Most ecommerce and lead generation websites pay far more than the cost of a newspaper to get one person to visit the site. Whether you are purchasing traffic through an agency or Pay-Per-Click (PPC) service like Google Adwords, you will quickly find that the best converting keywords can cost anywhere from fifty cents to twenty seven dollars per click! If you’re investing this kind of money in one set of eyeballs, you should make sure that the landing page (the relevant page on your site, which should be the best match for your keyword) presents information to the reader quickly and encourages them to take action right away.

Eye tracking studies on popular sites like Google show how much value the top of a webpage contains. On a search engine, 100% of the people see the first three results, which is why people use search engine optimization techniques or Pay-Per-Click marketing to appear in these spaces. The shape of the “heat map” for eye tracking on Google is called the “golden triangle,” because of the value found in that section of above-the-fold space.

On a regular website, people have been conditioned to look for information in the center for information, and the top right for phone numbers, so there should always be a clear message at the top of just about every page on your website. Remember that you don’t have a lot of control over how or where people land on your site, so useful information should be built into the top and center of every page. As a side note, it has been observed that placing a contact form on the right margin “marginalizes” it, so it gets less attention than it would get if it were closer to the middle of the webpage. For people who get most of their customers through form fill-outs, even a small increase in conversion rate can be the difference between profitability and bankruptcy, so this should be one angle that is exploited whenever possible.

Another industry term used in website conversion optimization is “hero shot” which essentially refers to your product or service as the “hero” of the page. Logically, the image should convey that the product/service is the heroic solution to whatever problem or need that the customer wants addressed. People respond to images more quickly than they do text, so a picture showing how the product solves a problem will get a reaction. Depending on the landing page, the hero shot should be as relevant as possible to the search term. It generally goes in the upper center of the page below the header, which is sometimes called the “hero space.”

One way to find out how well your on-page content is working is to use click tracking and analytics tools. Google Analytics has an overlay feature that shows which links people click on. Crazyegg and Clicktale offer even more advanced features which will either show you a heat map for user behavior or actual movies of mouse movement. This kind of advanced analytics is invaluable for testing site design and conversion features, because you can get unbiased information about how your site is performing.

The best information to put above the fold is your phone number, a brief contact form, or your top product categories and pricing. Secondarily, trust is a huge factor in turning visitors into customers, so you should be showing trust indicators like secure site logos, accreditations, and BBB membership information where people will see it right away. If you take credit cards and PayPal, you should have the logos visible right away, since acceptance of certain cards may be the difference between a sale and a lost visit.

The most important takeaway for the “above the fold” philosophy is that you want to create a quick value proposition that encourages people to stay on your website and get the information that they need. In the same way that newspapers offer a concise and attention grabbing headline to sell all the pages inside, your landing page’s above-the-fold (or “above-the-crease”) space should do the same thing. No matter how traffic comes to your site, or what pages people land on, the “above the scroll” information is going to be the factor that determines whether you keep your audience or lose a potential customer.

SEO Tips

June 24th, 2009 by Patrick Hare

There are multiple places online to get tips for SEO. In fact, some sites pride themselves in giving up to 55 different tips on aspects of search engine optimization. If you’re new to the game, or just a small business looking to build SEO into your website, these tips can be overwhelming and jargon heavy. Here are a few basic optimization tips for the casual web designer.

  1. Don’t just copy other people’s metatags. This was the preferred “poor man’s SEO” practice for a long time. Sites would grab the top titles, keywords, and descriptions, make a few minor edits, and publish. This was based on the fallacy that on-page optimization was the source of search engine ranking. You should make your site relevant to today’s search demand, not a copy of a copy of a site that was somewhere 5 years ago.
  2. Find out the best keywords by using the Google Keyword Tool. Look at the “related keywords” list to see if people are typing in synonyms or are looking for a specific service that you can provide.
  3. Make every page title unique. Page titles are the key to how a search engine will rank the site. There are many other factors, but sometimes you can get great results on an existing site just by changing the title.
  4. The home page title should cover the broadest possible topic matter, and interior pages should be more specific. A roofing contractor in Phoenix would start with a homepage title containing “Phoenix Roofing Contractor” and interior pages would have things like “Flat Roofs” and “Leaky Roof Repair” as topics.
  5. Each major page should have at least 250 words of readable content. “Readable” means that you can read it out loud without sounding repetitive. Even though there are plenty of ranking pages with less than 250 words, they usually have other factors in play.
  6. Put your site up now, submit it to search engines, and get links to it. After that you can add pages. In many cases people are afraid to launch a site because they are afraid that thousands of visitors are going to stop by and notice that the site is not perfect. This is not going to happen. Search engines need time to index your site, and traffic is going to go from zero to trickle in about 2 months, so you need to get the ball rolling right away.
  7. Follow Google’s Guidelines, Install Webmaster Tools, and Install Analytics. These tools are powerful and free. Sometimes people are afraid to give Google so much information about site traffic, but Google pretty much already knows, so you might as well be a part of the intelligence gathering.
  8. Compare advice from different “SEO Experts” and ask questions. Don’t be afraid to ask one optimization expert about the claims, advice, or techniques of another, without naming names. Nobody has a special relationship with Google, and if anyone says they have a proprietary or secret technique, it is probably not as good as the transparent one sold by reputable agencies. Agencies similar to Search Agency may not have esoteric SEO knowledge, but they do have experience in working with hundreds of sites, plus proprietary tools that make agency-based optimization much cheaper and less risky than DIY SEO.

Search engine optimization firms can help you get started with basic SEO and usually are able to offer products like link building, site optimization, and content writing which are above the skill level of the average webmaster. In fact, we have a network of webmasters that partner with us to provide SEO services during the site creation or redesign process. We have even had many site designers hire us to work on an optimization project for them so they could learn the basics of SEO and then build it into their own sites. The educational value of hiring an optimization firm can be very high as long as it involves clear account management and reporting. It is also very profitable for the SEO firm, since knowledgeable customers renew their contracts as their online business becomes more profitable, and they need to spend more time on making sales than maintaining their own SEO.

SEO Checklist If You Can’t Find Your Site Google or Other Search Engines

June 23rd, 2009 by Patrick Hare

“Why can’t I find my site in Google” is obviously a very common question in the world of search engine optimization. In most cases there is a stock answer involving the age of the site, the number of backlinks, or the lack of content on the site. The title of the web page may not match up with the overall context of the page, or it may be “index.html” or “Welcome” instead of something descriptive. If the site isn’t being found at all on the search engines, here is a list of what to look for:

  1. Robots.txt file exclusion – Every once in awhile a webmaster will create a development site and use the command “Disallow: /” in the robots.txt file, which can usually be found at: http://www.[yoursitehere].com/robots.txt . If you don’t have a robots file, that won’t stop you from getting found. Sometimes, webmasters and amateurs put the same wrong command in the file, and the result is a quick vanishing act in the search engines. (Note: If you use Google Webmaster Tools, it will tell you about robots.txt commands like this one, by saying that Google is not being allowed to index your site.)
  2. Metatag exclusions – On your source code, which you find by going to your website and selecting “View” and “Source” in the menu bar, you should look for a line that says something like “meta name=”ROBOTS” content=”NOINDEX,NOFOLLOW in the code on the page. If this command is there, and you want your site to be found, this should be removed.
  3. Duplicate On-Site Content – Do the pages on your site have the same content on multiple pages? Are the titles all the same? If so, the search engines may not know which page deserves the most attention. All content on your site should be unique, unless you need to have a standard piece of boilerplate on some pages. Even so, there should be plenty of content built around the boilerplate.
  4. Duplicate Websites – A few years ago, our customers would independently come to an amazing revelation. Since our SEO worked on one website, they figured that they could copy the entire site onto a .net or .org domain name and hold down more than one spot in Google’s top 10. Unfortunately, this did not work because duplicate websites essentially get ignored in Google. If you are substantially copying the content off someone else’s website, or just scraping and pasting it, you are also unlikely to see results.
  5. Content embedded in images, Flash, and JavaScript. Search engines have done a better job reading Flash files over the past few years, but there are still drawbacks. For one thing, lots of Flash designers embedded text in images, which can’t be easily read by search engines. In the same way, we have seen websites that looked like they had text, but were actually one big image. Search engines prefer text that is easy to read. When JavaScript is used, the search engines may be able to read it, but may not know what to do with the information or how to index it. In the same way, AJAX code is difficult for search engines to classify, or even find since it is delivered dynamically from a database.
  6. Use of frames. There are still a few sites built in frames, and they still get the same poor results. Normally all the search engine sees is a homepage with a header, which is often just a picture. In this case the search engine doesn’t have anything to read other than a page title.
  7. Copyright Violation (DMCA) – If you have been accused of copying someone else’s online or offline content, they can file a DMCA Removal Request with search engines. Normally these engines will attempt to contact you, but simultaneously they may remove your site content from their indexes. Most of the time you will get a letter in the mail from a law firm when this happens, but if your contact information is difficult to find due to private domain registration, then you may not get notified that way.
  8. Bad neighborhood – What kind of content is on your site, and how do the search engines see it? Most of our clients would be considered “good neighborhood” sites, since we do not do SEO for adult, gambling, or offshore pharmaceutical clients. However, your content may have a keyword profile that is too similar to something that would not be found in safe search results. You may be linking to bad sites and not know it. In some cases we have had customers who had been hacked, and were hosting links to very bad domains, phishing sites, and the like. Once again, Google Webmaster Tools makes it easy to see how Google sees you, and by extension you can get a sense of how Bing and Yahoo are seeing your site.
  9. Sending Malware and Viruses – Usually this is the result of having had your site hacked, but you may also be hosting software that does this type of thing on purpose. For some time, sites like Yahoo would take you out of their listings for a year if you were passing viruses, even unintentionally. Malware usually comes with “free” screensaver and chat programs, and normally you get a warning in Webmaster Tools, and a red notice on the search engine results saying “this site may harm your computer.” Experience with one client (hacked by sql injection shows that the red notice cut off 90% of natural SEO traffic while the warning was up. Search engines have the choice of showing a warning or taking your site out of the index, and a new site is more likely to be removed.
  10. Penalty/Filter – If your site is new, it may be seeing the “sandbox” filter, or it may not have been indexed yet. If you bought a domain name from someone else, it may have been banned for bad behavior. Lots of sites and domains for sale online are being sold because they tripped a spamming filter in Google, and no longer generate revenue.

If you believe that you have been penalized in Google, or Webmaster Tools has told you that you are, then you can always file a reinclusion request in Google, and Yahoo, but MSN’s reinclusion link is apparently guiding visitors to a search page with no information.

Most of the time, it is not difficult to get found in the search engines, but it is necessary to be patient. To see if you have been cached in Google, all you have to do is paste your domain name into the search box and see if your site is listed, or type in to see when your site was visited. Choose the “cached text” feature to see what Google can read.

Getting found by the search engines is the first step on the long road to rankings domination, but it is still the most important. Almost every other search engine optimization initiative regarding your site is going to be judged against how the web pages classified by search engine spiders. Advanced link building, content writing, pagerank sculpting, image optimization, and W3C compliance all take a back seat to being properly cached by the engines and placed somewhere among the billions of pages on the World Wide Web.

SEO Consultancy

June 23rd, 2009 by Patrick Hare Search Agency offers search engine optimization consultancy for international clients and people who want to expand their domestic online presence. Our SEO consulting services can be customized and range from hourly analysis and strategy consultation to an intensive audit of your company’s SEO profile. Whether you’re looking to get an audience in a small geographical area, or dominate the global search market, Search Agency can create a strategic roadmap which can be implemented by people on your end, at, or any one of our website development contacts. Every year, we provide multiple specialized consultancy solutions to large and small companies looking to discover roadblocks and challenges to getting found in top positions on Google, MSN (now Bing), and Yahoo search engines in the US and abroad.

Job for an English Degree? Try Search Engine Optimization

June 22nd, 2009 by Patrick Hare

One of the toughest parts of having a liberal arts degree involves the job market. Even before the economy slowed down, degrees like Communications, Political Science, Psychology, and English were not necessarily the route to a high paying job. However, the increasingly semantic nature of search engines has led people with BA Degrees to find jobs in search engine optimization (SEO) which is widely assumed to be a technical field.

Even before search engines touted their “semantic” capabilities, the field was weighted toward English majors. This derived from the “content is king” philosophy in SEO which had the thesis that the content on a website was the prime factor in getting good search engine rankings. Content is still a very important piece of the puzzle, and for that reason SEO firms routinely hire writers and train them in search engine optimization.

To be sure, there is a great need for technical knowledge in the field of SEO. Most of the better agencies and consulting firms have programmers and webmasters on staff for internal and external initiatives. From an entry level standpoint in SEO, the requirements are a bit more basic. For instance, a general knowledge of HTML and experience putting together a basic website are a good start.

The process of search engine optimization basically involves matching up actual search queries (in search engines like Google) with relevant websites. Search engines gain their credibility by presenting information that is as germane to the search term as possible. The order of the words in the search is also an important factor, so people looking for Hotel Rooms in San Francisco will either type in “San Francisco Hotel Room” or “Hotel Room San Francisco.” Either way, a talented content writer can incorporate both phrases into a web page, and that same writer would have the skills necessary to format the meta content (what the search engine uses as a topic guide) to optimize the page.

Another important piece of the SEO puzzle is link popularity and link text. Essentially, the sites linking to your site are telling the search engines what you are about. If 100 sites say that you are about “orange widgets,” and you have made your site relevant to that term, you are likely to get a good ranking in the search engines. Once again, a person who has experience writing term papers and doing research will have an advantage when it comes to finding out good keywords, soliciting links via email, and determining the best word order in the actual link text.

Search engines like Google and Bing like to say that their ranking algorithms (the set of rules that determines how one site gets better placement than another) have up to 300 different factors in deciding placement. Many of these factors are closely guarded trade secrets, but to a certain extent the major rules are known to the SEO world. Given that many of these rules favor “natural language” in website design, and those rules are the same grammar and usage rules that have been drilled into the heads of most English Language and Literature graduates, there is a natural fit for people in this field. A testament to the power of this idea is the good placement for sites like Wikipedia and, which almost always deliver very good results for any search. Despite challenges to the veracity of the content on these sites, there is plenty of information, lots of references, and each page is clearly formatted in a style that is useful to the reader and the search engine.

What can a Liberal Arts student do to improve their job prospects in the SEO field? First, do some research. Sources like SEOBook and PlanetOcean have useful guides for learning the basics of SEO. Search Engine Optimization for Dummies is a clear guide to the general principles of SEO, and helps people learn a few terms to drop during the job interview process. There are even plenty of YouTube videos dedicated to the topic. Second, creating a website and optimizing it is a real-world way to learn the ins and outs of search engine placement. The site doesn’t even have to sell anything, but it can function as a test case for SEO training. By learning first hand how to install free tools like Google Analytics and Webmaster Tools, you will be able to see how the search engine universe works, how to monitor traffic, and even how people find your website.

There is quite a bit of on-the-job training in the optimization world, and experience in working with diverse websites is key to getting a well rounded SEO education. By dealing in sites with sectors like health care, legal, real estate, retail ecommerce, and lead generation, you can learn more about the internet landscape than most business school graduates. Secondarily, you will get a feel for universal search engine principles and those that are applicable to specific niche fields on the World Wide Web. Eventually, your SEO knowledge can be a stepping stone to careers in online marketing and product development, which generally have a higher pay grade than other jobs open to people with BA degrees.

Finally, people with English, Psychology, and Communications degrees can benefit from the relatively open and growing status of the Search Engine Optimization industry. Experienced SEO professionals are still in demand despite cuts in other job categories. Many companies are only just learning that dollars spend on search engine marketing create a higher return than newspaper, radio, and TV advertising. These companies are scrambling for SEO professionals, or they are retaining the agencies that employ them. Given the natural fit between the skills needed for a BA degree and the semantic requirements of search engines, now may be the best time for graduates to consider a career in SEO.

Basic Robots.txt File Information

June 22nd, 2009 by Patrick Hare

On many occasions customers come to us with the complaint that they can’t be found. They either had rankings on all search engines and suddenly disappeared, or never were seen in the first place. Believing that they are the victims of a ban in the search engines, they come to us for search engine optimization advice. In many cases, the culprit is found in the robots.txt file, in the form of the classic:

User-agent: *
Disallow: /

(Special Note: Using this command will make your site disappear in the search engines!) The forward slash after the disallow tells the engines to ignore all files. The soluton to this problem is to delete the forward slash, which tells search engines that everything is fair game. If you use Google Webmaster Tools, you will be told that the robots file prevents the indexing of your site. Many times a webmaster will upload this accidentally, or forget to take it down when a dev site goes live. The command effectively tells every honest search engine spider to stop reading your site and go away. Note that unethical spiders that scrape for phone numbers, email addresses, and content will not even bother to look at your robots.txt file, unless they are programmed to look for the files you don’t want found. If you are looking to block search spiders from dishonest people on the internet, the robots.txt file is probably not going to help you, so you should look to server level exclusions.

Depending on the complexity of your site, the robots.txt file can be modified to support your SEO initiatives. If you have a series of pages in a shopping cart, forum, or section that you want to exclude, you can disallow a specific directory:

Disallow: /Example

If you have multiple directories, you would just add them to the list:

Disallow: /Example
Disallow: /secret_plans
Disallow: /things_we_do_not_want_the_world_to_know

or you can use a newer wildcard format that disallows pages with certain phrases of string segments in them. If you wanted to disallow all the pages with a session ID in them, you could use a command that says:

Disallow: /*sessionid

Keep in mind that this will effectively shut out search engines for these pages, so you should ensure that your string is long enough that it does not accidentally blind the engines to pages that you want to get found. The wildcard robots disallow is ideal for people who may have bought sites and then found out that the site was a parked domain with thousands of “junk” pages installed by a previous owner. Even if you don’t have any of those pages on your site, it can take months for Google to notice that they no longer exist. By excluding them in your robots file, the removal of those cached pages can take less time.

In the past, people have disallowed the /images directory but normally we don’t recommend this. Image and universal search features on search engines allow for your images to get indexed, and this leads to traffic. One of our clients made a substantial number of sales based on image search, so excluding this directory should be done with some thought.

If you want to exclude certain search engines, or direct them away from certain directories, it is easy to set up separate exclusion protocols in the file. For instance, excluding Yahoo! (which uses the “Slurp” robot”) from seeing a directory would be done this way:

User-agent: slurp
Disallow: /Example

If you want to see a list of all the useragents (a useragent is essentially the name of the robot) you can exclude, this site has a nice database.

Finally, you may want to tell the search engines about your XML Sitemap, if you haven’t already submitted it through Webmaster Tools. Doing this is easy, since all you have to do is add the command:


To the bottom of the robots.txt file.

For most people with a normal site, the whole robots.txt file should look like this:

User-agent: *


There are quite a few great online resources that will guide you through tips and tricks regarding the sitemap.txt file. Smaller sites only need a basic sitemap file, which only need to be modified if search engines have trouble finding pages, or crawl too deep and need to be excluded, slowed down, or properly directed. Larger sites will want to look into protocols that can keep the search engines on the right pages, prevent duplicate content issues, and even keep unnecessary files from getting added to search results. Even though the Robots.txt file gets overlooked by many webmasters, we have seen that Google and other engines may be looking at it many times a day, so any big changes to your site should at least include a review of the robots file. Finally, if your site vanishes from all three search engines at the same time, the Robots.txt file should be the first place you want to look before checking out potential new webmasters.

[Note:For more advanced information on the Robots Exclusion Standard, Wikipedia has some good information on this topic.]

Why Do Companies in New York, Chicago, and Los Angeles Use Arizona SEO Firms?

June 17th, 2009 by Patrick Hare

Many of our biggest customers come from outside the State of Arizona. Given the virtual nature of search engine optimization, this is no surprise, since SEO work can be done just about anywhere in the world, though results may vary. Since its inception in 1997, Search Agency has been performing SEO projects for firms in cities which have top level marketing agencies within the space of a few blocks. We service clients in Los Angeles, London, Chicago, New York, Miami, Houston, and around the world, and many of our Arizona SEO colleagues get a crack at the same prospects before they become clients.

What’s behind the “gold rush” for Arizona SEO firms? One consideration may be price. An expensive Park Avenue marketing agency has to pay a New York City price for rent, and higher salaries for people who need to live in NYC apartments, which are still not cheap despite recent housing price adjustments. As a result, the expense gets passed on to the client. A good deal of the talent pool for technology also finds itself priced out of “big city” living and finds the cost of living in Arizona more affordable, despite the air conditioning cost that comes with four months of triple digit temperatures.

However, price is not the only consideration for SEO work. There are plenty of far-east SEO firms who can do a project for a fraction of the cost in US dollars compared to prices charged by Phoenix, Scottsdale, and Tempe SEO companies. Some of our clients have gone this route first, and then come to us. Part of the problem with offshore search engine optimization is that its practitioners are not as familiar with US English vernacular, so the on-page content is in many cases inscrutable. Search engines are increasingly improving their natural semantic recognition features, so bad content gets bad results. Secondarily, offshore SEO firms are not necessarily as up to date on higher end linking and optimization practices. Many companies will use automatic tools to create content for your site, using scraped or “spun” information that approximates what you’d get when you put a dictionary in a blender. Lastly, the 12 hour time difference in communication means that emails don’t usually get answered until the next day, and you can’t have too many teleconferences where people on both sides of the phone are awake.

Outside of New York, one of the biggest sources of clients for Search Agency is Chicago. It may seem unusual for Phoenix firms to do Chicago SEO work, given that Phoenix is not windy, has no nearby lakes, and is rarely cold enough to require a jacket, but Chicago companies have been using Arizona optimization for years. Many of our Chicago SEO clients even fly out to our Scottsdale Airpark location in their own planes and pay us a visit. If they wanted to shop around for Arizona SEO, they would only have to travel a few miles to find several other agencies that do the same thing. The advantage of being in town with so many other agencies is the cross-pollination that comes with proximity. Secondarily, the many technology, website design, and hosting companies in Arizona provide a rich source of individuals who understand the HTML and programming techniques necessary for implementation.

Surprisingly, we also service SEO clients in London, where “optimisation” is the preferred spelling of the word. Some of these customers have a US or global presence, but other firms are looking for a global SEO experience base that may not be found servicing the comparatively lower number of sites in the UK. Other factors in play could be the cost of labor in the UK or EU, and the favorable exchange rate that makes it cheaper to hire a US firm and still get a higher level of quality and people you can ring up during working hours (albeit just barely).

What is not so surprising is the number of customers who come to us through search engines, and a scan of the top agencies in the nation would be weighted with SEO firms from Phoenix, Scottsdale, Mesa, Glendale, Tempe, and Chandler Arizona. Many of the most popular search terms bring up AZ agencies in the top 100 at the expense of larger cities. Given the lower population of the Phoenix metro area compared to LA or New York, this may seem unusual, until you consider that search engine rankings are a testament to the talent of the agency that got them, and Arizona has quite a few people who are good at search engine optimization.

SEO Consultant – What To Ask Before Hiring One

June 16th, 2009 by Patrick Hare

The field of search engine optimization consulting contains people with varying skill levels and capabilities. When looking for an SEO consultant, you want an experienced individual or company who can give you advice that is useful and timely. Here are a few things to ask when selecting a search engine optimization consultant:

  1. How many sites have you worked on? A good consultant should have worked a few dozen sites at minimum. Every site has its own nuances, and a well rounded SEO consultant will have come across different challenges with each site. Experience in problem solving and proactive prevention are two keys to making sure your site isn’t the victim of any “rookie mistakes.”
  2. Can I call your references? It is not uncommon for optimization customers to demand non-disclosure agreements. However, there are generally a few people who are willing to be called as a reference, especially if the consultant delivered substantially good results.
  3. What do you think of my site? Some consultants won’t look at your site without getting paid first. If someone is proposing an SEO job without at least looking through your site structure, you can put them at the bottom of the list of potential agencies. Several agencies, including Search Agency, will check out your site in order to craft a proposal that best meets your needs. Almost any good agency or consultant would point out glaring SEO obstacles before submitting a proposal. You shouldn’t expect to get in-depth optimization advice for free, but if your consultant did not mention a major obstacle, it is always possible that he/she did not notice it, which is a red flag in itself.
  4. How much SEO do I need? Many consultants have fixed fees or a minimum hourly rate. The scope of your optimization project is going to be determined by factors like keyword competition, the number of pages to optimize, your current link popularity, and difficulties peculiar to your website. Some sites need extra coding, restructured URLs, or upgrades before SEO can be properly implemented. Even so, custom SEO consulting generally takes this into account before a proposal is generated.
  5. Is implementation included? Most of the time your webmaster will be the one to add metatags, upload content, or make back end changes to the site. SEO professionals can often add this information in a pinch, but it doesn’t always make sense to have too many hands touching your website code. Moreover, webmasters may not be willing to share passwords, and should ideally be part of the process so search engine optimization work does not get blown out by the next update.
  6. Have you worked on sites in my field? Despite the fact that SEO follows pretty much the same rules whether you’re selling baseball bats or music boxes, there is a certain advantage to having someone who already knows the keywords associated with your products and services. Admittedly, the structure of your site has to be search engine friendly no matter what you are selling, so SEO experience in your field is not a requirement, but it is a big plus.
  7. What kind of techniques do you use? Even today, some advisors use tricks that are a few years out of date, and may harm your site’s search engine rankings. By learning about the type and kind of SEO processes being used, you can do some homework and make your own decisions.
  8. What are the risks of your advice? Most of the time, bad SEO advice does nothing. In this case, “nothing” is expensive, but you aren’t directly being harmed. Search engines usually devalue bad links to your site, and ignore duplicate content, so you are paying for no progress. In extreme cases, bad SEO consulting will get you filtered, penalized, or banned in the search engines. Remember that you are responsible for your website, and search engine robots can penalize you automatically if you meet a certain threshold of “black hat” practices on your site.
  9. How long do you expect it to take? The stock answer is that it should take months, unless you have a great site and link profile with a clear stumbling block. This is not common. New sites take longer to rank than old sites. New links to your site need time to “age” to get the best results. If someone tells you that it only takes 2 weeks to get top 10 results, then you should be wary.
  10. How much will it cost? Keep in mind that this answer will be wildly different depending on who you ask. If someone is claiming great results for a few hundred dollars, this is unlikely. If the cost is in the tens of thousands of dollars, then you should get more bids and also get plenty of up-front information about what is required to get the job done. If you’re selling airline tickets or online dating, the price will be quite high, but if you’re selling a local service, the price should be lower. Search Engine Optimization should not be a high-pressure field, so if someone is too pushy about getting you on board, you should at least shop around.
  11. How do you measure results? In most cases, results can be measured by search engine ranking reports and Google Analytics, which is a free tool available from Google which measures all traffic to your site. Your SEO consultant should set you up as the “owner” of the analytics, but have access in order to see how things are going, and generate custom reports to show progress. Counting backlinks and the number of pages optimized only works when results are achieved, so if you’re getting work with no progress, then it may be time to ask more questions.

Transparency is the key to any good SEO consulting relationship. Given that almost every known optimization technique is discussed online at length in blogs and forums, the consultant’s technique and experience are what you are buying. A good SEO advisor is worth several times the money you pay him or her, but the key to getting the best ones involves asking enough questions up front to ensure that you have made the best choice. Don’t be afraid to ask around, or even compare one person’s claims against another’s. In almost every case, a consultant should be able to talk about techniques that another vendor is selling, and discuss the pros and cons of any particular course of action.

Search engine optimization agencies usually pool the consulting job among several specialists, which allows for more vibrant and dynamic approaches to your particular website’s attributes. This also allows each specialist to determine the most relevant links, best content, and the ideal structure on your site for search engine robots. In turn, your site has become easier for the search engine to classify, so your rankings will noticeably improve over the course of your project.