Monitoring How Your Web Site Ranks In The Search Engines

The position your web site enjoys in the search engine results will constantly fluctuate for a variety of reasons. Here are a few of the common reasons that this occurs:

  1. New web sites are being added and dropped from the index.
  2. Competitors change the content of their web site… and thus they changed position.
  3. The search engines change their algorithm.
  4. The search engines use 1000’s of servers to deliver results. You got a result from a different server with a different index from the query you entered 20 minutes ago.

Checking Your Search Engine Placement

DO NOT buy any software that promises to monitor your ranking across all the major engines. Many times in conversations we have been asked why does Metamend not offer a “ranking” report. There are a variety of reasons for this. The most obvious reason we avoid them is the synonym to rank is “stink” and that’s what these reports do. The other major reasons we avoid use of such tools are:

  1. They are unreliable
  2. They are biased
  3. They may result in IP banning

1. All ranking checkers we have looked at, including the market leaders, are highly inaccurate. For this reason alone, anyone offering such a service should offer so many disclaimers that the buyer ought to know it’s virtually a worthless tool. We even resorted to writing our own customized solution, but found there was no measurable way to produce good, honest, reliable results.

One of the big problems faced when determining a web site’s search engine ranking, is taking into consideration the search engine’s index variance. Use any search engine you want, but we’ll use Google to illustrate this point. When you type any query into Google, where is it accessed? Type Google.com into your browser; sometimes you end up there, sometimes at www.google.com, sometimes at www.google.ca, sometimes at google.fr, etc…

Depending on whether you check rankings on “google.com”, “www.google.com”, or any other variant, including Yahoo!, and again depending on whether the index you queried is the most up to date, the results you get differ dramatically. The reasons for this vary – server loads, indexes spread across hundreds if not thousands of web servers, ongoing indexing, time of day, position of the moon relevant to the sun (well, you get the idea) are all factors which make ranking report results unreliable.

2. How objective are the reports you are receiving? In a scenario where you have paid someone based on a “Top X Ranking” guarantee, the report they send you may or may not be worth as much as the paper they are printed on. They definitely are not worth the cost of the monitor you would view them on. After all, the person or software you chose is reporting on their own success. The data they use will be most favorable to themselves. Are these the actual queries being searched for by the general public? Are they truly the most commonly used terms? Are they objectively the “right words” or are they the ones you commonly use? This obviously affects your results.

3. Lastly, as evidenced by the backlash by some engines against ranking reports. Google actually shut down 100 IP addresses from ComCast because the engine charged the provider with hosting accounts that abused Google’s terms of service, by performing an overwhelming number of automated queries.

The conflict centers on ranking reports and has raged around software that attempts to rank where a web site, and where its competitors, are placed within the search engine results. Google and other engines have long stated an aversion to computer-generated search requests. They view these requests as a hostile attack on their systems, somewhat akin to “Denial of Service” attacks. These queries take up substantial amounts of server resources and bandwidth by hitting the engines with hundreds of queries simultaneously, queries that are essentially all variants on the same terms, just with the words mixed into different orders.

These queries slow up the search engines for their real clients – regular users who are searching for something. Further damaged are the search engines that use the number of queries leading to click throughs as part of their algorithm for ranking results. These search engines are damaged as the software or service can be taught to automatically follow certain url’s, thus spoofing the engine into believing the web site in question is more popular than it really is, forcing it to rank higher in subsequent searches. This is an unethical practice.

Ranking Report Software Does It All With The Click Of A Button.

While the enforcement tactics Google used on Comcast’s subscribers were heavy-handed, they certainly did not take this action without consideration. Furthermore, this probably will not be the last time we see this type of action. Denying service to blocks of IP addresses when it cannot track down the specific abuser may seem harsh, but running a business where individuals try to manipulate your production requires vigilance and the occasional “warning shot across the bow.” This escalation may be the first step in a battle that could see many more innocent web site operators, (who are not trying to spoof the engines,) caught in the crossfire. How would you like to be a web site operator whose site becomes inaccessible by an engine because others from within the same IP range used “ranking” report software for their web site? They could effectively shut down your business, albeit unintentionally. Bottom Line – Metamend will not consciously contribute to this problem. We prefer the honest, verifiable approach.

Checking The Position Of A URL Search Engine by Search Engine

You can check your URL manually with each search engine. Before checking to see how your web site performs, first see if your site is listed at all. Some of the search engines make this easy, and some don’t. We suggest that you simply search for your domain name, if you are indexed it will show up. Once you have ascertained that your web site has in fact been indexed, then you can check how it is performing under the terms you believe it should be found under. To check your position for specific terms or phrases, enter each of your keyword phrases in each engine one at a time, and check the results. If you place well, you’ll know that for the term you just checked, you have covered the bases, and can move on to the next term. If not, you will need to review you content, meta tags, alternative tags, coding, etc. for errors, or omissions.

Should A web site That Is Performing Well Be Resubmitted?

Absolutely. You need to resubmit on a schedule every 4 to 6 weeks simply to remind the search engines that your web site exists, and that it is not dropped from the index. However, oversubmitting will hurt the performance of your web site, and can cause it to be dropped from the index.