Getting Googley Eyed!
Hello Google! You probably don’t know this but we’ve been helping you succeed for the past four years. You see, we’re a search engine optimization company dedicated to ensuring proper indexing within your search engine. Well, actually, ALL of the top search engines, but everyone wants to be ranked highly on yours, so we pay very close attention to your mood swings and temperament in order to keep you satisfied.
As it stands, our Clients do rather well within your listings, which begs the question, why did you change the way your results are produced? In your attempts to ban spam sites from your index you’ve let in a -LOT- of unsavory elements and they are ranking pretty high on your charts. I suspect you folks will be plugging these holes as time marches on. I certainly hope this is the case.
I’ve also noticed a lot of the top results which appear to come from different domains are actually just affiliate sites with disguised front pages, but once you click on any link within the site you are redirected to ‘the mother ship.’ Musicians Friend and Music123.com are examples of this type of ‘doorway’ marketing. I assume these sites will eventually be weeded out as well, but for now they are permeating your results. At best it’s a nuisance for the searcher, at the worst, it’s quickly diluting the relevance of your results. Be careful Google! The Internet is a fickle place.
For all you site owners out there, here are three observations theories I’ve been researching recently, relating to how Google has adjusted the way they produce search results.
One: Word Frequency Filters. Certain words seem to trigger a filter in Google and eliminate sites which were once on the list. The word ‘software’ for example seems to do this for certain sites but not for others. It seems to depend upon the density of the word and how many times it is used within the text of the page and how it is used within sentence structure. I.E. Repetition. Some words will not trigger the filter no matter how many times they are repeated. Other words will trigger the filter at a maximum stand alone word count of a mere three or four times. More than that triggers the filter four Which brings us to observation number two.
Two: Phraseology. Although just a personal theory I will be investigating further in the near future, it seems the way certain words are used within specific phrases can affect Google’s search results. Also the frequency of the exact phrase has a bearing on things. It’s sort of a combination of allowable keyword density, where certain keywords are OK on their own (used sparingly) and also OK within certain specific phrases within the content, again with considerations toward limits on repetition. But.. if the keyword has been spread out through the use of different text phrases, the frequency filter is not triggered as readily.
Remember Google recently purchased a company working on phraseology and natural language analysis. It may be a little early for the software to be entirely integrated within Google’s search algorithms, but they may be testing the waters.
Metamend will soon be introducing technology to not only absorb web site content through the use of a multilingual thesaurus tool, (just another of our little inventions) but will be able to suggest terms people may want to have on their sites to increase their relevance. We think having the ability to tell people what is NOT on their site but -should- be, is kind of a cool tool. We trust you will agree.
Three. A theory I call “The Vagueness Factor” Taking into consideration the two points above, it seems logical and somewhat interesting to note that the more arduously specific you are with your search terms in Google, the better your results. Use a three or four or five word query and you stand a much better chance of finding what you were looking for. It wasn’t always like that. You used to be able to be ‘vague’ in your searches and simply type in “Used Car Dealers” This would bring you a list of sites of commercial operations which sold used cars. Now however a somewhat vague term like this gives you back DMOZ category listings, national automobile dealers networks and other resources from which to choose. A lot more work for the lazy searcher.
Type in an extremely specific query however, such as “Used Car Dealers, Honda, Memphis” and you’ll find exactly what you want. This raises two important points. A) Google is, for lack of a better term, “training” the general searching public to be much more specific when looking for something. A neat trick. I like it. B) They seem to be moving toward integrating location-based search results. Geographic addresses are becoming increasing relevant to have on your web site. Check your web site to ensure your full address is on there, on several pages. Also, if your site is indeed location specific, for example a bank or gas station, remember the first two points as well and add some content with the location name in several different sentence phrases. E.g.. “Here are directions to our gas station in Memphis! or… “When visiting Memphis our gas station is easy to find. or “Our Gas Station is conveniently located in Memphis just off the Interstate.”
The point being, by combining location, keywords and phraseology all together within your content, (but no overdoing it on the repetition) you’ll stand a much better chance of being found in Google’s newly adjusted search results.
Another ED NOTE:
I won’t go into all the details here but if you are a Metamend Client, you should already know we have anticipated location-based searching as a growing trend and have in fact developed and implemented address extraction technology within our SEO services.