Search Engine Spam, Or Spamming the Search Engines
If you are grocery shopping it’s easy not to buy Spam. But in the world of search engine optimization, it’s a different story. We receive hundreds of “spam” emails each day. Spam also refers to “spamming the search engines.” It’s something that you want to avoid at all costs when you are performing any search engine optimization work to a web site.
So What is Search Engine Spam?
“Spam” or “spamming the search engines” refers to any techniques used to trick or “spoof” the search engines robots or spiders. Webmasters and web site operators are constantly looking for new ways to trick the search engines into giving their web site a better ranking. One such method is known as search engine cloaking. On the other side of the battle, the programmers, administrators, and developers at the search engines themselves are continually refining their definitions of spam so as to drop sites using spam tactics. The search engines hate spam because they try to provide their users with the most relevant rankings possible, and they view spamming tactics as decreasing the value of their results.
If you are not keeping up with the latest search engine guidelines on what constitutes spam, you could unwittingly be using spam. That’s one very strong argument for using a search engine optimization specialist, like Metamend, to keep your web site up to date with the latest rules and algorithms.
To help give you an idea of some of the more common spam tactics, we are listing some here, and explaining why they were classified as such. Hopefully this will provide you with some insight into where the next change may come from, and also some idea as to the things that the search engines look for.
Webmasters commonly used to insert an unlimited number of keywords in their meta tags. Because these metatags were the primary tool used by most engines to review a web site, enough mentions of the word “radioactive” in your tags practically guaranteed a top ranking for your site for that keyword. In response to the stuffing issue, search engine operators set character
limits for the meta description and meta keywords tags. This meant that they would only read the first ‘X’ number of characters, or that if the tag you were using went over the limit, it would be ignored completely. This meant that repetitious terms would no longer benefit the site as heavily. Later, to combat web sites that just repeated the term “travel” 50 times, and thus ranked high for that 1 term, the search engines set a limit on the number of times a term could be used within a tag.
If you went over that limit, they could “spam” your site out of their index.
Using Irrelevant Keywords And Terms in MetaTags
Because the value of web sites was measured by how many “eyeballs” saw the site, using completely irrelevant keywords in metatags was a favorite way to improve the ranking of a website. For
example, a webmaster who inserted a popular term like “sex” or “MP3” into his tags could get a high ranking for it – even if the content of the web site was all about
houseplants. Of course this forced the search engines to adapt again, and they began penalizing websites that used keyterms that did not appear, or were unrelated to
the text in the web site.
Tiny and Invisible text
This was a popular strategy. Because your web pages need to appeal to users as well as the search engines, it’s sometimes very difficult to place every keyword and keyterm where you want to in your text. So, to combat this, some web site operators used “invisible text” (written in the same color as the web page background) to fill their web pages with content. It therefore was visible to the engines, but not to the human eye. A similar trick was “tiny text” this was text so small that it would appear as a line of periods or dots on the monitor, but still was text to the search engines. The search engines have banned this practice.
Meta Refresh Tag
The meta refresh tag is a low-tech version of cloaking. It was used to hide “doorways”, or
“leap pages” from users. The tag itself automatically redirected users to another page within a specified time span, usually less than one second. By making the time
increment very small (2 milliseconds, for example), web site operators were able to hide their doorway pages from users. The search engines would spider the doorway
page, and then follow the link while users got another, often completely different page. This was very popular with the adult industry. It was possible to search for
“Financial Advisor”, see a list of relevant results, choose one, and then be redirected to a pornographic website. Since there were so many variations on redirects,
the administrators at most of the search engines decided that any kind of redirect page is now banned. The rule is that visitors should see the same pages a search
Excessive Search Engine Submission
Most search engines have a limit on how many times they will accept submissions from a particular web site, within a certain period. Sometimes it’s for multiple submissions in a day, in a week, or in a month. Excessive submissions occur when a search engine receives too many submissions of the same URL, within their time period. There was a time that re-submission would get your site re-indexed, and the search engines were favoring web sites that were ‘active’. Re-submitting meant that your web site was ‘active’. Today, if you exceed their limits, you might get your web site banned. This is very important to avoid, because many web site operators add their site to lots of free submission tools.
A website should not be oversubmitted because it may have the opposite effect to that which is desired. Submission is a very important part of search engine optimization. Get it done properly, and don’t try to overdo it. In many other areas of SEO you can go overboard, and help the process work better. You can work harder at building links and you can spend more time
creating more content rich text for the web site. But over submitting will only undo all your hard work.