Updated: July 27, 2017
Anyone entering the world of search engine optimization (SEO) certainly will learn pretty quickly about different neat tricks of the trade. Some of these practices are legitimate (in the eyes of Google) and will not result in a penalty or outright ban from the search engines.
But others – known in the trade as “black hat”– are questionable at best these days. Some of these practices were okay back in the 90’s when search engines were still in their infancy. Today, however, search technology is much more advanced and can easily spot many of the practices I’ll outline below.
One thing you need to remember though is that when we say search engines, we primarily mean Google. We’re not forgetting about other search engines like Yahoo, Bing, Baidu and others. It’s just a reality that Google captures over two-third of the Internet’s searches. So when we’re optimizing a website for the search engines, we are primarily working with Google from an SEO perspective.
Continue reading for 8 SEO tricks you want to avoid altogether. Doing so is your best insurance against being penalized by Google.
Because once you’re in that hole, it’s a real challenge to dig yourself out…
8 Black Hat SEO Practices To Avoid At All Costs
The practices described below are generally considered by Google to be “black hat.” If they decide to manually review your site’s code and remove you from their listing, it can take a long time to recover. It’s best to avoid these practices involving keywords, links and other technical elements of your website.
1. Keyword stuffing
Keyword stuffing is perhaps the oldest trick in the book when it comes to SEO. Search engines loathe keyword stuffing and can absolutely detect it. Basically, keyword stuffing consists of repeating keywords over and over again. It usually appears at the bottom of a page in very small text.
For instance, let’s say you’re trying to target the phrase “mountain vacations.” One common keyword stuffing technique might look like this in your site’s code:
<h6>mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations</h6>
As you may or may not know, an <h6> heading makes text very, very tiny. Including this on the bottom of a webpage isn’t noticeable by people, but is counted by search engines. During the Stone Age of SEO, this is how early webmasters got their sites to the top of the search engines.
Keyword stuffing can also be done in meta-description, keyword and image alt tags.
For alt tags, say we have an image and include our keyword in the alt and title tags for the image. This is considered keyword stuffing by Google and could land you in trouble.
To see if any webpage has any of these elements, simply use the Source Code feature on your browser.
While it’s possible to trick the search engines for awhile if you’re really experienced, they almost always detect keyword stuffing sooner or later and act accordingly. Also, it’s possible competitors will file spam reports with Google, so our advice is to avoid keyword stuffing altogether.
Additional reading: 11 Steps to Increasing Keyword Saturation while Maintaining Valuable Content
2. Invisible, barely visible or hidden text
A constant dilemma for search engine marketers is to develop web pages that appeal to both visitors and the search engines. The dilemma is that search engines love simple pages with loads of content, whereas people like pages with animation, graphics and lots of special effects – the very same elements search engines cannot crawl or index.
One of the ways SEOs used to get around this was to create text that was invisible or hidden. But with today’s more sophisticated search engines, this can be construed as keyword stuffing and get you in trouble.
One way webmasters would do this is to create text with the same – or near identical – color of the page’s background. Doing this in effect means the visitor wouldn’t see any words but the search engines would find all of those keywords.
For example, you can have a white background (<bgcolor=”#FFFFFF>) with a text font of white (<fontcolor=”FFFFFF”>). It’s also possible to use a slightly different text color by offsetting one of the colors a little bit. This will be a little harder for the spiders to detect, but if Google manually reviews it they will definitely catch it.
CSS is another creative way webmasters have adapted the hidden text strategy. They basically would use Cascading Style Sheets (CSS) to hide text from humans while making it available to search engines.
Below is an example of our keyword using a CSS visibility: hidden font format.
<div style=”visibility:hidden;”>mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations mountain vacations</div>
To see the text, someone would have to look at your page’s source code.
Human reviewers at Google do review sites, so once they check yours out they’ll certainly see you’ve done this if you have.
There’s only one instance where it’s okay and that’s if you use CSS sheets that let you tab hidden and unhidden text. These kinds of things are common with product listings on ecommerce sites. This is generally viewed to be okay (we think), but in order for that to be the case the user must have the option to choose whether or not to view the text.
One more way to hide text using CSS sheets is to use layers and place text behind pictures or other objects on the page. Known as the z-index function, the webmaster would simply assign the viewable item a higher z-index number than the hidden text.
Next, they would use another CSS function called “absolute positioning” to position the text and image in the same exact location.
Again, this tactic is harder for a computer to detect but careful review by a Google auditor will certainly reveal it. It’s best to avoid this or any other tactics designed to hide text from visitors but make it viewable by the search engine spiders.
3. Selling links to increase a target URL’s PageRank
Another black hat practice search engines frown profoundly on is selling links on your site. Paid links often look unnatural and if you see them, none of them have anything in common.
Take the following example, for instance, which may look similar to what you’ve seen across the bottom of some web pages:
Mountain vacations – Plastic Surgeons in Florida – Buy Gold – Used Cars for Sale
As you can tell, none of these links have anything to do with the other, which is a tell-tale sign of selling links. If the links are all for businesses located in the same town, then there’s no problem. But if a page has links going to an offshore gambling site for instance, then there’s more risk of getting penalized.
Using reciprocal link directories can also result in a penalty in some situations, especially if they have a wide focus of unrelated content.
Additional reading: Directory Submission Dos and Don’ts
Somewhat related to selling links is the risk of your site getting infected by Malware or being hacked. If you’ve setup Google Search Console on your site, then you should receive a warning from Google saying your site has been hacked or is hosting Malware.
If you end up in Google’s penalty box for selling links, it can take some time and effort to get out. First you should remove the offending links and pinky promise Google to never do it again. It usually takes 3 months or longer between the time they set your PageRank to zero and you get back in.
Unfortunately, many webmasters find the potential profits too good to pass up despite the rigid warnings and penalties for selling links. Check out Google’s Webmaster Guidelines to learn more about their position on this topic.
4. Hidden links and phantom pixels
Two other common black hat practices involve links that Google really can’t stand and loves to penalize sites for: hidden links, and what’s known as the “phantom pixel.”
Hidden links are hyperlinks that are obscured from a visitor’s view and strategically placed to direct the search engine to an unrelated site. The webmaster likely wants these off-topic sites to be indexed and rank well. Using hidden links boosts link juice (or PageRank) on favored web pages.
Whether paid for or not, the point is the links are NOT there for the site visitor to find. Since they hold no value for the site visitor, Google and other search engines penalize sites that have them.
Techniques for hiding links are quite similar to invisible/semi-visible strategies for keywords. CSS layering like we discussed above is another strategy. Heck, you can even include links in the period at the end of this sentence. Even though the link is technically invisible, search engines will still consider it a hidden link and act accordingly.
Phantom pixels are much like invisible or hidden links that can be placed in a period at the end of a sentence, but instead the link is placed in a 1×1-pixel image. These images can also contain keywords in their alt tag (like we discussed above), but instead webmasters use these super small images for hiding links.
Like other things we’ve already talked about, phantom pixels are another way for your site to be penalized or even banned – assuming Google discovers these black hat tricks on your site.
And eventually you can assume you will be caught.
5. Doorway pages
Another black hat tactic used by aggressive SEOs is to create large numbers of pages whose only purpose is to rank well for as many keywords as possible. These pages are generally very low quality. Many of them are automatically generated by software programs designed to optimize pages around a specific long-tail keyword.
Two tools are generally used to create these pages. One is a software program that copies or “scrapes” content from other web pages or RSS feeds. These pages are re-published to link or redirect visitors to main sales pages on the site.
The other tool is what’s known as Markov chain content generation. This tool uses special algorithms to combine words in unique ways. These pages generally escape many spam filters but read as complete nonsense to humans.
Here’s an example of Markov chain content generation:
A bowling ball daydreams, because a power drill eats the maelstrom about another polygon. Another highly paid spider buries the college-educated line dancer.
Whatever you do, do NOT use software to automatically generate content. While it’s fine to use content management systems and other software to manage your site’s content, content for people should be created by a real person.
6. Meta & JavaScript redirects
If you’ve been surfing online and noticed your browser loading a different page – sometimes on completely different sites – you’ve been redirected. The process generally only takes a split second and is hardly noticeable by site visitors.
Redirects are common and okay in Google’s eyes, if they’re used to guide visitors to the most up-to-date content on your site. In fact, here at SEO Advantage we use 301 redirects all the time to funnel visitors to our most relevant pages.
But when redirects are used improperly, it could land you in hot water.
What black hat search marketers do is build a keyword-rich page designed to rank the site high in the search engines. However, when a person clicks on the link in the search results, a redirect sends the visitor to a page more suitable for real people.
More specifically, two ways search marketers use redirects for nefarious purposes include the meta refresh and JavaScript.
Meta refresh is a section in the HTML code that causes the browser to redirect the visitor to the desired page. See the example below:
<meta http-equiv=”refresh” content=”1”; url=index.html”>
The “content=1” section indicates the number of seconds the keyword-rich page will display before the visitor is redirected. Search marketers do this in the hopes Google will index the keyword-rich page before visitors are redirected.
JavaScript, the other tactic, redirects visitors to the right webpage but leaves Google to index the shadow page since they cannot handle JavaScript. Therefore, search engines ignore the redirect and index the keyword-rich page instead.
While redirects do serve an important and legitimate purpose in website management, we recommend you avoid meta redirects and JavaScript. Use a 301 redirect if you’re updating your site’s pages and content.
Additional reading: Google Search Results Warn Smartphone Users of Faulty Redirects
7. Duplicate content
Many ecommerce sites around the Internet use product descriptions provided by the manufacturer or someone else. It’s likely several sites contain the same exact language for a given product. While duplicating product descriptions isn’t considered spam by Google and others, it can result in your pages being removed.
In light of this fact, you should consider content duplication to be spam and avoid it at all costs.
For example, if you’re an affiliate or reselling products, you should add unique content and value to your product descriptions supplied by a manufacturer or seller. One tip for effectively accomplishing this is to create comparison charts for your products.
If you don’t do anything but simply “cut and paste” product descriptions from elsewhere, there will be no way to differentiate your site from the hundreds of others using the same exact text. You also run the risk of being buried or de-listed on the search engines.
Additional reading: 7 Ways to Avoid Future Duplicate Content Issues
In general, though, most of the concern surrounding duplicate content is hyped. SEO guru Neil Patel hits the nail on the head when he says:
“Calm down, people. In my view, we’re living through a massive overreaction. For some, it’s a near panic. So, let’s take a deep breath and consider the following…
Googlebot visits most sites every day. If it finds a copied version of something a week later on another site, it knows where the original appeared. Googlebot doesn’t get angry and penalize. It moves on. That’s pretty much all you need to know.
A huge percentage of the internet is duplicate content. Google knows this. They’ve been separating originals from copies since 1997, long before the phrase ‘duplicate content’ became a buzzword in 2005.”
Unless you’re blatantly copying massive chunks of content verbatim, your site likely won’t be penalized in any sort of significant way.
8. IP delivery (aka “cloaking”)
Most commonly referred to as “cloaking,” IP delivery is perhaps one of the most controversial and complex black hat SEO strategies. What it basically does is serve one site to the real visitor while showing a different page to search engine spiders. Search engines don’t like this at all and will penalize any website caught engaging in cloaking (especially smaller sites).
What cloaking does is detect the IP address the visitor is coming from. If the IP address isn’t assigned to a search engine spider, the site will assume the visitor is human and give them one version of the page. If, however, the site determines that the IP address is from a search engine spider, another version is shown.
While it is general wisdom that cloaking is bad, there are a few instances where it’s okay. Web pages built using Macromedia Flash is one example. Since search engines don’t index Flash content very well, a SEO might cloak the Flash page in order to give the spider meaningful content to index.
In this sense, cloaking is okay but is ripe for exploitation, which is what the controversy boils down to.
Google itself engages in the practice of IP delivery to an extent, so in one sense they’re okay with it. For instance, say you’re in Florida looking for a tire shop. If you type in “tire shops” in a Google search, you’re likely to see all the shops in your area. They do this by identifying where your IP address is based and showing you a custom results page.
Plenty of brand names, including Google, use cloaking with impunity. Since Google trusts these names, they turn a blind eye. But smaller, lesser known websites exploiting this technique will get penalized.
The only instance where cloaking is accepted for sure is Google’s First Click Free program (FCF), which enables password-protected subscription sites to be indexed while only allowing a visitor to see a single page of content. By nature, you have to use cloaking with these kinds of sites.
So unless you’re a well known brand that Google trusts to use cloaking (…I mean IP Delivery…) in the right ways or are a subscription based site, you should consider this an unsafe SEO strategy.
That’s what it all boils down to, folks – whether your site is known and trusted or not.
Watch this video with Matt Cutts to see Google’s stance on cloaking:
(Source: https://support.google.com/webmasters/answer/66355?hl=en)
Sustainable SEO is About Playing it Safe, Not Taking Shortcuts
These practices mentioned above should be avoided altogether really. Although you may think you can get around the search engine spiders, a manual review by a real person will certainly expose these black hat tactics and result in a penalty.
So play it safe and stick with the basics. While it may seem daunting at first, the benefits will be much better and sustainable.
Have you used any of these tactics to rank high in the search engines?
I’m sure you won’t fess-up. No need to! Some of these techniques are very outdated. Hidden-texts, phantom pixels and stuffing keyword tags are ancient techniques that don’t work anymore. But please do share what it was like for you when those techniques used to work way back when.
What was your experience? Did your site get penalized? Any new technique you’d like to share?
Let us know in the comments below, or via Facebook.
ConsultantWorldWide says
The best easy way to avoid any of Google Penalties .. is to remove Google codes from your web pages!
You will be ranking up and up ;P
>> Be smart and Think out of the Box! <<
Cheers! and Happy New Year 🙂
Spook SEO says
This is just great. People will know what could be the potentially
dangerous programs that could ruin their hard-earned website. Anything that
will give you excess content abnormally should be crossed out in the list. Gone
are the days when you can flood your page with articles and not get caught by
the algorithms.