What is Google Penguin?
Among the several factors that take a website into the good books (that’s the first few SERPs) is the number and quality of links connecting back to the website from other relevant websites. Of course, webmasters try to game the system and exploit all loopholes in the search engine’s algorithm to pose as deserving recipients of Google’s favors.Among the shady practices adopted by desperate webmasters are propagating link schemes such as buying and selling links, building thinly populated websites to link back to the master website, exchanging links, posting anchor text links on blogs and forums, and what not. All this leads to a severe deterioration in the quality of web search results that Google provides to its users, and that’s where the need to weed out defaulting websites arises. Penguin is the solution from Google, and has been revolutionizing the web by penalizing unscrupulous link exchange based websites right from 24 th April, 2012.
Coded to target websites that adopt ‘black hat’ tactics to build links, Penguin forces website owners to reconsider the way they want to place their website’s links across the web. If they choose to diligently work towards providing differentiated content and attaining domain authority, and in the process motivating similar great websites into hosting links genuinely, they are in for great friendship with Penguin. Others, as expected and as has been observed, are doomed to face the wrath of the Penguin update. It really forces webmasters to answer some important questions while they decide upon leaving a link at a website in one way or the other –
- Is this website really top quality?
- Is the niche of this website closely related to my website’s niche?
- Will a diverse and natural anchor text be used to point out readers to my website?
If you are to live with Google Penguin, the answers to the above better be YES.
What does Google Penguin target and penalize?
For co-existing with Google Penguin, it’s of the prime importance that webmasters know as to what the algorithm update does not like and is always on the hunt for. Here’s a detailed understanding of all that Penguin hates:
Buying and selling links – Trust Penguin to track down on websites that sell space to host links irrespective of the relevance and quality of these links. Such blatant malpractices that bring down the heath of World Wide Web are in the Penguin radar, and paid links are right up there in the hit list. On the same lines, Google Penguin also targets websites that purchase links. Whereas it’s hardly a smart practice to place one’s website’s links on money making link sellers, Google Penguin ensures that even the smart webmasters who hit upon close matching link purchases are dissuaded from indulging in sale and purchase of links.
Unscrupulous redirecting paths to accumulate links – Imagine webmasters purchasing links and redirecting to their domains, or being less desperate and using manipulative links to domains that further redirect users to your domains. Of course, that’s a bad idea, and Penguin ensures that the perpetrators of this malpractice are adequately penalized. The purpose of an old domain is identified in Google caches, as is the sketch of the link graph. When Google identifies two unrelated link graphs on a domain that covers absolutely different topics, it raises the red flag and levies penalty.
Interlinked domains exchanging links among themselves – In its quest to rid the World Wide Web of worthless web content, Penguin identifies networks of interlinked domains that are primarily built to keep users crisscrossing among websites from the same network. However, Penguin analyzes the variance of these domains is nothing similar to that of genuine link networks, and that’s how Penguin catches hold of useless link networks.
Exact matched anchor texts – Whereas anchors were originally meant to help relevant web pages rank well for certain keywords, webmasters and SEO evil heads have misused them by spreading anchor texts exactly picked up from the destination URLs across the lengths and breadths of low quality websites, or buying their ways into other websites. Penguin plans to end all of this by identifying and penalizing websites that are blatantly leaving exact or very closely matched anchor texts.
Over-optimized websites – Websites that compromise the interests of the readers over what the search engine deems important are also at the receiving end of Penguin’s wrath. Web pages with keyword density more than the nominal range of 1-1.5% stand to lose their estimation in the eyes of Google Penguin. On similar lines, site wide links to key post titles presented at your blog’s footer could prove to be a reason or Penguin penalty; as such tactics are considered over-optimization.
Low-quality article marketing & blog spam – Planting worthless articles at article directories just for the sake of linking back to one’s website is among the most deleterious link building tactics adopted by desperate webmasters. Such practices jeopardize the quality of a user’s web experience no ends. Google Penguin is coded to find out such websites by analyzing their inbound link-backs, and levy appropriate penalty on the same.
Other clear signals that make Penguin mad – Apart from being ultra-dedicated towards ensuring that none of the above mentioned deficiencies should be associated with your websites, you’d also want to make note of other clear signals that Penguin is tuned to detect and punish, so that you can keep your websites Penguin-proof. Excessive presence on low quality and unmonitored guest blogging networks, inbound links from compromised websites, presence of irrelevant inbound or outgoing links irrespective of the quality of the website they link to or from, and any blackhat techniques employed to plant links across the World Wide Web – all are avoidable practices.
What are Google Penguin Updates?
The ulterior motive and intention behind every update to any of Google’s algorithms is to fine tune its working and move a step further towards more enjoyable and relevant web experience for users. Ever since the first penguin update blew the SEO fraternity by storm, there have been 4 more updates to the algorithm, each including more parameters to identify websites relying upon unscrupulous link building means, and measuring their defaults accurately. Among these, the latest update rolled out on 4 th October, 2013, and is known as Penguin 2.1, although it happens to be the 5 th version of Penguin hopping about in the cyber sphere. Let’s introduce you to Penguin 2.1, which is believed to have has a much bigger impact on defaulting websites than its predecessors.
Apart from assessing all indexed websites on the basis of the links they’ve planted since Penguin 2.0 did the cleaning job, Penguin 2.1 also identifies new link spam and levies appropriate penalties on the defaulters. The following are strict NO NO for Penguin 2.0:
Forum based spamming – Whether it’s about your forum biographies using exact match anchor text links or comments with exact match anchor texts made on forum threads, Penguin 2.0 will not be lenient and treat such activities as attempts of spam.
Unperturbed “do-follow” blogs – Unless a blog is adding no-follow to the posted links, Google treats it as a clear cut signal that it’s trying to pull one over the search engine’s algorithm. Penguin 2.0 has other ideas, and that’s why being listed on do-follow websites is no more an appealing bet for webmasters.
Blogroll spamming – Right from comment signature spam on blogs, even if the links are no-follow, to blogroll links that have deteriorated, everything is being treated harshly by Penguin 2.0, and that’s enough reason for webmasters to revise how they behave in terms of leaving links on blogs.
Just in case you thought that Penguin 2.0 has made life hell for webmasters, wait till you see the next version of Penguin. Being touted as Penguin 3.0, the next update to Google’s algorithm is supposed to roll out anytime in the far end of 2014, and it keeping webmasters at the edge of their seats. It might be a life saver for several websites to take these suggestions into account as they prepare for the next generation Penguin update:
Optimized anchors could be dead – We’re undoubtedly moving towards search engines that disregard any sort of optimize anchor texts; exact matched anchor texts are already dead and buries, and optimized anchors could be next. Co-occurrence and co-citation, along with socially integrated SEO is the way forwards, and optimized anchors could well get lost in the dust.
Non-tolerance for suspicious backlinks – Any low quality backlink originating from a suspicious website could be a severe blow to your website. Focus will be more on motivating websites to ensure that they leave links on websites with better or at least equivalent domain authority level, so that web users have a fulfilling browsing experience.
That said, it needs to be underscored that Penguin updates are also an opportunity for several websites to undo their mistakes from the past and recover their rankings. Indeed, it’s Google’s proactive approach to making the web a more fulfilling experience for its users that gives birth to innovative and revolutionary algorithms such as Penguin.