Google has been releasing so many animal-themed updates to its service that it’s beginning to resemble a zoo. Over the past few years there have been many changes to Google’s search and ranking algorithms, and with so much fuss over each one it can be difficult to figure out exactly what’s going on. Only Google has all the answers, but observation and expertise can be used to figure out most of it. To help you understand Google’s main algorithm changes of the past year, here is a brief summary of each.
Penguin 2.0 is actually the 4th update of Google’s original Penguin algorithm, but from what Google said when it was released this is the second major iteration. Penguin 2.0 was released in May 2013, with a smaller update (Penguin #5, or 2.1) in October the same year. Penguin’s main focus seems to concern sites with large quantities of low quality backlinks.
In the past, before the first Penguin algorithm, links were an easy way to make Google rank your website higher. Google’s algorithm treats each link to a site as a recommendation, with recommendations from higher quality sites being more significant. Links from less important sites are not valued as highly, but a large number of them could still count for more than a few high quality ones.
Many SEO experts took advantage of this fact, and even began creating ‘link farms’ that served no purpose except to link to each other, and all their clients’ sites. This would make a website appear more important due to the large number of backlinks, regardless of how good the site actually was. The new Penguin algorithms were created to counter this exploitative form of link building, attempting to shift the balance back towards high quality websites, and away from link farmers.
With each release of Penguin, Google’s web crawlers check websites for bad backlinks, and potentially even identify link farms, evaluating a site based on the quality of all other sites linking to it. Having poor quality backlinks is not a sign of a bad site in itself, but Penguin particularly punishes sites which have accumulated a lot of bad links in a short space of time.
It is possible to remove bad backlinks to your site through two methods; either ask the webmaster of the linking site to remove it (or add a “nofollow” tag), or through a process known as disavowing. Both methods are quite lengthy, so improving a site that has been hit by Penguin will take some time, but sites which are already high quality, or which are relatively new with few backlinks, can focus on attaining the good links that are a sign of real and algorithmic quality.
In short, if you are looking for sites to link to your own, you should be focusing on quality not quantity. Most of the time, simply being a high quality site is enough to encourage other high authority sites to link to you, and will also be of benefit to your visitors, who are on the lookout for a good website, not one with a lot of backlinks they know nothing about.
Panda 4.0 was released on May 2014, and is the latest in a relatively long line of Google updates that started in 2011. Although it’s impossible to know exactly what factors Panda looks at, it is known that it focuses on the overall quality of your site. A Google Fellow, Amit Singhal, wrote a list of guidelines for what Panda might look at, asking questions such as “Does this article provide a complete or comprehensive description of the topic?” and “Would you trust the information presented in this article?”
Although Panda is based on the quality across your entire site, there are a few particular practices or mistakes to steer clear of regarding the Panda algorithm:
Google is rolling out our Panda 4.0 update starting today.
— Matt Cutts (@mattcutts) May 20, 2014
To summarise, Panda rewards sites which have a lot of high quality, informative and helpful content, which is usually a sign of websites that are useful to visitors. As long as you keep your visitors in mind, your site should avoid being penalised by Panda.
Hummingbird is not like Panda and Penguin. Rather than being an algorithm update that focuses on specific changes, Hummingbird is really more of an update to Google’s entire search functionality. It was not designed with the aim of targeting specific practices, but instead encourages a new way of searching and of being found: conversational search queries.
As a series of computer programs and algorithms, Google’s search engine struggles to think like a human being, and can’t make the logical connections that we can. In its most basic form, Google is able to look at a keyword, such as “wood” and find pages on the internet that also talk about “wood”. Additional functions, such as understanding that people searching for “wood” might also be searching for “logs” are added by humans, and help Google find more relevant results.
However, not everyone searching for “wood” is necessarily searching for “logs”, they might actually be more interested in seeing results about “mahogany” items, or perhaps they were looking for a “forest”. Using additional words in searches did help people find the right sites a lot of the time, such as searching for “wood table” or “walks in the wood” but this was still somewhat limited, and Google really only looked at the keywords “walk” and “wood” and tried to find sites which had both.
The Hummingbird update sought to change the way that Google worked by helping it to understand more conversational terms, looking at your entire search query in one go, rather than picking out keywords. Thus a search for “good walk in the wood” is now more likely to find you websites about lovely woodland walks than it is to find “excellent logs in walking distance”. Helping Google to understand these conversational queries also helps users to find what they’re actually looking for.
Hummingbird isn’t a change that can really be taken advantage of, and shouldn’t be either. The essence of this update was that users should be able to find what they are looking for more naturally, and since you only want visitors to your site who are actually interested in what you offer, this can only be a good thing.
The key to Hummingbird is that if you focus on building a site that has relevant information, Google will be able to direct searchers who are looking for that information to you. There is much less emphasis now on keywords than there was before, and more emphasis on what your site can offer. As long as you are able to promote what you do, your site should see the benefits of Hummingbird.
Ok, so a Payday Loan isn’t an animal, but this was another major addition to Google’s search algorithms that is worth talking about, albeit briefly. The Payday Loan updates mainly focus on ‘web spam’ and low quality sites that overuse high query search terms.
So-called because one of the most commonly spammed search queries is payday loans, this update had an international rollout and affected different languages to different degrees. Apparently roughly 0.2% of English language sites were affected, but it was as high as 4% in Turkey, where web spam is more common.
Although this update mostly affected payday loans, insurance, claims sites and so on, any site that is likely to draw attention from an extraordinarily high volume of search queries could be affected if keywords have been overused compared to quality of content. Most businesses need not worry however.
It might seem like there’s a lot there to take in, but really it all comes down to a single fact: as time goes on, Google are continuing to encourage sites that are high quality, informative, helpful and relevant for visitors. It’s true in every business, and it’s true online as much as offline: putting your customers first should always be your main priority.