An algorithm is a set of mathematical system of calculations designed to create a result. Search Engines use algorithms to weigh varied elements to determine which webpage is most relevant to a search query.
Search Engines like Google use many elements and aspects, commonly referred to as “signals” in their algorithm to determine relevance.
Such signals are the usage of the term in the Title tag, Heading, URL or even the proximity of the keyword to the beginning of the content. Others include such things as links from other websites; the relevance of the page the link originates from; the text of the link and the page the link is pointing to. Google states that they utilize over 200 signals in their ranking algorithms.
Google does use many different algorithms together to determine relevance or to reduce the manipulation of their ranking algorithms by determining blackhat tricks or keyword usage called “spam”.
One of the most effective algorithms in Google’s history is named Penguin. Penguin was designed to deal with something called link-spam; where people will attempt to game Google by creating links from websites that have not topical connection to the website being linked to. This can include buying links that pass relevance and PageRank or by using blog comments to gain Follow links to the destination website. Other types of link spam include Guest Blogs which until recently were considered acceptable by Google but which have become considered spam due to too many people trying to use them to obtain backlinks.
Google has released multiple versions of Penguin (namely Penguin 1 in April, 2012 and Penguin 2.0 in May, 2013 and Penguin 2.1 in October, 2013). Many in the SEO industry are awaiting, some with great trepidation, the arrival of Penguin 3.0. The 2.0 version greatly affect a mass amount of websites who had previously been using either questionable or outright blackhat linking tactics to increase their website’s rankings. Following Penguin 2.0 Google announced the introduction of a new tool called the Link Disavow tool which allows those who’ve been hit by a Penguin penalty to remove the association of spam backlinks. The tool, however, doesn’t tell you which links are bad or good leaving it up to the website’s owner or SEO to perform a link audit and to manually disavow links.
Penguin has made obtaining backlinks which transfer PageRank and relevance (called Follow links) much more difficult as Google only wants such links to be obtained “naturally” without any effort on behalf of the destination website. Although many SEOs and websites still use blackhat tricks to obtain links, it is unknown whether Penguin 3.0 or another variation will cause such links to be downgraded and such websites penalized.
Panda began as its own stand-alone algorithm which was designed to determine the quality of the content on webpages, among other things. It seeks to determine the quality of the content (how unique it is compared to other content on the web) in order to reduce the duplication of content in the SERPs. There are penalties which can be applied to a website or to a specific content page if a page is found to be duplicating content from its own website or from other websites on the web. This is important if you have an ecommerce site, for example, that uses product descriptions from a manufacturer as other websites selling the same product may use the same description. It is best to tailor your content to the product or topic and to write unique content which is relevant and focused.
Although Panda fits with the animal theme of Google’s algorithms it was actually named for a Google engineer named Navneet Panda!
Since its launch Panda has been bundled into Google’s overlying algorithm named Hummingbird.
Here is a post on SEO for Google's Panda Algorithm
In August 2013 Google launched a new version of their overall algorithm which they named Hummingbird, thus carrying on the animal theme which included Penguin and Panda. Hummingbird has increased linguistic semantic analysis meaning that they are seeking to divine the intent of the query and not just present results related to the keywords used.
Click here for an in-depth article on Optimizing for Google’s Hummingbird Algorithm
Effective White Hat SEO for eCommerce, International & Local Businesses from SEO professionals with over 15 years experience
Dynamic CMS websites for eCommerce, blogs, and businesses featuring SEO Built-In™ - so your SEO campaign launches when your site does.
Using a blend of methodology, copy, design and technology we convert more visitors into revenue
Building your brand's awareness, authority and reputation via key Social Media channels.