How Google Can Improve Their Penguin Algorithm

Google Penguin

For webmasters and SEO professionals, Penguin has been a big topic throughout the past year. Many sites have seen their search engine rankings change from Penguin and in some cases were even dropped off of the map for Google search results. Moving forward I think Google could make some improvements to help both webmasters and people who use Google to simply look up information online.

One of the problems with Penguin is – it can penalize websites for mistakes they made years ago such as hiring a bad SEO company that created bad links to their site. The argument can be made that you can hire the wrong person in any business which can hurt your company, which is fair – but one of the biggest rooms for improvement Google can provide, is to give more opportunities for websites that made mistakes to recover from a Penguin penalty and get back to ranking on Google search results. Here are three ways I think the algorithm can be improved.

For starters – more frequent updates like they do with the Panda algorithm now. (Note – Panda has to do more with on site content, while Penguin is more about offsite factors which has to do with links) It is understandable why Google wouldn’t want to update it too frequently as it is a complex algorithm but something along the lines of once every few months sounds like it could be a fair deal.

The second idea I can see helping improve Penguin is to have an option to choose if you want to accept links to your website. This may be controversial as it could add a lot of information to deal with, especially for people who aren’t versed in SEO, but it is an idea that people have been talking about on some internet marketing forums throughout the year. One advantage of having this option is that if anybody points negative SEO links at your website you can simply disassociate yourself from it.

For the third idea – If Google could discount backlinks from some websites so it doesn’t hurt websites rankings online if they had nothing to do with those links. Arguably Google already does this but they have not been very open about this process. As an example of there is a random website that scrapes peoples information into junky articles and puts a link to their website next to it- it would make sense for Google to have a system that devalues these websites so webmasters don’t need to think much about links from these sites.

Sometimes on Webmaster Tools you may see tons of links coming from sites you have nothing to do with. These can be sites that scrape websites information or analyze them for whatever reason. If Google took this into account it would make sense to not even show it in peoples Webmaster Tools accounts so we don’t need to worry about them.

From all these ideas I think more frequent updates along with everything else is important because in some cases websites can have their rankings lost for a significant amount of time such as in 2014 Google took over a year to refresh the Penguin algorithm. By improving the algorithm and updating it more regularly it can help both webmasters and searchers online since it can potentially help quality websites improve their rankings and at the same time lower sites rankings that are showing up in search results through low quality backlinks.

Leave a Comment