There are so many continuous rumblings about the “next big Google update” that sometimes it seems every time someone so much as sneezes in the vicinity of a SERP everyone else starts panicking and staring at realtime analytics in a paranoid sort of fashion.

Google Algorithm

Before the conspiracy theories start again after the latest grind through the rumour mill, I thought it might be worthwhile to take a quick look at what’s currently doing the rounds and what brands can do to be prepared ahead of time rather than relying on knee-jerk reactions.

Google Penguin

The infamous link spam assassin is still being refined and tweaked with new incarnations but only rolls out updates periodically in refreshes, meaning that if you get a slap from the algorithm for doing something naughty offsite you not only have to clean yourself up but then wait for the next algorithm refresh before you can completely recover as all your links are essentially reprocessed taking into account disavows and removals.

Google Penguin Q3 2015

Google has been talking about making Penguin “real time” for a while now, meaning that you won’t need to wait for the next refresh for a recovery; a long-awaited development, especially for brands that have had a Penguin hit due to negative SEO activity rather than any dodgy practices of their own. Gary Illyes of Google was overheard at SMX East very recently saying that the next Penguin was in the “foreseeable” future and that he hoped it would be by the end of 2015 as the long-awaited real time version. Fingers crossed on that front.

What does this mean for brands? Well, if you’ve already got a penalty for some reason then provided you’ve taken appropriate actions you should be able to get it lifted soon on the refresh at the very least. On the other hand, the algorithm going real time will also mean that any bad linking activity is picked up more rapidly as Penguin essentially becomes more reactive.

So if you’re still doing dodgy, anchor-text rich link building then a) stop (seriously, you still haven’t?) and b) clean up your act now with removals and disavows so you don’t get hit by the refresh. In fact even if you are doing outreach and authority development the right way, it is worth doing a health check and a bit of backlink pruning anyway, just to keep things shipshape. Just make sure you don’t over-prune if you can’t find any evidence you’re under a penalty – remember you can have an over-optimised link profile from both directions, as it were.

Google Panda

The last Google Panda update, targeting rubbish onsite copy, spam and content that just adds no value to anyone, human or otherwise, is still rolling out after two and a half months. This seems to be a recurring theme with recent algo updates as their scope and scale keeps increasing to the point that a complex and prolonged rollout is needed rather than a simple switch-flip, but the downside is once again that penalty recovery can take a lot longer.

Google Panda Q3 2015

In many ways Panda is a slightly more mysterious beast than Penguin because the definition of “poor content” keeps fluctuating. At first it was just spam, keyword stuffing and pages with little to nothing on them. Then it included excessive duplication, including at meta data level, then expanded to a greater understanding of “value adding” content – because a piece of copy may be a decent length and even well-written doesn’t mean it is actually doing anything useful for a user. The trick is not just to say something but to say something worthwhile!

Continuous content auditing and strategy review is the best weapon in the anti-Panda arsenal. Don’t do it once and then deem it done for an arbitrary number of months. Measure, analyse, adapt, publish, rinse and repeat. If your content isn’t performing well – in terms of visibility, engagement, conversion assistance, whatever it should be doing – then change your approach. Improve or just remove content that doesn’t perform. These days “more” is most definitely not always “better” when it comes to your website’s content.

Other Major New Algorithm Factors

Lots of additional parts of the Google ranking algorithm continue to emerge as the engine goes on adapting to users and their ever-changing behaviour patterns. Speed and user-friendliness (in terms of things like tap target sizes, link proximity and so on) are becoming huge for cross-device usage, so now more than ever a well-designed website is one built to perform for search as well as users. Google have confirmed that if you recover from one penalty but then end up being given another, the second (and any subsequent) will be increasingly more severe. So no more spamming your way up and then disavowing your way out only to spam up in a different way (and about darn time, too).

I could go on (and on, and on, and on some more) about all the bazillions of algorithmic ranking factors past, present and postulated future, but there is one last flavour of Google’s algorithm that I want to draw everyone’s attention to.

Machine Learning – The Self Writing Algorithm

People constantly badger Google to release a big list of all the factors they use as visibility signals so that the SEOs and webmasters of the world can finally have transparency. Google usually respond with a wide variety of excuses, PR sidestepping and shouts of “look, something shiny” before fleeing the room. But what a lot of people forget is that this isn’t (just) because Google want to be insanely top-secretive about their ranking factors.

It is because, increasingly, they don’t actually know.

Google makes use of an immense volume of computing power and the algorithm driving organic search utilises machine learning, where it can process data and responses to make decisions and thereby update itself. The machine learning portions of the algorithm are constantly tweaking and improving to see what changes to ranking factors result in the best experience for users (which they determine using – yep, more data).

Now this Algobot (as I like to think of it) is smarter and faster than a whole campus of Google engineers, but as far as we’re aware there is no daily Algobot report that summarises all its experiments and changes to some supervising Google employee. It just ticks over, tirelessly and incrementally improving organic search results. How can anyone possibly try to pre-empt such a contraption?

The answer is deceptively simple: the Algobot wants users to have a good experience using Google. That way they’ll come back and use Google again, allowing Google to make a profit off advertising space. So focus on the experience of your users – for content, for links, for design, for social media, for every possible digital touchpoint (and every other one too, for that matter, because happy brand advocates search for you by name which gives you a brand boost anyway) – and make sure your site is technically sound so the bots can crawl everything nice and easily, and good organic visibility is going to follow.

It’s science, after all.

Google Algorithm Machine Learning

4Ps isn’t just another London SEO agency. To discuss how SEO and content are evolving together in order to keep pace with new developments in user interaction and search algorithms, give us a call on +44 (0)207 607 5650 for a no-obligation coffee and chat about data, marketing and user behaviour across all inbound channels.