The speed of the average flash sale and the pace at which the retail marketplace can shift during the average intensive sales period makes reactive agility key to a lot of your strategy, but don’t forget your customers’ overall experience during this period and definitely don’t overlook SEO considerations when it comes to your Black Friday planning or you may find organic visibility suffering when the dust has settled.

Black Friday Users

First things first – when is Black Friday? This year it falls on November 25th 2016.

Landing Pages

If you’re doing any wider marketing or pre-emptive advertising you’ll probably need to set up a nice landing page for yourself so customers have someone to go if they’re specifically searching for your brand in conjunction with Black Friday. Offering deals on a range of products? You might want to do a little old-fashioned on page optimisation for a selection of deals landers – check query volumes around your products to see what might do you a favour on the organic traffic front. Tailor landers will earn their keep with your PPC campaigns anyway by helping keep quality scores up and CPC down.

Black Friday Keyword Research

Whatever you do though, don’t get hung up on pushing your landing pages hard or going nuts over organic visibility. SEO just isn’t the right acquisition channel to focus on for Black Friday – organic SERP changes simply aren’t fast or agile enough to roll with the punches when the time comes. Prep your landers, by all means promote in advance, but on the day itself stick to your PPC to bring in the punters.

Website Load Testing

This isn’t just something to write off to your IT team or hosting provider – and neither is it just about “up time.” Don’t get carried away with the thought of queuing systems solving all your problems for you either – the best performers on Black Friday are those that don’t need queuing systems to keep things running, so a good load testing programme ahead of time is essential to make sure your systems can cope.

Black Friday Load Testing

Don’t just push your site to breaking point when testing – remember that sites slow before they stop, and speed issues will cost you at least as many conversions as the site being completely offline. Consider streamlining your pages to make them “lighter” on the showy stuff to keep the essentials up and running. Web application monitoring and stress testing firm SciVisum noted in their 2014 Christmas eCommerce Report that seven out of ten leading UK retailers saw noticeably slower page download times last year, and half of them made no attempt to streamline their content to help servers cope with demand.

One of the biggest failures I’ve seen when it comes to load and stress testing is reliance on “concurrent users” as a metric – unfortunate on a few levels, the most notable of which is that this is largely meaningless as a measure of capacity. A “concurrent user” is simply a person who has been active on your website within the current session timeout. But take a fairly average ten minute timeout and suddenly you can be looking at all sorts of mad scenarios based purely on concurrent users.

Black Friday Stress Testing

Someone hits your homepage and bounces off? That’s one concurrent user. Someone hits your homepage, browses six products, adds two the basket and checks out? That’s one concurrent user too. But it doesn’t take a network engineer to realise that those two users are making very different levels of demands on your infrastructure, despite being counted as the same for “concurrent users” purposes. What you need to know to judge true site capacity is active users, or active user journeys, or whatever you care to name it – the number of active browsers, add-to-baskets and checkout processes that your site can handle before it grinds to a stuttering halt. So if anyone mentions to your team that the site can handle however many thousands of concurrent users for Black Friday, shoot them. Then go and find a metric that will give you a more realistic idea of your site capacity.

Stress Test Full UX

Remember too that website infrastructure isn’t the only thing that will be under additional load around Black Friday – call centres, warehouse, packing and general fulfilment logistics will all be getting a good workout. Don’t sell ten million things with free next day delivery if you can only fulfil five million – all that will do is tank your brand, and that will hurt a lot more in the long run. User experience goes considerably beyond the online clicks – you need to make sure the customer has a smooth journey whatever their touchpoint, and that includes post-conversion care.

Black Friday Fulfilment

You can ease the burden at certain points by limiting delivery options to a realistic level and throttling your offers and bargains back as the numbers rise to start pushing capacity. Just make sure your operations team know what that capacity is in the first place, and that you have an integrated measurement plan in place to keep track of how many orders are pouring in and where they’re coming from so your advertising can respond accordingly.

This ties back into system load testing (think consumers will recognise the day and be okay to wait if your site goes down? Your call centre staff will tell you otherwise!) but also highlights the need to streamline your conversion process ahead of time. Review analytics funnel data, A/B and user test extensively to make sure even your senile aunt Edith can breeze through the checkout process. You should be doing this anyway as part of daily business, but especially before big peaks like Black Friday when chances are your customer support staff are going to be maxed out anyway. Use your statistics from last year and then add a big ol’ buffer of at least 20% because retail analysts are projecting even more uptake in the UK this year.

Emergency Measures

If despite all efforts your site goes offline or has to start using a queuing system of some sort, there’s only one number you need to remember:

Black Friday 503

If your site stops normal operations for any reason – whether it goes offline fully due to overloading or just starts getting pushed through a queuing system – you should make sure all affected URLs return a 503 HTTP response for the duration until it is back to business as usual. Not a 404 (is your homepage gone?) and not a 200 (if your site isn’t a-okay this isn’t correct) but a 503, which translates to “Service Temporarily Unavailable.” That’s exactly what is happening, and the equivalent of putting a sign up advising any friendly search bots happening by that things are a bit hectic right now but you’ll be back soon, so could they nip off and come back later? This has zero effect on site organic visibility in the short term (say, the duration of Black Friday…) and does save awkward situations where a bot might think your homepage is returning a 404 or your new content is all about queues.

In fact you can actually specify additional things in your HTTP response to help this further, like asking search engines to come back at a specific date or time, or after a particular number of seconds. For example, to ask for them to come back at 8am the following day:

Retry-After: Sat, 28 Nov 2014 08:00:00 GMT

Or, if more cloud servers are being pulled in to boost capacity post-haste, ask them to try again in two hours (don’t forget to specify in decimal seconds):

Retry-After: 7200

While you’re at it, make a nice holding page for users (if you aren’t using a queuing system) and for goodness’ sake slam the brakes on the advertising if the site goes down, otherwise you’re just pouring fuel on the fire in your server rooms.

But wait, I hear some of you cry – “my queuing system lets me whitelist IPs, so can’t I just whitelist the Googlebot so it will see a 200 normal homepage and all will be well?” Honestly? This makes me flinch just a tad – any point at which you are giving a search engine special treatment, say by allowing it to get easily at content users have to queue for or just can’t get to full stop at the time, you’re verging dangerously close to Google’s own definition of cloaking, which violates their Webmaster Guidelines:

“the practice of presenting different content or URLs to human users and search engines”

Can you do this without seeing enormously adverse results? Well…probably. Best practice vs what you can get away with are often two different things. Googlebot is a smart cookie, will normally work out what is going on and is often surprisingly forgiving in circumstances like these.

But sometimes it isn’t. Sometimes it will decide to make an example of someone skirting or over the line of the guidelines, and you can’t be sure it won’t pick you. That’s why they call it best practice. Not only that, but remember that other search engines with increasingly significant market share, including Bing, will struggle to understand what is going on unless you follow the best practice solution and give them a nice clear instruction. It depends how comfortable you – and your brand – are with the risk of long-term issues vs short-term setup needs. The only one who can answer that is you!