As customers demand more expansive and interactive multichannel engagement, a lot of conventional content management systems are being found lacking in the kind of rich media experiences that brands need in order to stay ahead of the content curve. This has led to a rise in the need for additional plugins to speed-based CDNs and even the rise of entirely separate content marketing platforms that plug into already existing architecture to expand rich media capabilities.

Due to the “plug and play” nature of a lot of these systems end up deploying content via JavaScript. In this day and age that’s almost never an issue for users, but search engines are another story. It’s all very well deploying that beautiful rich content across your website, but if nobody finds it because your SEO ends up tanking then that’s a bit of a misfire for your brand as a whole.

Never fear, the intrepid 4Ps Labs team have a guide to the implications of these content marketing platforms and their delivery with JavaScript.

Rich Media Platforms

Script Served Content On Google & Yandex

Google and Yandex are probably the most similar of the major search engines in terms of crawling capability and technology. Both specifically request that webmasters do not block CSS and JavaScript assets in their robots.txt files in order to allow full indexing. Yandex still supports the escaped fragment method of AJAX indexing (Google has formally deprecated this method, saying it isn’t necessary any more, although it does still support it so no need to panic if your site is set up to use this).

So does this mean you’re all set for indexing? On most content marketing platforms that deploy via JavaScript yes, actually, it does. We’ve experimented with deploying content through a script container using Google Tag Manager and while our results suggest that indexing isn’t necessarily as smooth and rapid as conventional inline content they do prove that Google definitely does index – and can rank – content that is served via a script.

That isn’t the end of the story, however.


This includes notables like Bing and Baidu which can’t be ignored, even if they aren’t a focus for your particular search strategy. Baidu is of course the only real priority for mainland China, and Bing’s search market share is growing rapidly (the most recent stats from Comscore confirm that Bing has hit over 20% search market share in the US, and considering that Bing also powers Yahoo that actually gives it the better part of 35% of US search market share all told. That’s over a third of a pretty powerful search audience that no brand in its right mind should ignore.

The bad news is that Bing and Baidu’s bots aren’t anywhere near as smart as Google or Yandex, and they can’t parse or index content that is served via a script. Does that mean you need to reject these rich content platforms or sacrifice at least a potential third of your search audience?

Not at all.

Firstly, make sure you ask about the search indexing capabilities of your content platform when you’re early into the process of potentially acquiring one. You’ll need to get your development team involved as early as possible to make sure your existing website systems can talk to an SEO-friendly integration of the new content platform. What you’re looking for is being able to embed an inline HTML equivalent of the content served via the script into the output source code of your pages. This doesn’t even necessarily need to contain all the styling and other elements, but make sure it contains key markup like H tags and fully readable text content.

But wait, I hear – isn’t this a form of cloaking? Hiding text and links in divs only designed to be read by search engines rather than users? Am I sitting on a Google penalty waiting to happen?

Actually, no.

Google’s own guidelines point out that not all hidden text is deceptive and one of the things it specifically talks about is a site using technologies like JavaScript which aren’t accessible. The way that they ask webmasters to deal with this is specifically to place the content inline on the website, but to wrap it in a <noscript> tag so it is also being served to users with JavaScript disabled – which is in essence what search crawlers like the Bingbot and Baidubot are.

Rich Media Content

The full details are on Google’s quality guidelines page about hidden text and links but the pertinent bit we need is this specification for what to do about content served via JavaScript.

Place the same content from the JavaScript in a <noscript> tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.

For example, one common use of an external content platforms is to add rich media headings to the top of categories or product listing pages on eCommerce sites. In this case you should end up with output source code that looks something roughly like this (the script part will of course vary depending on the platform you’re using for your rich content delivery, the example code below is based on some output from our chums at Amplience):

<script language="javascript">





{type:"html5",src:"", xd:[""]},






<div><h1>My Web Store</h1>

<p>My web store sells a lot of beautiful and purdy stuff and thangs.</p>

<img src="" alt="My Web Store Banner"></div>


Provided this guideline is followed, there’s no violation in place and all is well. Just make sure you use that <noscript> tag as a clear indicator to all search engines that you’re not up to anything cheeky, and of course ensure that you’re only replicating exactly what would otherwise be rendered by your rich content platform in the text – no trying to sneak in anything different (or additional) or you will be ripe for a Google slap.