When you re-launch any website, there are many considerations that must be taken into account before you hit “go”. Just one of these considerations are the technical spiders, or “Googlebots”, that “crawl” a website and discover new and updated pages to add to the Google index. These are computers that look at the technical structure of your website, and this can sometimes run into errors when a new website build is developed without an SEO strategy behind it.  As a digital strategist, some of the most important considerations in search are the technical elements that make up a website. Therefore, understanding what’s “under the hood” is imperative.

Why Screaming Frog?

At 4Ps Marketing, we are active users of Screaming Frog to assist us in this task. There are many similar tools out there, but this platform offers a user-friendly interface with exportable data. In this article, we will cover the best features of the software when re-launching your website in order to reduce the risk of traffic loss from technical issues.

Run an entire site crawl

The first thing you do to prepare for a website relaunch is to run an entire crawl on your existing website. To do this in Screaming Frog, enter the URL at the top and don’t forget to “exclude” any folders that contain pages or images that you don’t want included in your crawl. Examples of this may be /images/, /admin/& /forum/; these folders could contain many URLs that are not important or may cause the crawl to fail.

Features: before and after the new website launches

While there are many features in Screaming Frog, we wanted to highlight some of the features that are most useful when working with a site launch

Feature Before After
301 Permanently Moved • Checking the current 301 redirects will allow you to identify any potential redirect loops when the new website launches. • Check the 301s requested have been implemented correctly.Clearly visualise the destination URL to check they are as expected.
• Ensure the redirects are set as 301s, not 302s or 307s (temporary redirects).
Meta Data • Extract and record meta data so you can refer back if the migration loses any of it
• Identify any missing meta data so you can implement this during the build
• Check all meta data has been migrated as expected
• Easily export the data to amend any new categories that have been missed
Canonicals • Identify the current canonical strategy
• Identify instances where it is necessary to build canonicals into the new structure
• Check that canonicals have been implemented as expected
• Identify any places that are missing a canonical tag
• Identify any canonical contradictions. This can cause issues if they have not been implemented correctly.
Analytics Tracking Code (This feature is more beneficial after launch) • Set up the filters to check the source code contains the correct tracking code
• This will highlight any pages/sections that do not include tracking
• Without this, it’s very difficult to test that all pages contain tracking code

Following these steps before and after you launch your website will aid you in designing a more successful website strategy and will reduce the chance of visibility loss in search. The tool has many other features not covered in this article, but you can find a full list of features over at Screaming Frog.

4Ps isn’t just another SEO agency. To discuss how SEO and analytics are evolving together in order to keep pace with new developments in user interaction and technology, give us a call on +44 (0)207 607 5650 for a no-obligation coffee and chat about data, marketing and user behaviour across all inbound channels. What could a 4Ps analytics consultant do for your business?