In 2019, when the SaaS and hardware solutions company ShockWatch rebranded to SpotSee, their keyword rankings, organic traffic, and leads dropped significantly. We performed a technical analysis of their new site and found an improper redirect chain from their old site. We also uncovered a manual penalty due to an improper schema markup.
It was a site migration mistake that cost them dearly.
Our strategy involved performing niche keyword research that would resonate better with their prospects, setting up proper redirects from the old website, correcting the schema markup so the manual penalty would be lifted, and writing optimized titles and descriptions for the site’s core pages.
The result? SpotSee earned 34 SERP position increases for niche keywords, which led to a 77.8% increase in organic traffic and a 56.7% increase in organic conversions.
But, SpotSee could have avoided this headache entirely, had they engaged us for their site migration — and not after it had flopped. Which leads us to the importance of…
Auditing the staging site’s architecture to ensure keyword relevance
A staging site is a testing site — either a clone of your live site or a model of the site you’re building. During the construction of a new site or a website migration, it allows you to fix bugs, as well as test new features and adjustments in a safe environment before you go live with them on your actual site.
But if it’s just a test environment, should you do SEO on a staging site? Absolutely!
Begin by auditing the staging site’s architecture to ensure keyword relevance. Ask yourself: Do the niche keywords, primary keywords, and secondary keywords on each page make sense? Have you answered the questions searchers are asking when they enter those keywords? Are the pages organized in a logical fashion that leads searchers through the sales funnel in an efficient manner? Do the pages crosslink in a manner that enhances the user experience?
Read our site migration success story to see how this fits in practice — and how it led to the rebounding of millions of dollars in revenue losses.
Mapping the redirects to ensure link equity is preserved
The next thing to look at are the redirects. If you’re creating new versions of existing web pages, it’s important that visitors can find those pages. You can use a mapping tool to create a 301 redirect, which helps preserve the backlink equity that is vital for your site’s authority. A 301 redirect basically tells search engines that searchers should be redirected to your new page — as well as that the new page’s relevance and authority is equal to that of the original page.
While it’s possible to use mass redirects, it makes more sense to look at each redirect individually. That way, you can check whether or not the redirect is necessary and as efficient as possible. You’ll want to make sure there are no orphaned pages or 404 errors. Unnecessary redirects and redirect chains slow your website down, and incorrect redirects detract from the user experience — so taking the time to pay some extra attention to your 301s can make a big difference from an SEO perspective.
Proper 1-to-1 URL redirect mapping is critical to your new site’s success. The redirects act as a guide for Google to understand that each new page should carry on the old page’s legacy. Mass redirects muddy the waters and result in lost authority.
Staging site tech audit
Fix foundational issues before they become problems.
You’ll want to perform a tech audit on your staging site to see if there are any issues before going live with your new site. Problems that can arise vary from incorrect redirects and slow page loading speeds to videos not playing and forms not behaving the way you want them to. To avoid these types of issues, you should perform a staging site crawl and tech audit to identify and remedy any issues before the site goes live.
Waiting until the site goes live to check for technical issues is like building a skyscraper before the blueprints are approved. It’s going to be much more complex and costly after the site goes live.
At Digital Current, we create a new robots.txt file for the new domain and any subdomains to block content that doesn’t provide value for search engines. We also review templates for header tags, breadcrumbs, and schema to ensure search engines have no trouble extracting the information they need to index your new site properly. And we run a speed test on the old site, as well as the staging site to identify new ways to improve site speed on the new site.
Setting up tracking codes
Last, but certainly not least, we set up tracking codes for Google Analytics, Google Search Console, Bing Webmaster Tools, Google Tag Manager, and any other third-party data sources such as Salesforce or Crazy Egg.
These tracking codes will be critical in the next step of the site migration, which we’ll discuss in Part 4: Monitoring and Reporting. (Catch up on Part 1 and Part 2!)
In the meantime, sign up for our email newsletter to get the latest digital marketing insights — and first access to our downloadable website migration checklist!