12.09.19

Seven ways to make site migrations a crawl in the park with Screaming Frog

Site migrations can unnerve even the most experienced “I-was-there-when-it-was-Alta-Vista” SEOs. But the right tools and checklists can lighten the migration workload.

Here are seven ways the iCrossing UK SEO team use Screaming Frog to make the migration minefield a walk (or crawl) in the park...

1. Crawl the current site/s to compare, post-migration

Whenever you take on a migration, it’s best practice to crawl the pre-migration site(s) first. This will give you a complete record of URLs and site structure to compare against your post-migration site, making it easier to spot any URLs that go missing, are no longer internally linked to, or have changed their indexability. All those things that shouldn’t happen during a migration but can – even with your best preparation.

Check the Directives tab in the Spider to make sure you have a list of the no-indexed pages. Then you can make sure they stay this way post-migration.

Tip – remember to crawl all areas of the site, including nofollow pages which aren’t crawled by default. To do this, make sure ‘Follow Internal “nofollow”’ is ticked (Configuration > Basic > Follow Internal “nofollow”).

Changing your robots.txt crawl settings to ‘Ignore robots.txt but report status’ might make your crawl larger, but it’ll be useful when comparing the status of pages pre and post-migration to make sure none have slipped through.

Case study: Flying past geo-redirects

We recently worked with a client that was in the process of migrating several of their EMEA websites. Although we could crawl their EN and FR sites, their DE site was bizarrely redirecting due to a geo-redirect.

 

But all was not lost. By using Forms Based Authentication (Configuration > Authentication > Forms Based), we could load up the website within the built-in browser and navigate to the DE site, which allowed a cookie to be dropped on the page. So, instead of being redirected back to the EN version of the site, we could crawl the DE site without an issue.

Case study: Crawling the uncrawlable

Although not technically something encountered during a migration, the Forms Based Authentication feature of the Spider can also be very useful when websites won’t play crawl – and can be used to bypass those pesky CAPTCHA forms (technically I’m not a robot, so I’m not lying…)

As staging environments are often password protected, Forms Based Authentication can even be used to crawl a website that needs login details first.

2. Crawl multiple sites at once

If your migration is consolidating several sites at once, it’s good practice to make a full list of URLs pre-migration. But us busy SEOs don’t have hours of time spend running multiple crawls (there are links that need building, too!) The good news? Screaming Frog lets you crawl several domains at the same time.

Source: Twitter

3. Decide which pages to keep and which to consolidate

As part of a migration you’ll usually need to combine multiple pages into a single authority page on a topic, and at the same time, gather information like sessions, clicks, and referring domains.

Screaming Frog streamlines this consolidation process by pulling all this data during the crawl. It uses the APIs offered by Google Analytics, Google Search Console, Ahrefs, Majestic and Moz. Simply connect to your APIs (Configuration > API Access) and crawl your site with ease, waving goodbye to hours spent downloading URLs in separate platforms and wasting time with Excel.

4. Crawl redirects

One of the most beneficial uses for the Screaming Frog SEO Spider is to check URLs are redirecting to the right location by crawling redirects. Screaming Frog themselves have an excellent guide on this, but here’s a quick recap on the steps to take:

  1. List Mode

Make sure ‘Always Follow Redirects’ is ticked

  1. Crawl website

  2. Export Redirect Path

  3. Identify any URLs that don’t match (common issues include missing trailing slashes, additional subfolders, redirect chains, capital letters in URLs, and redirects to pages with 4XX errors – though hopefully not all at once).

5. Check rendering issues before they become issues

Nobody likes a website that doesn’t render properly. And of course, nobody likes a website that isn’t optimised for mobile.

Make sure your post-migration site loads properly and is properly sized for mobile, by taking rendered screenshots of key pages. (Configuration > Spider > Rendering > Javascript > Window Size - Googlebot Mobile:Smartphone.)

You can even save the screenshot to quickly send to your client to show them what the site looks like on mobile.

6. Test robots.txt files

Ever thought nothing will change during a migration? It’s a classic assumption, before being faced with URL structures changing, internal links disappearing and never-before-seen parameters cropping up.

Thankfully, there are steps you can take to minimise this risk. When reviewing a staging environment with Screaming Frog, you can mimic the robots.txt file from the live site (Configuration > Robots.txt > Custom > Download live site robots.txt.) With this, you can crawl the staging environment with the real-life limitations implemented.

It’ll help to identify the parameters that will stay blocked, and whether any new ones have arisen during development that could lead to crawl budget waste or page duplication. This is particularly useful for hefty e-commerce sites.

Want to step it up a level? You can then head back to your custom robots.txt file to test the disallows you’ll want to recommend to your client, making sure they’re blocking the correct parameters (and nothing else!)

If you want to block the /ideas/ subfolder (maybe you can’t handle the great ideas), you can add this to the custom robots.txt file and test to see if it’s blocked. The crawl will then obey your custom robots.txt file to see how the site would react. Or, you can update and see whether new parameters or URLs have arisen that you don’t want pushed live.

7. Scheduling crawls

Many migrations can take place outside of working hours, since these tend to be quieter periods for website traffic. But you don’t want to wake up at 2am to get it going (we hope!) With Screaming Frog’s crawl scheduling, you can run a crawl following a migration without having to sacrifice your wind-down hours outside of work.

This is simple to do…

  1. Set up the configuration you want to have in your scheduled crawl

  2. Save crawl configuration (File > Configuration > Save As)

  3. Set up crawl scheduling (File > Scheduling > Add) – this includes when you want the crawl to start, configuration of choice, any APIs you want to include, and any exports you’d like (e.g. redirects or redirect chains) and the file location to save them.

  4. Sit back and relax

The benefits of scheduling a site crawl post-migration? It allows you to get stuck into actionable recommendations straight away, as you won’t have to wait for a crawl to finish during your work hours before identifying issues, making the process much more streamlined.

 

Want to know more about using Screaming Frog for a site migration? Get in touch at results@icrossing.co.uk.

Continue reading

POV

/

04.06.2018

Alexa, what does voice search mean for on-site content?

SEE FULL POV

POV

/

08.02.2021

How to automate paid search

SEE FULL POV

Report

/

09.10.2019

The SEO expert's site migration checklist for 2019

SEE FULL REPORT

ix-chevron-bg

Contact

Are you ready to make a digital step-change?

We believe that moving too slowly in digital is the biggest risk your business faces. If you are ready to move faster in digital, we are here to help.

GET IN TOUCH