Indexing a refreshed website on Google

Best-Steps-For-Launching-a-WebsiteI’m working on a refreshed website for Central Bedfordshire Council.

This will mean a new structure, new URLs etc. I want to void any pitfalls along the way so these are the steps I am going to take.

Site Map

Create a site map and submit this via the Google Search Console.

Fetch as Google

I will use this tool in the Search Console to index top 500 pages on the website. This looks like a manual process – unless I am missing something?

Tests from today show that it takes <2 minutes for Google to recognise a new URL once I submit this.

Redirect key URLs

The new website will be on the same domain using the same CMS, which means I can manually redirect the top 200 URLs to the new URL. This accounts for 85% of our traffic.

Remove URLs

If I redirect key URLs then in theory I should go ahead and remove these from the Google search results to allow the new pages to push through via submitting the Site Map and Fetch as Google. I’m slightly hesitant as if (for some reason) these steps don’t work, we could temporarily disappear from the Google search results.

Other steps?

I’d like to hear your feedback on any other steps you would take, or advice on which order to take the steps I have described above.

Much appreciated!

Alan

 

 

 

Advertisements

7 thoughts on “Indexing a refreshed website on Google

  1. I would say you’re overthinking it a little bit. You don’t need to remove any URLs: Use the 301 redirect if the page has moved permanently. If it doesn’t exist anymore then just make sure it is returning a 404 header. Google will handle the rest, replacing the old urls with the new ones, and dropping any pages from its index that 404. I also don’t think it’s necessary to manually submit any pages. If they are in your sitemap Google will find them. I don’t know that submitting manually speeds anything up.

    Also, be patient. Google’s index doesn’t change overnight, and it has a long memory. It may continue to crawl old urls on your site for some time. Just make sure they continue to either 301 or 404.

    • Hi Brad

      Possibly overthinking like you say – but, it’s a high profile project and we want to make sure it is a success on day 1, so want to make sure I can cover every angle possible.

      I’ll take a look in more detail at 301 redirects though and looks from yours and other comments that submitting the new Site Map on launch day is going to be the #1 priority.

      Thanks

      Alan

      • Alan — One point I forget, and which I think is most crucial when considering SEO changes and also monitoring afterwards. You want access to your server log, and it’s worth the dev time to build a program to filter and output the activity of googlebot as it crawls the site. Where is googlebot spending it’s time? Is it crawling where you want it to crawl, is it finding the proper headers (i.e. 404 or 301) for pages that no longer exist? Is it getting bogged down crawling urls that aren’t important or that you didn’t know existed? Crawl budget is a valuable resource. You can learn a lot from looking at what googlebot is doing on your site.

  2. Id also ensure the 404 for any failed redirects / missing / lost links has a search option or decent internal sitemap rather than JUST a funky designed one.

    If it was me id hold out removing URLs till you know all redirects work and the site is running smooth before hand. But that’s just me.

    Remember there are other options as well as Google to consider…

  3. Hi Alan

    Site Map:
    Yes do this once your redirects are in place.

    You could/should even submit a sitemap of the *OLD* URLs if you have them. This will speed up Google seeing the redirects, no harm in that.

    Fetch as Google:
    Won’t do any harm but that’s really what the sitemaps will help with. And even if you didn’t submit a sitemap or do Fetch as Google, Google would still find the new URLs eventually – fairly quickly for the ‘top’ ones, and maybe a few days/weeks later with the others. So if there’s time I would do this step but if you’re up against it I’d lower the priority of this one.

    Redirect key URLs:
    Try and redirect as many of the other 15% of pages as possible too if there’s time. Even if no-one visits them, they may have external links pointing to them which will be helping the site.

    You don’t say, but I assume you’re doing one-to-one redirecting, i.e not bulk redirecting a big batch to the homepage?

    Remove URLs:
    Avoid this. It’s a pain to do, as you can’t just instruct Google to remove them, Google need to see in (I think) your robots.txt file or on the page itself (with a robots ‘noindex’ meta tag) that you want them removed. Let them fall out naturally which the above steps (Redirects, Sitemap) will help speed up.

    Other steps?
    Hmm.
    – Do the new pages have the same/similar content as the old?
    – Make sure internal links are updated to point to the new content. Some say you can even leave the internal links pointing at the old URLs for a period, again just to make Google ‘see’ the redirects. Whichever is good, as long as you know why you’re doing it.
    – Make sure canonical tags are updated (if there is some) to point to the new URLs and not reference the old ones.
    – If there’s a staging server you’re using before pushing live have you made sure any robots.txt blocking rules are removed? See that a lot.
    – Made sure your Google Analytics (or other) is carried over.
    – Annotate in GA that you migrated on this date

    I think that’s probably everything as far as SEO goes. Hope that helps.

    Jon

  4. Hi Jon

    Thanks for the comments.

    Site Map – yes, I agree, we should submit the current site URLs. I’ll look at that.

    Redirecting URLs – this will be manually done to point old to new URLs. We have 1,200 pages on the site so not possible to go deeper than top 200. Last time we refreshed the site we did not redirects and Google re-indexed in a few days. But to provide a far better customer experience I want to cover the time it does take with some of the steps above.

    On other steps, all content is being rewritten from scratch, new structure, format – the works! So it should be a lot easier for Google to index properly.

    We’re launching a BETA site in Jan with robots.txt rules in place to block it so yes, good point to remember to ensure this is unblocked when we switch to live.

    Thanks

    Alan

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s