Best Practice for Switching your elcomCMS site from http to https
The Move to HTTPS Moving web sites from http to https is becoming more common as companies aim to improve security, and take advantage of the slight...
If you're new to HubSpot, we guide you on where to start, how to do it right, and train you to make the most of the platform.
Review your HubSpot portal to uncover issues, spot growth opportunities, and ensure you're maximising its potential.
Unlock business growth with automation and attribution. Implement best practices and execute marketing campaigns.
HubSpot On-Demand
HubSpot Training
HubSpot Websites
HubSpot Campaigns
Virtual HubSpot Manager
We've been working with elcomCMS for many years now, and helped with a number of site go-lives. In this post I wanted to go through a simple checklist of items to include in your own elcomCMS site go-live process.
elcomCMS is a pretty feature rich product, so in the interests of keeping this post as simple as possible, there's a number of items that I've not covered eg canonical URLs, Google parameter insertion, download tracking, Code blocks for setting Google analytics custom variable, article meta robots tags - leave a comment if there's any in particular you'd like covered.
Although this is intended for a go-live process, it doesn't hurt to review on your own site and ensure the following are in place, even if it's been live for a little while now.
If you're going live with a completely new site (ie the domain is newly registered), then the process is a little simpler (eg you don't need to get redirects in place), but quite often the sites I'm been involved with have been Site Refresh projects where the domain has been around for a while already, and the project is about replacing it with a new site design - and usually a change of CMS (eg changing from another CMS to elcomCMS).
With that in mind I'm going to include the steps as if they apply to an existing site being refreshed.
Here's the general steps:
And now the details:
In the lead up to go-live you should be ensuring that the basic on-page SEO items are set. These include article attributes such as page Titles, META descriptions, URLs, canonical URLs, as well as article content such as headings, lists and general content items.
I won't go into detail on all of these since they are generally pretty well known. However I will point out a few key things.
Page Title and user friendly URL are set at the top of the Article attributes screen. Meta Description is set lower down from the Metadata -> Meta tags section:
Canonical URLs can also be set at the article level, along with Meta robot tags. You can safely ignore the Meta keywords tag these days - it is not required anymore and doesn't impact SEO.
This is a crucial step for a site refresh. It is very important that all of the old site URLs are redirected to their new site equivalent URLs (here's a case study of what can happen if you don't put 301 redirects in place).
There's a few ways to do this, and depending on the size of the site one may be preferred.
For small sites, using the redirect settings within each article is probably easiest. Simply enter the old URLs that should directed to this new article from here:
For larger sites, building a redirect mapping file is probably easiest since it can be easily constructed from a spreadsheet, and is also very easy to update and add to. Because elcomCMS is a Microsoft .NET based product you'll generally be working with IIS. I've covered how to build and load the rewritemap.config redirect files here, so check that out for further details.
Creating an XML sitemap is a good way to guide Google as to what is important on the site and should be indexed.
Use the XML sitemap section to include all relevant content folders, but exclude or leave out any admin folders (eg site_layout) and membership folders (eg pages that will be only viewable by logged in members). Create the XML sitemap and set it to be updated regularly whenever code is updated.
Usually sites have a Site_Layout folder that includes lots of behind the scenes assets (eg footers, banners, embedded articles). As a general rule it is best to keep these pages out of the Google index (since it is unlikely a person would want to search for and view any of these items.
To exclude them from Google, the noindex and noarchive meta tags should be set. Note: it is ok to allow following of links in these items (ie usually leave the nofollow unticked), but it will depend on the particular assets.
Google Analytics (or other analytics package) is important to get in place just prior to go-live.
It can be placed in a number of spots, but the recommended practice is to insert it into one of the global master pages:
It is possible to insert it using the Code Insertion box in Global Admin, however please note that this code is not inserted into all modules (eg the ecommerce module doesn't include this into pages).
Usually when developing the site, it is in a staging environment and has been blocked from Google (googlebot crawler) using the robots.txt file.
One the site goes live it is import to unblock the site from Google. This managed via the Admin -> robots.txt settings:
If a site is block (eg during development on a staging site) the robots.txt will have a blanket disallow line like this:
User-agent: *
Disallow: /
It is important this is removed once the site goes live and changed to something like the above screenshot, which only blocks the login page and the FTP folder.
On a live site, this is GOOD:
User-agent: *
Disallow: /login.aspx
On a live site, this is BAD:
User-agent: *
Disallow: /
Once live it is important to update the Google Webmaster Tools account for the site (or create an account if one doesn't already exist).
Add the XML sitemap (created earlier in step 3) to the Google Webmaster tools account:
It usually only takes a day or two for Google to crawl the sitemap and start indexing the pages. It is normal for only a subset of the pages to be indexed by Google - so don't be worried if only 60% of the pages in the sitemap are indexed.
If you are especially keen you can also set up Bing Webmaster Tools and submit to there as well.
If the site is a responsive site then there won't be any further changes to the above required.
However, if the site is adaptive and rendering different versions for different devices then there may be some additional implementation required, including:
Getting Google Analytics in place with the default settings is generally fine for 90% of the reporting you'll be doing.
However, for the full insight into sites, I'm generally doing a fair bit of custom setup these days including specific conversion goals, custom segments, custom reports and custom variables where required. For simplicity I've not covered them here, but if there are specific custom reporting requirements noted by the client, it is important to implement these are close as possible to the go-live.
Hope this helps. If you have any questions please leave a comment or feel free to contact me directly.
The Move to HTTPS Moving web sites from http to https is becoming more common as companies aim to improve security, and take advantage of the slight...
Setting the robots.txt file is important for your web site's health - Google uses it as a guide for what to index and what to exclude.
One of the best ways to guide Google in what it should crawl on your site, is to provide a listing of URLs that you deem important.