Preparing for a site launch
Creating a website is often a chaotic process. As the deadline looms it is easy to let quality slip. Here is the checklist I follow to ensure things are not left out.
Final cross browser test ¶
Perhaps the most important check a final cross browser check is completed once content and images are in. Often during content entry HTML can be poorly formed. Checking again against your agreed browser list ensures the site will display consistently at launch.
Validate code ¶
I validate code at the template stage but I also like to complete a further validation check before site launch. At this stage the errors found a minimal but it is a final quality assurance check. You can check your HTML here and your CSS here. Validating code is good for many reasons but it will ensure search engine spiders can deep crawl your site at the earliest opportunity.
Check links ¶
You can guarantee that somewhere in your site there will be a broken link. There are many link checkers out there but if your site isn’t too big you can validate an entire site at the WDG HTML Validator. This will point out any broken links as well as any code errors.
Site features ¶
Somewhere along the line I always seem to forget something. So I check the following features are in place for launch.
404 page
Making sure a bespoke 404 page is in place and configured on the server. This will help users recover from errors and will help search engines keep indexes up to date.
Print friendly CSS file
A print friendly CSS file is a simple but valuable addition to a site. There is a great print CSS tutorial here and it is a simple addition that will enhance usability.
Google Sitemap XML file
Creating an XML sitemap file is a simple way of helping Google to spider content. Many Content Management Tools allow you to create XML files, or you can use an online sitemap creator to do it for you.
Google Analytics
Ensuring you add Analytics code to your site will allow you and your client to track the success of your site. It is easily forgotten but ensuring it is set up and configured for your client to access is crucial.
robots.txt
If there are areas of the site you don’t want to be spiders you should ensure that a robots.txt file is in place and on the server. If you forget and go live without your content may well get spidered.
These are simple features that should be on every professional site.
Generating free traffic ¶
It is well known that Google doesn’t like island sites - that is to say sites that don’t have any links from other sites. You can easily generate links by submitting your site to CSS Galleries and exchanging links with other sites. If you get into one of the CSS galleries it is likely you will get a good amount of traffic, and this will encourage Google to deep crawl your site more quickly. Submitting your site to search engines is a good idea too. Most major search engines have a manual submission form.
Anything else? ¶
What do you do? Do you have a checklist when you launch a site? Or do you just do these things by habit?
Tags
Can you help make this article better? You can edit it here and send me a pull request.
See Also
-
10 ways to spot a good front-end coder
Front-end coding is certainly a craft and there are some amazing people doing it. When I visit a good site I often rummage around in the undergrowth to learn from the Masters. Here are ten practices I commonly see from the best coders. -
Site Navigation in CSS and XHTML
Coding navigation in CSS and XHTML properly is important for both search engines and accessibility. Here's my take on how to do it, although I would be very interested to hear what others think. -
The joy and curse of WYSIWYG
Allowing sites to be updated by non-technical staff is essential. WYSIWYG editors are a must in Content Management Systems, but can cause frustration too.