Website Utilities

Introduction

Creating a website is realtively easy and browsers are very tolerant of small mistakes in HTML and CSS. What they show might not be what you want, but they will try and show something rather than a blank page. But, web page authors should still try to keep up with the latest techniques and code. Over the years HTML, CSS and JavaScript has changed and although most still work, such as deprecated tags in HTML, there are now better ways of writing pages.

Almost every editor now has code hints and there are utilities around to check your site for accessibility, errors and loading speed. Other utilities will check for server vulnerabilities. Others still will minimize the Javascript and CSS files, others will even minmize the HTML page itself.

I may as well say something about a pet peeve of mine. Sometimes when you look at a website's source code the first page before scrolling down is full of included CSS and JS files. What on earth for? Do people really need to load everything on every page? Perhaps they do, but for myself I won't load the libraries for FancyBox, Google Charts, Prism or anything else I use unless the page actually uses them. The files themselves may be tiny, perhaps even just a few kilobytes, but they use resources.

The Utilities

The idea behind all of these utilities is to provide a page free of errors and load it as fast as it can be into the browser. I'm not even sure if it is possible to optimize a website or page to pass every test, but these point out the major bottlenecks. Here are some of the tools I use:

Accessibility - The W3C page Web Accessibility Evaluation Tools List has 160 validators. I haven't yet made up my mind which ones I prefer to use.

Analytics - These analyze the website traffic more than validate them, but they are useful in their own way. For someone who runs a website it's always nice to know that people are visiting it. This site gets around 100 visitors a day, my site about HMS Gambia gets around 14 and my old offroading site is lucky to get any visitors at all. But website analytics should be much more than that. I want to know how people are finding the sites, which pages are most popular, and if any pages are having problems.The online one I use is Google Analytics</p>

Web Server logs can also be analyzed. There are several of these utilities but the classics remain Analog, AWStats and Webalizer. The original Analog site, www.analog.cx, is long gone but luckily it was mirrored. C:Amie took Analog on as a maintenance project. The files can be downloaded from the site or from GitHub. The results displayed by Analog are mostly text based, but can be "beautified" using Report Magic.

CSS - I use the W3C CSS Validation Service to check I haven't made any dumb mistakes in the site's CSS file. The W3C Nu HTML Checker can also check CSS files. I minimize it using Minify which can also handle JS files.

Editor - I've been using Adobe DreamWeaver to edit the pages for several years. The HTML hints are great. The only downside to the program is the spell checking. Every other editor has automatic spell checking, DreamWeaver never has and it is annoying. Everyone makes mistakes spelling at some time or another and it would be nice if checking was done as you type.

HTML - I use the W3C Nu HTML Checker and W3C Markup Validation Service

Images - Apart from video, images are the largest content of most websites. I optimize the images I use in Adobe Photoshop, but Cloudinary Website Speed Test is optimized to show which images are slowing your website down.

JavaScript - JSLint is good at tidying up JavaScript and looking for mistakes. It can be a little too strict at times and sometimes trying to make optimize the code to their rules breaks it when it is minimized. I minimize it using Minify which can also handle CSS files.

Links - Xenu Link Sleuth has not been updated in years, but it's still very good at what it does. The W3C Link Checker is good as well.

Minification - This is the process of removing all white space and line breaks from files. It reduces the file size without more agressive compression. I use Minify which can handle both CSS and JS files. I should add that I run my own Apache webserver, the "Server in the Cellar", and that has the gzip deflate option turned on.

Mobile Friendly - Some test are designed to test if the pages are mobile friendly. Google has two tools for doing this, Google PageSpeed and Mobile-Friendly

Page Loading Times - Some tests measure everything about a page and gives results on an analysis of the whole thing. Solarwinds Pingdom is one such, others I use are Catchpoint Web Page Test, and of course Google PageSpeed. I should add that I run my own Apache webserver, the "Server in the Cellar", and that has the gzip deflate option turned on.

Search Engines - In June 2021, according to StatCounter, Google was used for 92.47% of worldwide searches while Bing was used for just 2.31%. Statista, gave Google an 87.76% share and Bing 5.56%. Google has Google Search Console which offers useful insights. Although Google is the most used search engine by far there is also Bing Webmaster Tools.

Server - As I run my own Apache webserver, the "Server in the Cellar", I'm interested in tools that can help optimize and protect it. Some of the better ones I've found are Detectify, this company once offered free scans for a single site. I started using it and they still scan it, even though the offer was withdrawn years ago. Immuniweb Website Security Test, Ionos Website Checker, Mozilla Observatory, and Pentest-Tools Website Vulnerability Scanner are also useful.

This page created February 7, 2021; last modified July 24, 2021