I may as well say something about a pet peeve of mine. Sometimes when you look at a website's source code the first page before scrolling down is full of included CSS and JS files. What on earth for? Do people really need to load everything on every page? Perhaps they do, but for myself I won't load the libraries for FancyBox, Google Charts, Prism or anything else I use unless the page actually uses them. The files themselves may be tiny, perhaps even just a few kilobytes, but they use resources.
Woktron has a good, understandable guide to optimizing Apache for performance.
The idea behind all of these utilities is to provide a page free of errors and load it as fast as it can be into the browser. I'm not even sure if it is possible to optimize a website or page to pass every test, but these point out the major bottlenecks. Here are some of the tools I use:
Accessibility - The W3C page Web Accessibility Evaluation Tools List has 160 validators. I haven't yet made up my mind which ones I prefer to use.
Analytics - These analyze the website traffic more than validate them, but they are useful in their own way. For someone who runs a website it's always nice to know that people are visiting it. This site gets around 100 visitors a day, my site about HMS Gambia gets around 14 and my old offroading site is lucky to get any visitors at all. But website analytics should be much more than that. I want to know how people are finding the sites, which pages are most popular, and if any pages are having problems.The online one I use is Google Analytics.
Web Server logs can also be analyzed. There are several of these utilities but the classics remain Analog, AWStats and Webalizer. The original Analog site, www.analog.cx, is long gone but luckily it was mirrored. C:Amie took Analog on as a maintenance project. The files can be downloaded from the site or from GitHub. The results displayed by Analog are mostly text based, but can be "beautified" using Report Magic.
Content Security Policy (CSP) - There are several tools available to check the validity of a website's Content Security Policy (CSP) among them are Content Security Policy (CSP) Validator, CSP Evaluator, Csper Policy Evaluator, Hardenize, Mozilla Observatory, Pentest-Tools Website Vulnerability Scanner, and Security Headers.
CSS - I use the W3C CSS Validation Service to check I haven't made any dumb mistakes in the site's CSS file. The W3C Nu HTML Checker can also check CSS files. I minimize it using Minify which can also handle JS files.
Editor - I've been using Adobe DreamWeaver to edit the pages for several years. The HTML hints are great. The only downside to the program is the spell checking. Every other editor has automatic spell checking, DreamWeaver never has and it is annoying. Everyone makes mistakes spelling at some time or another and it would be nice if checking was done as you type.
HTML - I use the W3C Nu HTML Checker and W3C Markup Validation Service
Images - Apart from video, images are the largest content of most websites. I optimize the images I use in Adobe Photoshop, but Cloudinary Website Speed Test is optimized to show which images are slowing your website down.
Links - Xenu Link Sleuth has not been updated in years, but it's still very good at what it does. The W3C Link Checker is good as well.
Minification - This is the process of removing all white space and line breaks from files. It reduces the file size without more agressive compression. I use Minify which can handle both CSS and JS files. I should add that I run my own Apache webserver, the "Server in the Cellar", and that has the gzip deflate option turned on.
Mixed Content (HTTPS vs HTTP) - There are sites that will check individual pages for mixed HTPP and HTPPS content, but there are others that will attempt to crawl an entire site. Two of the best I have found are JitBit SSL Check and Missing Padlock - SSL Checker. HTTPS Checker has a free downloadable checker. It works very well but the free edition is limited to 500 pages, this is OK for my sites.
Mobile Friendly - Some test are designed to test if the pages are mobile friendly. Google has two tools for doing this, Google PageSpeed and Mobile-Friendly
Page Loading Times - Some tests measure everything about a page and gives results on an analysis of the whole thing. The tests I use are: Catchpoint Web Page Test, GTMetrix, Solarwinds Pingdom, Whats My Ip's HTTP Compression Test, Yellow Lab Tools, and of course Google PageSpeed. I should add that I run my own Apache webserver, the "Server in the Cellar", and that has Apache's gzip deflate option turned on, and the CSS and JS files are minimized.
When I ran these tests, the results showed one major thing I should change and that's Apache's cache. Other things that were slowing the page loading time down were not actually part of the page but the stuff added by Google Analytics and Microsoft Clarity.
Before optimizing Apache's cache the following scores were obtained: CatchPoint: "Not Bad"; GTNetrix: A; PageSpeed: Mobile-78, Desktop-100; SolarWinds: C (77); Yellow Lab: B (72).
Public Availability - There are two types of tests to show whether a site or indvidual pages are available publically. For a simple site test there are places like Down for Everyone or Just Me, Freshping, and Is It Down Right Now. To see what an individual page looks like then there are services such as GeoPeeker, LocaBrowser, and WebPageTest. These last three are good for checking pages that should be private but for some reason are public.
Search Engines - In June 2021, according to StatCounter, Google was used for 92.47% of worldwide searches while Bing was used for just 2.31%. Statista, gave Google an 87.76% share and Bing 5.56%. Google has Google Search Console which offers useful insights. I use the reports to look for pages that are giving 404 page not found errors. Although Google is the most used search engine by far there is also Bing Webmaster Tools.
Server - As I run my own Apache webserver, the "Server in the Cellar", I'm interested in tools that can help optimize and protect it. Some of the better ones I've found are Detectify, this company once offered free scans for a single site. I started using it and they still scan it, even though the offer was withdrawn years ago. Ionos Website Checker, Mozilla Observatory, and Pentest-Tools Website Vulnerability Scanner are also useful.
Server Status - I have enabled Apache's mod_status module on my site, that produces a server-status web page. Other sites also have this page enabled. To find them, search the internet for unique, static text on the page such as "total entries stored since starting:"
Site Information - By default a web server gives a lot of information about itself. Sites like BuiltWith, Netcraft, and W3Techs can display a lot of information about any site.
Sitemap Validators - A malformed sitemap.xml file means that search engines and other utilities cannot index the site properly. A simple typo can invalidate every line after it, sometimes affecting thousands of pages. When using these tools it may be necessary to look at the entries before the line reported as giving error to look for the malformed entry. Three utilities I found useful are My Sitemap Generator, Website Planet, and XML Sitemaps.
SSL Certificates - I use Let's Encrypt to supply the SSL Cerrtificates and the have a range of Automated Certificate Management Environment (ACME) clients to manage the certificates. Qualys SSL Labs can check the certificates on the website as can Ionos SSL Certificate Checker.
This page created February 7, 2021; last modified December 8, 2022