Everyone has their own idea of what makes a great web host. Loads of features, powerful servers, or maybe great support, low prices, or maybe some combination of these and other functionality important to the individual.
Some specialist web hosting review websites also place a high weight on their own web hosting benchmarks, though, especially speed and host uptime (the proportion of time a website is available online). That seems like the logical thing to do but does it actually make sense?
Thereâs nothing wrong with running these tests â indeed weâve done them ourselves â and the results can feel like valuable information. No-one wants a slow or unreliable website, and anything which highlights a good host, or warns you about a poor one, has to be welcome.
However, you must be careful how you interpret these figures. Although the high-precision results make them look objective, theyâre often based on a number of very subjective judgements, and that can significantly affect their reliability. In this article weâll look at some of the issues you need to think about in this respect.
[ul]
[li]Best web hosting services of 2021: Top host providers for your website[/li][/ul]
[HEADING=1]Understanding the tests[/HEADING]
Measuring web host speed and uptime is complicated, with many factors involved. Which servers are checked? Which sites? How often? Is the speed figure a server response time, the âtime to first byteâ (the time between requesting site content), the time to load a sample site, or something else?
Itâs like browsing a chart comparing the 20 top electric cars. The rankings might change radically depending on the test driving style, environment, traffic conditions, weather, temperature and more, and until you understand the details, thereâs no way to tell how relevant they might be to you.
When a hosting review presents you with an uptime or speed figure, donât take it at face value. Read the review in full and look for any explanation of how itâs calculated. If you donât see anything, look for a âHow we testâ site-level article with more details.
Just reading the explanation of whatâs going on can tell you a lot. Is it clear whatâs going on, exactly whatâs being checked, how the tests are run? Do you feel thereâs enough information that you could carry out the same tests yourself?
If itâs all a little (or extremely) vague, or thereâs no explanation at all, that alone makes the figures almost meaningless as youâve no idea whatâs being checked. Most review sites do have a decent explanation of what theyâre doing, fortunately, but that can also raise many more issues.
[IMG alt=âBitcatcha Sample Speed Test Reportâ]https://cdn.mos.cms.futurecdn.net/oD...x4FtFio857.jpg
(Image credit: Bitcatcha)
[HEADING=1]Speed tests[/HEADING]
The first question to consider with any web host speed test is what, exactly, is being tested?
We typically benchmark a shared hosting plan, and almost all web hosting review sites do the same. Thatâs a reasonable starting point, but it doesnât indicate what sort of performance youâll see from a hostâs managed WordPress plans, VPN or dedicated hosting.
Even if youâre shopping for shared hosting, there are complications. For example, many hosts have multiple levels of shared hosting, where each plan gets a different level of system resources (CPU, RAM and so on). The test designer must decide which level of shared hosting to include in the benchmarks.
The simplest option is to pick the cheapest shared hosting product in any range â but the trouble is, that penalizes providers who offer basic consumer hosting. Provider X may have some of the best high-end hosting products around, for instance, but because it also has a very basic $1 a month starter plan (which should be a plus), itâs likely to drop down the speed test ranks.
A fairer approach is to choose equivalent hosting packages from each provider, so the test compares similar products. But does âequivalentâ mean a similar price, or features, or some mix of the two? Thereâs a lot of subjective judgement involved in figuring that out. And even if the tester somehow comes up with the perfect choice of comparable products, that might all change the very next day if the host updates its feature list or prices.
This doesnât mean the results have no value at all. If host X tops the current list for baseline shared hosting speeds, and host Y trails behind in a distant last place, then thatâs useful information. And itâs certainly better than having no information at all. Just keep in mind that it might not accurately represent the speeds youâll see with your preferred product.
[IMG alt=âUptimeRobot.com Speed Chartâ]https://cdn.mos.cms.futurecdn.net/wA...kf5DqEujZT.jpg
(Image credit: UptimeRobot.com)
[HEADING=1]What is âspeedâ anyway?[/HEADING]
The next benchmarking issue to consider is how the test measures speed. There are two common methods.
The first checks server response time, or how long the server takes to respond to a request. Thatâs a simple statistic and easy to compare, but itâs mostly about network speed, and doesnât cover a long list of relevant factors. If your server is short on CPU power, or RAM, or has slow storage devices, for instance, thatâs not going to be properly reflected in a response time, as itâs not loading a full website page.
The second option measures the load time for a test site, perhaps a simple WordPress template. Thatâs an improvement, as it takes account of more performance factors, but itâs still most unlikely to reflect your situation. Your own site is probably very different to the test template, maybe runs on a different CMS, with your own custom plugins, and none of that will be reflected in the test results.
Template-based speed tests are almost always based on a web hostâs default setup, too. Does the host automatically enable Cloudflare or some other CDN, say? Is it using the latest and fastest version of PHP, and the most speed-optimized PHP settings?
Doing it this way has some value for first-timers whoâll accept the default settings and never change anything, but itâs not much use for anyone else. If you might integrate a CDN yourself, change your PHP version, or make a single speed tweak, ever, anywhere in your hosting control panel, that could be enough to radically change your hostâs test speed ranking.
[IMG alt=âPicture of the Earth with a web of links over the surfaceâ]https://cdn.mos.cms.futurecdn.net/Fr...xwEAKQhdpk.jpg
(Image credit: Shutterstock / NicoElNino)
[HEADING=1]Location, location, locationâŚ[/HEADING]
The extra complication with any speed test is figuring out the locations involved. Where in the world is the test site, and where are the servers running the tests? It can make a huge difference.
Many web hosts have several data centers, for instance, and itâs most unlikely that theyâll offer the same performance. If the test site is in Los Angeles, but youâll choose New Jersey, or London, or Brisbane, or somewhere else, thatâs likely to have a significant effect on the results. (Most of the review sites we checked didnât even mention the issue, so youâre probably not going to find out.)
The best speed tests are typically carried out by an automated service which runs simultaneous performance checks from multiple locations. For example, Bitcatcha runs tests from the US (east and west coasts), London, Singapore, Sao Paolo, Mumbai, Sydney, Japan, Canada and Germany, and gives you a rating based on the overall results.
The advantage of this approach is it allows you to compare web hosts anywhere for their worldwide performance. The problem is that if your site doesnât have a worldwide audience â most visitors are from your home country, maybe â then the figures you need are for their locations only, and that could give you a very different rating.
As weâve said above, this doesnât mean the figures have no value. Any test results are welcome. But donât take any single speed rating as a cast-iron guaranteed measure of a hostâs overall performance â it could take some thought to figure out what the data actually means for you.
[IMG alt=âHRank.com Reports Website Uptimesâ]https://cdn.mos.cms.futurecdn.net/KD...NggYQxF3qJ.jpg
(Image credit: HRank.com)
[HEADING=1]What is âuptimeâ anyway?[/HEADING]
At first glance, measuring website uptime seems relatively easy. Itâs just the amount of time your site is up and running, expressed as a percentage. If a web host gives you 99.9% uptime annually, for instance, that translates to 8 hours 46 minutes of downtime over the year. Simple.
Except, well, it really isnât. Web hosts often define âuptimeâ as meaning your server is accessible, not your site. For example, HostGatorâs Uptime Guarantee page says: âJust because your website does not work, this does not mean your server has downtime. As long as the server is available to deliver your content, then the guarantee is met.â
Many review and testing sites also focus largely on server availability. For instance, UptimeRobot says it detects downtime by sending HTTP requests to a website, and looking for HTTP status error responses, or no response at all.
The problem is there are many situations where the server might be up and running, but a website is close to unusable. Just think of all the times youâve seen this. You visit a site but see strange error messages, maybe some features donât work at all, or speed is so poor you give up and go somewhere else.
Issues like this are arguably the worst you can get. If your site is inaccessible, people might wonder if itâs some ISP or network issue. If they can reach your site, but it doesnât work, theyâre far more likely to blame you. And yet, if your server is available and can return a page â even if it says âsorry, weâve got problems, come back laterâ â itâs possible that none of this will be reflected in the uptime figures.
[IMG alt=âUptime.com Uptime Reportâ]https://cdn.mos.cms.futurecdn.net/oE...tVpkdsbPMe.jpg
(Image credit: Uptime.com)
[HEADING=1]More uptime complications[/HEADING]
There are plenty of other potential uptime testing complications. As with the speed tests, for instance, itâs important to understand which servers are included in the benchmarks. If theyâre covering a specific product only (the cheapest shared hosting), they wonât necessarily tell you anything at all about the rest of the range.
Uptime checks can sometimes falsely report a site is down, too. This happens often enough that Uptime.com has a FAQ page on the topic, where it lists quite a few potential causes: âMost likely candidates include local issues, firewalls, blacklists, timeouts, and load balancer issues.â
These problems may not be common, but theyâre still something to consider. The difference between uptimes of 99.99% and 99.98% is only around 18 minutes a year, for instance â if the test checks a site every minute, that represents only 18 misleading fails out of 525,600 tests over a year, or one error for every 29,200 attempts.
None of these issues mean you should ignore uptime and speed results entirely. Even if theyâre covering basic shared hosting and youâre after a VPS, say, itâs interesting to see if a provider is racing ahead of the competition, or lagging far behind.
But donât assume the figures give you a complete and accurate picture, either. Theyâll give you a general idea of how the web host performs in some areas, but those may not be the areas most relevant to you, and they certainly donât give you the full performance story.
[HEADING=1]What TechRadar advises[/HEADING]
Always take uptime and speed tests with a handful of salt. Because they are often the only objective-looking numerical tests that web hosting review sites can perform, they tend to be put in the limelight and placed firmly on a very high pedestal. In our opinion though, they should only be considered as secondary, minor, parameters when choosing which web hosting company to go for and that is reflected in our review process.
[ul]
[li]Check out the best free web hosting[/li][/ul]
Continue readingâŚ
Some specialist web hosting review websites also place a high weight on their own web hosting benchmarks, though, especially speed and host uptime (the proportion of time a website is available online). That seems like the logical thing to do but does it actually make sense?
Thereâs nothing wrong with running these tests â indeed weâve done them ourselves â and the results can feel like valuable information. No-one wants a slow or unreliable website, and anything which highlights a good host, or warns you about a poor one, has to be welcome.
However, you must be careful how you interpret these figures. Although the high-precision results make them look objective, theyâre often based on a number of very subjective judgements, and that can significantly affect their reliability. In this article weâll look at some of the issues you need to think about in this respect.
[ul]
[li]Best web hosting services of 2021: Top host providers for your website[/li][/ul]
[HEADING=1]Understanding the tests[/HEADING]
Measuring web host speed and uptime is complicated, with many factors involved. Which servers are checked? Which sites? How often? Is the speed figure a server response time, the âtime to first byteâ (the time between requesting site content), the time to load a sample site, or something else?
Itâs like browsing a chart comparing the 20 top electric cars. The rankings might change radically depending on the test driving style, environment, traffic conditions, weather, temperature and more, and until you understand the details, thereâs no way to tell how relevant they might be to you.
When a hosting review presents you with an uptime or speed figure, donât take it at face value. Read the review in full and look for any explanation of how itâs calculated. If you donât see anything, look for a âHow we testâ site-level article with more details.
Just reading the explanation of whatâs going on can tell you a lot. Is it clear whatâs going on, exactly whatâs being checked, how the tests are run? Do you feel thereâs enough information that you could carry out the same tests yourself?
If itâs all a little (or extremely) vague, or thereâs no explanation at all, that alone makes the figures almost meaningless as youâve no idea whatâs being checked. Most review sites do have a decent explanation of what theyâre doing, fortunately, but that can also raise many more issues.
[IMG alt=âBitcatcha Sample Speed Test Reportâ]https://cdn.mos.cms.futurecdn.net/oD...x4FtFio857.jpg
(Image credit: Bitcatcha)
[HEADING=1]Speed tests[/HEADING]
The first question to consider with any web host speed test is what, exactly, is being tested?
We typically benchmark a shared hosting plan, and almost all web hosting review sites do the same. Thatâs a reasonable starting point, but it doesnât indicate what sort of performance youâll see from a hostâs managed WordPress plans, VPN or dedicated hosting.
Even if youâre shopping for shared hosting, there are complications. For example, many hosts have multiple levels of shared hosting, where each plan gets a different level of system resources (CPU, RAM and so on). The test designer must decide which level of shared hosting to include in the benchmarks.
The simplest option is to pick the cheapest shared hosting product in any range â but the trouble is, that penalizes providers who offer basic consumer hosting. Provider X may have some of the best high-end hosting products around, for instance, but because it also has a very basic $1 a month starter plan (which should be a plus), itâs likely to drop down the speed test ranks.
A fairer approach is to choose equivalent hosting packages from each provider, so the test compares similar products. But does âequivalentâ mean a similar price, or features, or some mix of the two? Thereâs a lot of subjective judgement involved in figuring that out. And even if the tester somehow comes up with the perfect choice of comparable products, that might all change the very next day if the host updates its feature list or prices.
This doesnât mean the results have no value at all. If host X tops the current list for baseline shared hosting speeds, and host Y trails behind in a distant last place, then thatâs useful information. And itâs certainly better than having no information at all. Just keep in mind that it might not accurately represent the speeds youâll see with your preferred product.
[IMG alt=âUptimeRobot.com Speed Chartâ]https://cdn.mos.cms.futurecdn.net/wA...kf5DqEujZT.jpg
(Image credit: UptimeRobot.com)
[HEADING=1]What is âspeedâ anyway?[/HEADING]
The next benchmarking issue to consider is how the test measures speed. There are two common methods.
The first checks server response time, or how long the server takes to respond to a request. Thatâs a simple statistic and easy to compare, but itâs mostly about network speed, and doesnât cover a long list of relevant factors. If your server is short on CPU power, or RAM, or has slow storage devices, for instance, thatâs not going to be properly reflected in a response time, as itâs not loading a full website page.
The second option measures the load time for a test site, perhaps a simple WordPress template. Thatâs an improvement, as it takes account of more performance factors, but itâs still most unlikely to reflect your situation. Your own site is probably very different to the test template, maybe runs on a different CMS, with your own custom plugins, and none of that will be reflected in the test results.
Template-based speed tests are almost always based on a web hostâs default setup, too. Does the host automatically enable Cloudflare or some other CDN, say? Is it using the latest and fastest version of PHP, and the most speed-optimized PHP settings?
Doing it this way has some value for first-timers whoâll accept the default settings and never change anything, but itâs not much use for anyone else. If you might integrate a CDN yourself, change your PHP version, or make a single speed tweak, ever, anywhere in your hosting control panel, that could be enough to radically change your hostâs test speed ranking.
[IMG alt=âPicture of the Earth with a web of links over the surfaceâ]https://cdn.mos.cms.futurecdn.net/Fr...xwEAKQhdpk.jpg
(Image credit: Shutterstock / NicoElNino)
[HEADING=1]Location, location, locationâŚ[/HEADING]
The extra complication with any speed test is figuring out the locations involved. Where in the world is the test site, and where are the servers running the tests? It can make a huge difference.
Many web hosts have several data centers, for instance, and itâs most unlikely that theyâll offer the same performance. If the test site is in Los Angeles, but youâll choose New Jersey, or London, or Brisbane, or somewhere else, thatâs likely to have a significant effect on the results. (Most of the review sites we checked didnât even mention the issue, so youâre probably not going to find out.)
The best speed tests are typically carried out by an automated service which runs simultaneous performance checks from multiple locations. For example, Bitcatcha runs tests from the US (east and west coasts), London, Singapore, Sao Paolo, Mumbai, Sydney, Japan, Canada and Germany, and gives you a rating based on the overall results.
The advantage of this approach is it allows you to compare web hosts anywhere for their worldwide performance. The problem is that if your site doesnât have a worldwide audience â most visitors are from your home country, maybe â then the figures you need are for their locations only, and that could give you a very different rating.
As weâve said above, this doesnât mean the figures have no value. Any test results are welcome. But donât take any single speed rating as a cast-iron guaranteed measure of a hostâs overall performance â it could take some thought to figure out what the data actually means for you.
[IMG alt=âHRank.com Reports Website Uptimesâ]https://cdn.mos.cms.futurecdn.net/KD...NggYQxF3qJ.jpg
(Image credit: HRank.com)
[HEADING=1]What is âuptimeâ anyway?[/HEADING]
At first glance, measuring website uptime seems relatively easy. Itâs just the amount of time your site is up and running, expressed as a percentage. If a web host gives you 99.9% uptime annually, for instance, that translates to 8 hours 46 minutes of downtime over the year. Simple.
Except, well, it really isnât. Web hosts often define âuptimeâ as meaning your server is accessible, not your site. For example, HostGatorâs Uptime Guarantee page says: âJust because your website does not work, this does not mean your server has downtime. As long as the server is available to deliver your content, then the guarantee is met.â
Many review and testing sites also focus largely on server availability. For instance, UptimeRobot says it detects downtime by sending HTTP requests to a website, and looking for HTTP status error responses, or no response at all.
The problem is there are many situations where the server might be up and running, but a website is close to unusable. Just think of all the times youâve seen this. You visit a site but see strange error messages, maybe some features donât work at all, or speed is so poor you give up and go somewhere else.
Issues like this are arguably the worst you can get. If your site is inaccessible, people might wonder if itâs some ISP or network issue. If they can reach your site, but it doesnât work, theyâre far more likely to blame you. And yet, if your server is available and can return a page â even if it says âsorry, weâve got problems, come back laterâ â itâs possible that none of this will be reflected in the uptime figures.
[IMG alt=âUptime.com Uptime Reportâ]https://cdn.mos.cms.futurecdn.net/oE...tVpkdsbPMe.jpg
(Image credit: Uptime.com)
[HEADING=1]More uptime complications[/HEADING]
There are plenty of other potential uptime testing complications. As with the speed tests, for instance, itâs important to understand which servers are included in the benchmarks. If theyâre covering a specific product only (the cheapest shared hosting), they wonât necessarily tell you anything at all about the rest of the range.
Uptime checks can sometimes falsely report a site is down, too. This happens often enough that Uptime.com has a FAQ page on the topic, where it lists quite a few potential causes: âMost likely candidates include local issues, firewalls, blacklists, timeouts, and load balancer issues.â
These problems may not be common, but theyâre still something to consider. The difference between uptimes of 99.99% and 99.98% is only around 18 minutes a year, for instance â if the test checks a site every minute, that represents only 18 misleading fails out of 525,600 tests over a year, or one error for every 29,200 attempts.
None of these issues mean you should ignore uptime and speed results entirely. Even if theyâre covering basic shared hosting and youâre after a VPS, say, itâs interesting to see if a provider is racing ahead of the competition, or lagging far behind.
But donât assume the figures give you a complete and accurate picture, either. Theyâll give you a general idea of how the web host performs in some areas, but those may not be the areas most relevant to you, and they certainly donât give you the full performance story.
[HEADING=1]What TechRadar advises[/HEADING]
Always take uptime and speed tests with a handful of salt. Because they are often the only objective-looking numerical tests that web hosting review sites can perform, they tend to be put in the limelight and placed firmly on a very high pedestal. In our opinion though, they should only be considered as secondary, minor, parameters when choosing which web hosting company to go for and that is reflected in our review process.
[ul]
[li]Check out the best free web hosting[/li][/ul]
Continue readingâŚ