Webmasters: Improve Your Performance

Came across some interesting data from Google on the size, number of resources and other metrics of pages on the web. I thought they were worth noting here because some of the stats indicate that webmasters out there are only taking advantage of some of the power of the web, and in the process limiting themselves and the companies they represent — a lot of times their own.

This stuff comes from Googlebot — the company’s crawl and indexing pipeline. (Googlebot not only processes the main HTML of a page, but also all embedded resources such as images, scripts and stylesheets.)

  • The average web page takes up 320 KB on the wire;
  • Only two-thirds of the compressible material on a page is actually compressed.
  • In 80% of pages, 10 or more resources are loaded from a single host.
  • The most popular sites could eliminate more than 8 HTTP requests per page if they combined all scripts on the same host into one and all stylesheets on the same host into one.

Also, this:

  • The mean number of hosts per page is 7 for all sites Googlebot looks at, while the median is 5 and the max is an incredible 374
  • Mean KB per page is just over 320 for all sites, while the median is about 177.5 and the max is just over 517,026
  • The mean KB per host is 45.69; the median is more than 13 and the max is 441,631.71

Do you want your website to run faster? The first step is to make sure you’re monitoring your site. Consider trying Monitis free plan – for website, server and traffic monitoring service. We have established track records of robust execution, alerts delivery and we help many website owners to reach high uptime and availability at no cost. Monitis is a professional, premium all-in-one monitoring service, integrating application performance with back-end infrastructure and cloud monitoring.