According to the web site optimization page, this blog is too big and loads too slowly:
Connection Rate Download Time 14.4K 70.21 seconds 28.8K 36.81 seconds 33.6K 32.03 seconds 56K 20.58 seconds ISDN 128K 8.66 seconds T1 1.44Mbps 3.86 seconds
For readers on a good cable/DSL connection, this shouldn’t be an issue; but the thought that it might take 20 seconds to load this page on dialup is sobering. I currently have the blog set to show six past days of content. I could set it to fewer in order to speed things up. Should I?
OK, the web site optimization page is sort of suspect, as it seems designed to get you to buy a book. Also, this week isn’t quite a typical one, as I have a 21K graphic element bulking up the total to 86K. But still, if it’s true…twenty seconds (or even sixteen) is forever….
Update: Based on suggestions in the comments, I cut a couple of days of posts off the tail, removed four small graphic images (although I like the “XML” boxes myself), and tried it again. Didn’t seem to make much difference:
Connection Rate Download Time 14.4K 61.58 seconds 28.8K 32.19 seconds 33.6K 27.99 seconds 56K 17.92 seconds ISDN 128K 7.43 seconds T1 1.44Mbps 3.20 seconds
*Note that these download times are based on the full connection
rate for ISDN and T1 connections. Modem connections (56Kbps or less)
are corrected by a packet loss factor of 0.7. All download times
include delays due to round-trip latency with an average of 0.2 seconds
per object. With 14 total objects for this page, that computes to a
total lag time due to latency of 2.8 seconds.
But I do think there’s something suspicious here. Not only does my host have gzip loaded, but if you do the back of the envelope arithmatic, some 70K of content should not take even two tens of seconds to download over a 56K connection. Even with some moderate packet loss.
The report does not take into account that you have gzip enabled. The one thing that I would change if you are interested in load times are the feed images. I personally do not like them aesthetically and I think that a simple text link is just as good and it would be faster. I have the same thoughts on the creative commons image but I grant that the CC image is a little more useful than the xml images…
You are quite right. With a fast connection, your page size is OK.
The problem is, I only sometimes have a fast connection.
Other people are the same boat; or have no fast connection at all.
During a download, I do something else. Many of your potential
readers will not bother to complete the download of your page at all.
While putting `dead time’ in front of people is not so bad as putting
dead bodies in front of them, for a visit that should be safe, neither
act is encouraging.
Please emulate the BBC and provide a `low graphics’ or `no graphics’
page as well as a `high graphics’ page.
Also, please provide only the last two days of content, and provide a
listing with links to the rest. That way, you can become more
I took a quick look at your page and noticed you problem.
You don’t have a baseref= and about 175 hrefs to URL starting https://www.discourse.net. Adding the baseref and killing the xtra text (about 5K) could save you a couple seconds on a 56K line uncompressed.
Of course all of this is noise to the speed hit you take if you have a PHP element that has to regenerate on every hit.
In a related note, I was getting terrible slowdown when using your site about 2 weeks ago. Now it loading very quick and running like a champ.
I suspect that the difference is that I fixed up the stylesheets.
ON the baseref issue, I think it would take some serous tweaks to the MT templating system to get it to stop giving the whole URL. I don’t think I have the guts, as it would mess up the upgrade path. And version 3.0 is just around the corner…