Jump to content

Automated Requests (i.e. Downstream Server)


Jesdisciple

Recommended Posts

I guess I should first ask if this is done yet: Is there a server application (type) for measuring the download time of web pages? If not...How do sites such as http://w3tableless.com/ (if it were up...) go about acting as clients to their users' servers?Until I'm done with my current project, this is just out of curiosity, but I think I might write something to do that next.

Link to comment
Share on other sites

Is there a server application (type) for measuring the download speed of web pages?
Pages don't have a download speed, only a total filesize. A single request has a download speed because it has a response size and a total time. The web server has a certain amount of bandwidth available, which changes over time, and each client has a certain amount of bandwidth available, which also changes over time. You would only be able to measure the speed that a given server is sending out a page if you have a client with more bandwidth then the server, or else you're measuring the time that the client takes to download it instead of the time that the server takes to serve it.I'm not sure what your question is with that other site because it's offline.
Link to comment
Share on other sites

Regarding 'speed' vs. 'time', believe it or not I edited that mistake right before reading my email notification of your reply.The other site parses pages (I think entire sites) and determines whether they conform to the W3C's recommendation (or whatever status it has) for "tableless" design, i.e. design that doesn't use tables for layout (as opposed to displaying data).I was thinking of doing something similar for 'dial-up friendliness' (an issue close to my processor), in particular reasonableness regarding images. But now that I've read your reply, I guess it would just be a measure of file-size divided by bandwidth.

Link to comment
Share on other sites

Well, if you want to be dial-up friendly, remembering that dial-up is around 56 kbps (optimally 7 kBps more realistically around 5 kBps most of the time) and that really pages should load in 1-2 seconds, so all pages (rendered, not with SS code), and that page's images, should be under 10 kB. So, you could use the filesize() command in PHP (apparently it supports some URL wrappers, you will have to experiment), to get the size of a page being tested, then regex the URLs of all the images (don't forget to check in its CSS as well), and filesize() them too, add them all up, and see whether it is under 10kB.Edit: Ok, filesize() doesn't seem to work over HTTP. :) So, if you want to go ahead, you will have to write a script that transfers all relevant files to your server and measures their size there.

Link to comment
Share on other sites

Oh yes - and Applets and CSS files :) the list keeps growing.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...