Jump to content

offline browser


gabrielebr

Recommended Posts

Hi I would like to use an offline browser to download completely my web site in order to verify what are the files that are really needed (linked) and what are the ones accumulated during the time but no more used.First of all i would like to know if an offline broser is the right tool.If it is, I would like to know which one you suggest (freeware).Thank regardsGabriele

Link to comment
Share on other sites

Hi I would like to use an offline browser to download completely my web site in order to verify what are the files that are really needed (linked) and what are the ones accumulated during the time but no more used.First of all i would like to know if an offline broser is the right tool.If it is, I would like to know which one you suggest (freeware).Thank regardsGabriele
Hi Gabriele,Depending on your file types depends on what you need to view files offline. You will need all the files on your site downloaded to the same structure folder on your local computer. If you are using HTML all you need to do is download them and open them in a browser window. If you are using php or asp that is another story.There isn't a special browser for viewing offline files. If you are running server side code you will need a local server. I will leave that for some one else to explain.Hope this gives you some insight.dink
Link to comment
Share on other sites

Assuming your local copy is missing files form your live site, or they out of sync for whatever reason, you can use an FTP client like Filezilla (and maybe FireFTP) where you can compare the contents of folders to see the differences between them.If you are looking to test your sites locally, including the ability to run server side code like PHP or manage a database using MySQL, then what dink suggested (commonly referred to as an AMP stack), would be required.

Link to comment
Share on other sites

Assuming your local copy is missing files form your live site, or they out of sync for whatever reason, you can use an FTP client like Filezilla (and maybe FireFTP) where you can compare the contents of folders to see the differences between them.If you are looking to test your sites locally, including the ability to run server side code like PHP or manage a database using MySQL, then what dink suggested (commonly referred to as an AMP stack), would be required.
Thank you Thescientist and Dink, probably I was not clear: the only scope for which I propose to use an offline browser is to download completely my web site in the hope that ONLY the used (linked) files are downloaded. Obviously I need to be sure that ALL the linked files are downloaded.I don't need to browse locally my site. Once I have the list of linked files I want to clean the server by eliminating the old files.RegardsGabriele
Link to comment
Share on other sites

ah, I getcha. Yeah, I'm not sure about that one. I'm sure there's something like that out there. I know there are some services out there that can measure your websites efficiency, SEO compatibility, transfer rates, etc and often times they check for broken links. Perhaps somewhere in that report you could get a list of all the pages that were linked to and compile a list of pages that way somehow?

Link to comment
Share on other sites

There are programs out there (search "site spider") that scan your links, and download it all in a certain folder of your choice, but keep in mind that some are dumb and can't handle "simple" stuff like images linked from CSS, dynamically inserted links and the like.I needed to do something similar recently, and I ended up with the following solution:1. Open up fiddler2 (see my signature).2. Browse all pages I want the content of.3. Save my HTTP session data as a file (which contains all URLs, which is all you really need).4. Give that to a custom PHP script that simply loops over the lines in the file, downloading every URL along the way "as is" (so that the site structure is preserved) in a certain folder.I'm in the process of creating a Fiddler extension that would ease this process (merge steps 3 and 4 to the press of a button, done before step 2), but I'm not sure how much free time I'll have for that. Still... the steps above are also good enough for a quick&dirty run.[edit]Well... Trellian looks like a free site spider that operates on pretty much the same premise I described above, except it's a separate browser (with an embedded IE engine I'd guess), and not a separate traffic monitor (as is Fiddler).[/edit]

Link to comment
Share on other sites

There are programs out there (search "site spider") that scan your links, and download it all in a certain folder of your choice, but keep in mind that some are dumb and can't handle "simple" stuff like images linked from CSS, dynamically inserted links and the like.I needed to do something similar recently, and I ended up with the following solution:1. Open up fiddler2 (see my signature).2. Browse all pages I want the content of.3. Save my HTTP session data as a file (which contains all URLs, which is all you really need).4. Give that to a custom PHP script that simply loops over the lines in the file, downloading every URL along the way "as is" (so that the site structure is preserved) in a certain folder.I'm in the process of creating a Fiddler extension that would ease this process (merge steps 3 and 4 to the press of a button, done before step 2), but I'm not sure how much free time I'll have for that. Still... the steps above are also good enough for a quick&dirty run.[edit]Well... Trellian looks like a free site spider that operates on pretty much the same premise I described above, except it's a separate browser (with an embedded IE engine I'd guess), and not a separate traffic monitor (as is Fiddler).[/edit]
Thank you !I'll give a try to trellian.regardsGabriele
Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...