Jump to content

Better Way To Implement Large Top Menu?


thesoundsmith

Recommended Posts

I have an 80+ page site that uses a drop-down top menu. With all the content, there are a lot of options on the menu, and every time the user goes to a new page, it all has to reload. I am using a PHP include to get it there, but it still has to load all the text each page.I have side menus on most pages as well, so technically I could reduce the amount by pruning the menu structure, but I like the one-click-to-anywhere option of a full top menu. I'm wondering f there is a way to convert the menu into, for instance, a javascript app that remains resident while the user is on the site, but does not compromise his security and is still relatively low-maintenance to add or delete menu items. Or a way to 'freeze' the PHP include text stream - perhaps a floating menu bar that remains on top but calls other pages?You can see the site at The Soundsmith

Link to comment
Share on other sites

You could use frames. you could have one frame for your top menu, another for your sidebars, and another for the main content.http://w3schools.com/html/html_frames.asp
maybe, but I would avoid it if possible.I would suggest (to at least make the large top menu easier to maintain) to look into PHP's include(). With this approach, your menu could be contained within one HTML file, but included in every page with one line of code, and to change the menu on every page, you just have to update that one included file.About keeping it constant while loading other pages, AJAX might be able to help you out with that possible. If not that, maybe some fancy CSS work based on positioning. I wish I could be more detailed, but I know someone on here will be able to fill in the gaps, or provide a better solution.
Link to comment
Share on other sites

Soundsmith is using PHP includes already, it sounds like. But I can't imagine they're slowing down the pageload very much. You can fit a lot of menu into 5K (for example) and a 5K image loads pretty fast.If images are slowing the thing down, that should only happen once. Then they would be cached, and subsequent loads would be instantaneous.The first thing to do is this. Load a page. Save the source as a file. This is just the text. See how big it is. If it's 100K, you have a problem. More likely it's something manageable.I'm working on a module right now that has a bunch of JavaScript. The file is 21K and 500 lines long. Even over DSL it loads pretty fast.Maybe the server processes PHP slowly. Do plain HTML files download faster than PHP files?Then again, if your menu system contains a lot of page elements (I mean A LOT), then a slower computer will take time to build the DOM. I have a page that generates about 1500 input elements (it's basically an AJAX spreadsheet). At work, no problem. At home on my clunker, maybe 25 seconds, most of which is DOM building.It is possible to write a JavaScript file that would get cached (and therefore downloaded just once) and have it write the menu text for you. But I just hate that solution, and normally recommend it only for people who don't have server-side scripting ability. You end up with your HTML hidden in a gazillion document.write() statements, so the code is a PITA to maintain. And it just plain offends my sense of The Way Things Should Be Done. (I get a little Zen about this stuff sometimes.)Anyway, I'd check out other possible sources of this problem before resorting to that.How significant is the problem, anyway? 5 seconds? 30?

Link to comment
Share on other sites

When I load your site, it doesn't take long - a few seconds, sure, maybe 5, but I can wait that long.

Link to comment
Share on other sites

Pfft! Follow the link. Why didn't I think of that?Yeah, the download is pretty fast. The generated HTML is 15.6K, which is not bad at all. You can trim that to 14K if you put the embedded CSS in a separate file, like the rest of your CSS.Now, the first time downloading bg1.jpg took FOREVER. That sucker is 60K. I compressed it to 27K and could not tell a difference. This will be even more true when all that text and stuff goes in front of it. I cropped it to the edges of the colored part, and got it down to 23K. Worth a try. Especially since I just followed a menu link and noticed that every page has a new giant image. The images are slowing you down. Compress them a little more.Of course, once they're cached, it's no big deal. Is caching enabled on your test browser?Seriously, except for the first time downloading the big images, I am not seeing a problem here. Definitely not in your menus.

Link to comment
Share on other sites

Run your page through this site's analyzer.http://www.websiteoptimization.com/services/analyze/I don't see anything on that page that would concern me a whole lot. The obect count is 13, but that is at the edge of acceptable, Image loads are within tolerable limits, although that one background image could be made smaller, maybe. As isay, I would think this page is acceptable by today's standards. Yahoo also have some services that might assist, but they are more technical and this issue would not be solved on this page by using their tools.

Link to comment
Share on other sites

Thanks for the replies, everyone. I am running a pretty slow machine for my web work (the fast box goes for audio and video editing.) I guess I'm just not happy with looking at the page source and having to scroll 80% of the way down to get past the menu to the first content element. But that only matters in debug mode...Deirdre's Dad, I think you're right about the header backgrounds, I've already compressed the header images from over 100K. I'll just have to go through them again. And I have the cache down as low as I can for the web testing.jlhaslip, thanks for the link. I forgot about these folks.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...