LittleJoe Posted April 16, 2013 Share Posted April 16, 2013 (edited) I'm working on a website that I want to be accessible to speakers of many languages but I'm wondering how to architect such a website. I seem to have two options and I hope you guys can help me choose: 1) Database I could make a table that lists all the messages and then have individual language tables with translations of those messages. Each page would then make a dozen or more queries to get all the messages/content to populate each page. I'm a bit worried about what the effect this may have on my databases and whether it is too much for them to handle in addition to all the other queries such as registration, login etc. 2) PHP array Using PHP arrays is also a popular solution. I think both MediaWiki (which drives Wikipedia etc) and phpMyAdmin go with this solution. It's easier to crowd source the translation this way since most people are familiar with pure text files. I'm worried about all the memory that is consumed by this approach and at what point it becomes inviable since every translation gets loaded into memory for every page request. Edited April 16, 2013 by LittleJoe Link to comment Share on other sites More sharing options...
Ingolme Posted April 16, 2013 Share Posted April 16, 2013 Not every translation is necessarily loaded into memory, though I doubt there would be trouble if they were.What you can do is have separate language files with arrays in them and only include the one that you need. Each file creates the same array with the same identifier. Link to comment Share on other sites More sharing options...
LittleJoe Posted April 16, 2013 Author Share Posted April 16, 2013 Not every translation in every language is loaded but every translation in the relevant language. Isn't that a bit much? Link to comment Share on other sites More sharing options...
Ingolme Posted April 16, 2013 Share Posted April 16, 2013 I don't believe it's excessive but if you're worried, you can subdivide the language files into one for each page and then load the file based on language and website section. Link to comment Share on other sites More sharing options...
justsomeguy Posted April 16, 2013 Share Posted April 16, 2013 I load an entire language (several thousand strings) into memory for my application, but the memory limit for it is set to between 128 and 512MB depending on the script. You can use the memory_get_usage and memory_get_peak_usage functions if you want to run benchmarks. http://www.php.net/manual/en/function.memory-get-usage.php Link to comment Share on other sites More sharing options...
LittleJoe Posted April 16, 2013 Author Share Posted April 16, 2013 So we are then all in agreement that using a database for this is a no no? Link to comment Share on other sites More sharing options...
justsomeguy Posted April 16, 2013 Share Posted April 16, 2013 We use a database to store the values. Instead of one database query per string, when someone chooses a language we get the entire language from the database and load it into an array, and then get individual strings from the array. Link to comment Share on other sites More sharing options...
LittleJoe Posted April 16, 2013 Author Share Posted April 16, 2013 I guess that makes more sense. Thank you guys! Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now