Jump to content

Server-side pagination


rink.attendant.6
 Share

Recommended Posts

I understand that client-side pagination is more appropriate for little sets of data and server-side is for large sets of data. With the DataTables plugin it seems a bit easier to implement server-side pagination. All this seems fine, however I've googled many resources and nowhere does anyone define what a "large" data set is (number of records/number of fields). So I'm wondering, approximately how large of a data set would be large enough to choose server-side pagination over client-side pagination?

Link to comment
Share on other sites

A data set is "large" when more often than not, the entries to be downloaded are so numerous that downloading them takes over 7 seconds, which is the point at which most users will just quit waiting for the page to load. Depending on the size of each entry, and the internet speed of the average user of your site, the threshold from small to large data set can be at anywhere between 5 and 5000 entries.Of course, you don't have to go over THAT edge. For many users, 1 full second is already so slow that they'd look for competing sites (they'd still forgive you if your data is worth it of course).

Link to comment
Share on other sites

Hmm... well the site is hosted on a dedicated server with no domain. I'm not sure about the size of each entry (each entry with the index is approx 8 kilobytes in the database, if that helps). As for the audience of the site, there would be a maximum of 10 concurrent users (that's the most extreme situation and most likely would never happen, 1-3 users is normal). Most of them have high-speed broadband connections, though a few of them are in remote communities with lite-speed DSL (not sure what speed that is). Everyone should have access to computers with Windows 7 and IE 9 by the time the database is deployed (again not sure if that matters). The expected amount of records per year (for the particular table I'm concerned with) would be 1200-1500. Would it be a good idea to retrieve a whole year's worth of data (by JSON or whatever) and have DataTables paginate that or would that be too much? I've consulted with the server administrator and he couldn't come up with an answer... so that's why I'm asking :D

Link to comment
Share on other sites

its not about how much data is there for one entry/row in your database. it is up to how much data you are pulling from the database and showing in the page, in other words overall page size the php is building. People usually tend to avoid scrolling huge pages even if user is in high speed connection and your server is able to handle the request.Depending on your data and your use of application you can decide in which logical point you need to break the presentation in different pages. yes, it may not apply if it is some kind of reports generation though.

would be 1200-1500. Would it be a good idea to retrieve a whole year's worth of data (by JSON or whatever) and have DataTables paginate that or would that be too much?
if you are paginate those 1200-1500 entries that does not mean it is getting all the row at a time. you will get subset of the data depending upon current pages dynamically, so it wont have any overhead.
Link to comment
Share on other sites

Hmm... well the site is hosted on a dedicated server with no domain. I'm not sure about the size of each entry (each entry with the index is approx 8 kilobytes in the database, if that helps).
What matters is how much data flows between client and server. So... the index is irrelevant for this. Just the data you send to the client (i.e. the data in all fields of a record, plus all bytes that are required for formatting it, which are quotes, commas and brackets in the case of JSON). Let's say that in the end, you still have 8KiB per entry on average...
The expected amount of records per year (for the particular table I'm concerned with) would be 1200-1500. Would it be a good idea to retrieve a whole year's worth of data (by JSON or whatever) and have DataTables paginate that or would that be too much? I've consulted with the server administrator and he couldn't come up with an answer... so that's why I'm asking :D
... 8KiB * 1500 = 12 000 KiB ~ 11.72 MiB. Even at a high speed DSL connection (8Mbps = 1MiB/s), downloading the full set would take 12 seconds, which is unacceptable. If this site is an intranet site (the web server is within a local network with its clients; for most LAN cards, that's 100Mbps = 12.5MiB/s), downloading the full set will take, at worst, ~3 seconds (1 second or less, on average), which is sort of acceptable.
if you are paginate those 1200-1500 entries that does not mean it is getting all the row at a time. you will get subset of the data depending upon current pages dynamically, so it wont have any overhead.
And this is what "server pagination" essentially boils down to.
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...