iwato Posted September 24, 2018 Share Posted September 24, 2018 (edited) BACKGROUND: I am using AJAX to make a call to my Matomo database. This call runs in PHP and returns a set of data far too large for any particular use. This said, it can satisfy a lot of different needs separately. As I would like to provide users with the ability to build their own dynamic graphs from this very large data set I must plan an efficient strategy. From my still very naïve understanding of web applications I envision two possible scenarios only one of which is likely correct -- probably the second. AJAX calls PHP. PHP calls the Matomo reporting API Matomo reports and fills the PHP file with data Scenario One: AJAX makes makes a new call for the generation of each dynamic graph. Scenario Two: AjAX is only called once, and the same data set is used over and over again. QUESTION: If Scenario Two is the preferred method, as I strongly suspect is true, what is the best way of dealing with the data set? Do I reassign its contents to a separate file and then retrieve that portion of its contents required to build a new and different graph each and every time a new and different graph is desired? Or, do I simply leave the data where it is, and perform all subsequent calls to the PHP file from within the AJAX success function? Please elaborate your response. Roddy Edited September 24, 2018 by iwato Link to comment Share on other sites More sharing options...
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!Register a new account
Already have an account? Sign in here.Sign In Now