Jump to content
iwato

AJAX - A Question of Strategy

Recommended Posts

BACKGROUND:  I am using AJAX to make a call to my Matomo database.  This call runs in PHP and returns a set of data far too large for any particular use.  This said, it can satisfy a lot of different needs separately.  As I would like to provide users with the ability to build their own dynamic graphs from this very large data set I must plan an efficient strategy.  From my still very naïve understanding of web applications I envision two possible scenarios only one of which is likely correct -- probably the second.

  • AJAX calls PHP.
  • PHP calls the Matomo reporting API
  • Matomo reports and fills the PHP file with data
     
  • Scenario One:  AJAX makes makes a new call for the generation of each dynamic graph.
  • Scenario Two:  AjAX is only called once, and the same data set is used over and over again.

QUESTION:  If Scenario Two is the preferred method, as I strongly suspect is true, what is the best way of dealing with the data set?  Do I reassign its contents to a separate file and then retrieve that portion of its contents required to build a new and different graph each and every time a new and different graph is desired?  Or, do I simply leave the data where it is, and perform all subsequent calls to the PHP file from within the AJAX success function?

Please elaborate your response.

Roddy

Edited by iwato

Share this post


Link to post
Share on other sites

How often does data change? everytime requested, hourly, daily? With the latter why not download, store, then  retrieve when needed on own server, store with date and update if date exceeds modification time.

Share this post


Link to post
Share on other sites

Do you mean like setting up a CRON job that keeps the data current, but only runs once every six-hours or so?

Roddy

Share this post


Link to post
Share on other sites

Yes! you can take only the data you want, then store it in database table or xml file, then access that directly, instead from Matomo.

Share this post


Link to post
Share on other sites
42 minutes ago, dsonesuk said:

Yes! you can take only the data you want, then store it in database table or xml file, then access that directly, instead from Matomo.

I've created this implementation with a Currency API that has very stringent limits. I managed to get it, so even if someone was always using the page that needed the API data, it would never go over. By reusing the old data with a data age check. (By storing what time the data was saved)

AJAX call into 'controller' which then checks the Database, to see how old the data is. If its new, I get the data from the Database and return it. If its old, I get the data from the API, update the database and return the information. It works quite well.

It also makes 'subsequent' calls to the 'controller' lightning fast.

Edited by Funce

Share this post


Link to post
Share on other sites

DSONESUK:  The Matomo reporting API will return the data in a variety of ways depending on the value of the format parameter that I use to call it.  I could easily have it returned in XML format, but am wondering why you suggest this format and not another.  I have just gotten use to used to JSON and quite prefer it.  Also, why do you suggest that it be entered into a database?  Is this to protect it from public exposure?

FUNCE:  Could you direct me to the page where you have implemented your model.  I am not sure that I understand it, and I would be loathe to reinventing the wheel , if it were not necessary.

Roddy

 

Share this post


Link to post
Share on other sites
7 hours ago, iwato said:

Also, why do you suggest that it be entered into a database?  Is this to protect it from public exposure?

FUNCE:  Could you direct me to the page where you have implemented your model.  I am not sure that I understand it, and I would be loathe to reinventing the wheel , if it were not necessary.

The most information you'll be able to gather from my page, is that it updates every 5 minutes, is slow to load the first time, and works nearly instantly from thereafter until the next 5 minute rotation. Here's the link: https://www.championfreight.co.nz/currency (It has a note on when it was last updated)

The slow loads are updating the Database with the API, then displaying the data. The fast page loads are loading from the Database without updating, as the Data hasn't aged enough for me to need to update from the API.

 

A very high level overview:
RAM vs Disk. Disk is slow, and updates the RAM. The RAM is fast, and can respond to multiple commands quickly, as it can access its own data very quickly, but would need to update from the Disk if it gets a request it isn't expecting, and lacks the data required.
In this scenario: Your Matomo would be the Disk, with the absolute values required at any time. The RAM would be a database you create and host on your own server. Its super fast, due to it being hosted in the same place, but periodically needs to get the up to date values from Matomo.

 

I store a timestamp with the values I retrieve, so I check the timestamp, and if its too old, I get new values.

In your case:

  • AJAX asks for Data from PHP
  • PHP gets Data from Database and checks when it was received
  • If its recent enough, PHP returns the data back to AJAX,
  • If its not, PHP asks Matomo for the Data
  • Matomo reports and returns the Data to PHP
  • PHP updates Database with Data, and a timestamp for checking later
  • PHP then returns the Data back to AJAX

If you don't like the idea of a slow load, if your hosting provider allows scheduled tasks, just run the page on the scheduler. The page load from the Scheduler will pull all the Data required, and none of your users will experience any slowdown of experience.

Edited by Funce
  • Thanks 1

Share this post


Link to post
Share on other sites
11 hours ago, iwato said:

I have just gotten use to used to JSON and quite prefer it. 

JSON is used to transfer data, so you are continuously requesting the data from Matomo every time it is required, even though it will be likely same each time you do so. instead you can use JSON just once within a time limit, to retrieve the data and store, you can then filter what is wanted, from unwanted.

Database OR XML they are both forms of storage of data, depending on the data amount depends on which to use, large capacity data, use database, small capacity use XML. The database is much quicker, easier to read large amounts of stored data, but! You could split larger data into smaller XML files that relates to a specific type of data related to a specific page.

 

  • Thanks 1

Share this post


Link to post
Share on other sites

I would like to thank you both for the wonderful suggestions, clarifications, and implementing strategies.  This topic must now wait, however, while I attend to other matters that are equally, if not more, pressing. Long turn-around times between questions-asked and questions-answered have taught me to appreciate multi-tasking.  In this particular instance, I discovered that some of my already implemented code was not robust.  It worked in one instance, but not in another.  I finally replaced it, but in the meantime several answers to questions for other tasks entered in, and I set this task on hold weary of my misfortune.

Roddy

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×