Jump to content

Populating Php/mysql Table With Daily, Dynamic Statistics From A Website?


paulmo

Recommended Posts

Scenario: statistics are posted daily in a form field or table on a third-party website. I get permission to use those statistics, and retrieve them automatically every day at midnight (from PHP or JS), into a MySQL table, then present them as a dynamic graph that changes every day, and calculate and display weekly percentage increases or decreases from statistics. RSS/XML potential? I've only populated MySQL tables with form field data from my own website, so I need some direction here. Thanks in advance for suggestions.

Edited by paulmo
Link to comment
Share on other sites

You can use the file_get_contents() function or a HTTP library like cURL to retrieve the content from the third-party website. If they publish their statistics in RSS or another XML-based format, then of course you can use an XML library like DOMDocument to parse the results, otherwise if the information is only available on their HTML site, you can try using the XML library on the HTML anyway, but if that doesn't work your best best would probably be regular expressions. Once you have the data you can just insert it into your database as normal. To make the script run every day you can use cron (Unix) or Scheduled Tasks (Windows).

Link to comment
Share on other sites

thanks Synook. I can do cron through my host. Since I'll be using php/mysql, what advantages if any would using cURL or XML have over file_get_contents() ?PHP seems to keep it all one script, whereas cUR/XML points to php script anyway, like using json. what do you think? thanks.

Edited by paulmo
Link to comment
Share on other sites

If your server allows you to use file_get_contents over HTTP, and all you need to do is read the contents of the file, then that should be fine. If you need to do anything more advanced, like a post request or following a header redirect, or your server doesn't allow fopen to open URLs, then you need to use cURL.

Link to comment
Share on other sites

Thank you justsomeguy, good to see you still around here and moderating. file_get_contents is working for echoing a home page, etc., but I need to parse into JSON columns/rows to render in Google Chart. The site I'm looking at has many tables...I need last number -1226 from this one:

<td class="MRdetail" width="24%"><span class="hasHelp" id="styles64">Phase 2</span></td>		<td class="MRdetail" align="center" width="24%">-2,000</td> 		<td class="MRdetail" align="center" width="24%">1,200</td>		<td class="MRdetail" align="right" width="24%">-1,226</td>

These all updated daily. The only thing unique to identify this on the page is "styles64" included in a table. Phase 2 appears in other tables, and styles64 appears elsewhere, but not in other tables. How to capture that value? edit: just found linked, downloadable .csv file on same site. here's a section...only problem is again, Phase 2 mentioned on different occasions...I need the first instance of Phase 2 (example below):

"D","NB",17"D","Phase 2",-1226"D","Highgate",-218

Edited by paulmo
Link to comment
Share on other sites

You can use fgetcsv to convert the CSV data to an array and then loop through it. The first time you find what you're looking for you can just quit out of the loop if you don't want to keep going. http://www.php.net/manual/en/function.fgetcsv.php You can also use str_getcsv, but it only operates on one line at a time. You would need to split the file data into lines first and then loop through them and convert each one to an array to check. http://www.php.net/manual/en/function.str-getcsv.php

Link to comment
Share on other sites

thanks, that renders the .csv data...hundreds of lines and these are the ones i need:

5 fields in line 56: DPhase 2-20001200-1225

so in this last part of fgetcsv script, how to echo the output of the .csv to a .json file?

for ($c=0; $c < $num; $c++) {			echo $data[$c]

and then of course how to isolate just that section of .csv/.json data (Phase 2 above) to go in my table/charts? thanks so much.

Edited by paulmo
Link to comment
Share on other sites

  • 2 years later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...