Jump to content

Php Memory


Err

Recommended Posts

I'm working on a script that works on some users and fails on others. What my script does: - PHP script reading though log files and extracting certain lines from it.- Script extracts IP's and dates through a series of strpos() and substr()- I execute the script through a webpage that targets single users at one time. The log files are sometimes several MB in size. My problem: When I try to run the script on a certain user it will give me memory errors:

Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 725359539 bytes) in C:\wamp\www\mc\index.php on line 141

Considering that the log files are big and the massive amounts of work the script has to do, I allowed it 1GB of memory directly in the php.ini file. Now this memory error is telling me that is exhausted all that. So I set ini_set('memory_limit','2048M'); in the script. When I run the script with that kind of memory, it doesn't generate any errors but what it does is simply remove the bottom half of my webpage. I always have error_reporting(E_ALL); and ini_set('display_errors', 1); set on all my pages. Line 141 contains:

$str = (!empty($dts)) ? min($dts) : $str;

Which is basically me assigning the earliest date in the $dts array to a return string. Else, it returns an error string. I can't figure out what's wrong. Could really use some help.

Link to comment
Share on other sites

There's not much information to let me suggest anything. You may want to use fopen and fread to read the file a line at a time instead of reading the entire thing, if that's what you're doing. The error says it is trying to allocate 700MB after it already allocated 1GB, so obviously that's a lot of memory. If the files are smaller than 1GB then you're probably saving too many things in variables.

Link to comment
Share on other sites

I am reading the entire log then searching the file for certain strings and cutting them out of the log until I have everything that I want. How is reading it line by line going to help?

Link to comment
Share on other sites

I see. The files are considerably smaller than 1.7GB in size, it's in 8 - 693MB range. What I'm doing is looping through all the logs (have 4 of them) and extracting the strings I need twice over since I'm extracting two different set of strings without stopping.

Link to comment
Share on other sites

Update: So I went with reading the files line-by-line, it takes considerably longer to process but I don't get any memory errors nor weird bugs. Takes on average of 2min 30sec for the script to complete instead of the way I was doing it before which was around 30-50sec. I don't suppose there is any way I can speed that up?

Link to comment
Share on other sites

Another Update: I'm not sure why but when running on a Linux system is reads the file A LOT faster. Instead of the 2min 30sec per user, it runs one user every 20 secs, which is great!

Link to comment
Share on other sites

That's related to the speed of the storage system. If you're comparing this on your home computer versus a rented server, the rented server has a much faster storage system. The server would be using SCSI or even SSD storage, maybe in a RAID array, where your home computer may even be using a bottom-rung 5400 RPM IDE drive.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...