Badchip Posted November 7, 2021 Share Posted November 7, 2021 I want to avoid file_get_contents from being executed multiple times at once. For example, if 100 visitors access a PHP at the same time, I want the file_get_contents to be executed only 1 time. Thank you in advance. Link to comment Share on other sites More sharing options...
niche Posted November 7, 2021 Share Posted November 7, 2021 (edited) Set up a SESSION for each user, so your server will hit the session file for each user, instead of the original server file. https://www.w3schools.com/php/php_sessions.asp EDIT: Using tables and databases, instead of a file, may eliminate your concern. Edited November 7, 2021 by niche 1 Link to comment Share on other sites More sharing options...
Ingolme Posted November 7, 2021 Share Posted November 7, 2021 I'm not sure what you mean. If file_get_contents() is not executed, then the user will not see anything. If you can explain why you don't want to call file_get_contents() more than once we might be able to find a proper solution. Link to comment Share on other sites More sharing options...
niche Posted November 7, 2021 Share Posted November 7, 2021 Taking the question as written, using SESSION would mediate the challenge. That way, every user won't be hitting the same file all the time. Ingolme, what do you think, given only the available info?  Link to comment Share on other sites More sharing options...
Ingolme Posted November 7, 2021 Share Posted November 7, 2021 Sessions would only prevent the same user from opening the file 100 times. It wouldn't stop 100 different users from opening the file because each session is tied to an individual. Databases are slower than filesystem access, so it wouldn't be wise to use a database to keep track of how many times a file has been opened, it would actually be faster to use a file itself. Again, if only 1 in 100 users can access the file then only that person will actually get to see the contents of the file and everybody else would see nothing. Link to comment Share on other sites More sharing options...
niche Posted November 7, 2021 Share Posted November 7, 2021 (edited) I don't see a way to avoid the initial touch by a 100, except to space-out those initial touches, which would automatically happen with sessions. then the problem becomes re integration, which would make it similar to a Version Control challenge IMO. Edited November 7, 2021 by niche Link to comment Share on other sites More sharing options...
Ingolme Posted November 7, 2021 Share Posted November 7, 2021 There are ways to solve the problem, but without more information about the primary objective, I cannot recommend any specific approach. Link to comment Share on other sites More sharing options...
niche Posted November 7, 2021 Share Posted November 7, 2021 (edited) Is it correct to think that sessions would spread out the initial hit? Edited November 7, 2021 by niche Link to comment Share on other sites More sharing options...
Ingolme Posted November 7, 2021 Share Posted November 7, 2021 No, not likely. I would assume a majority, if not all simultaneous requests are coming from different people. Link to comment Share on other sites More sharing options...
niche Posted November 7, 2021 Share Posted November 7, 2021 So, sessions isn't a way to spread out requests from different people? Link to comment Share on other sites More sharing options...
Badchip Posted November 7, 2021 Author Share Posted November 7, 2021 (edited) Suppose that 100 visitors run the same PHP at the same time (which contains the file_get_contents to get the code of another page) I'd like to allow only 1 request at a time. That code will be sent via file_put_contents (to a .txt file) so that other users can access it. I want to prevent the other website from registering 100 hits in a second. Edited November 7, 2021 by Badchip Link to comment Share on other sites More sharing options...
Ingolme Posted November 8, 2021 Share Posted November 8, 2021 You could use the lock on an empty file to determine whether a person has permission. The flock() operation is atomic, so there won't be any race conditions. It isn't perfect, but here's a possible implementation: define('TEMP_LOCK_FILE', 'templock.txt'); // Which file represents the lock define('TEMP_LOCK_TIME', 1); // How long to lock the file // Create the lock file if it does not exist if(!file_exists(TEMP_LOCK_FILE)) { file_put_contents(TEMP_LOCK_FILE, ''); } // Check the file to see if it's locked $fp = fopen(TEMP_LOCK_FILE, 'w'); $fail = !flock($fp, LOCK_EX|LOCK_NB, $is_locked); if ($fail || $is_locked) { // The file was locked by somebody else or we failed to acquire a lock for some other reason exit; } // We have successfully locked the file which means we have permission to get the file contents $data = file_get_contents('otherfile.txt'); echo $data; // Make the other 99 users wait for a short amount of time, then we can release the lock. sleep(TEMP_LOCK_TIME); fclose(); // Closing the file releases the lock The minor problem with the above script is that it makes the user who actually had permission to view the file wait approximately second after the read operation has finished, but it guarantees that other people won't run the code within the time limit. 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now