Jump to content

How to prevent multiple requests for file_get_contents?


Badchip

Recommended Posts

Set up a SESSION for each user, so your server will hit the session file for each user, instead of the original server file.

https://www.w3schools.com/php/php_sessions.asp

EDIT:

Using tables and databases, instead of a file, may eliminate your concern.

Edited by niche
  • Like 1
Link to comment
Share on other sites

I'm not sure what you mean. If file_get_contents() is not executed, then the user will not see anything.

If you can explain why you don't want to call file_get_contents() more than once we might be able to find a proper solution.

Link to comment
Share on other sites

Taking the question as written, using SESSION would mediate the challenge.  That way, every user won't be hitting the same file all the time. Ingolme, what do you think, given only the available info?  

Link to comment
Share on other sites

Sessions would only prevent the same user from opening the file 100 times. It wouldn't stop 100 different users from opening the file because each session is tied to an individual.

Databases are slower than filesystem access, so it wouldn't be wise to use a database to keep track of how many times a file has been opened, it would actually be faster to use a file itself.

Again, if only 1 in 100 users can access the file then only that person will actually get to see the contents of the file and everybody else would see nothing.

Link to comment
Share on other sites

I don't see a way to avoid the initial touch by a 100, except to space-out those initial touches, which would automatically happen with sessions. then the problem becomes re integration, which would make it similar to a Version Control challenge IMO. 

Edited by niche
Link to comment
Share on other sites

Is it correct to think that sessions would spread out the initial hit?

Edited by niche
Link to comment
Share on other sites

Suppose that 100 visitors run the same PHP at the same time (which contains the file_get_contents to get the code of another page)
I'd like to allow only 1 request at a time. That code will be sent via file_put_contents (to a .txt file) so that other users can access it.
I want to prevent the other website from registering 100 hits in a second.

Edited by Badchip
Link to comment
Share on other sites

You could use the lock on an empty file to determine whether a person has permission. The flock() operation is atomic, so there won't be any race conditions.

It isn't perfect, but here's a possible implementation:

define('TEMP_LOCK_FILE', 'templock.txt'); // Which file represents the lock
define('TEMP_LOCK_TIME', 1); // How long to lock the file

// Create the lock file if it does not exist
if(!file_exists(TEMP_LOCK_FILE)) {
  file_put_contents(TEMP_LOCK_FILE, '');
}

// Check the file to see if it's locked
$fp = fopen(TEMP_LOCK_FILE, 'w');
$fail = !flock($fp, LOCK_EX|LOCK_NB, $is_locked);
if ($fail || $is_locked) {
  // The file was locked by somebody else or we failed to acquire a lock for some other reason
  exit;
}

// We have successfully locked the file which means we have permission to get the file contents
$data = file_get_contents('otherfile.txt');
echo $data;

// Make the other 99 users wait for a short amount of time, then we can release the lock.
sleep(TEMP_LOCK_TIME);
fclose(); // Closing the file releases the lock

The minor problem with the above script is that it makes the user who actually had permission to view the file wait approximately second after the read operation has finished, but it guarantees that other people won't run the code within the time limit.

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...