Write delay with sessions/cache

When trying to implement a locking mechanism to prevent duplicate processing of a longer running script, i experienced something i wasn’t aware of: PHP seems to write to sessions and cache after a request is complete.

A simple test script should show this:


if (apc_fetch('locked'))

  die ('this script is still running!');




echo "Success!";


When i run this script the same time in 2 tabs i would expect, that as long as the script in tab A is loading, i get the die() message in tab B. That’s not the case! Both scripts run, no matter if the first script saved a lock.

The same happens if i use $_SESSION to save some locking flag.

The only explanation i have: the actual cached value is only stored after the script ends.

Does anyone know, how to circumvent this?

My last idea would be to create lock file as file operations (hopefully) don’t get delayed until the script ends.

Wow, things get even worse. Not even with a lock file it’s possible. I’ve tried this now:




if (file_exists($tmpfile)) {

  die ('This script is still running in another process!');

} else {




    echo "Success!";


The strange thing is: Run these two scripts in two tabs. Both scripts will take 10 seconds, but the output is different:

Tab A, after 10 seconds: Success!

Tab B, after 10 seconds: This script is still running in another process!

So it feels like the file_exists() call in Tab B is waiting until the same script in Tab A has completed! Isn’t that really strange?

I’m confused now. Any enlightenment?

Run the second request in another browser.

Thanks, that what i just discovered, too. Cool, so i don’t have to get insane ;)

But now my original problem persists: I need to prevent processing of a second form submission as long as a previous request is still running.

I also tried with flock($fh, LOCK_EX | LOCK_NB) with no luck. If the second request originates from the same browser LOCK_NB is simply ignored and the second request always waits for the first request to finish.


It’s the same issue. You can simply put

sleep(10); echo 'done';

The second tab (second connection) will need 20 seconds if you load both at the same time.

Hmm, that doesn’t really help. Let me show you my problem, maybe you have an idea:

    public function actionLoooong()



        if ($user->getState('complete',false))






        if (isset($_POST['SomeForm']) && !$user->getState('stillrunning'))


            $user->setState('stillrunning',true);	// LOCK PROCESSING

            // ...

            // Now process the form here, which takes some seconds.

            // The same data in $user->getState('data') must not be processed twice!

            // ...

            $user->setState('stillrunning',false); // RELEASE LOCK AFTER PROCESSING





        // Render a form that asks "do you want to process your data?"





The data in $user->getData(‘data’) must never be processed twice! So the basic idea is: I store a flag that indicates that the data gets processed now. If user re-submits, he will see the form again, and can try to submit again. If this happens, and the first script finished: “complete” will be set in user state and the second form submission will just render the “complete” view. If the first script is still running, the form is rendered again.

But as stated above, this doesn’t work. The second request doesn’t see any “stillrunning” flag, before request 1 is completed.

Not sure how the behavior is with CHttpSession instead of CDbHttpSession, but the latter saves the actual session data onEndRequest.

You can create your own session class (by extending your session backend class), make the method writeSession public and call it somewhere to force the state to be stored instantly.

But I think it’s better to use file-based locking. See my mutex for example (signature).

Thanks for your help. But looking at the source of EMutex i see, it’s using flock(), too.

As i said, this does not work for me either: If the second request comes from the same browser, flock() just sits there and waits for the first request to finish. No matter what i try, you can’t tell, that another process for the same user is running.

No flock doesn’t wait, the browser is just holding back the connection for some reason. I’m on windows and can’t sniff local network traffic, you may try and see (you’re obviously on linux).

As noted, put only this in a script:

sleep(10); echo 'done';

The second tab/request will need 20 seconds.

// If you load at the same time in different browser both will take 10 seconds (finished at the same time).

Ah, now i understand. You lead me on a new track here. If the browser doesn’t send the next request until the response for the first arrives, this explains a lot of odd behavoirs i had with duplicate submissions. It also makes sense. I just wasn’t aware of that.

So thanks again!

Yes it’s kind of strange. Obviously it’s holding back the connection if you request the same resource uri. This will work (they both refer to the same script):

I see, good to know.

How about using the database to write something and checking if it’s there…


Shouldn’t make any difference: main problem is that the second request will not be sent until the first response arrives.