0

I have some php code, that execute for a very long time.

I need to realise next scheme:

  1. User enter on some page(page 1)
  2. This page starts execution of my large PHP script in background .(Every change is writting to database)
  3. We sent every N seconds query to database to get current status of execution.

I don't want to use exec command because 1000 users makes 1000 php processes. It's not way for me...

1
  • Because this process must go without user's activity Commented Jan 7, 2013 at 5:25

3 Answers 3

1

So you basically want a queue (possibly stored in a database) and a command line script ran by cron that process queued items.

Clarification: I'm not sure about what's unclear about my answer, but this complies with the two requirements imposed by the question:

  1. The script cannot be aborted by the client
  2. You share a single process between 1,000 clients
Sign up to request clarification or add additional context in comments.

Comments

1

Use http requests to the local http server from within your script in combination with phps ignore_client_abort() function.

That way you keep the load inside the http servers worker processes, have a natural limit and queuing of requests comes for free.

1 Comment

I would also remind him to use set_time_limit(0) and register_shutdown_function()
0

You can use CLI to execute multiple PHP scripts

or

you can try Easy Parallel Processing in PHP

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.