Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to process multiple parallel requests from one client to one PHP script

I have a webpage that when users go to it, multiple (10-20) Ajax requests are instantly made to a single PHP script, which depending on the parameters in the request, returns a different report with highly aggregated data.

The problem is that a lot of the reports require heavy SQL calls to get the necessary data, and in some cases, a report can take several seconds to load.
As a result, because one client is sending multiple requests to the same PHP script, you end up seeing the reports slowly load on the page one at a time. In other words, the generating of the reports is not done in parallel, and thus causes the page to take a while to fully load.

Is there any way to get around this in PHP and make it possible for all the requests from a single client to a single PHP script to be processed in parallel so that the page and all its reports can be loaded faster?

Thank you.

like image 393
HartleySan Avatar asked Oct 15 '25 20:10

HartleySan


1 Answers

As far as I know, it is possible to do multi-threading in PHP. Have a look at pthreads extension.

What you could do is make the report generation part/function of the script to be executed in parallel. This will make sure that each function is executed in a thread of its own and will retrieve your results much sooner. Also, set the maximum number of concurrent threads <= 10 so that it doesn't become a resource hog.

Here is a basic tutorial to get you started with pthreads.

And a few more examples which could be of help (Notably the SQLWorker example in your case)

like image 73
Niket Pathak Avatar answered Oct 17 '25 09:10

Niket Pathak