• 5

A PHP Error was encountered

Severity: Notice

Message: Undefined index: userid

Filename: views/question.php

Line Number: 191


File: /home/prodcxja/public_html/questions/application/views/question.php
Line: 191
Function: _error_handler

File: /home/prodcxja/public_html/questions/application/controllers/Questions.php
Line: 433
Function: view

File: /home/prodcxja/public_html/questions/index.php
Line: 315
Function: require_once

name Punditsdkoslkdosdkoskdo

Is making asynchronous HTTP requests possible with PHP?

I have a PHP script that needs to download several files from a remote server. At the moment I just have a loop downloading and processing the files with cURL, which means that it doesn't start downloading one file until the previous one is finished - this increases the script run time significantly.

Would it be possible to start several instances of cURL, for example, to asynchronously download these files at the same time without waiting for the previous one to finish? If so, how would this be accomplished?

Check out curl-easy. It supports both blocking and not-blocking requests in parallel or single request at once. Also, it is unit-tested, unlike many simple or buggy libraries.

Disclosure: I am the author of this library. The library has it's own test suite so I'm pretty confident it is robust.

Also, check out example of use below:

// We will download info about 2 YouTube videos:
// http://youtu.be/XmSdTa9kaiQ and
// http://youtu.be/6dC-sm5SWiU

// Init queue of requests
$queue = new cURL\RequestsQueue;
// Set default options for all requests in queue
    ->set(CURLOPT_TIMEOUT, 5)
// Set callback function to be executed when request will be completed
$queue->addListener('complete', function (cURL\Event $event) {
    $response = $event->response;
    $json = $response->getContent(); // Returns content of response
    $feed = json_decode($json, true);
    echo $feed['entry']['title']['$t'] . "\n";

$request = new cURL\Request('http://gdata.youtube.com/feeds/api/videos/XmSdTa9kaiQ?v=2&alt=json');

$request = new cURL\Request('http://gdata.youtube.com/feeds/api/videos/6dC-sm5SWiU?v=2&alt=json');

// Execute queue
  • 21
Reply Report
      • 2
    • Your library is absolutely remarkable. Thanks for it! I was about to give up and only then, I found it. Greaaaaaat job, mate!
    • @jreed121 no, actually it doesn't. Callback functions are executed sequentially, so there is no risk of accessing objects simultaneously.
      • 2
    • Nice. Honestly, I get the concept of TS, but I'm still pretty ignorant on the details. All I know for sure is that I need to make a crapload of API calls to different sites, and this particular site is on shared hosting that is not TS. Thanks again for sharing this, and for the quick reply!


There is the multirequest PHP library (or see: archived Google Code project). It's a multi-threaded CURL library.

As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.

  • 17
Reply Report

For PHP5.5+, mpyw/co is the ultimate solution. It works as if it is tj/co in JavaScript.


Assume that you want to download specified multiple GitHub users' avatars. The following steps are required for each user.

  1. Get content of http://github.com/mpyw (GET HTML)
  2. Find <img class="avatar" src="..."> and request it (GET IMAGE)

---: Waiting my response
...: Waiting other response in parallel flows

Many famous curl_multi based scripts already provide us the following flows.

        /-----------GET HTML\  /--GET IMAGE.........\
       /                     \/                      \ 
[Start] GET HTML..............----------------GET IMAGE [Finish]
       \                     /\                      /
        \-----GET HTML....../  \-----GET IMAGE....../

However, this is not efficient enough. Do you want to reduce worthless waiting times ...?

        /-----------GET HTML--GET IMAGE\
       /                                \            
[Start] GET HTML----------------GET IMAGE [Finish]
       \                                /
        \-----GET HTML-----GET IMAGE.../

Yes, it's very easy with mpyw/co. For more details, visit the repository page.

  • 1
Reply Report

The library of @stil is so, so cool. Many thanks to him!

Still, I have written nice utility function that makes it very easy to get asynchronously content from multiple URLs (APIs in my case) and to return them without losing information which is which.

You simply run it by passing key => value array as an input and it returns key => response array as result :- )

     * This function runs multiple GET requests parallely.<br />
     * @param array $urlsArray needs to be in format:<br />
     * <i>array(<br />
     * [url_unique_id_1] => [url_for_request_1],<br />
     * [url_unique_id_2] => [url_for_request_2],<br />
     * [url_unique_id_3] => [url_for_request_3]<br />
     * )</i><br />
     * e.g. input like:<br />
     *  <i>array(<br />
     * &nbsp; "myemail@gmail.com" =>
     * &nbsp; "http://someapi.com/results?search=easylife",<br />
     * &nbsp; "michael@gmail.com" =>
     * &nbsp; "http://someapi.com/results?search=safelife"<br />
     * )</i>
     * @return array An array where for every <i>url_unique_id</i> response to this request 
     * is returned e.g.<br />
     * <i>array(<br />
     * &nbsp; "myemail@gmail.com" => <br />
     * &nbsp; "Work less, enjoy more",<br />
     * &nbsp; "michael@gmail.com" => <br />
     * &nbsp; "Study, work, pay taxes"<br />
     * )</i>
     *  */
    public function getResponsesFromUrlsAsynchronously(array $urlsArray, $timeout = 8) {
        $queue = new \cURL\RequestsQueue;

        // Set default options for all requests in queue
                ->set(CURLOPT_TIMEOUT, $timeout)
                ->set(CURLOPT_RETURNTRANSFER, true);

        // =========================================================================
        // Define some extra variables to be used in callback

        global $requestUidToUserUrlIdentifiers;
        $requestUidToUserUrlIdentifiers = array();

        global $userIdentifiersToResponses;
        $userIdentifiersToResponses = array();

        // =========================================================================

        // Set function to be executed when request will be completed
        $queue->addListener('complete', function (\cURL\Event $event) {

            // Define user identifier for this url
            global $requestUidToUserUrlIdentifiers;
            $requestId = $event->request->getUID();
            $userIdentifier = $requestUidToUserUrlIdentifiers[$requestId];

            // =========================================================================

            $response = $event->response;
            $json = $response->getContent(); // Returns content of response

            $apiResponseAsArray = json_decode($json, true);
            $apiResponseAsArray = $apiResponseAsArray['jobs'];

            // =========================================================================
            // Store this response in proper structure
            global $userIdentifiersToResponses;
            $userIdentifiersToResponses[$userIdentifier] = $apiResponseAsArray;

        // =========================================================================

        // Add all request to queue
        foreach ($urlsArray as $userUrlIdentifier => $url) {
            $request = new \cURL\Request($url);
            $requestUidToUserUrlIdentifiers[$request->getUID()] = $userUrlIdentifier;

        // =========================================================================

        // Execute queue

        // =========================================================================

        return $userIdentifiersToResponses;
  • 1
Reply Report
    • I'm glad you made use of it. But there is simpler solution to track "which request is which". $request = new cURLRequest($url); $request->_myId = 1; and then check on complete event: $event->request->_myId. It's a popular problem, so I'll see how to implement it in library without declaring dynamic properties in request object.
    • Ah, yeah. Well, that's a bit easier, actually I thought about it, but like you said, I wanted to avoid dynamic field being assigned to an external class. Thanks for the tip and for the library! :- )

In PHP 7.0 & Apache 2.0, as said in PHP exec Document, redirecting the output, by adding " &> /dev/null &" at the end of the command, could makes it running on background, just remember to wrap the command correctly.

$time = microtime(true);
$command = '/usr/bin/curl -H \'Content-Type: application/json\' -d \'' . $curlPost . '\' --url \'' . $wholeUrl . '\' >> /dev/shm/request.log 2> /dev/null &';
echo (microtime(true) - $time) * 1000 . ' ms';

Above works well for me, takes only 3ms, but following won't work, takes 1500ms.

$time = microtime(true);
$command = '/usr/bin/curl -H \'Content-Type: application/json\' -d \'' . $curlPost . '\' --url ' . $wholeUrl;
exec($command . ' >> /dev/shm/request.log 2> /dev/null &');
echo (microtime(true) - $time) * 1000 . ' ms';

In total, adding " &> /dev/null &" at the end of your command may helps, just remember to WRAP your command properly.

  • 0
Reply Report

Warm tip !!!

This article is reproduced from Stack Exchange / Stack Overflow, please click

Trending Tags

Related Questions