• 10
name

A PHP Error was encountered

Severity: Notice

Message: Undefined index: userid

Filename: views/question.php

Line Number: 191

Backtrace:

File: /home/prodcxja/public_html/questions/application/views/question.php
Line: 191
Function: _error_handler

File: /home/prodcxja/public_html/questions/application/controllers/Questions.php
Line: 433
Function: view

File: /home/prodcxja/public_html/questions/index.php
Line: 315
Function: require_once

I have PHP script that posts a request to a remote API. If the response takes longer than about 200 seconds to come back, then I just get a content-length of zero in the response. I am trying to figure out why that is happening.

In an attempt to resolve this issue, I have set every conceivable variable in Apache's and PHP's config files set to longer than 300 seconds to combat this, as recommended by the first answer below. The things I have set to 300 seconds:

  • Apache timeout
  • Apache keep_alive time
  • PHP max_reponse_time
  • PHP session.cache_expire time
  • PHP max_execution_time

Despite that I still consistently get zero content length responses right around the 200-second mark. However if it takes less than 200 seconds the problem does not occur.

Below I describe how our code is set up.

What happens is that crontab runs a shell script on our server, which calls a localhost URI using /usr/bin/curl. The localhost URI is served by Apache and is a PHP file that itself contains the below code, which in turn uses cURL to call out to the remote API. We POST about 10KB of XML and expect to receive about 135KB back, in chunks.

Here is the request code:

        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $this->_xml_url);
        curl_setopt($ch, CURLOPT_POST, true);
        curl_setopt($ch, CURLOPT_VERBOSE, true);
        curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: text/xml'));
        curl_setopt($ch, CURLOPT_POSTFIELDS, $xml_str);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        $output = curl_exec($ch);        
        curl_close($ch);

I turned on debugging in our Apache logging and below is what we get. In this example the request was sent at 19:48:00 and the response comes back at 19:51:23, just over 200 seconds later.

* About to connect() to api.asdf.com port 443 (#0)
*   Trying 555.555.555.555... * connected
* successfully set certificate verify locations:
*   CAfile: none
  CApath: /etc/ssl/certs
* SSL connection using RC4-SHA
* Server certificate:
*    subject: snip
*    start date: 2014-03-12 10:22:02 GMT
*    expire date: 2015-04-16 12:32:58 GMT
*    subjectAltName: api.asdf.com matched
*    issuer: C=BE; O=GlobalSign nv-sa; CN=GlobalSign Organization Validation CA - G2
*    SSL certificate verify ok.
> POST /xmlservlet HTTP/1.1

Host: api.asdf.com
Accept: */*
Content-Type: text/xml
Content-Length: 10773
Expect: 100-continue

< HTTP/1.1 100 Continue
< HTTP/1.1 200 OK
< Cache-Control: private
< Content-Type: text/xml
< Server: Microsoft-IIS/7.5
< X-AspNet-Version: 4.0.30319
< X-Powered-By: ASP.NET
< Date: Thu, 06 Nov 2014 19:51:23 GMT
< Content-Length: 0
< 

* Connection #0 to host api.asdf.com left intact
* Closing connection #0

I would like to know if there is something wrong with this code or something I may have missed in the server settings that could cause the content length to come back zero after 200 seconds.

      • 2
    • it might be related to 100 Continue status. Try to disable it in CURL request - curl_setopt($ch, CURLOPT_HTTPHEADER, array('Expect:', 'Content-Type: text/xml'));
    • I think the issue is not at your side .. its related to the remote host you are sending data .. most probably their server script timeout setting is 200s ..

Sounds very much like a timeout on your remote API server - try to access it manually e.g. via browser or wget.

  • 6
Reply Report
    • I think you win. I tried manually accessing via curl -m 1800 -v -S -H "Content-Type: application/xml" --data @test.xml {MYURI} and I'm having the same issues, with no PHP or Apache involved. So unless there is some arcane curl option I'm not considering here, then I think you are correct -- especially since googling around about Microsoft-IIS/7.5, I find that 200 seconds is the value they recommend for timeouts... I'm not saying it was Microsoft, but it was Microsoft. Your suggestion to try accessing it manually is what led me to eliminate enough other possibilities to be 99% sure on this.
      • 2
    • Alright I tested a few more different curl configuration options but no matter what I try, we get the same 200 seconds and out. Unless I reduce the size of my query in which case it starts to drop under 200 seconds. What you and Iserni seems highly likely to be correct, then, based on the preponderance of the evidence. :D
    • BTW for the record, I tried wget instead of curl, just to be sure. Same issue. And every conceivable curl argument.

and is a PHP file that itself contains the below code, which in turn uses cURL to call out to the remote API. We POST about 10KB of XML and expect to receive about 135KB back, in chunks.

Here is the request code:

    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $this->_xml_url);
    curl_setopt($ch, CURLOPT_POST, true);
    curl_setopt($ch, CURLOPT_VERBOSE, true);
    curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: text/xml'));
    curl_setopt($ch, CURLOPT_POSTFIELDS, $xml_str);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    $output = curl_exec($ch);        
    curl_close($ch);

What is very likely to be happening is that the remote server ($this->_xml_url) is actually a load balancer or other front-end block that has a timeout of 200s and calls a backend server.

If the backend server does not answer within the time frame, the front end server closes the backend connection and continues executing (which is arguably the wrong thing to do) and sends to your script a "success" answer with the content it has got.

Which is nothing. Hence content length of zero.

The best you can do is recognize the problem by checking data length or XML conformance, and either retry or notify the user. I'd use a lower timeout than 200s - let's say 180s - in order to be sure that you will be the one raising the error, i.e. the remote server will never hang up on you, you will hang up on him.

Try also notifying the API server maintainers in case there's some way of speeding up the request (a different API? Different encoding? Possibility of caching results? More expensive SLAs? etc.)

  • 4
Reply Report
      • 1
    • Well... we shall see if they do anything different this time. It is what it is. I just wanted to make sure our server wasn't the reason. Thanks.
    • With all due respect your answer is correct too, but Tyron was technically first and gave the suggestion to try using a manual method to diagnose the issue and eliminate other possibilities. The problem with notifying the API server maintainers is that these particular guys never respond to open tickets, don't have any other SLAs, no possibility of caching results, no other encodings. Your tip about the load-balancer may be correct and I've filed a ticket with them to maybe clue them in, although I'm pretty sure that ticket is going straight to their "circular file".
      • 2
    • That is absolutely okay. I am also upvoting Tyron for conciseness! Sorry to hear about the maintainers' lack of support...

Have you tried using set_time_limit in PHP

http://php.net/manual/en/function.set-time-limit.php

The default for php is usually 30 seconds and found in the php.ini


For configuring Apache see timeout, keepalive, keepalivetimeout and maxkeepaliverequests

http://users.cis.fiu.edu/~downeyt/cgs4854/timeout


Also see http://www.devside.net/wamp-server/apache-and-php-limits-and-timeouts for a good overall tutorial.

I've been able to run requests with apache and php for 15 minutes, so you can extend it for quite a long time.

  • 1
Reply Report
      • 2
    • I tried set_time_limit but that gets overridden automatically by max_execution_time, which I have set to 300000. As for apache, timeout is set to 300, so that cannot possibly be the problem. Meanwhile keepalivetimeout only has to do with the requests that come into apache from the outside. My script uses CURL in PHP to send the request but CURL's default is not to set any limit on how long to wait for the response. It gets a 200 OK response initially to the POST request but then just gets a blank string as the response after 200 seconds. Why? There is no curl config file on my system anywhere.
      • 2
    • There is a default timeout for curl, set to 300 secs (not 200), you can overwrite it by using curl_setopt($ch, CURLOPT_TIMEOUT, 400);. Anyway, if the curl's timeout were reached, curl_exec would return false, not an empty server response.

Make sure you are using the full url path to the external resource you are trying to reach (since we cannot verify that from your code example), and more importantly, make sure your cron command is executing from the path your script is residing in, like this:

1 1 * * 0 /path/to/your/script php < yourscript.php

Doing so will ensure anything referenced by a relative location path is found correctly.

Keep in mind that when you test something via the browser, your apache log is fine, but since you are croning the script, crontab related errors will be sent to your cron output file.

  • 0
Reply Report
      • 1
    • We don't use php CLI for accessing the script. The cron job calls curl to access it. Output from cron is sent to dev null in our crontab. Apache error.log is where output goes for us.
    • By default, cron (natively) should run longer than 200 seconds as long as it gets the initial handshake response, but you may want to pass a max-timeout length (in seconds) in case your server is set to less than 200. I would try using the php cli approach as well, instead of looping through your shell script, since you are essentially nesting cron calls both from your server to itself, and then to the outside world.
      • 2
    • As noted in other comments, I bypassed cron, php, and Apache entirely, and just manually ran the curl request (see my comment to Tyron's answer). I got the same results meaning it has nothing to do with anything other than the remote server.

Warm tip !!!

This article is reproduced from Stack Exchange / Stack Overflow, please click

Trending Tags

Related Questions