Websites have to make requests from other websites all the time for things as simple as an image, or more complex information such as search data or social media sharing data. Often, information is pulled to your website by use of a cURL request or similar method. These can cause problems with webpages loading for several reasons — on a commerce website for instance:
- The website currently initiates a call to another system with the order details.
- This call, if synchronous, will cause the system to appear slow since the user must wait for the data to return for the order to complete.
- You can use cron to push order data to the external system every five minutes, but this may not meet business constraints for processing time. It also an create other problems due to the heavy use of cron.
Finding and fixing external calls that aren't working properly is the best method of ensuring stability. A common Drupal 7 culprit is
drupal_http_request . In Drupal 8,
drupal_http_request has been replaced by the Guzzle HTTP library . The Drupal.org page has examples of its implementation.
- If you have a New Relic pro subscription, examine the External Calls section. This will give you some idea of where the calls are coming from.
- Use grep in the codebase to look for
\Drupal::httpClient();(Drupal 8 and newer),
drupal_http_request(Drupal 7 or older),
file_get_contents. As an example:
find . -regex ".*\.\(php\|module\|inc\|install\)" |xargs grep --color=always -in 'drupal_http_request'
Inspect the output specifically for custom modules.
- Ensure you are not making calls to your own balancers. In Drupal 7 this usually comes from a
- If there's a long-running request making external calls,
drupal_http_requestwill have a long wait time in xhprof. You can use that to trace back to where this call is coming from.
drupal_http_requestwill also sometimes display as
User Agentin logs.
cURL, Drupal, and other PHP libraries have similar types of timeouts written into their systems, and can be used for similar reasons. Setting these properly is highly dependent on the type of external call, and what resources are being accessed.
cURL has parameters that can limit the amount of time the command will wait for a response before terminating. Setting these parameters to a reasonable limit can preserve your website's performance in the event the external website you are attempting to reach is unavailable. The most useful of these parameters is
We suggest reviewing at least two of the options for
curl_setopt. Setting these options can prevent a call from hanging, which prevents your website from slow response time when a remote server is having issues:
CURLOPT_CONNECTTIMEOUT- The number of seconds to wait while trying to connect. Set the value to
0to wait indefinitely.
CURLOPT_TIMEOUT- The maximum number of seconds to allow cURL functions to execute. The default value for
max_execution_timevariable. On Acquia Cloud, this variable can be changed in your PHP settings.
As an example, you could set these options as part of the function like this:
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10); curl_setopt($ch, CURLOPT_TIMEOUT, 30); //timeout in seconds
You can change the timeout options in Drupal, depending on the version that you're using:
- Drupal 8
For Drupal 8, when you are using
$client = \Drupal::httpClient(['base_url' => 'https://foo.com/api/']);
You can use:
$client->request('GET', $url,['timeout' => 5]);
- Drupal 7
If you're using a
drupal_http_request, you can include a timeout:
In Drupal 7,
drupal_http_request($url, $options = array('timeout' => 5))
drupal_http_requestuses HTTP/1.0. This may cause external requests to remain in an open/waiting state unless the server response includes Content-Length or Transfer-Encoding headers. In this case we recommend testing the Curl HTTP Request module as a drop-in replacement for
A website can cache the results of the external request for a specific amount of time. This prevents the need to make the call on every webpage request, if that is an issue. Caching an external call for even minutes can help alleviate the performance issue.
Caching or background processing is really the only tool to take some of the load off. Optimizing the other end of the service request would be the other solution, but these external services are often out of the user's control.
Using background processes
The Background Process module offers one way to continue processing a webpage, even without the external asset. When set up properly, it can:
- Issue a call to a remote service at the beginning of request processing (non-blocking)
- Continue webpage processing
- When the result is needed, check if it's already there, if not wait for it with a timeout
- Optionally cache the response
This helps ensure that webpages load, even if assets are missing. If the asset isn't needed for that webpage, it can also add the request to a queue and process it in the background. However, it can also increase server load, because it spawns separate processes to handle each request.