Server gets crashed when using curl in php

by ram07
4 replies
Hi guys,

I have spent whole day in debugging in it, but can't get any solution... PLz anyone help me.

I am trying to fetch data from a site by using curl. My PHp version is 4.3.9

Here is the code i have used,

function ipaddress($ip)
{
$url = 'http://www.example.com?x=' . $ip;
$curl = curl_init ($url) or die ('curl_init() failed');
curl_setopt ($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt ($curl, CURLOPT_TIMEOUT, 5);
curl_setopt ($curl, CURLOPT_URL, $url);
$data = curl_exec ($curl);
curl_close ($curl);
}

I call this function for about 60 times in a loop with different ip address at a single page load. On doing this often the server get crashed and my site is not loading. It get loading only after i restarted the server.

I searched in some forums, they all said about the execution timeout and the connection timeout...

Anyone who had be familiar with this can help me or suggest me an alternative way to do this...

Thank you...
#crashed #curl #php #server
  • Profile picture of the author walkerj
    Store the 60 IPs in a database. Do a script that will process a single IP each time and then reload itself (reload the page via javascript timeout or meta-refresh).
    {{ DiscussionBoard.errors[4025176].message }}
    • Profile picture of the author ram07
      Thanks for your replay.

      The IP's are stored in the database and i am passing the ip's to the function while retrieving the it from the database in a while loop...

      So the each increment the function is called and get interacted with that site. Any idea on decreasing the timeout issue

      Thanks.
      {{ DiscussionBoard.errors[4025209].message }}
  • Profile picture of the author paulpalm
    walkerj 'ssuggestion is trying to cut down on your memory usage.

    If you just run your program with one URL as a test, then you should get to the point where there are no crashes. Therefore, walkerJ's simple solution tests the memory theory by allowing your php to be cleared from memory before javascript calls your script again.

    Or you can examine your code making sure you free up any objects etc. used before the next loop iteration.

    The only other timeout for Curl is, CURLOPT_CONNECTTIMEOUT
    e.g. curl_setopt ($curl, CURLOPT_CONNECTTIMEOUT, 5);

    And if you wanted the processing to occur faster then you would have to look into the rather complex issue of multiple curl requests.

    By the way, I would ask your host to upgrade your php to at least version 5.2, php4 is like developing on Windows 98 today - you just wouldn't ;-)
    {{ DiscussionBoard.errors[4025663].message }}
  • Profile picture of the author stma
    I'd clean the whole mess up and turn it into a cron job probably.

    Not sure how often you need the data, but let's say it was once a day.

    Hour 1 Day 1: Get IP's from database and store in temporary table or just set them all to not crawled.

    Hour 1.3 Day 1: Grab first IP from DB that hasn't done it's crawling yet. Do your thing. Set that IP to crawled.

    Hour 1.4 Day 1: Grab second ip.... etc...

    Sounds like what you are doing is just storing to much right now though. You aren't cleaning the results off the server and loading it 60 times and storing all that info. It's a lot for any server to handle all at once.

    Let it take a breath, and upgrade your php version right now.
    {{ DiscussionBoard.errors[4027569].message }}

Trending Topics