How to make my PHP script keep on running?

by 9 replies
11
I made a script that will backup and transfer database records to another server. The problem is if the backup file is too big the PHP script stops because it running too long. Tried set_time_limit() but it is not working. How can I override this?
#programming #make #php #running #script
  • Beyond the time limit your script may be hitting other limits as configured by your server's PHP and Apache.

    If it's processing a large backup you could be hitting memory limits.

    Also, double check to make sure that your set_time_limit() command is having an affect, as not all configurations allow settings to be changed that way.

    Another thing to look into is re-designing your backup process so it does it in parts. One database table at a time, or for large tables, cutting it into chunks of records. Many processes make light work.

    -Ryan
  • what happens mostly is the browser times out from not getting output to the screen.
    look up php flush. If that doesn't work I've also done some "ajax" to keep the connection alive, and refresh the screen.
    • [1] reply
    • This might be the one you need - it's free:
      MySQLDumper - Backup your MySQL-Database (e.g. forum, guestbooks and online shops)

      Haven't used it before - just read about it but here is a quote from the website:
      " A PHP script has a maximum execution time that is usually set to 30 seconds on most server installations. A script running longer than this limit will simply stop working. This behavior makes backing up large databases impossible. Maybe you already had this specific problem when using other tools.
      MySQLDumper fills a gap ...

      MySQLDumper uses a proprietary technique to avoid this problem. It only reads and saves a certain amount of data, then calls itself recursively via JavaScript and remembers how far in the backup process it was. The script then resumes backing up from that point. "
  • rahul222, try add after set_time_limit(0);

    ignore_user_abort(true);
  • add set_time_limit(0); in the beginning of the script (works for me)
    if that dont work it means ur host dont allow you to run the script that long....you must use shell for large sql files....
  • add set_time_limit(0); It gives you maximum 300 seconds its the limit for php default settings and apache. If you dont have access to php.ini and apache configuration your stucked with this solution.
    RJP is giving you correct advice you need to make your data to smaller parts if its too big and thats it.
  • you can using ajax for resubmit forms every 30 secounds one time
    • [1] reply
  • To backup a DB you might try this:

    Code:
    exec("/usr/bin/mysqldump --opt --host=DBHOST --user=DBUSER --password=PASSWORD DBNAMEl > backup_" . date("Y-m-d_H-i-s") . ".sql");
    DBHOST
    DBUSER
    PASSWORD
    DBNAME
    is what you need to change with your own info... :p
    Turn off PHP safe mode.
    Filename will be 2009-02-01_12-00-00.sql as an example.
    Just paste it in between the <?PHP ?>.

    I use a cron command to run it once a day.
  • Banned
    [DELETED]

Next Topics on Trending Feed

  • 11

    I made a script that will backup and transfer database records to another server. The problem is if the backup file is too big the PHP script stops because it running too long. Tried set_time_limit() but it is not working. How can I override this?