8 replies
Hello.
I have a problem with a script that I am building now.

I'm trying to make the application (built entirely using PHP) compress the contents of a folder (via linux TAR or windows ZIP) but when it comes to moving the file, via SCP (I do not use FTP) I can't point the best library to do so.
The archive is about 100 - 500 MB in size.

First, there is PHPSecLib which is a great, pure PHP implementation and does a terrific job with the SSH part, however, when it comes to SCP, it's very very slow at transferring the file and hogs the system's CPU and also loads the entire file in memory.

Secondly, there's the PHP ssh2_scp_send which is truly remarkable, handling huge files with very low memory and CPU usage and also sending at the full speed of the connection. However, this requires users to install libssh2-php which isn't an option for shared accounts.

Do you have any recommendations about another library that I can implements?

I was thinking to first check for the ssh2_* functions and if available use that and if not, use the PHPSecLib, but most of the people will end up using the second option which makes the server slow, unresponsive and with large files, times out or exceeds memory limit (physical memory limits or, on shared accounts, forced ones by sysadmins).

Thanks!
#library #php #scp
  • Profile picture of the author ovnign
    phpseclib doesn't have to have the whole file loaded into the string. You can have it read the file in chunks directly from the file system by doing this:

    $sftp->put('filename.remote', 'filename.local', NET_SFTP_LOCAL_FILE);

    You're probably trying to do this:

    $sftp->put('filename.remote', file_get_contents('filename.local'));

    Same thing for get(). Do this:

    $sftp->get('filename.remote', 'filename.local');
    {{ DiscussionBoard.errors[6626871].message }}
  • Profile picture of the author TopicSpan
    Just to second what ovnign said, try and stay away from reading anything in and out of memory unless it's chunked. file_get_contents() is inherently evil for anything bigger than like 1mb, unless you're expecting it to run so sporadically that it doesn't matter. Typically on a system with, say, 2gb of RAM, you can easily have most of the available memory tied up by the OS and other processes (mysql I'm looking at you). Calling file_get_contents() will still work, but it will end up having to fall over into paging, which will grind everything to a halt (for large operations).
    Signature
    Don't lose users!
    Grab our full-page UltraCache system that instantly speeds up any PHP website!
    {{ DiscussionBoard.errors[6630953].message }}
  • Profile picture of the author narcispap
    Hello and thanks for the response.

    I am using $sftp->put('filename.remote', 'filename.local', NET_SFTP_LOCAL_FILE), as that is the best performance version, but as I've benchmarked this library, it looks like the bottleneck is not really the memory loading, but it's the on-the-fly, native PHP encryption of the transferred content.

    I've temporary found a way, which is to split each transferred folder in a separate tar.gz file, which improved the speed using the library, but increases the application total processing time by about 15%.
    {{ DiscussionBoard.errors[6637564].message }}
  • Profile picture of the author ovnign
    Does your server have mcrypt installed? If not that should speed things up.

    Alternatively... does it have openssl installed? openssl supports symmetric key crypto as well. phpseclib doesn't utilize openssl currently but I could maybe create a branch on git to make it do that and send the author a pull request after.
    {{ DiscussionBoard.errors[6639880].message }}
  • Profile picture of the author narcispap
    It's a web application so it must depend on the user's server.
    I prefer using native, PECL SSH2 libraray, but that required the libssh2-php to be installed on the server.

    I'm trying to find the best possible fall-back system in case there's no native SSH2 support, which, so far, has proven to be quite the challenge.

    I'm also targeting shared hosting accounts, therefore I can't use too much memory or CPU without risking to have the client receive a temporary suspension from the hosting company.

    Thanks!
    {{ DiscussionBoard.errors[6649348].message }}
  • Profile picture of the author jaasmit
    Seems you are an expert programmer. There is nothing that i can see in this section. I only can read and learn from what you say and what you people do.
    {{ DiscussionBoard.errors[6652216].message }}
  • Profile picture of the author ovnign
    I'm also targeting shared hosting accounts, therefore I can't use too much memory or CPU without risking to have the client receive a temporary suspension from the hosting company.
    That's why I was asking about openssl.

    phpseclib uses the extensions that are available to be as fast as possible. It aims to be as fast as possible. Sometimes it's not really possible to be that fast but it'll try to be none-the-less by using extensions when they're available.

    If in your testing some people have openssl but not mcrypt that would be a way to make phpseclib even faster. If they don't even have openssl installed I'm not sure what more could be done but openssl is one idea that could speed things up.

    Something else you could do...

    If mcrypt isn't installed use AJAX to upload the file in chunks. ie. upload the first 2mb and kill the SSH connection. Send an XmlHttpRequest and start it back up for the next 2mb. Rinse lather and repeat until you're done.
    {{ DiscussionBoard.errors[6663546].message }}
  • Profile picture of the author narcispap
    Ovning, thanks for your help, but finally I remained to create a similar system to what you have described.

    I've used two parallel AJAX requests, immediately after splitting the file in 2. Each ajax request transfers 20% of the total size of the half of the complete archive.

    The receiving server is my own network farm, so I had the possibility to create the archive and preprocess it + save it as required.

    Thanks!
    {{ DiscussionBoard.errors[6666446].message }}

Trending Topics