I have a large amount of data to move using two PHP scripts: one on the client side using a command line PHP script and other behind Apache. I POST the data to the server side and use php://input stream to save it on the web-server end. To prevent from reaching any memory limits, data is separated into 500kB chunks for each POST request. All this works fine.
Now, to save the bandwidth and speed it up, I want to compress the data before sending and decompress when received on the other end. I found 3 pairs of functions that can do the job, but I cannot decide which one to use:
All of these can be used. There are subtle differences between the three:
gzipcommand line tool. This file format has a header containing optional metadata, DEFLATE compressed data, and footer containing a a CRC32 checksum and length check.
All three use the same algorithm under the hood.
gzencode() adds the ability to include the original file name and other environmental data (this is unused when just compressing a string).
gzcompress() both add a checksum, so the integrity of the archive can be verified, which can be useful over unreliable transmission and storage methods. If everything is stored locally and you don't need any additional metadata then
gzdeflate() would suffice. For portability I'd recommend
gzencode() (GZIP format) which is probably better supported than
gzcompress() (ZLIB format) among other tools.