I have a php website. Since I'm using template engine and I always do the html in "one-shot" I have the size of the html document upfront. So I decided to set Content-Length header for better performance. If I don't set it the document is transferred using chunked encoding.
The php code for html output looks like this:
wget dev.site.com/ --server-response -O /dev/null
--2013-11-09 01:32:37-- http://dev.site.com/
Resolving dev.site.com... 127.0.0.1
Connecting to dev.site.com|127.0.0.1|:80... connected.
HTTP request sent, awaiting response...
HTTP/1.1 200 OK
Date: Fri, 08 Nov 2013 23:32:37 GMT
Set-Cookie: lng=en; expires=Wed, 07-May-2014 23:32:37 GMT; path=/; domain=dev.site.com
Last-Modified: Fri, 08 Nov 2013 23:32:37 GMT
Cache-Control: must-revalidate, post-check=0, pre-check=0
Set-Cookie: PHPSESSID=8a1e9b871474b882e1eef4ca0dfea0fc; expires=Thu, 06-Feb-2014 23:32:37 GMT; path=/
Set-Cookie: hc=1518952; expires=Mon, 17-Nov-2036 00:38:00 GMT; path=/; domain=dev.site.com
Keep-Alive: timeout=15, max=100
Content-Type: text/html; charset=UTF-8
Length: 16970 (17K) [text/html]
Saving to: “/dev/null”
100%[===================================================================================================================================================================================================>] 16,970 --.-K/s in 0.1s
2013-11-09 01:32:37 (152 KB/s) - “/dev/null” saved [16970/16970]
The answer is already there.
Content-Length has to be the size that is actually being sent, which is the size after the '$content' is compressed. The size of the content you see on view-source is naturally decompressed size.
Connection does not stall. Your browser is waiting for more data to come but compressed data size is smaller than what browser is waiting for. If your server eventually timeouts the connection your browser will assume it got all the data and show it. It works with wget and such because they don't send accept-compression headers and server does not send compressed response.
If you must, you could disable compressing, manually compress and send
$content and also appropriate
Another option is to download the page uncompressed (send
Accept-Encoding: gzip with wget, I guess it won't get decompressed, but even though it is not enabled by default wget might support compression after all, I don't know. I know cURL doesn't support it you can use it) and get the size of the response minus headers (which means only size of the data after
\r\n\r\n header end sequence) and use that size while sending
Content-Length. But of course changing compression level or maybe implementation (different web servers/modules or different versions of the same web server/modules) will change the size of the resulting compressed data so this is a very fragile way to do this.
Why are you modifying
Content-Length anyway? Php or web server is supposed to handle that.