brooksaar brooksaar - 27 days ago 7x
C# Question

C# WebResponse Stream losing bytes?

So I have a function like so:

private String SendRequest(String jsonRequest)
WebRequest webRequest = WebRequest.Create(_url);
byte[] paramBytes = Encoding.UTF8.GetBytes(jsonRequest);
byte[] responseBytes;

webRequest.Method = "POST";
webRequest.ContentType = "application/json";
webRequest.ContentLength = paramBytes.Length;
webRequest.Headers.Add("X-Transmission-Session-Id", _sessionId);

using (Stream oStream = webRequest.GetRequestStream())
oStream.Write(paramBytes, 0, paramBytes.Length);

WebResponse webResponse = webRequest.GetResponse();

using (Stream iStream = webResponse.GetResponseStream())
responseBytes = new byte[webResponse.ContentLength];
iStream.Read(responseBytes, 0, (int) webResponse.ContentLength);

return Encoding.UTF8.GetString(responseBytes);

The problem is, at the iStream.Read() stage, some of the bytes are lost. Using wireshark reveals all the bytes are sent to this machine, however .Net is loosing them somewhere along the way. In my current debugging session, for example, where webResponse.ContentLength = 4746 byte[3949] to byte[4745] are all 0's, but they should be populated. As a result, the UTF8 JSON string cuts off early and I can't deserialise my JSON.

I thought the code was pretty clear cut, I can't see where it's going wrong to loose those bytes.

Thanks for any help!


When reading from the stream you can get less bytes than requested.

The total number of bytes read into the buffer. This can be less than the number of bytes requested if that many bytes are not currently available, or zero (0) if the end of the stream has been reached.

msdn example for WebResponse.GetResponseStream():