DanielG DanielG - 1 year ago 340
C# Question

Json.Net deserialize out of memory issue

I got a Json, which contains among others a data field which stores a base64 encoded string.
This Json is serialized and send to a client.

On client side, the newtonsoft json.net deserializer is used to get back the Json.
However, if the data field becomes large (~ 400 MB), the deserializer will throw an out of memory exception: Array Dimensions exceeded supported Range.
I also see in Task-Manager, that memory consumption really grows fast.

Any ideas why this is? Is there a maximum size for json fields or something?

Code example (simplified):

HttpResponseMessage responseTemp = null;
responseTemp = client.PostAsJsonAsync(client.BaseAddress, message).Result;

string jsonContent = responseTemp.Content.ReadAsStringAsync.Result;
result = JsonConvert.DeserializeObject<Result>(jsonContent);

Result class:

public class Result

public string Message { get; set; }
public byte[] Data { get; set; }



I think my problem is not the serializer, but just trying to handle such a huge string in memory.
At the point where I read the string into memory, the memory consumption of the application explodes. Every operation on that string does the same. At the moment, I think I have to find a way to work with streams and stop reading the whole stuff into memory at once.

dbc dbc
Answer Source

I assume you are using 64 bit. If not, switch.

Having done so, if you are using .Net 4.5 or later, enable gcAllowVeryLargeObjects. It allows for arrays with up to int.MaxValue entries even if that would cause the underlying memory buffer to be larger than 2 GB. You will still be unable to read a single JSON token of more than 2^31 characters in length, however, since JsonTextReader buffers the full contents of each single token in a private char[] _chars; array, and, in .Net, an array can only hold up to int.MaxValue items.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download