Andrew Andrew -4 years ago 307
JSON Question

What is better to use when parsing dynamic JSON data: JToken or c# built in dynamic type

Could not find any proper answer while searching the internet...

I have a

JSON
data received from server. Its format could vary each time, so I have to use dynamic objects. Currently in our project we are using
JToken
type from
Newtonsoft.JSON
library, but if we look at its internal structure... I'm afraid it has a lot of boxing/unboxing cases which is not good - we should use as less memory as possible because its a mobile application.

So the question is: is it better to use
dynamic
type from
c#
itself (yeah, despite of its boxing/unboxing too) or there is no difference between them? How do they perform with memory usage? Has anyone done some benchmark tests of them?

Maybe there is some good alternative for both of them?

Thanks in advance

Answer Source

So it seems that noone has done something related to my question and I've made my own test and now I wan t share my results. Hope they will be really helpful.

At first I want say that I was really shocked and surprised about some results.

First things first and my test machine configuration:

  • HP Probook 450 G3 laptop with Intel Core i5-6200 CPU of 2.40 GHz
  • 8 GB of RAM
  • 256MB of SSD Patriot Ignite M2, Up to 560MB/s Read & 320MB/s Write IO
  • Seagate SATA HDD of 1000 GB
  • Windows 10 Pro x64 OS

Tests are pretty simple: There is a json string, its size stored in a file 4.54 kB. I run 100 test, each test creates 1000 objects of specified type (dynamic/JToken/JContainer). During each test I count actual amount of memory which these 1000 objects are allocated and the time needed to parse json-string to specified object type. After all tests are completed I count the average amount of used memory and the average time of each test set. Average counters counted by simple algorithm based on formula for 0 < i < n, S += SUM(i), a = S/n, where S - sum of all counters, n - amount of tests (100), a - average counter. Memory usage is counted by Process and GC

Now the most interesting part (IMO) - results of my benchmarking test :)

Firstly I ran tests using Newtonsoft.JSON built-in method JToken.Parse() and JContainer.Parse(). Both of these sets produced the same results as expected (because JContainer is nested type from JToken):

JToken/JContainer

Avg. Memory Usage: 25.7 MB

Avg. Parsing Time: 223 ms

Pretty simple. Now I ran testsusing JsonConver.Deserialize<T>(string json) method deserializing my json data into the same JToken/JContainer type and I was surprized for a little

JsonConver.Deserialize(string json) to JToken/JContainer

Avg Memory Usage: 22.0 MB

Avg. Parsing Time: 223 ms

So, for me personally, it was a little surprise that Parse() method for JToken/JContainer produces objects which allocates more memory than you do it with JsonConver.Deserialize<T>() method. Really unexpected.

The last set of test for this part are with dynamic type using the same JsonConver.Deserialize<T>(string json) method:

JsonConver.Deserialize(string json) to dynamic

Avg. Memory Usage: 22.1 MB

Avg Parsing Time: 224 ms

To be honest I was expecting some different results, but seems that JToken is just a wrapper around dynamic that's why they used the same memory amount and takes the same parsing time.

All things could be good and enough, but there is always some "but" conditions. The json data which I used for my tests was of small size (of course, we always tries to send as less data through the network as possible). The real object instance which I have to work with has more size and to be really satisfied with my real data I ran another test sets. Now, the real object of size 270 kB (compared to 4.54 kB in first sets). The results were really shocked to me. JToken/JContainer's Parse() method was failed!!! I got Out Of Memory exception on my 8 GB RAM Machine!!! Having near me opened Task Manager I saw how the memory for my test program was more than 2 GBs!!! I thought that was the end of my tests but I'm ambitious guy so I continued. Running tests with JsonConver.Deserialize<T>() brought me more success on that:

270kBs JsonConver.Deserialize()

Avg. Memory Usage: 1.63 GB

Avg Parsing Time: 20 s

Yes, its true: 1.63 GB and 20 second to parse 1000 objects into JToken/JContainer type. Working with the dynamic gave me the same results so I won't copy them again

These tests will not be full if I have not tried to test my json with the object parsing. So I created a class that describes my json structure and use JsonConver.Deserialize<T>() method:

JsonConver.Deserialize() parsing to POCO object

Avg. Memory Usage: 72 MB

Avg. Parsing Time: 7.5. s

72 MBs for 1000 POCO objects comparing to 1.63Gbs of dynamic data.

PS. Instead of conclusion... Of course we are always try to use objects to work with data on our code, but sometimes we have to deal with dynamics. Now you can see how efficient objects in C#. Moreover, I suggest you to never use JToken.Parse(), save yourself a little of memory. And IMO, OOP was introduced to really help developers and make our night dreams good. Its working! :)

PPS. All folks who are interested to see my tests solution - welcome on github :)

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download