Could not find any proper answer while searching the internet...
I have a
So it seems that noone has done something related to my question and I've made my own test and now I wan t share my results. Hope they will be really helpful.
At first I want say that I was really shocked and surprised about some results.
First things first and my test machine configuration:
Tests are pretty simple: There is a json string, its size stored in a file 4.54 kB. I run 100 test, each test creates 1000 objects of specified type (dynamic/JToken/JContainer). During each test I count actual amount of memory which these 1000 objects are allocated and the time needed to parse json-string to specified object type. After all tests are completed I count the average amount of used memory and the average time of each test set. Average counters counted by simple algorithm based on formula
for 0 < i < n, S += SUM(i), a = S/n, where
S - sum of all counters,
n - amount of tests (100),
a - average counter. Memory usage is counted by
Now the most interesting part (IMO) - results of my benchmarking test :)
Firstly I ran tests using
Newtonsoft.JSON built-in method
JContainer.Parse(). Both of these sets produced the same results as expected (because
JContainer is nested type from
Avg. Memory Usage: 25.7 MB
Avg. Parsing Time: 223 ms
Pretty simple. Now I ran testsusing
JsonConver.Deserialize<T>(string json) method deserializing my json data into the same
JToken/JContainer type and I was surprized for a little
JsonConver.Deserialize(string json) to JToken/JContainer
Avg Memory Usage: 22.0 MB
Avg. Parsing Time: 223 ms
So, for me personally, it was a little surprise that
Parse() method for
JToken/JContainer produces objects which allocates more memory than you do it with
JsonConver.Deserialize<T>() method. Really unexpected.
The last set of test for this part are with
dynamic type using the same
JsonConver.Deserialize<T>(string json) method:
JsonConver.Deserialize(string json) to dynamic
Avg. Memory Usage: 22.1 MB
Avg Parsing Time: 224 ms
To be honest I was expecting some different results, but seems that
JToken is just a wrapper around
dynamic that's why they used the same memory amount and takes the same parsing time.
All things could be good and enough, but there is always some "but" conditions. The json data which I used for my tests was of small size (of course, we always tries to send as less data through the network as possible). The real object instance which I have to work with has more size and to be really satisfied with my real data I ran another test sets. Now, the real object of size 270 kB (compared to 4.54 kB in first sets). The results were really shocked to me.
Parse() method was failed!!! I got Out Of Memory exception on my 8 GB RAM Machine!!! Having near me opened Task Manager I saw how the memory for my test program was more than 2 GBs!!! I thought that was the end of my tests but I'm ambitious guy so I continued. Running tests with
JsonConver.Deserialize<T>() brought me more success on that:
Avg. Memory Usage: 1.63 GB
Avg Parsing Time: 20 s
Yes, its true: 1.63 GB and 20 second to parse 1000 objects into
JToken/JContainer type. Working with the
dynamic gave me the same results so I won't copy them again
These tests will not be full if I have not tried to test my json with the object parsing. So I created a class that describes my json structure and use
JsonConver.Deserialize() parsing to POCO object
Avg. Memory Usage: 72 MB
Avg. Parsing Time: 7.5. s
72 MBs for 1000 POCO objects comparing to 1.63Gbs of dynamic data.
PS. Instead of conclusion... Of course we are always try to use objects to work with data on our code, but sometimes we have to deal with dynamics. Now you can see how efficient objects in C#. Moreover, I suggest you to never use
JToken.Parse(), save yourself a little of memory. And IMO, OOP was introduced to really help developers and make our night dreams good. Its working! :)
PPS. All folks who are interested to see my tests solution - welcome on github :)