When I use the python Decimal object simply with something like:
from decimal import *
dec = Decimal('3.432')
You get a
Decimal('3.432') in your JSON object? Weird... how?
>>> from decimal import * >>> import json >>> json.dumps(Decimal('3.432')) .... TypeError: Decimal('3.432') is not JSON serializable
In any case, if you are using a Decimal instead of a float, you probably don't want to lose precision by converting it to a float. If that's true, then you have to manage the process yourself by first ensuring you preserve all the data, and dealing with the fact that JSON doesn't understand the Decimal type:
>>> j = json.dumps(str(Decimal('3.000'))) >>> j '"3.000"' >>> Decimal(json.loads(j)) Decimal('3.000')
Of course, if you don't really care about the precision (e.g. if some library routine gives you a Decimal) just convert to a float first, as JSON can handle that. You still won't get a Decimal back later unless you manually convert again from float though...
Edit: Devin Jeanpierre points out the existing support in the
json module for decoding float strings as something other than float with the parse_float argument for
loads() and custom JSONDecoders. While that would require you first to convert your Decimals to floats when encoding (losing precision) it could solve the second half of the problem a little more cleanly than my more manual approach above (note that it would apply to all floats in your JSON data). You should also be able to create a JSONEncoder subclass to write out the digits for the Decimal without first converting to float, which would let you avoid the loss of precision.