I have big information table. I want to export it in json format for some other purpose. I could export in xml format using the below command.
mysql -u root -p --xml -e "SELECT * FROM db_name.tbl_name" > d:\export.xml
mysql -u root -p --json -e "SELECT * FROM db_name.tbl_name" > d:\export.json
unknown option '--json'
mysql cannot output directly in json format
so you have two options:
1) export in XML the use a tool to convert from XML to JSON (a tool that can deal with large tables of course)
2) write a small script (for example in PHP) that fetch the data from the DB and writes it to file in JSON
If you choose option nr. (2) you may (try first giving php a lot of memory in the php.ini configuration...) have trouble loading the whole table data, converting to JSON and saving to file in a single "atomic" step if you have a lot of data.
However you may break the task into steps.
Basically a table turned into a JSON is an array of objects, each object representing a single record.
Open a connection to the db
Start writing the output file and open the array by writing an open square bracket
Execute your query fetching n ( 1 < n < 1000 ) record at time. (In order to do that you have to
SORT the table by any field -ex.
id- and use the
Convert each record with
json_econde, write the string to file, write a comma
, unless you have written the last record.
Back to 3 until you reach the last record.
Write a closing square bracket to file
] (closing the array).
Close the file and the db connection
This would require a little more coding but it's not rocket science.
...and maybe you find something online that already does that.
A script that fetch the data from the DB and writes it to file in JSON can be found here on GitHub: Dump-table-to-JSON