dan martin dan martin - 3 months ago 8
Python Question

Python - write x rows of csv file to json file

I have a csv file, which I need to write to json files in rows of 1000. The csv file has around 9,000 rows, so ideally I'd like to end up with 9 separate json files of consecutive data.

I know how to write a csv file to json - what I've been doing:

csvfile = open("C:\\Users\Me\Desktop\data\data.csv", 'r', encoding="utf8")

reader = csv.DictReader(csvfile, delimiter = ",")
out = json.dumps( [ row for row in reader ] )

with open("C:\\Users\Me\Desktop\data\data.json", 'w') as f:
f.write(out)


which works great. But I need the json file to be 9 split files. Now, I'm assuming that I would either:

1) attempt to count row and stop when it reaches 1,000

2) write the csv file to a single json file, then open the json and attempt to split it somehow.

I'm pretty lost on how to accomplish this - any help appreciated!

Answer

This will read the file data.csv once and will create separate json files with id data_1.json through data_9.json since there are 9000 rows.

Also as long as the number of rows in data.csv is multiple of 1000, it will create number_of_rows/1000 files without having to change the code.

csvfile = open("C:\\Users\Me\Desktop\data\data.csv", 'rb', encoding="utf8")

reader = csv.DictReader(csvfile, delimiter = ",")

r = []
counter = 0
fileid = 1

for row in reader:
    r.append( row )
    counter += 1
    if counter == 999:
        out = json.dumps( r )
        fname = "C:\\Users\Me\Desktop\data\data_"+ str(fileid) + ".json"
        with open( fname, 'wb' ) as f:
            f.write( out )

        # resetting & updating variables
        fileid += 1
        counter = 0
        r = []
        out = None
Comments