Mahen dran Mahen dran - 1 month ago 6
JSON Question

How to increase data processing speed in CSV data into SQL Server?

I have used Nifi-0.6.1 with combination of GetFile+SplitText+ReplaceText processor to split the csv data which having 30MB(3Lakh rows).

GetFile can able to pass 30mb to SplitText very quickly.

In SpliText + Replace Text take long time(25 mins) to split those data into Json.

Just 30 mb data taking 25 mins for store csv into SQL Server.
It performs its conversion bytes by bytes only.So it take more time

I have tried Concurrent Task option in Processor,It can able to speed but it also take more time.At that time it attain 100% cpu Usage also.

How can i perform csv data into sql Server very quickly?

I appreciate any help.Thanks

Answer

You mention splitting the data into JSON, but you're using SplitText and ReplaceText. What does your incoming data look like? Are you trying to convert to JSON to use ConvertJSONtoSQL?

If you have CSV incoming, and you know the columns, SplitText should pretty quickly split the lines, and ReplaceText can be used to create an INSERT statement for use by PutSQL.

Alternatively, as @Tomalak mentioned, you could try to put the CSV file somewhere where SQLServer can access it, then use PutSQL to issue a BULK INSERT statement.

If neither of these is sufficient, you could use ExecuteScript to perform the split, column parsing, and translation to SQL statement(s).

Comments