code_Assasin code_Assasin - 28 days ago 5
Groovy Question

Alternative to batch importer for neo4j for large datasets

I am trying to import a large dataset to neo4j.I created a python script to write the cypher queries into a .cql file after reading a .xls file and then I ran them using the neo4j-shell.This worked for a small dataset.But on increasing the size of dataset my system crashed for the same.I have seen few suggestions to use batch importers but they are usually based in java(eg:Groovy) and its something i'm not comfortable using.So is there any alternative to batch inserting or atleast batch iinserting via python?

Answer Source

You could try the Neo4J Load CSV tool / cypher command. It is very flexible and can be used with the USING PERIODIC COMMIT to process very large datasets by making periodic commits to prevent buffer overflow problems and optimize the process further.

The only prerequisite is that you are able to export your original data in the CSV format. (section 8.6)