I have 1+ million nodes in my neo4j database. I tried to query all of those nodes in a single query and it threw an exception due to
"Out of Memory"
Hey @BrunoPeres. I have left jvm/ram configs as default. I wanted to understand common patterns or what have people have done in the event that they want to query large datasets from neo4j.
If you really want to work with the 1+ million nodes at same time I believe is nothing to do... You will need increase your available hardware. Otherwise you can use SKIP and LIMIT to do some pagination-like approach.
I could try customizing the RAM used by the JVM but sounds like a "hack"...? what if tomorrow I have 10 million nodes..
Well, if your amount of nodes has increased from 1 million to 10 million this means that your hardware requirements has increased too.