python - Batch insertion neo4j - best option? -


i've been trying import relatively large dataset neo4j...approximately 50 million nodes relationships.

i first experimented cypher via py2neo -> work, becomes slow if need use create unique or merge.

i'm looking @ other batch import methods, , i'm wondering if there recommendations of these approaches best general workflow , speed:

  • the neo4j docs mention batch insertion facility appears java , part of neo4j distribution;
  • there batch inserter michael hunger on @ github, not sure how similar or different 1 included in distribution;
  • then there load2neo, i'm testing;
  • and there load csv functionality part of neo v2's cypher, though not sure if convenience factor , if performance similar executing cypher queries in batches of, say, 40 000 via cypher transaction.

i appreciate comments on functionality, workflow, , speed differences between these options.

if can use latest version of neo4j recommended way use new load csv statement in cypher: http://docs.neo4j.org/chunked/stable/cypherdoc-importing-csv-files-with-cypher.html


Comments

Popular posts from this blog

database - VFP Grid + SQL server 2008 - grid not showing correctly -

jquery - Set jPicker field to empty value -

.htaccess - htaccess convert request to clean url and add slash at the end of the url -