You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Excellent results for the ,PostgreSQL However, for an ArangoDB, they seem to be incomparable, since they do not use a high-performance operator import, similar to an PostbreSQL copy.
Hi @apapacy. Thank you for that suggestion, I didn't know about that API. When I find the time I will extend the benchmark to use import_bulk for arangodb and update the numbers.
Hi @apapacy. I reran the insert test with import_bulk but could not find a significant speed improvement over using insert_many (basically just swapped out the method calls in arangodb.py). I'll have to see why, as according to the docsimport_bulk should be faster.
Excellent results for the ,PostgreSQL However, for an ArangoDB, they seem to be incomparable, since they do not use a high-performance operator import, similar to an PostbreSQL copy.
https://stackoverflow.com/questions/61345323/bulk-import-of-json-files-in-arangodb-with-python
The text was updated successfully, but these errors were encountered: