Giter Club home page Giter Club logo

Comments (9)

a2800276 avatar a2800276 commented on May 17, 2024

I would really like to see something like this:

while the csv and json importers are useful, they are not generic enough to import arbitrary data: I was playing around recently and wanted to do a bulk import of data, roughly 2GB. It would be nice to be able to load and process such a file immediately from within arango, but the File.read function in the fs module (is this module even officially there? I had to dig around quite a lot to find it) will always read the entire file.

It would be great to have a more versatile read that can be provided length or buffers or callbacks. This way the import tools can be rewritten in arango directly (I think I recall the json and csv importer imported via the http API?)

from arangodb.

rotatingJazz avatar rotatingJazz commented on May 17, 2024

I pre-process my raw data to csv and then use the importer. Works fine. ;) What would the benefit be from moving this to ArangoDb?

from arangodb.

frankmayer avatar frankmayer commented on May 17, 2024

Hi, would like to chime in, but I am not sure what you mean by "What would the benefit be from moving this to ArangoDb?" πŸ˜„ Can you elaborate, on what you want to do, what you did, and what the last sentence means? πŸ˜ƒ

from arangodb.

rotatingJazz avatar rotatingJazz commented on May 17, 2024

Hi Frank,

If I understood correctly, @a2800276 wants to be able to process his raw data and enter them into the db from withing arangosh.

I was wandering why the devs should invest time to this feature since one can easily process his raw data into csv/json (via any language, say PHP or Python, or Bash) and use the already working importer.

from arangodb.

frankmayer avatar frankmayer commented on May 17, 2024

Oh, yes, didn't notice the different users πŸ˜„ . Yes of course, I totally agree with @rotatingJazz on that.
@a2800276 is there some specific reason not to use the import? Is there some edge case that you're trying to tackle?
I have had no issues for importing external data, so far, so I am interested in your edge case πŸ˜„

from arangodb.

a2800276 avatar a2800276 commented on May 17, 2024

I pre-process my raw data to csv and then use the importer.

To me it seems very elaborate to preprocess data, that may or may not be in a form suitable for CSV/JSON, transforming it to a different format, throwing that against a --functionally restricted-- import script which then uses HTTP to import individual records to the database.

When instead:

I could be reading and transforming arbitrarily formatted files from within DB and have a much more efficient workflow, both from the "programmer efficiency" point of view and in terms of performance.

What I was trying to do concretely:

re-implement a toy project to play with graph functionality that I have working for neo4j in arango. I'd like to importethe wikipedia inter-page links and play around with that dataset. The dump of that data is 4GB, (in the form of mysql INSERT statements). If I can avoid it, I don't want to preprocess 4GB of data into 3GB of some other data that I can import when I could import directly in ~half the time.

More generally:

Since arango wants to become a general purpose deployment platform with Foxx, then it will certainly need some rudimentary file io implementation. As it's currently implemented, File.read is utterly useless apart from reading tiny toy files.

from arangodb.

mulderp avatar mulderp commented on May 17, 2024

It might be interesting to have some reference data that one could try to
'feel' or confirm the problem with the current importer; maybe the problem
can be shown with similar data from here e.g:

from arangodb.

jsteemann avatar jsteemann commented on May 17, 2024

We'll eventually have an implementation of Buffer, which will allow us to read binary files and process them in chunks from JavaScript.

Until that's available, I think there are two alternatives available at least for processing CSV and JSON files.
They should work incrementally and process the input file line-wise (not quite true for CSV but think of them as working line-wise). They allow supplying a callback function that is executed whenever a record was read. You can then use the callback to process the data and put it into the database.

Example invocation for CSV files:

var internal = require("internal");
// var options = { separator: ",", quote: "\"" };
internal.processCsvFile("test.csv", function (data) { internal.print(data); }, options );

And for p```
var internal = require("internal");
internal.processCsvFile("test.csv", function (data) { internal.print(data); } );


Processing JSON files is similar:

var internal = require("internal");
internal.processJsonFile("test.json", function (data) { internal.print(data); } );


Note that the above function aren't general purpose file-processing functions, but targeted for handling UTF-8 encoded CSV and JSON data. 
For arbitrary file formats, we'll need an implementation of Buffer in Javascript.

from arangodb.

fceller avatar fceller commented on May 17, 2024

Closed because processCsvFile and processJsonFile are doing what I intended.

from arangodb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.