Giter Club home page Giter Club logo

mingo's Introduction

mingo

MongoDB query language for in-memory objects

license version build issues codecov npm downloads

Install

$ npm install mingo

Features

For more documentation on how to use operators see mongodb.

Browse package docs for modules.

Usage

// Use as es6 module
import mingo from "mingo";

// or vanilla nodeJS
const mingo = require("mingo");

The main module exports Aggregator, Query, aggregate(), find(), and remove(). Only Query and Projection operators are loaded by default when you require the main module. This is done using the side-effect module mingo/init/basic and automatically includes pipeline operators; $project, $skip, $limit, and $sort.

Loading Operators

MongoDB query library is huge and you may not need all the operators. If using this library on the server-side where bundle size is not a concern, you can load all operators as shown below.

// Note that doing this effectively imports the entire library into your bundle and unused operators cannot be tree shaked
import "mingo/init/system";

Or from the node CLI

node -r 'mingo/init/system' myscript.js

To support tree-shaking for client side bundles, you can import and register specific operators that will be used in your application.

ES6

import { useOperators, OperatorType } from "mingo/core";
import { $trunc } from "mingo/operators/expression";
import { $bucket } from "mingo/operators/pipeline";

useOperators(OperatorType.EXPRESSION, { $trunc });
useOperators(OperatorType.PIPELINE, { $bucket });

CommonJS

const core = require("mingo/core");
const $trunc = require("mingo/operators/expression").$trunc;
const $bucket = require("mingo/operators/pipeline").$bucket;
const useOperators = core.useOperators;
const OperatorType = core.OperatorType;

useOperators(OperatorType.EXPRESSION, { $trunc: $trunc });
useOperators(OperatorType.PIPELINE, { $bucket: $bucket });

Using query to test objects

import { Query } from "mingo";

// create a query with criteria
// find all grades for homework with score >= 50
let query = new Query({
  type: "homework",
  score: { $gte: 50 }
});

// test if an object matches query
query.test(doc);

Searching and Filtering

import { Query } from "mingo";

// input is either an Array or any iterable source (i.e Object{next:Function}) including ES6 generators.
let criteria = { score: { $gt: 10 } };

let query = new Query(criteria);

// filter collection with find()
let cursor = query.find(collection);

// alternatively use shorthand
// cursor = mingo.find(collection, criteria)

// sort, skip and limit by chaining
cursor.sort({ student_id: 1, score: -1 }).skip(100).limit(100);

// count matches. exhausts cursor
cursor.count();

// classic cursor iterator (old school)
while (cursor.hasNext()) {
  console.log(cursor.next());
}

// ES6 iterators (new cool)
for (let value of cursor) {
  console.log(value);
}

// all() to retrieve matched objects. exhausts cursor
cursor.all();

Using $jsonSchema operator

To use the $jsonSchema operator, you must register your own JsonSchemaValidator in the options. No default implementation is provided out of the box so users can use a library with their preferred schema format.

The example below uses Ajv to implement schema validation.

import { RawObject } from "mingo/types"
import { JsonSchemaValidator } from "mingo/core"
import Ajv, { Schema } from "ajv"

const jsonSchemaValidator: JsonSchemaValidator = (s: RawObject) => {
  const ajv = new Ajv();
  const v = ajv.compile(s as Schema);
  return (o: RawObject) => (v(o) ? true : false);
};

const schema = {
  type: "object",
  required: ["item", "qty", "instock"],
  properties: {
    item: { type: "string" },
    qty: { type: "integer" },
    size: {
      type: "object",
      required: ["uom"],
      properties: {
        uom: { type: "string" },
        h: { type: "number" },
        w: { type: "number" },
      },
    },
    instock: { type: "boolean" },
  },
};

// queries documents using schema validation
find(docs, { $jsonSchema: schema }, {}, { jsonSchemaValidator }).all();

Note: An error is thrown when the $jsonSchema operator is used without a the jsonSchemaValidator configured.

Aggregation Pipeline

import { Aggregator } from "mingo/aggregator";
import { useOperators, OperatorType } from "mingo/core";
import { $match, $group } from "mingo/operators/pipeline";
import { $min } from "mingo/operators/accumulator";

// ensure the required operators are preloaded prior to using them.
useOperators(OperatorType.PIPELINE, { $match, $group });
useOperators(OperatorType.ACCUMULATOR, { $min });

let agg = new Aggregator([
  { $match: { type: "homework" } },
  { $group: { _id: "$student_id", score: { $min: "$score" } } },
  { $sort: { _id: 1, score: 1 } }
]);

// return an iterator for streaming results
let stream = agg.stream(collection);

// return all results. same as `stream.all()`
let result = agg.run(collection);

Options

Query and aggregation operations can be configured with options to enabled different features or customize how documents are processed. Some options are only relevant to specific operators and need not be specified if not required.

Name Description Default Behaviour
idKey The key that is used to lookup the ID value of a document. "_id"
collation Collation specification for string sorting operations. none See Intl.Collator
processingMode Determines copy rules for inputs and outputs. CLONE_OFF Turn off cloning and modifies the input collection as needed.
This option will also return output objects with shared paths in their graph when specific operators are used. Provides the greatest speedup by minizing cloning. When using the aggregation pipeline, you can use the $out operator to collect immutable intermediate results.
useStrictMode Enforces strict MongoDB compatibilty. true When disabled, behaviour changes as follows.
  • $elemMatch returns all matching nested documents instead of only the first.
  • Empty string "" is coerced to false during boolean checking in supported operators which is consistent with Javascript semantics.
    scriptEnabled Enable or disable using custom script execution. true When disabled, operators that execute custom code are disallowed such as; $where, $accumulator, and $function.
    hashFunction Custom hash function to replace the default based on "Effective Java" hashCode. default Expects function (value: unknown) => number.
    collectionResolver Function to resolve strings to arrays for use with operators that reference other collections such as; $lookup, $out and $merge. none Expects function (name: string) => RawObject[]
    jsonSchemaValidator JSON schema validator to use for the $jsonSchema operator. none Expects function (schema: RawObject) => (document: RawObject) => boolean.
    The $jsonSchema operation would fail if a validator is not provided.
    variables Global variables to pass to all operators none
    context Container to use for loading operators. none This option allow users to load only desired operators or register custom operators which need not be available globally.
    useGlobalContext Allow falling back to the global context when operator is not found in the provided context. true This is provided to allow users to strictly enforce which operators may be usable.

    Adding Custom Operators

    Custom operators can be registered using Context via the context option which is the recommended way from 6.4.2. The Context is a container for operators, that the execution engine will use to process queries. Previously, the useOperators(...) function was used to register operators globally but that is no longer preferred. The difference between the two is that a globally registered operator cannot be overwritten whereas a new context may be created and used at anytime.

    NB: Note that the execution engine will first try to find the operator in the context and fallback to the global context when not found.

    Each operator type has a specific interface to which an implementation must conform to be valid.

    Pre-loaded operators defined here cannot be overridden. These include;

    NB: Update operators is not supported in Context.

    Updating Documents

    An update operation can be performed using the updateObject function from the mingo/updater module. Unlike other operations in the library, this only works on a single object. The query and aggregation operators are powerful enough to use for transforming arrays of documents and should be preferred when dealing with multiple objects. updateObject returns an array of all paths that were updated. It also supports arrayFilters for applicable operators. To detect whether a change occured you can check the length of the returned array.

    All operators as of MongoDB 5.0 are supported except the positional array operator $.

    Examples

    import { updateObject } from "mingo/updater";
    // all update operators are automatically loaded.
    
    const obj = {
      firstName: "John",
      lastName: "Wick",
      age: 40,
      friends: ["Scooby", "Shagy", "Fred"]
    };
    
    // returns array of modified paths if value changed.
    updateObject(obj, { $set: { firstName: "Bob", lastName: "Doe" } }); // ["firstName", "lastName"]
    
    // update nested values.
    updateObject(obj, { $pop: { friends: 1 } }); // ["friends"] => friends: ["Scooby", "Shagy"]
    // update nested value path
    updateObject(obj, { $unset: { "friends.1": "" } }); // ["friends.1"] => friends: ["Scooby", null]
    // update with condition
    updateObject(obj, { $set: { "friends.$[e]": "Velma" } }, [{ e: null }]); // ["friends"] => friends: ["Scooby", "Velma"]
    // empty array returned if value has not changed.
    updateObject(obj, { $set: { fristName: "Bob" } }); // [] => no change to object.

    You can also create a preconfigured updater function.

    import { createUpdater } from "mingo/updater";
    
    // configure updater to deep clone passed values. clone mode defaults to "copy".
    const updateObject = createUpdater({ cloneMode: "deep" });
    
    const state = { people: ["Fred", "John"] };
    const newPeople = ["Amy", "Mark"];
    
    console.log(state.people); // ["Fred", "John"]
    
    updateObject(state, { $set: { people: newPeople } });
    
    newPeople.push("Jason");
    
    console.log(state.people); // ["Amy", "Mark"]
    console.log(newPeople); // ["Amy", "Mark", "Jason"]

    Differences from MongoDB

    This list describes how this library differs from the full MongoDB query engine.

    1. There is no concept of a collection. Input data is either an array of objects or a generator function to support streaming.
    2. Does not support server specific operators. E.g. $collStat, $planCacheStats, $listSessions.
    3. Does not support geometry query operators. no support planned.
    4. Does not support query operators dependent on persistent storage; $comment, $meta, $text.
    5. Does not support positional query or update operator $.
    6. Does not support server specific expression operators; $toObjectId, $binarySize, bsonSize.
    7. Agregation pipeline operator $merge enforces unique constraint on the lookup field at runtime.
    8. Custom function evaluation operators; $where, $function, and $accumulator, do not accept strings as the function body.
    9. Custom function evaluation operators are enabled by default. They can be disabled with the scriptEnabled option.
    10. Custom function evaluation operator $accumulator does not support the merge option.
    11. The $jsonSchema operator requires the user to register their own validator using the jsonSchemaValidator configuration.

    Benefits

    • Declarative data driven API.
    • Usable on both frontend and backend.
    • Provides an alternative to writing custom code for transforming objects.
    • Validate MongoDB queries without running a server.
    • Well documented. MongoDB query language is among the best available and has great documentation.

    Contributing

    • Squash changes into one commit.
    • Run npm test to build and run unit tests.
    • Submit pull request.

    To validate correct behaviour and semantics of operators, you may also test against mongoplayground.net. Credit to the author.

    A big thank you to all users and CONTRIBUTORS of this library.

    License

    MIT

    mingo's People

    Contributors

    antonsotirov avatar cezarcarvalhaes avatar dependabot[bot] avatar eldaranne avatar hardeep avatar jimrandomh avatar justsml avatar k3th3r avatar kofrasa avatar ksloan avatar lackofbrilliance avatar lgtm-com[bot] avatar markreestman avatar mattipv4 avatar maxnowack avatar pocesar avatar praoshealth avatar pubkey avatar renovate-bot avatar renovate[bot] avatar saadtazi avatar sebastian-lenz avatar sepehr avatar shanewholloway avatar shota-f avatar stalniy avatar stefanholzapfel avatar stefanosala avatar xcorail avatar zmillman avatar

    Stargazers

     avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

    Watchers

     avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

    mingo's Issues

    Clarification

    I would be nice to know a little more about this project without having to dig into the code. The project description is only six words and the language makes the project seem like another Mongo Node driver.

    It's not a Mongo Node driver per se, but more specifically a Mongo emulator. There's no MongoDB backend required because it's built right in (as I understand it). To me this wasn't clear from the description or the examples. It took a conversation on Reddit to clear up this fact.

    Exclusive select projections seem to be broken

    Example data model:

    var user = {
      Name: "Mingo",
      Age: 20,
      _id: 1
    }

    Using something along the line of var query = new Mingo.Cursor(collection, {}, {Name: 0}) I expect the result to be:

    {
      Age: 20,
      _id: 1
    }

    instead it is:

    {
      _id: 1
    }

    Date comparison doesn't work with strings

    I was using a selector that filters based on dates:

    {
      "postedAt":{
        "$gte":"2017-01-04T15:00:00.000Z",
        "$lt":"2017-01-14T14:59:59.999Z"
      }
    }
    

    As far as I can tell, this doesn't work properly when the postedAt field contains date strings (the resulting cursor just came up empty), but it did work when I converted postedAt to date objects.

    Is this the expected behavior? Or am I missing something?

    More Info Please

    This looks like a great project, just need some more intro info, like what the purpose of your API is and how it compares. Does it work on all collection types or any deserialized JSON? Can it be used to query Firebase documents? Can it augment the Firebase query API?

    Wow nice work, but I cannot compile it :(

    Hi there

    I'm trying to use your beautiful library with the Jurassic c# library for executing javascript. It keeps telling me:

    TypeError: undefined cannot be converted to an object - This is at line 40 in the library.

    Does this absolutely need to work in the browser context?

    Im running it in a pure javascript environment (kinda), with no browser or browser objects present...

    Help!

    Add bower.json

    It will be great to manager this library with bower.

    Thanks for sharing the library :)

    Group by multiple fields not supported?

    Hi,

    I am playing with your framework, so far I love the way it works.. Anyway I want to "report" the following issue.

    On the mongo shell I execute the following query:
    db.XXX.aggregate([{"$match":{}},{"$group":{"_id":{"hour":"$date_buckets.hour","keyword":"$Keyword"},"total":{"$sum":1}}},{"$sort":{"total":-1}},{"$limit":5},{"$project":{"_id":0,"hour":"$_id.hour","keyword":"$_id.keyword","total":1}}])

    • Notice the multiple field grouping.
      And this is the result:
      { "total" : 2, "keyword" : "Bathroom Cleaning Tips" }
      { "total" : 1, "keyword" : "best way to clean a bathroom" }
      { "total" : 1, "keyword" : "Cleaning Bathroom Tips" }
      { "total" : 1, "keyword" : "unclog bathtub drain" }
      { "total" : 1, "keyword" : "Drain Clogs" }

    Same query applied using mingo
    var agg = new Mingo.Aggregator(
    [{
    $match: {}
    },{
    $group: {
    _id: {
    hour: "$date_buckets.hour"
    , keyword: "$Keyword"
    },
    total: {
    $sum: 1
    }
    }
    },{
    $sort: {
    total: -1
    }
    },{
    $limit: 5
    },{
    $project: {
    _id:0
    , hour: "$_id.hour"
    , keyword: "$_id.keyword"
    , total:1
    }
    }]
    );

    And this is the result:

    { "total" : 6, "keyword" : "Bathroom Cleaning Tips" }
    { "total" : 6, "keyword" : "best way to clean a bathroom" }
    { "total" : 6, "keyword" : "Cleaning Bathroom Tips" }
    { "total" : 6, "keyword" : "unclog bathtub drain" }
    { "total" : 6, "keyword" : "Drain Clogs" }

    This is the data I am using it contains 6 rows:
    [{"date_buckets":{"date":"2015-04-29T00:17:03.107Z","day":28,"hour":18,"minute":17,"sec":3,"hour_minute":"18:17"},"Keyword ID":"sr3_4VzRD3sp","Creative ID":"5184986203","Keyword":"Bathroom Cleaning Tips","Match Type":"be","Device":"m","Conversions":[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],"Revenues":[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],"account_id":"baron"},{"date_buckets":{"date":"2015-04-29T00:17:03.107Z","day":28,"hour":18,"minute":17,"sec":3,"hour_minute":"18:17"},"Keyword ID":"sr3_K1iQOeXy","Creative ID":"5184986241","Keyword":"Cleaning Bathroom Tips","Match Type":"bb","Device":"c","Conversions":[5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,5,0,0,0,0,0,0],"Revenues":[5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,5,0,0,0,0,0,0],"account_id":"baron"},{"date_buckets":{"date":"2015-04-29T00:17:03.108Z","day":28,"hour":18,"minute":17,"sec":3,"hour_minute":"18:17"},"Keyword ID":"sr3_sl0C3VAYk","Creative ID":"44210589597","Keyword":"best way to clean a bathroom","Match Type":"b","Device":"c","Conversions":[4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0,0],"Revenues":[4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4,0,0,0,0,0,0],"account_id":"baron"},{"date_buckets":{"date":"2015-04-29T00:17:03.108Z","day":28,"hour":18,"minute":17,"sec":3,"hour_minute":"18:17"},"Keyword ID":"sr3_4VzRD3sp","Creative ID":"5184986204","Keyword":"Bathroom Cleaning Tips","Match Type":"be","Device":"c","Conversions":[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],"Revenues":[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0],"account_id":"baron"},{"date_buckets":{"date":"2015-04-29T00:17:03.107Z","day":28,"hour":18,"minute":17,"sec":3,"hour_minute":"18:17"},"Keyword ID":"sr3_HZAarvKy","Creative ID":"6074827333","Keyword":"Drain Clogs","Match Type":"bp","Device":"c","Conversions":[1,0,0,1,0,0,0,0,0],"Revenues":[5,0,0,5,0,0,0,0,0],"account_id":"baron"},{"date_buckets":{"date":"2015-04-29T00:17:03.107Z","day":28,"hour":18,"minute":17,"sec":3,"hour_minute":"18:17"},"Keyword ID":"sr3_irU8fFk0","Creative ID":"6074827289","Keyword":"unclog bathtub drain","Match Type":"bp","Device":"c","Conversions":[1,0,0,1,0,0,0,0,0],"Revenues":[5,0,0,5,0,0,0,0,0],"account_id":"baron"}]

    I will try to figure it out whats going on, but probably you are fastest than me.
    Thanks, Robert Baron.

    find with string functions

    I started using mingo. It's been really great so far.
    What I am missing (or don't understand to do) is, finding documents with string functions like: contains, startsWith, endsWith.

    How do I do this more elegant (contains):

    var result = Query.find(npm.rows, {$where: 'this.id.indexOf("express") !== -1'}).all();
    // or
    var result = Query.find(npm.rows, {id: /mocha/g}).all();

    Streaming Aggregation

    It seems that there's no reason why aggregations can't be streamed, is there a chance support can be added for this?

    $concatArrays should not limit to 2 elements

    Hello,

    According to the mongodb documentation, $concatArrys operator allows multiple elements.

    But mingo only allows 2 elements:

    assert(isArray(arr) && arr.length === 2, '$concatArrays expression must resolve to an array of 2 elements')

    repro step:

    const data = [{professionalTags: ['tag1', 'tag2'], personalTags: ['tag2'], publicationTags: ['tag2']}];
    mingo.aggregate(data, [
      { $project: {
        tags: {$concatArrays: ['$professionalTags', '$personalTags', '$publicationTags'] }
      }}
    ]);
    

    this throws an exception:

    $concatArrays expression must resolve to an array of 2 elements
    
          at err (node_modules/mingo/dist/mingo.js:140:9)
          at assert (node_modules/mingo/dist/mingo.js:40:26)
          at Object.$concatArrays (node_modules/mingo/dist/mingo.js:2438:5)
          at computeValue (node_modules/mingo/dist/mingo.js:3805:38)
          at node_modules/mingo/dist/mingo.js:3852:23
          at each (node_modules/mingo/dist/mingo.js:182:16)
          at computeValue (node_modules/mingo/dist/mingo.js:3851:7)
          at node_modules/mingo/dist/mingo.js:950:19
          at each (node_modules/mingo/dist/mingo.js:177:14)
          at node_modules/mingo/dist/mingo.js:914:5
          at each (node_modules/mingo/dist/mingo.js:177:14)
          at Object.$project (node_modules/mingo/dist/mingo.js:904:3)
          at node_modules/mingo/dist/mingo.js:1248:48
          at each (node_modules/mingo/dist/mingo.js:177:14)
          at Aggregator.run (node_modules/mingo/dist/mingo.js:1241:9)
          at Object.aggregate (node_modules/mingo/dist/mingo.js:1267:35)
          at Object.<anonymous>.exports.getAllSkills.elementals (src/selectors/index.js:10:26)
          at Object.it (src/selectors/__tests__/index.test.js:7:38)
          at Promise.resolve.then.el (node_modules/p-map/index.js:42:16)
          at process._tickCallback (internal/process/next_tick.js:109:7)
    

    Note that removing one element ($publicationTags for example) doesn't throw.

    Incorrect matching of deeply nested values and whole objects

    From experiments, MongoDB only allows one level of nesting when matching values in an array without explicit indexes in the selector.

    > db.test.insert({
    ...     key0: [{
    ...       key1: [[[{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]], {"key2": "value"}],
    ...       key1a: {key2a: "value2a"}
    ...     }]
    ...   })
    WriteResult({ "nInserted" : 1 })
    > // Test: query deeply nested value
    > // 1. specify no array index
    > db.test.find({"key0.key1.key2.a": "value2"}) // no match
    >
    > // 2. specify partial array index selector for nested value
    > db.test.find({"key0.key1.0.key2.a": "value2"}) // no match
    >
    > // 3. specify full array index selector to deeply nested value
    > db.test.find({"key0.key1.0.0.key2.a": "value2"}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }
    
    > // Test: query shallow nested value
    > // 1. specify no array index
    > db.test.find({"key0.key1.key2": "value"}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }
    >
    > // 2. specify array index
    > db.test.find({"key0.key1.1.key2": "value"}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }

    Matching whole object is also broken.

    > // Test: match whole objects
    > db.test.find({"key0.key1": [[{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]]}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }
    
    > db.test.find({"key0.key1.0": [[{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]]}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }
    
    > db.test.find({"key0.key1.0.0": [{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }
    >
    > db.test.find({"key0.key1.0.0.key2": [{a:"value2"}, {a: "dummy"}, {b:20}]}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }
    >
    > db.test.find({"key0.key1.0.0.key2": {b:20}}) // match
    { "_id" : ObjectId("583fad3a1828419a48d35fe1"), "key0" : [ { "key1" : [ [ [ { "key2" : [ { "a" : "value2" }, { "a" : "dummy" }, { "b" : 20 } ] } ] ], { "key2" : "value" } ], "key1a" : { "key2a" : "value2a" } } ] }

    Tested with [email protected].

    We currently do not handle these cases correctly. Using the sample data.

    var Mingo = require("mingo");
    var _ = Mingo._internal();
    
    var fixtures = [
        [{"key0.key1.key2.a": "value2"}, [], "should not match without array index selector to nested value "],
        [{"key0.key1.0.key2.a": "value2"}, [], "should not match without enough depth for array index selector to nested value"],
        [{"key0.key1.0.0.key2.a": "value2"}, data, "should match with full array index selector to deeply nested value"],
        [{"key0.key1.0.0.key2": {b:20}}, data, "should match with array index selector to nested value at depth 1"],
        [{"key0.key1.1.key2": "value"}, data, "should match with full array index selector to nested value"],
        [{"key0.key1.key2": "value"}, data, "should match without array index selector to nested value at depth 1"],
        [{"key0.key1.1.key2": "value"}, data, "should match shallow nested value with array index selector"]
      ];
    
    console.log("#Test 1: matching deep nested objects");
    fixtures.forEach(function (f) {
       var query = f[0],
           expect = f[1],
           msg = f[2];
       var result = Mingo.find(data, query).all();
       console.log(_.isEqual(result, expect) + "> " + msg);
    });
    console.log(">>>>>>>>>>>>> finish test 1 <<<<<<<<<<<<<");
    
    console.log("#Test 2: matching whole objects");
    fixtures = [
        [{"key0.key1": [[{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]]}, "should match full key selector"],
        [{"key0.key1.0": [[{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]]}, "should match with key<-->index selector"],
        [{"key0.key1.0.0": [{key2: [{a:"value2"}, {a: "dummy"}, {b:20}]}]}, "should match with key<-->multi-index selector"],
        [{"key0.key1.0.0.key2": [{a:"value2"}, {a: "dummy"}, {b:20}]}, "should match with key<-->multi-index<-->key selector"]
      ];
    
    fixtures.forEach(function (f) {
       var query = f[0],
           msg = f[1];
       var result = Mingo.find(data, query).all();
       console.log(_.isEqual(result, data) + "> " + msg);
    });
    console.log(">>>>>>>>>>>>> finish test 2 <<<<<<<<<<<<<");
    // output below
    /* #Test 1: matching deep nested objects 
    false> should not match without array index selector to nested value
    false> should not match without enough depth for array index selector to nested value
    true> should match with full array index selector to deeply nested value
    true> should match with array index selector to nested value at depth 1
    false> should match with full array index selector to nested value
    true> should match without array index selector to nested value at depth 1
    false> should match shallow nested value with array index selector
    >>>>>>>>>>>>> finish test 1 <<<<<<<<<<<<<
    #Test 2: matching whole objects
    
    /Users/francis/workspace/mingo/mingo.js:505
            throw new Error("Invalid query operator '" + operator + "' detected");
                  ^
    Error: Invalid query operator '0' detected
        at Object.Mingo.Query._processOperator (/Users/francis/workspace/mingo/mingo.js:505:15)
        at Object.Mingo.Query._compile (/Users/francis/workspace/mingo/mingo.js:493:22)
        at Object.Mingo.Query (/Users/francis/workspace/mingo/mingo.js:472:10)
        at Object.Mingo.find (/Users/francis/workspace/mingo/mingo.js:820:13)
        at /Users/francis/workspace/mingo/test/test.js:44:23
        at Array.forEach (native)
        at Object.<anonymous> (/Users/francis/workspace/mingo/test/test.js:41:10)
        at Module._compile (module.js:456:26)
        at Object.Module._extensions..js (module.js:474:10)
        at Module.load (module.js:356:32)
    */

    Test 1 does not pass all equality checks
    Test 2 throws an exception

    Tested with [email protected].

    $add is not compliant with dates

    From mongo documentation:

    "Adds numbers together or adds numbers and a date. If one of the arguments is a date, $add treats the other arguments as milliseconds to add to the date." documentation link.

    The current implementation just sums values, this change would make it work as intended

    /**
       * Computes the sum of an array of numbers.
       *
       * @param obj
       * @param expr
       * @returns {Object}
       */
      $add (obj, expr) {
        var args = computeValue(obj, expr)
        var date = args.find(function(argument) {
          return argument instanceof Date;
        })
        if (date) {
          args = args.map(function(argument) {
            if (argument instanceof Date) {
              return argument.getTime()
            }
            return argument
          });
        }
        var reduced = reduce(args, function (acc, num) {
          return acc + num
        }, 0);
        if (date) {
          return new Date(reduced);
        }
        return reduced;
      },

    webpack critical dependencies warnings

    Seeing this warning with webpack:

    webpackHotDevClient.js:196./~/mingo/mingo.js
    Critical dependencies:
    24:59-66 require function is used in a way in which dependencies cannot be statically extracted
     @ ./~/mingo/mingo.js 24:59-66
    
    webpackHotDevClient.js:196./~/mingo/dist/mingo.min.js
    Critical dependencies:
    1:5762-5769 require function is used in a way in which dependencies cannot be statically extracted
     @ ./~/mingo/dist/mingo.min.js 1:5762-5769
    

    possibly related to webpack/webpack#2675 (comment) ?

    Loving this project by the way! Thanks!

    runTest tests are generating false-positives

    I found out that this line is causing problem: some runTest tests that are suppose to fail are not failing because you are comparing 0 with 0 because isNaN('a string') or isNaN([1, 2]) returns true: all the tests that tries to compare strings or arrays are not valid.

    repro step:
    Try to modify any runTest test in test/string_operators.js or test/array_operators.js to make it fail (hint: it won't!).

    I think the proper way to fix it is to do the following:

    -          if (isNaN(actual) && isNaN(expected)) actual = expected = 0
    +          if (actual !== actual && expected !== expected) actual = expected = 0

    because NaN === NaN returns false.

    There are 3 tests that fails with the suggested changes, 2 related to unicode characters and string operators, that I am not sure how to solve.

    webpack+UglifyJs build fail because of picking of the es6 version of mingo

    After mingo package.json added "module": "index.js", webpack seem will pick index.js as the entry point and which is es6.

    However, the UglifyJs will complain because it is not es6 ready???

    ERROR in app-b0821b8c5ea25b9a55ff.js from UglifyJs
    Unexpected token operator «=», expected punc «,» [app-b0821b8c5ea25b9a55ff.js:9081,28]
    

    After I deep into line 9081, I find that UglifyJs complain about default value for arg = null

    function each (obj, fn, ctx = null) {

    I tried using babili-webpack-plugin instead of uglifyjs-webpack-plugin, which take me forever to compile my code.

    My webpack setting

        rules: [
          {
            test: /\.jsx?$/,
            exclude: /(node_modules|bower_components)/,
            loader: 'babel-loader',
            query: {
              presets: [
                [
                  'env',
                  {
                    targets: {
                      browsers: ['>1%', 'last 2 ChromeAndroid versions'],
                      uglify: true,
                    },
                  },
                ],
              ],
            },
          },

    I also have tried include all node_modules into my babel transpile but some other packages will fail to transpile because some devDependencies babel plugins have not be installed.

    I am not expert on importing or exporting es6 modules. And seems webpack is not prefect to choose es5 or es6 versions. Not sure how to enforce es5 version for browser and es6 version for node.

    Any suggestion on how to use mingo with webpack?

    Thanks

    • Not yet try rollup, but that require me to convert all webpack configs...

    cloneDeep is not cloning all properties when cloning a boolean with false value

    From the $addField mongo documentation

    Adds new fields to documents. $addFields outputs documents that contain all existing fields from the input documents and newly added fields.

    const { Aggregator } = require('mingo');
    const oper = [{
      $addFields:{
        accountInfo:{
          $arrayElemAt:["$accounts", 0]
        }
      }
    }];
    const col = [{
      _id: "59c52580809dd0032d75238a",
      email: "[email protected]",
      accounts: [{
        createdAt: "2017-09-22T15:00:17.418Z",
        updatedAt: "2017-09-22T15:00:17.418Z",
      }],
    }];
    const aggregatedCollection = new Aggregator([oper]).run(col);
    const expected = [{
      _id: "59c52580809dd0032d75238a",
      email: "[email protected]",
      accountInfo: {
        createdAt: "2017-09-22T15:00:17.418Z",
        updatedAt: "2017-09-22T15:00:17.418Z",
      },
      accounts: [{
        createdAt: "2017-09-22T15:00:17.418Z",
        updatedAt: "2017-09-22T15:00:17.418Z",
      }],
    }];
    
    // no accounts array
    const got = [{
      _id: "59c52580809dd0032d75238a",
      email: "[email protected]",
      accountInfo: {
        createdAt: "2017-09-22T15:00:17.418Z",
        updatedAt: "2017-09-22T15:00:17.418Z",
      },
    }];

    ObjectId comparison

    this returns wrong result:

    const ObjectId = require('mongoose').Types.ObjectId
    const { Query } = require('mingo')
    const id = ObjectId()
    
    const q = new Query({ id })
    
    console.log(q.test({ id: id.toString() })) //false
    console.log(q.test({ id: ObjectId(id.toString()) })) //false

    I expect both checks to return true.

    Is there anyway to check equality of ObjectIds?

    Partial use

    The minified file of the library is currently 30+ kB. Is there a way to use only operators and functions which I need? can I compile my own version of the library with help of tools like rollup or webpack?

    It'd be good to provide ES6 version of the lib, so people will be able to use tree shaking and remove code which they don't use

    Query for Null or Missing Fields

    Thanks for your package first. That is really amazing.

    I found that the behavior of find({ someField: null }) is not same as mongodb

    https://docs.mongodb.com/manual/tutorial/query-for-null-fields/

    in which null should also match with undefined field.
    Shall mingo align that with mongoDb spec? Or that is expected behavior?

    BTW. I have done a little benchmark and found that is much faster than sift and within same level as nedb and lokijs. Thanks a lot.

    Nested projections do not function correctly

    If you have a nested projection, it fails to match:

    var Mingo = require('mingo');
    
    var db = [{ "name": "Steve", "age": 15, "features": { "hair": "brown", "eyes": "brown" } } ];
    
    Mingo.find(db, {}, { "features.hair": 1 }).all(); // [ { } ]
    

    In MongoDB, this would return [ { "_id" : ObjectId("57b86248bc6b3372ac511267"), "features" : { "hair" : "brown" } } ].

    Passing Criteria as a parameter to query.test

    Hi,

    This works (on NodeJS 7.7):

      let lowScore = {type: "homework", score: 30};
      let highScore = {type: "homework", score: 80};
      mingo.setup({});
      let query = new mingo.Query({
        type: "homework",
        score: { $gte: 50 }
      });
      let isLowScore = query.test(lowScore);
      console.log("Score Above 50 ", isLowScore);
      let isHighScore = query.test(highScore);
      console.log("Score Above 50 ", isHighScore);

    and produces:

    Score Above 50  false
    Score Above 50  true
    

    The below throws an error Criteria must be of type Object. The issue is obviously that criteria is initialized as a string and not an object. In my use case I need to read the criteria from a database and from the database it comes back as a string. Is it possible to to this? I know it might be trivial question.

      let lowScore = {type: "homework", score: 30};
      let highScore = {type: "homework", score: 80};
      let criteria = '{type: "homework", score: { $gte: 50 } }';
      mingo.setup({});
      let query = new mingo.Query(criteria);
      let isLowScore = query.test(lowScore);
      console.log("Score Above 50 ", isLowScore);
      let isHighScore = query.test(highScore);
      console.log("Score Above 50 ", isHighScore);

    Thanks
    Leon

    Feature request: defining custom Operators.

    One thing I find myself wishing for when using Mingo is the ability to define custom operators.

    Without the ability to define custom operators, I find myself wrapping aggregators in my own functions with aggregator-like APIs so that I can perform unsupported mutations or actions on data in such a way that the logic is centralized in my application and I don't repeat myself needlessly. If given the ability to define custom operators, the need for wrapping of Mingo's core functions in other functions may remove itself, and Mingo can be used as an implementation of a domain specific language that's very MongoDB-like.

    I've given a look at the source code, and it appears that exposing the Ops and *Operators objects via a nice API should do the trick. Of course, users of the extension API need to realize that they might break things severely, but ensuring that they don't would be their own responsibility.

    I was wondering what your thoughts were on this. Do you think this is within scope of the project?

    Please support query with mongoose ObjectId

    Please consider the following situation:

    // users
    [
      {
        _id : ObjectId("1"),
        someKey : "someValue"
      }
    ]
    
        var query = new Mingo.Query({_id : 1});
        var cursor = query.find(users);
        return cursor.all(); // [] empty array, but it should contain {_id : ObjectId("1"), someKey : "someValue"}
    

    Looking forward to this!

    $regex not working as expected with arrays

    When trying to find() a property that is nested in an array of arrays, the record is not returned if the child array has more than one element and one or more of those elements match.

    Repro steps:

    // no regex - returns expected list: 1 element - ok
    mingo.find([{l1: [{ tags: ['tag1', 'tag2'] }, {'notags': 'yep'}]}], {'l1.tags': 'tag1'}).all()
    
    // with regex - but searched property is not an array: ok
    mingo.find([{l1: [{ tags: 'tag1'}, {'notags': 'yep'}]}], {'l1.tags': {$regex: '.*tag.*', $options: 'i'}}).all()
    
    // with regex - but searched property is an array, with all elements matching: not ok - expected 1, returned 0
    mingo.find([{l1: [{ tags: ['tag1', 'tag2'] }, {'tags': ['tag66']}]}], {'l1.tags': {$regex: 'tag', $options: 'i'}}).all();
    
    // with regex - but searched property is an array, only one element matching: not ok - returns 0 elements - expected 1
    mingo.find([{l1: [{ tags: ['tag1', 'tag2'] }, {'notags': 'yep'}]}], {'l1.tags': {$regex: 'tag', $options: 'i'}}).all();
    
    // 
    

    Option to modify the pipeline documents for $lookup

    There should be an option to modify the input objects and add a reference to the lookup data, rather than return new (cloned) objects.

    This way, if there has been a $project stage, we can safely opt-out of cloning again.

    See #59.

    internal.computeValue() throws if keys is not present under certain circumstances

    Repro step:

    // this works
    mingo._internal().computeValue({l1: {l2: [{l3: [{ a: 666}]}]}}, '$l1.l2.l3.a');
    
    // this throws an exception
    mingo._internal().computeValue({l1: {l2: [{l3: [{ a: 666}]}, {l4: 'some prop'}]}}, '$l1.l2.l3.a');
    

    Thrown exception:

    TypeError: Cannot read property 'a' of undefined
        at getValue (path/to/project/node_modules/mingo/dist/mingo.js:3545:13)
        at _loop (path/to/project/node_modules/mingo/dist/mingo.js:3617:15)
        at resolve (path/to/project/node_modules/mingo/dist/mingo.js:3625:16)
        at path/to/project/node_modules/mingo/dist/mingo.js:3605:16
        at Array.map (native)
        at _loop (path/to/project/node_modules/mingo/dist/mingo.js:3604:21)
        at resolve (path/to/project/node_modules/mingo/dist/mingo.js:3625:16)
        at Object.computeValue (path/to/project/node_modules/mingo/dist/mingo.js:3840:12)
        at repl:1:19
        at sigintHandlersWrap (vm.js:22:35)
    

    It seems like computeValue is trying to find a property a in 'some prop'.

    BUG: $all and $elemMatch fails on nested elements

    Hey there. first off: awesome stuff you did with mingo. However I encountered a bug while querying a nested array using $all and $elemMatch. I setup a testcase to illustrate the issue:

    test("elemMatch on nested elements", function () {
    
      var testobject = [{
          user: { username: 'User1', projects: [{ name: "Project 1", rating: { complexity: 6 }}, { name: "Project 2", rating: { complexity: 2 }}] }
      },{
          user: { username: 'User2', projects: [{ name: "Project 1", rating: { complexity: 6 }}, { name: "Project 2", rating: { complexity: 8 }}] }
      }],
      query = {
        'user.projects': {
          "$all": [{
            "$elemMatch": {
              'rating.complexity': { '$gt' : 6 }
            }
          }]
        }
      };
      // It should return one user object
      ok(Mingo.compile(query).test(testobject));
    
    });

    Expected behavior: http://docs.mongodb.org/manual/reference/operator/query/all/#use-all-with-elemmatch

    Nested Arrays

    Hi,

    It looks like mingo doesn't correctly return values in nested arrays. For example, with this data set:

    var data = [{
        key0: [{
            key1: [ {key2: "value2"} ],
            key1a: {key2a: "value2a"}
        }]
    }];

    The expression {"key0.key1a.key2a": "value2a"} returns the object, while the expression {"key0.key1.key2": "value2"} returns an empty array. Am I constructing the query wrong?

    Thanks,
    Dan

    $group operators should work in parallel

    $group operations are insanely slow. Grouping 3000 documents to 25 properties from memory takes about 15 seconds on my laptop. This operation takes 0.1 second from disk in mongodb.

    One of the problems seems to be that every OP_GROUP operators ($sum, $push, custom operators, etc) each iterate the collection for themselves. This means that 3000 documents are iterated 25 times.

    The collection should be iterated only once, and every operator should work within that iteration.

    Documentation: what's a collection?

    This might seem obvious, but the documentation uses the term collection everywhere without ever giving an example of what a collection should look like. I think having a quick illustration of what you mean (array? JS object? something else?) would be useful.

    Consider moving to use Lodash rather than Underscore

    There are several places Lodash has functions which Underscore doesn't which could be used internally, and none the other way around.

    E.g. the _.result causing #25 would be _.get in Lodash but Underscore has no equivalent.

    Is there a possibility Underscore could be replaced?

    Thoughts on hashing optimizations

    Your hasing function looks like the hashCode algorhythm. That correct?

    Hypothetically

    Would you be willing to have any dependencies?
    Would you be willing to have different dependencies for the npm/node.js version?

    • murmurhash3 (pure javascript) is about 140% as fast as hashCode.
    • cityhash (c++ node.js module) is about 280% as fast as hashCode

    $lookup should reference join collection instead of clone data.

    I'm not sure what's happening in lookup.js:

    each(joinColl, (obj, i) => {
        let k = hashCode(obj[foreignField])
        hash[k] = hash[k] || []
        hash[k].push(i)
      })
    
      each(collection, (obj) => {
        let k = hashCode(obj[localField])
        let indexes = hash[k] || []
        obj[asField] = map(indexes, (i) => clone(joinColl[i]))
        result.push(obj)
      })
    

    I'm confused by this specifically:

    obj[asField] = map(indexes, (i) => clone(joinColl[i]))

    Is the $lookup data being cloned, or added by reference?
    Cloning, of course, is trivial for small datasets but huge (unnecessary) overhead on big datasets.

    $slice works in a different way compared to mongodb

    Expected: calling $slice with a single argument in mongodb, gets that many items from the array beginning from zero position.

    Actual: Calling mingo with $slice and a single argument takes all items counting from that position in the array.

    aggregate `$project` mongoDB discrepancy

    I noticed a difference between mongodb and mingo behaviour:

    // mongoDB
    db.test.insert({l1: [{l2: [{ l3: 'level3', l3bis: 'level3bis'}]}, {l2bis: 'level2bis'}]});
    db.test.aggregate([{$project: { l3: '$l1.l2.l3'}}]);
    // ==> { "_id" : ObjectId("..."), "l3" : [ [ "level3" ] ] }
    
    // mingo
    mingo.aggregate(
        [{_id: 'some-id', l1: [{l2: [{ l3: 'level3', l3bis: 'level3bis'}]}, {l2bis: 'level2bis'}]}],
        [{$project: { l3: '$l1.l2.l3'}}]
    );
    // ==> [ { _id: 'some-id', l3: 'level3' } ]
    

    Notice that l3 is not an array of array.

    It is probably related to this line.

    I don't consider this a problematic issue, I just wanted to let you know...

    Mingo.setup() removed

    I use setup to change the default key from _id. Since the ES6 refactor this is no longer possible.

    assert(typeof require('mingo').setup === 'function') Fails

    The readme shows setup setup() If it was intended to be remove this should be updated.

    Feature request: Test function

    Hi there

    I would love a feature to do the following: Test an arbitary object to see if it would pass or fail a "find" kind of expression. I'm sure you have it internally, would it be hard to expose to the API ?

    var Mingo = require('mingo');
    // or just access Mingo global in browser

    // setup the key field for your collection
    Mingo.setup({
    key: '_id' // default
    });

    // create a query with criteria
    // find all grades for homework with score >= 50
    var result = Mingo.Test({ type: "homework", score: 55}, { $gte: 50 } });

    if (result)
    alert("Passed!");
    else
    alert("Failed!");

    Is this possible?

    Meteor minimongo vs mingo

    Meteor also has a pure-js implementation of (a large part of) the mongo API.

    I know it's not as feature-complete (e.g. no aggregation), but I was wondering if there has been a performance comparison between the two?

    Recommend Projects

    • React photo React

      A declarative, efficient, and flexible JavaScript library for building user interfaces.

    • Vue.js photo Vue.js

      🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

    • Typescript photo Typescript

      TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

    • TensorFlow photo TensorFlow

      An Open Source Machine Learning Framework for Everyone

    • Django photo Django

      The Web framework for perfectionists with deadlines.

    • D3 photo D3

      Bring data to life with SVG, Canvas and HTML. 📊📈🎉

    Recommend Topics

    • javascript

      JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

    • web

      Some thing interesting about web. New door for the world.

    • server

      A server is a program made to process requests and deliver data to clients.

    • Machine learning

      Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

    • Game

      Some thing interesting about game, make everyone happy.

    Recommend Org

    • Facebook photo Facebook

      We are working to build community through open source technology. NB: members must have two-factor auth.

    • Microsoft photo Microsoft

      Open source projects and samples from Microsoft.

    • Google photo Google

      Google ❤️ Open Source for everyone.

    • D3 photo D3

      Data-Driven Documents codes.