Giter Club home page Giter Club logo

logdna-bunyan's People

Contributors

darinspivey avatar logdnabot avatar lvilya avatar mdeltito avatar racbart avatar respectus avatar s100 avatar smusali avatar vilyapilya avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logdna-bunyan's Issues

Use of deprecated package "logdna"

I see that package logdna has been deprecated recently:

logdna has been renamed to @logdna/logger. Please install the latest version of @logdna/logger instead of logdna

Will this package be updated to depend on @logdna/logger? Or is there a replacement for this package we should use instead?

Thanks.

Message sequence issue

Hello,

We tried to use LogDNA with our stage environment and found that message sequence of different microservices mixed in some cases. We hoped that found that core of this issue is:

// LogDNA adds their own - lets assume the time delta is trivial
// record['timestamp'] = record.time;

In log heavy cases it's definitely not trivial. :)

BUT it was not fix our issue :(

Unformatted JSON does not get logged and freezes the whole application

NOTE: Issue moved from logdna/nodejs#24

Node.js: 8.9.1
logdna-bunyan: ^1.0.0
bunyan: ^1.8.12
sequelize: ^4.8.4

How to reproduce:

  1. Run an application where you use Bunyan and LogDNA's BunyanStream sub-module to log, and Sequelize for DB connection.
  2. Try log a raw, unformatted Sequelize object

What happens:
The log is not executed (the information is not logged and does not reach LogDNA).
The server app freezes.

More info:
So, this is the unformatted Sequelize object that is trying to get logged:

[ registration {
    dataValues:
     { id: '1',
       customerId: '1',
       date: '2017-10-10',
       userId: '1',
       amount: 1800,
       unit: 'kg',
       currency: 'DKK',
       kgPerLiter: 15,
       cost: '5000',
       comment: 'Hello test',
       manual: true,
       scale: 'true',
       updatedAt: 2017-10-09T10:30:42.000Z,
       createdAt: 2017-10-09T10:30:42.000Z,
       deletedAt: null,
       areaId: '2',
       productId: '2',
       area_id: '2',
       product_id: '2' },
    _previousDataValues:
     { id: '1',
       customerId: '1',
       date: '2017-10-10',
       userId: '1',
       amount: 1800,
       unit: 'kg',
       currency: 'DKK',
       kgPerLiter: 15,
       cost: '5000',
       comment: 'Hello test',
       manual: true,
       scale: 'true',
       updatedAt: 2017-10-09T10:30:42.000Z,
       createdAt: 2017-10-09T10:30:42.000Z,
       deletedAt: null,
       areaId: '2',
       productId: '2',
       area_id: '2',
       product_id: '2' },
    _changed: {},
    _modelOptions:
     { timestamps: true,
       validate: {},
       freezeTableName: true,
       underscored: false,
       underscoredAll: false,
       paranoid: true,
       rejectOnEmpty: false,
       whereCollection: [Object],
       schema: null,
       schemaDelimiter: '',
       defaultScope: {},
       scopes: [],
       hooks: {},
       indexes: [Object],
       name: [Object],
       omitNull: false,
       getterMethods: [Object],
       sequelize: [Object],
       uniqueKeys: {} },
    _options:
     { isNewRecord: false,
       _schema: null,
       _schemaDelimiter: '',
       raw: true,
       attributes: [Object] },
    __eagerlyLoadedAssociations: [],
    isNewRecord: false } ]

It does not get logged and the whole app freezes, becoming unresponsive to further requests.
Nevertheless, if I format the object into the actual thing that has to be logged:

[
    {
        "date": "2017-10-10",
        "createdAt": "2017-10-09 10:30:42",
        "updatedAt": "2017-10-09 10:30:42",
        "id": "1",
        "customerId": "1",
        "userId": "1",
        "amount": 1800,
        "unit": "kg",
        "currency": "DKK",
        "kgPerLiter": 15,
        "cost": "5000",
        "comment": "Hello test",
        "manual": true,
        "scale": "true",
        "deletedAt": null,
        "areaId": "2",
        "productId": "2",
        "area_id": "2",
        "product_id": "2"
    }
]

Then it works all fine.
I perform the formatting (as suggested from Sequelize) with JSON.parse(JSON.stringify(object)).

Of course, I don't want to log the unformatted object, because it contains a lot of redundant data, and I am going to format my objects before logging them.
I just thought it'd be nice to let you know about this issue.

Use latest logdna version

The logdna-bunyan library is currently using logdna ^2.0.0, but that dependency is now on v3.

Can logdna-bunyan move to use logdna ^3.0.0 to pick up any new fixes and features?

Stream's write function mutates external record objects making it unusable for other bunyan streams

LogDNA stream deletes properties from record object and such modified object is then passed to any other raw bunyan stream declared after LogDNA stream, resulting in incomplete data logged with other streams.

This is because in JS objects are passed to functions as references. This means that modifying arguments inside the function actually modifies external object, not a local copy (because there is no local copy). Stream driver should not mutate received arguments in any way.

Example:

const bunyan = require('bunyan');

/* Basic stream which will get raw data and print it to the console */
class SimpleStream {
  write(record) {
    console.log(record);
  };
}

let LogDNAStream = require('logdna-bunyan').BunyanStream;
let logDNA = new LogDNAStream({
  key: '...apikey...'
});

var logger = bunyan.createLogger({
  name: 'test',
  streams: [
    { stream: new SimpleStream(), type: 'raw' }, // first: basic stream
    { stream: logDNA, type: 'raw' },             // second: logdna
    { stream: new SimpleStream(), type: 'raw' }, // third: basic stream
  ]
});

logger.warn('This is a warning');

The above logger instance uses three streams. The first prints to the console, the second is a LogDNAStream, the third prints to the console.

Expected behaviour: object printed to the console with first and third stream is the same.

Actual behaviour:

{ name: 'test',
  hostname: 'hostname',
  pid: 3296,
  level: 40,
  msg: 'This is a warning',
  time: 2018-04-21T10:36:05.294Z,
  v: 0 }
{ hostname: 'hostname',
  pid: 3296,
  time: 2018-04-21T10:36:05.294Z,
  v: 0 }

Result: LogDNAStream stream deleted some properties and later streams receive incomplete record. Actually, the most important fields are deleted, level and the message itself, which makes the log record useless.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.