Giter Club home page Giter Club logo

Comments (14)

ofrobots avatar ofrobots commented on August 30, 2024 2

Thanks for opening the issue. Yeah, this is problematic – but working as designed in the serverless execution model. This is related to the fundamental issue with the serverless execution model where background work may be quiesced unpredictably. bunyan itself doesn't expose an interface to users to know when logs are "done".

On Google Cloud Functions, console logging will actually work better because it is scraped through a different - runtime independent process. The scraper is able to process JSON structured LogEntry records correctly; but it would be inconvenient for users to format the logs that way manually.

I think a proper way to fix this would be for us to switch from using gRPC-based network transport to a console-writing transport and rely on scraping inside the container to ship the log data off to Stackdriver. This feature has been on the back-burner for a while, maybe it is time to prioritize this.

from nodejs-logging-bunyan.

npomfret avatar npomfret commented on August 30, 2024 2

I had some success combining @retrohacker's technique of closing the Bunyan stream inside an exit handler (using async-exit-hook) for the function. So when the function dies, my custom error handler logs to bunyan as normal, the exit-hook then closes the stream, and I see my errors logged in Stackdriver.

from nodejs-logging-bunyan.

npomfret avatar npomfret commented on August 30, 2024 1

No sorry, I couldn't get it to work. Tried absolutely everything I could think of. Logging in cloud functions seems very broken. The discussion is moving here: https://issuetracker.google.com/issues/149668307. Please chip in with your problems.

from nodejs-logging-bunyan.

losalex avatar losalex commented on August 30, 2024 1

I believe that the best solution here is to use console logging as mentioned before by @ofrobots. The nodejs-logging library introduced LogSync class which supports structured logging output into console. From the console log records are picked by the agents running in Cloud Functions nodes and delivered to Cloud Logging. However, LogSync class is not integrated with nodejs-logging-bunyan and requires a code change. Please thumb up if you think we should consider to integrate LogSync in this library.

from nodejs-logging-bunyan.

retrohacker avatar retrohacker commented on August 30, 2024

The scraper is able to process JSON structured LogEntry records correctly;

I may be misunderstanding what you are saying here. I switched to this logger from using pino directly because pinos logs were showing up as text in stack driver.

Is there a difference in a cloud function between the console.* methods and process.stdout?

from nodejs-logging-bunyan.

ofrobots avatar ofrobots commented on August 30, 2024

@retrohacker apologies for not being clear. It is possible to write JSON structured according to the LogEntry message schema. These are supposed to be parsed and presented as structured log entries. Now, personally, I have not experimented with this much, so there might be details that are missing. As part of addressing this issue we can flesh this out in more detail. In the meanwhile, feel free to give it shot with pino and let us know how it goes.

from nodejs-logging-bunyan.

ofrobots avatar ofrobots commented on August 30, 2024

@retrohacker

Is there a difference in a cloud function between the console.* methods and process.stdout?

Yes, the console.* functions on GCF have special logic to format the entire payload as a LogEntry with a textPayload. This makes multi-line log messages to work out better. process.stdout streams don't have this extra logic.

from nodejs-logging-bunyan.

ofrobots avatar ofrobots commented on August 30, 2024

I switched to this logger from using pino directly because pinos logs were showing up as text in stack driver.

@retrohacker would it be possible for you to show an example/screenshot of what you were observing with pino?

from nodejs-logging-bunyan.

retrohacker avatar retrohacker commented on August 30, 2024

I seem to have been misremembering. Pino logs aren't showing up at all with the default destination.

I've actually managed to get reliable logs by doing:

'use strict'

const { LoggingBunyan } = require('@google-cloud/logging-bunyan')
const bunyan = require('bunyan')

exports.logtest = function logtest (message, event, cb) {
  const stackdriver = (new LoggingBunyan()).stream('info')
  const log = bunyan.createLogger({
    name: 'logtest',
    level: 'info',
    streams: [
      stackdriver
    ]
  })
  log.info({ message, event }, 'processing')
  log.info({ a: Math.random(), b: Math.random() })
  stackdriver.stream.end(cb)
}

A bit of a bummer since it requires standing up and tearing down the whole logging framework on every request, but it gets the job done.

from nodejs-logging-bunyan.

retrohacker avatar retrohacker commented on August 30, 2024

This seems to be a decent pattern if we can find a lighter touch way to standup/teardown the stream:

function initLogger (config) {
  const stackdriver = (new LoggingBunyan()).stream('info')
  config.streams = config.streams || []
  config.streams.push(stackdriver)
  const log = bunyan.createLogger(config)
  // Wrap a callback, ensures logs are flushed prior to invoking the callback
  log.callback = (cb) => () => stackdriver.stream.end(cb)
  return log
}

exports.logtest = function logtest (message, event, cb) {
  const log = initLogger({
    level: 'info',
    name: 'logtest'
  })
  const callback = log.callback(cb)
  log.info('hello world!')
  callback()
}

from nodejs-logging-bunyan.

ofrobots avatar ofrobots commented on August 30, 2024

The scraper is able to process JSON structured LogEntry records correctly; but it would be inconvenient for users to format the logs that way manually.

What I have found out is that GCF is slightly different from the other serverless environments, it doesn't have a scraper, but instead console.log are modified to write the logs. The problem is they write them as textPayload. The approach of writing JSON that I presented earlier will not work. The GCF team is working on adding structured logging support: https://issuetracker.google.com/132647512

We also have the problem that process.stdout is not propagated. This is also being investigated.

Once we have structured logging support, we can switch the bunyan and winston transport libraries to use console based logging when appropriate. That will fix this problem reliably.

from nodejs-logging-bunyan.

kanekv avatar kanekv commented on August 30, 2024

@npomfret what did you do exactly?

from nodejs-logging-bunyan.

rymnc avatar rymnc commented on August 30, 2024

Currently facing this issue with cronjobs that complete quickly (~15 seconds). I've tried the method highlighted here, right before a process.exit(0), but it seems like the promise is never resolved. Has anyone been able to find a permanent solution to this?

from nodejs-logging-bunyan.

losalex avatar losalex commented on August 30, 2024

Thanks a lot for your patience - I believe this issue is addressed by 605. Feel free to reactivate or comment if you have more concerns

from nodejs-logging-bunyan.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.