-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Async logging doesnt complete before function termination #304
Comments
Thanks for opening the issue. Yeah, this is problematic – but working as designed in the serverless execution model. This is related to the fundamental issue with the serverless execution model where background work may be quiesced unpredictably. On Google Cloud Functions, console logging will actually work better because it is scraped through a different - runtime independent process. The scraper is able to process JSON structured I think a proper way to fix this would be for us to switch from using gRPC-based network transport to a console-writing transport and rely on scraping inside the container to ship the log data off to Stackdriver. This feature has been on the back-burner for a while, maybe it is time to prioritize this. |
I may be misunderstanding what you are saying here. I switched to this logger from using pino directly because pinos logs were showing up as text in stack driver. Is there a difference in a cloud function between the console.* methods and process.stdout? |
@retrohacker apologies for not being clear. It is possible to write JSON structured according to the |
Yes, the |
@retrohacker would it be possible for you to show an example/screenshot of what you were observing with |
I seem to have been misremembering. Pino logs aren't showing up at all with the default destination. I've actually managed to get reliable logs by doing: 'use strict'
const { LoggingBunyan } = require('@google-cloud/logging-bunyan')
const bunyan = require('bunyan')
exports.logtest = function logtest (message, event, cb) {
const stackdriver = (new LoggingBunyan()).stream('info')
const log = bunyan.createLogger({
name: 'logtest',
level: 'info',
streams: [
stackdriver
]
})
log.info({ message, event }, 'processing')
log.info({ a: Math.random(), b: Math.random() })
stackdriver.stream.end(cb)
} A bit of a bummer since it requires standing up and tearing down the whole logging framework on every request, but it gets the job done. |
This seems to be a decent pattern if we can find a lighter touch way to standup/teardown the stream: function initLogger (config) {
const stackdriver = (new LoggingBunyan()).stream('info')
config.streams = config.streams || []
config.streams.push(stackdriver)
const log = bunyan.createLogger(config)
// Wrap a callback, ensures logs are flushed prior to invoking the callback
log.callback = (cb) => () => stackdriver.stream.end(cb)
return log
}
exports.logtest = function logtest (message, event, cb) {
const log = initLogger({
level: 'info',
name: 'logtest'
})
const callback = log.callback(cb)
log.info('hello world!')
callback()
} |
What I have found out is that GCF is slightly different from the other serverless environments, it doesn't have a scraper, but instead We also have the problem that Once we have structured logging support, we can switch the |
I had some success combining @retrohacker's technique of closing the Bunyan stream inside an exit handler (using |
@npomfret what did you do exactly? |
No sorry, I couldn't get it to work. Tried absolutely everything I could think of. Logging in cloud functions seems very broken. The discussion is moving here: https://issuetracker.google.com/issues/149668307. Please chip in with your problems. |
Currently facing this issue with cronjobs that complete quickly (~15 seconds). I've tried the method highlighted here, right before a process.exit(0), but it seems like the promise is never resolved. Has anyone been able to find a permanent solution to this? |
I believe that the best solution here is to use console logging as mentioned before by @ofrobots. The nodejs-logging library introduced LogSync class which supports structured logging output into console. From the console log records are picked by the agents running in Cloud Functions nodes and delivered to Cloud Logging. However, |
Thanks a lot for your patience - I believe this issue is addressed by 605. Feel free to reactivate or comment if you have more concerns |
When using the bunyan wrapper for logging from a Google Cloud Function, there doesn't appear to be a way to tell when the logs have been flushed to Stack Driver. If you don't give the logger time to flush, the logs wont show up.
Environment details
@google-cloud/logging-bunyan
version: ^ 0.10.1Steps to reproduce
If you run this, the logs don't show up.
If you change
return cb()
toreturn setTimeout(cb, 1000)
the logs usually show up.The text was updated successfully, but these errors were encountered: