Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Implement IAsyncDisposable and use FlushAsync on TelemetryClient #228

Merged
merged 1 commit into from
Oct 31, 2024

Conversation

agehrke
Copy link
Contributor

@agehrke agehrke commented Oct 30, 2024

Currently the docs has a whole section on How, When and Why to Flush Messages Manually. We can help consumers limit the risk of losing log messages when application is shutting down by making use of the TelemetryClient.FlushAsync() method. By implementing IAsyncDisposable in ApplicationInsightsSink we can call this FlushAsync() method which will ensure all log messages are flushed to App Insights. The Flush() method will only enqueue a flush.

This PR also adds a few tests around channel being flushed when logger is disposed.

@agehrke agehrke changed the title Implement IAsyncDisposable and use FlushAsync on channel Implement IAsyncDisposable and use FlushAsync on TelemetryClient Oct 30, 2024
@nblumhardt
Copy link
Contributor

Thanks for the PR! The changes look good; just so the purpose of this one is clear - are you experiencing message loss in the current version without this change? The telemetry client is flushed (non-asynchronously) in Dispose(), so although the async version is a welcome improvement, I don't think it will have any effect on whether messages are flushed or not. Am I understanding this correctly? Thanks!

@agehrke
Copy link
Contributor Author

agehrke commented Oct 30, 2024

Thanks for the fast feedback, Nicholas. Yes we are experiencing loss of log messages in our App Insights sink when host is shutting down. File sink contains everything. First I had a quick look at the dispose logic in ApplicationInsightsSink and concluded it looked good. But digging into the App Insights code:

  1. Our call to TelemetryClient.Flush() eventually calls ITelemetryChannel.Flush()
  2. I believe by default the ServerTelemetryChannel is used in ASP.NET Core, and its Flush() implementation only differs slightly from FlushAsync():
// Flush
this.TelemetryBuffer.FlushAsync().ConfigureAwait(false).GetAwaiter().GetResult();

// FlushAsync
this.TelemetryBuffer.FlushAsync(cancellationToken);

Note the use of cancellation token in async version.

  1. Digging into the TelemetryBuffer.FlushAsync() methods you start to see the horror :)
// FlushAsync()
await Task.Run(() => this.serializer.Serialize(telemetryToFlush)).ConfigureAwait(false);

// FlushAsync(CancellationToken)
this.serializer.SerializeAsync(telemetryToFlush, cancellationToken);
  1. Going deeper into TelemetrySerializer you realize what is happening:
// Serialize
var transmission = new Transmission(this.EndpointAddress, items) { TransmissionStatusEvent = this.TransmissionStatusEvent };
this.Transmitter.Enqueue(transmission); // OMG

// SerializeAsync
var transmission = new Transmission() { TransmissionStatusEvent = this.TransmissionStatusEvent };
return Task.Run(() => this.SerializeTransmissionAndEnqueue(transmission, items, cancellationToken), cancellationToken);

private Task<bool> SerializeTransmissionAndEnqueue(Transmission transmission, ICollection<ITelemetry> items, CancellationToken cancellationToken)
{
    transmission.Serialize(this.EndpointAddress, items);
    return this.Transmitter.FlushAsync(transmission, cancellationToken);
}

So, TelemetryClient.Flush() is just "enqueuing" a flush. Which actually matches the docs on flushing:

When you use Flush(), we recommend this pattern:

// Allow some time for flushing before shutdown.
System.Threading.Thread.Sleep(5000);

When you use FlushAsync(), we recommend this pattern:

await telemetryClient.FlushAsync();
// No need to sleep

@nblumhardt
Copy link
Contributor

Wow, that's mind-boggling! Thanks for the explanation 👍

@nblumhardt nblumhardt merged commit 110dde7 into serilog-contrib:dev Oct 31, 2024
1 check passed
@agehrke
Copy link
Contributor Author

agehrke commented Nov 1, 2024

Thank you for a great libary! And thanks again for the fast feedback.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants