-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Add content validation result metrics #796
Conversation
6b2790b
to
ba4e672
Compare
cf3864b
to
47bde5e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
validate_content
is used so many places, it seems like it would be easy to miss one of them during a refactor or a new feature. Maybe we could pass the metric object into the metrics and increment it inside the validator once, instead of at every call site.
Or maybe a utility method that calls the validator, records the metric and returns the result, so the match calls can be replaced? Then we would generally make sure that no new direct calls to validate_content
are created, though.
github tells me there are 9 call sites altogether, and I only see 4 places where the PR reports the validation result. So maybe there are still some missing?
portalnet/src/metrics/overlay.rs
Outdated
format!( | ||
"offers={}/{}, accepts={}/{}", | ||
"offers={}/{}, accepts={}/{}, successful_validations={}/{}", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This log line gets quite long, so maybe we can bend towards shorter labels here, like:
"offers={}/{}, accepts={}/{}, successful_validations={}/{}", | |
"offers={}/{}, accepts={}/{}, validations={}/{}", |
If I'm not missing something, the other call sites are from within tests, in which case it doesn't seem worthwhile to manage metrics
I agree, it's not ideal, and it's easy to miss something. I do see this as a concern. But, this is how we handle all metrics inside wrt the idea of passing metrics into the
I went with a bit of a hybrid solution here. Added a utility method inside I'm going to go ahead with the merge here since you gave this pr 👍 - but I'm happy to come back to this if you have any further thoughts. |
9539e34
to
f502631
Compare
f502631
to
d3fb405
Compare
What was wrong?
Add metrics reporting whether or not received content is being successfully validated or not.
How was it fixed?
Added metric reporting logic. Update to grafana dashboard template will come in a following pr along with other dashboard changes.
To-Do