Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Deprecate aggregate_logging_outputs API (use reduce_metrics instead) #1611

Closed
wants to merge 1 commit into from
Closed

Deprecate aggregate_logging_outputs API (use reduce_metrics instead) #1611

wants to merge 1 commit into from

Conversation

myleott
Copy link

@myleott myleott commented Jan 11, 2020

In 1e324a5 we introduced fairseq.metrics, which allows metrics to be logged and aggregated based on their context. For example:

with metrics.aggregate("train"):
    for step, batch in enumerate(epoch):
        with metrics.aggregate("train_inner") as agg:
            loss = get_loss(batch)
            metrics.log_scalar("loss", loss)
            if step % log_interval == 0:
                # print the average loss over this log_interval
                print(agg.get_smoothed_value("loss"))
                agg.reset()
# print the average loss over the whole epoch
print(metrics.get_smoothed_value("train", "loss"))

This interface allows one to log metrics from anywhere, without having to pass the values up the call stack, resulting in a much simpler train.py.

To make the transition smoother, we introduce FairseqTask.reduce_metrics and FairseqCriterion.reduce_metrics to replace the old aggregate_logging_outputs method, although we provide backward compatibility for old Tasks/Criterions.

Summary: Pull Request resolved: fairinternal/fairseq-py#974

Differential Revision: D19292402

Pulled By: myleott

fbshipit-source-id: 6375677ce8bb371fb9b8cfae60cd26156c88a506
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D19292402

@facebook-github-bot
Copy link
Contributor

@myleott merged this pull request in 8679339.

moussaKam pushed a commit to moussaKam/language-adaptive-pretraining that referenced this pull request Sep 29, 2020
…acebookresearch#1611)

Summary:
Pull Request resolved: facebookresearch#1611

Pull Request resolved: fairinternal/fairseq-py#974

Differential Revision: D19292402

Pulled By: myleott

fbshipit-source-id: d51327584e048d3e39c133e9ef57a791e0329a66
yzpang pushed a commit to yzpang/gold-off-policy-text-gen-iclr21 that referenced this pull request Feb 19, 2021
…(#1611)

Summary:
Pull Request resolved: facebookresearch/fairseq#1611

Pull Request resolved: fairinternal/fairseq-py#974

Differential Revision: D19292402

Pulled By: myleott

fbshipit-source-id: d51327584e048d3e39c133e9ef57a791e0329a66
yzpang pushed a commit to yzpang/gold-off-policy-text-gen-iclr21 that referenced this pull request Feb 19, 2021
…(#1611)

Summary:
Pull Request resolved: facebookresearch/fairseq#1611

Pull Request resolved: fairinternal/fairseq-py#974

Differential Revision: D19292402

Pulled By: myleott

fbshipit-source-id: d51327584e048d3e39c133e9ef57a791e0329a66
# for free to join this conversation on GitHub. Already have an account? # to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants