Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Add warning when initialising the patchgan discriminator with batchnorm in a distributed environment #454

Merged
merged 3 commits into from
Jan 8, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions generative/networks/nets/patchgan_discriminator.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@

from __future__ import annotations

import warnings
from collections.abc import Sequence

import torch
Expand Down Expand Up @@ -218,6 +219,12 @@ def __init__(
)

self.apply(self.initialise_weights)
if norm.lower() == "batch" and torch.distributed.is_initialized():
warnings.warn(
"WARNING: Discriminator is using BatchNorm and a distributed training environment has been detected. "
"To train with DDP, convert discriminator to SyncBatchNorm using "
"torch.nn.SyncBatchNorm.convert_sync_batchnorm(model).)"
)

def forward(self, x: torch.Tensor) -> list[torch.Tensor]:
"""
Expand Down