Skip to content

A light-weight library for adding fault tolerance to large-scale PyTorch distributed training workloads.

License

Notifications You must be signed in to change notification settings

dzhulgakov/torchsnapshot

 
 

Repository files navigation

torchsnapshot

build status pypi version pypi nightly version codecov bsd license

This library is currently in Alpha and currently does not have a stable release. The API may change and may not be backward compatible. If you have suggestions for improvements, please open a GitHub issue. We'd love to hear your feedback.

A light-weight library for adding fault tolerance to large-scale PyTorch distributed training workloads.

Install

Requires Python >= 3.7 and PyTorch >= 1.11

From pip:

pip install --pre torchsnapshot-nightly

From source:

git clone https://github.com/facebookresearch/torchsnapshot
cd torchsnapshot
pip install -r requirements.txt
python setup.py install

Concepts

  • Stateful object - an object that whose state can be obtained via .state_dict() and restored via .load_state_dict(). Most PyTorch components (e.g. Module, Optimizer, LRScheduler) already implement this protocol.
  • App state - the application state described using multiple stateful objects.
  • Snapshot - the persisted app state.

Basic Usage

Describing the application state with multiple stateful objects:

app_state = {"model": model, "optimizer": optimizer}

Taking a snapshot of the application state:

from torchsnapshot import Snapshot

# File System
snapshot = Snapshot.take(path="/foo/bar/baz", app_state=app_state)

# S3
snapshot = Snapshot.take(path="s3://foo/bar", app_state=app_state)

# Google Cloud Storage
snapshot = Snapshot.take(path="gcs://foo/bar", app_state=app_state)

Referencing an existing snapshot:

snapshot = Snapshot(path="foo/bar/baz")

Restoring the application state from a snapshot:

snapshot.restore(app_state=app_state)

See the example directory for more examples.

License

torchsnapshot is BSD licensed, as found in the LICENSE file.

About

A light-weight library for adding fault tolerance to large-scale PyTorch distributed training workloads.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.6%
  • Shell 0.4%