Skip to content

Releases: siddk/voltron-robotics

Official v1.1.0 Release - PyTorch GPU DDP Pretraining & Cleanup

25 Apr 09:22
Compare
Choose a tag to compare

Upgrades to PyTorch 2.0 and provides native DistributedDataParallel GPU Pretraining (1+ GPUs) for all models. Fully documented and cleaned up preprocessing flow (raw Something-Something-v2 data --> pretrained models).

Key Examples:

What's Changed

  • Release 1.1.0 - Native GPU/DDP Pretraining by @siddk in #14

Full Changelog: v1.0.0...v1.1.0

Official v1.0.0 Release - Preprocessing & XLA Pretraining Pipeline

07 Mar 01:44
ed19236
Compare
Choose a tag to compare

Completes the preprocessing -> pretraining -> model usage pipeline, by adding the full reference scripts for preprocessing the entire Sth-Sth-v2 dataset (for pretraining MVP, R3M, and Voltron models), as well as the PyTorch XLA pretraining script (again, for all models).

A "standard" PyTorch GPU/DDP pretraining implementation is in the works, but hopefully the logic transfers!

What's Changed

  • Add Preprocessing Pipeline for Sth-Sth-v2 by @siddk in #8
  • Add XLA Pretraining Script by @siddk in #10

Full Changelog: v0.0.1...v1.0.0

Initial Release - v0.0.1

06 Mar 10:45
Compare
Choose a tag to compare

Initial Voltron model release (marking as 0.0.1 because I don't know how to SemVer / wait until full pretraining reference exists). Contains pretrained checkpoints, automatic load functionality for:

  • V-Cond (ViT-Small) on Sth-Sth-v2
  • V-Dual (ViT-Small) on Sth-Sth-v2
  • V-Gen (ViT-Small) on Sth-Sth-v2
  • V-Cond (ViT-Base) on Sth-Sth-v2

As well as for our "data-locked" reproductions:

  • R3M (ViT-Small) on Sth-Sth-v2
  • R3M (ResNet-50) on Sth-Sth-v2
  • MVP (ViT-Small) on Sth-Sth-v2

This accompanies the initial release of the Voltron Evaluation Suite – complete with usage examples!