Skip to content

Actions: microsoft/DeepSpeed

Formatting

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
5,180 workflow runs
5,180 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Formatting
Formatting #16055: Scheduled
January 17, 2025 00:20 1m 20s master
January 17, 2025 00:20 1m 20s
Update import for torchvision.transformers
Formatting #16054: Pull request #6958 synchronize by loadams
January 16, 2025 21:22 1m 18s loadams/torchvision-imports
January 16, 2025 21:22 1m 18s
Update import for torchvision.transformers
Formatting #16053: Pull request #6958 opened by loadams
January 16, 2025 21:19 1m 20s loadams/torchvision-imports
January 16, 2025 21:19 1m 20s
Update torch.norm to torch.linalg.norm and torch.linalg.vector_norm
Formatting #16052: Pull request #6931 synchronize by loadams
January 16, 2025 20:47 1m 19s loadams/fix-torch-issues
January 16, 2025 20:47 1m 19s
[DEBUG] Add diagnostics for cpu-torch-latest intermittent hang
Formatting #16051: Pull request #6942 synchronize by loadams
January 16, 2025 18:19 1m 20s loadams/cpu-runner-debug
January 16, 2025 18:19 1m 20s
Formatting
Formatting #16050: Merge group checks requested
January 16, 2025 18:11 1m 17s
January 16, 2025 18:11 1m 17s
generalize deepspeed linear and implement it for non cuda systems
Formatting #16049: Pull request #6932 synchronize by oelayan7
January 16, 2025 15:42 1m 18s oelayan7:linear
January 16, 2025 15:42 1m 18s
generalize deepspeed linear and implement it for non cuda systems
Formatting #16048: Pull request #6932 synchronize by oelayan7
January 16, 2025 15:40 Action required oelayan7:linear
January 16, 2025 15:40 Action required
[FPDT] Support FPDT Based on Intel Backend
Formatting #16047: Pull request #6956 synchronize by YizhouZ
January 16, 2025 08:39 Action required YizhouZ:yizhou/support_fdpt
January 16, 2025 08:39 Action required
[FPDT] Support FPDT Based on Intel Backend
Formatting #16046: Pull request #6956 opened by YizhouZ
January 16, 2025 08:38 Action required YizhouZ:yizhou/support_fdpt
January 16, 2025 08:38 Action required
support autoTP with weight only quantization in DS inference path
Formatting #16045: Pull request #4750 synchronize by ftian1
January 16, 2025 08:10 1m 30s ftian1:master
January 16, 2025 08:10 1m 30s
Update torch.norm to torch.linalg.norm and torch.linalg.vector_norm
Formatting #16043: Pull request #6931 synchronize by loadams
January 16, 2025 00:40 1m 25s loadams/fix-torch-issues
January 16, 2025 00:40 1m 25s
generalize deepspeed linear and implement it for non cuda systems
Formatting #16042: Pull request #6932 synchronize by loadams
January 16, 2025 00:23 1m 18s oelayan7:linear
January 16, 2025 00:23 1m 18s
Formatting
Formatting #16041: Scheduled
January 16, 2025 00:20 1m 22s master
January 16, 2025 00:20 1m 22s
Pin numpy version
Formatting #16040: Pull request #6953 opened by BLOrange-AMD
January 15, 2025 23:43 1m 15s ROCm:pin_numpy
January 15, 2025 23:43 1m 15s
Formatting
Formatting #16039: Merge group checks requested
January 15, 2025 22:09 1m 29s
January 15, 2025 22:09 1m 29s
Update sharded_moe.py to support top2 gate with Tutel
Formatting #16038: Pull request #6948 synchronize by loadams
January 15, 2025 21:15 2m 14s xenshinu:patch-1
January 15, 2025 21:15 2m 14s
Unpin tests that previously used a pinned version of transformers
Formatting #16037: Pull request #6387 synchronize by loadams
January 15, 2025 19:46 1m 21s loadams/transformers-fixes
January 15, 2025 19:46 1m 21s
Update sharded_moe.py to support top2 gate with Tutel
Formatting #16036: Pull request #6948 synchronize by xenshinu
January 15, 2025 19:40 Action required xenshinu:patch-1
January 15, 2025 19:40 Action required
Formatting
Formatting #16035: Merge group checks requested
January 15, 2025 19:25 1m 15s
January 15, 2025 19:25 1m 15s
warn to warning
Formatting #16034: Pull request #6952 opened by qgallouedec
January 15, 2025 18:32 1m 28s qgallouedec:warn_to_warning
January 15, 2025 18:32 1m 28s
Addressing ipg Buffer Data Race Condition in Zero Stage2
Formatting #16033: Pull request #3727 synchronize by loadams
January 15, 2025 17:09 Action required xxr3376:master
January 15, 2025 17:09 Action required
[inf] Add config var to enable keeping module on host
Formatting #16032: Pull request #6846 synchronize by loadams
January 15, 2025 17:06 1m 35s oelayan7:keep_module_on_host
January 15, 2025 17:06 1m 35s
generalize deepspeed linear and implement it for non cuda systems
Formatting #16031: Pull request #6932 synchronize by loadams
January 15, 2025 16:24 1m 20s oelayan7:linear
January 15, 2025 16:24 1m 20s