Skip to content

Pull requests: deepspeedai/DeepSpeed

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

Add engine.coalesce_grad_reduction() for ZeRO 1/2/3 multi-backward
#7992 opened May 5, 2026 by roycho96 Contributor Loading…
fix gemma4 num attention head bugs (from #7975)
#7990 opened May 2, 2026 by delock Collaborator Loading…
Fix eigenvalue monitor logging
#7987 opened Apr 28, 2026 by heurry Contributor Loading…
Add Qwen 3.5 preset to AutoTP
#7978 opened Apr 16, 2026 by tohtana Collaborator Draft
fix gemma4 num attention head bugs
#7975 opened Apr 15, 2026 by mingxiang1006 Loading…
[Blog] Muon Optimizer Support in DeepSpeed
#7962 opened Apr 8, 2026 by delock Collaborator Loading…
Fix/warnings stacklevel mvapich runner
#7949 opened Apr 2, 2026 by nathon-lee Contributor Draft
Refactor/torch autocast encapsulate global state
#7946 opened Apr 2, 2026 by nathon-lee Contributor Loading…
Add AutoEP
#7938 opened Mar 31, 2026 by tohtana Collaborator Draft
[Feature] Enable AutoEP Compatibility with ZeRO-3
#7928 opened Mar 28, 2026 by nathon-lee Contributor Loading…
Add torch_xla TPU support for ZeRO-1/2
#7917 opened Mar 21, 2026 by PKUWZP Collaborator Loading…
fix: add setup_context for torch.func compatibility
#7916 opened Mar 21, 2026 by roycho96 Contributor Loading…
doc: Remove suggestion to build extensions in parallel
#7899 opened Mar 12, 2026 by Flamefire Contributor Loading…
ProTip! Find all pull requests that aren't related to any open issues with -linked:issue.