Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Dec 2, 2025

Bumps pytorch-lightning from 1.9.5 to 2.6.0.

Release notes

Sourced from pytorch-lightning's releases.

Lightning v2.6.0

Changes in 2.6.0

PyTorch Lightning

  • Added WeightAveraging callback that wraps the PyTorch AveragedModel class (#20545)
  • Added Torch-Tensorrt integration with LightningModule (#20808)
  • Added time-based validation support though val_check_interval (#21071)
  • Added attributes to access stopping reason in EarlyStopping callback (#21188)
  • Added support for variable batch size in ThroughputMonitor (#20236)
  • Added EMAWeightAveraging callback that wraps Lightning's WeightAveraging class (#21260)
  • Expose weights_only argument for Trainer.{fit,validate,test,predict} and let torch handle default value (#21072)
  • Default to RichProgressBar and RichModelSummary if the rich package is available. Fallback to TQDMProgressBar and ModelSummary otherwise (#20896)
  • Add MPS accelerator support for mixed precision (#21209)
  • Fixed edgecase when max_trials is reached in Tuner.scale_batch_size (#21187)
  • Fixed case where LightningCLI could not be initialized with trainer_default containing callbacks (#21192)
  • Fixed missing reset when ModelPruning is applied with lottery ticket hypothesis (#21191)
  • Fixed preventing recursive symlink creation iwhen save_last='link' and save_top_k=-1 (#21186)
  • Fixed last.ckpt being created and not linked to another checkpoint (#21244)
  • Fixed bug that prevented BackboneFinetuning from being used together with LearningRateFinder (#21224)
  • Fixed ModelPruning sparsity logging bug that caused incorrect sparsity percentages (#21223)
  • Fixed LightningCLI loading of hyperparameters from ckpt_path failing for subclass model mode (#21246)
  • Fixed check the init args only when the given frames are in __init__ method (#21227)
  • Fixed how ThroughputMonitor calculated training time (#21291)
  • Fixed synchronization of gradients in manual optimization with DDPStrategy(static_graph=True) (#21251)
  • Fixed FSDP mixed precision semantics and added user warning (#21361)

Lightning Fabric

  • Expose weights_only argument for Trainer.{fit,validate,test,predict} and let torch handle default value (#21072)
  • Set _DeviceDtypeModuleMixin._device from torch's default device function (#21164)
  • Added kwargs-filtering for Fabric.call to support different callback method signatures (#21258)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pytorch-lightning](https://github.com/Lightning-AI/lightning) from 1.9.5 to 2.6.0.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@1.9.5...2.6.0)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-version: 2.6.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependabot test-guided-notebooks Run PR check to verify Guided notebooks labels Dec 2, 2025
@openshift-ci openshift-ci bot requested review from dimakis and pawelpaszki December 2, 2025 01:04
@codeflare-machine-account codeflare-machine-account added lgtm Indicates that a PR is ready to be merged. approved Indicates a PR has been approved by an approver from all required OWNERS files. labels Dec 2, 2025
@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 2, 2025

[APPROVALNOTIFIER] This PR is APPROVED

Approval requirements bypassed by manually added approval.

This pull-request has been approved by:

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved Indicates a PR has been approved by an approver from all required OWNERS files. dependabot lgtm Indicates that a PR is ready to be merged. test-guided-notebooks Run PR check to verify Guided notebooks

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants