Skip to content

Conversation

@emapco
Copy link
Contributor

@emapco emapco commented Dec 22, 2025

What does this PR do?

This fix improves accuracy and prevents existing ModernBert models from yielding NaN tensors. Previously, when a pretrained modernbert model is loaded, the ModernBertUnpaddedRotaryEmbedding's inv_freq non-persistent buffer is not initialized resulting in NaN loss and logits.

Without fix:

  • ModernBertForSequenceClassification ModernBertModelIntegrationTest.test_inference_sequence_classification_flash_attention_2 rarely returns NaN tensors but instead fails due to inaccurate logits
  • ModernBertModelIntegrationTest.test_inference_sequence_classification_flash_attention_2_modernbert_base always fails due to NaN logits.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@ArthurZucker @Cyrilvallez

…ding from a checkpoint

This fix improves accuracy and prevents existing ModernBert models from yielding NaN tensors.

Without fix:
- `ModernBertModelIntegrationTest.test_inference_sequence_classification_flash_attention_2` rarely returns NaN tensors but instead fails due to inaccurate tensors
- `ModernBertModelIntegrationTest.test_inference_sequence_classification_flash_attention_2_modernbert_base` always fails due to NaN tensors
Copilot AI review requested due to automatic review settings December 22, 2025 11:30
@github-actions
Copy link
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: modernbert

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes a critical bug in ModernBERT models where loading from a checkpoint with flash_attention_2 results in NaN tensors due to an uninitialized inv_freq buffer in ModernBertUnpaddedRotaryEmbedding. The fix ensures proper initialization during model loading and adds comprehensive tests to verify the bug is resolved.

Key changes:

  • Initializes ModernBertUnpaddedRotaryEmbedding's non-persistent inv_freq buffer and cache attributes in _init_weights
  • Adds two new integration tests specifically for flash_attention_2 implementation to ensure NaN values are not produced

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

File Description
src/transformers/models/modernbert/modular_modernbert.py Adds initialization logic for ModernBertUnpaddedRotaryEmbedding in _init_weights method to compute and register the inv_freq buffer and initialize cache attributes
src/transformers/models/modernbert/modeling_modernbert.py Auto-generated from modular file - contains identical initialization logic as modular_modernbert.py
tests/models/modernbert/test_modeling_modernbert.py Adds two new slow tests for flash_attention_2 sequence classification with NaN checks to verify the fix works for both tiny-random and base models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant