fix: https://github.com/NVIDIA/Model-Optimizer/issues/981#983
fix: https://github.com/NVIDIA/Model-Optimizer/issues/981#983
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughThe Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@modelopt/torch/opt/plugins/mcore_dist_checkpointing.py`:
- Around line 151-157: Remove the stray pre-try conversion so the fallback runs:
delete the initial `config[k] = str(v)` before the try block and leave the
try/except that attempts `str(v)` and on `AttributeError`/`TypeError` sets
`config[k] = repr(type(v))`; update the block around `config`, `k`, and `v` in
`mcore_dist_checkpointing.py` so only the `try: config[k] = str(v)` / `except
(AttributeError, TypeError): config[k] = repr(type(v))` sequence exists.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 782435f7-d420-4661-8805-3bf2d93883dd
📒 Files selected for processing (1)
modelopt/torch/opt/plugins/mcore_dist_checkpointing.py
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Keval Morabia <28916987+kevalmorabia97@users.noreply.github.com>
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #983 +/- ##
==========================================
+ Coverage 72.10% 72.12% +0.02%
==========================================
Files 209 209
Lines 23629 23628 -1
==========================================
+ Hits 17037 17042 +5
+ Misses 6592 6586 -6 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Signed-off-by: Chenhan Yu <chenhany@nvidia.com>
| config[k] = str(v) | ||
| except (AttributeError, TypeError): | ||
| config[k] = repr(type(v)) | ||
| print("Warning: TransformerConfig.{} does not have _repr_ implemented.") |
There was a problem hiding this comment.
Did you mean to add something to the empty {}?
What does this PR do?
Type of change: ? Bug fix
An issue is reported in #981 where
str(v)on someTransformerConfigfields will raiseTypeError. The fix is to catch this error and dorepr(type(v))instead.Usage
# Add a code snippet demonstrating how to use thisTesting
Before your PR is "Ready for review"
Make sure you read and follow Contributor guidelines and your commits are signed (
git commit -s -S).Make sure you read and follow the Security Best Practices (e.g. avoiding hardcoded
trust_remote_code=True, usingtorch.load(..., weights_only=True), avoidingpickle, etc.).Additional Information
Summary by CodeRabbit