From 866b849f6778be1f784522aea4af41c3f160943b Mon Sep 17 00:00:00 2001 From: Eduardo Patrocinio Date: Mon, 24 Nov 2025 09:58:33 -0500 Subject: [PATCH] Fix retain_grad() documentation for leaf vs non-leaf tensors The documentation incorrectly stated that calling retain_grad() on a non-leaf node results in a no-op. This is misleading because: - retain_grad() on a non-leaf tensor with requires_grad=True correctly retains gradients (not a no-op) - retain_grad() on a leaf tensor is a no-op (already retains by default) - retain_grad() on a tensor with requires_grad=False throws an error Updated the documentation and summary table to accurately reflect this behavior. --- .../understanding_leaf_vs_nonleaf_tutorial.py | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/beginner_source/understanding_leaf_vs_nonleaf_tutorial.py b/beginner_source/understanding_leaf_vs_nonleaf_tutorial.py index 740c4d4bd76..6c8fa91a011 100644 --- a/beginner_source/understanding_leaf_vs_nonleaf_tutorial.py +++ b/beginner_source/understanding_leaf_vs_nonleaf_tutorial.py @@ -265,8 +265,10 @@ # # Computational graph after backward pass # -# If you call ``retain_grad()`` on a non-leaf node, it results in a no-op. -# If we call ``retain_grad()`` on a node that has ``requires_grad=False``, +# If you call ``retain_grad()`` on a leaf tensor, it results in a no-op +# since leaf tensors already retain their gradients by default (when +# ``requires_grad=True``). +# If we call ``retain_grad()`` on a tensor that has ``requires_grad=False``, # PyTorch actually throws an error, since it can’t store the gradient if # it is never calculated. # @@ -298,13 +300,13 @@ # +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+ # | ``is_leaf`` | ``requires_grad`` | ``retains_grad`` | ``require_grad()`` | ``retain_grad()`` | # +================+========================+========================+===================================================+=====================================+ -# | ``True`` | ``False`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | no-op | +# | ``True`` | ``False`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | throws error | # +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+ -# | ``True`` | ``True`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | no-op | +# | ``True`` | ``True`` | ``False`` | sets ``requires_grad`` to ``True`` or ``False`` | no-op (already retains) | # +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+ # | ``False`` | ``True`` | ``False`` | no-op | sets ``retains_grad`` to ``True`` | # +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+ -# | ``False`` | ``True`` | ``True`` | no-op | no-op | +# | ``False`` | ``True`` | ``True`` | no-op | no-op (already retains) | # +----------------+------------------------+------------------------+---------------------------------------------------+-------------------------------------+ #