Skip to content

Feedback about Optimizing Model Parameters Page #3507

@madhaven

Description

@madhaven

There is the following issue on this page: https://docs.pytorch.org/tutorials/beginner/basics/optimization_tutorial.html

Within the section Full implementation, the loop does not contain the zero_grad function on top of the backward propagation block as is recommended in the paragraph preceding this section.

Actual code:

# Backpropagation
loss.backward()
optimizer.step()
optimizer.zero_grad()

Recommended code:

optimizer.zero_grad()
loss.backward()
optimizer.step()

If you could instruct me how to make this change on the documentation, I would be glad to do that.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions