Skip to content

Conversation

@ailzhang
Copy link
Contributor

@ailzhang ailzhang commented Mar 5, 2021

Stack from ghstack:

RFC: pytorch/rfcs#17

Differential Revision: D26973911

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Mar 5, 2021

💊 CI failures summary and remediations

As of commit 09f5448 (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-scanned failure(s)

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

@ailzhang ailzhang changed the title Implement public API AutoInferenceMode and its error handling [WIP]Implement public API AutoInferenceMode and its error handling Mar 5, 2021
ailzhang pushed a commit that referenced this pull request Mar 5, 2021
ailzhang pushed a commit that referenced this pull request Mar 6, 2021
ailzhang pushed a commit that referenced this pull request Mar 8, 2021
ailzhang pushed a commit that referenced this pull request Mar 8, 2021
ailzhang pushed a commit that referenced this pull request Mar 9, 2021
ailzhang pushed a commit that referenced this pull request Mar 9, 2021
ailzhang pushed a commit that referenced this pull request Mar 10, 2021
@ezyang
Copy link
Contributor

ezyang commented Mar 24, 2021

Was reviewing the spec some more and I think we can probably do some more safe relaxations:

  • Taking a view on an inference mode tensor outside of inference mode is safe, you just need to produce an inference tensor in this case as well (currently this is an error)
  • Inplace modifying an inference mode tensor outside of inference mode is safe and you can skip the VC bump

(Don't block this PR on it)

Copy link

@bhosmer bhosmer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM per Ed's analysis here and offline conversation 😁 one note about setting requires_grad, but if we want to do anything about that it can definitely wait for a followup.

ASSERT_TRUE(is_inference_tensor(c));

torch::Tensor tmp = torch::ones({1, 2, 3}).set_requires_grad(true);
ASSERT_TRUE(tmp.requires_grad()); // requires_grad is silently ignored when it's an inference tensor.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did we decide there was a reason we wanted to silently ignore, like code reuse? Seems clear it would be saner devex to actually error if you set requires_grad true on an inference tensor.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code got edited above, but just to be super explicit: it's not silently ignored; we DO set requires_grad=True (just like in no grad mode); the idea to not error is so that code that initializes parameters can run in inference mode without triggering errors

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea code reuse is the main reason. And another reason is inference tensor can actually have requires_grad=true (and gradients) as shown in the second PR. :D

@ailzhang ailzhang requested a review from bdhirsh March 25, 2021 18:20
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One case that I don't see mentioned here is:

a = torch.rand(10, requires_grad=True)
with inference_mode():
    b = a.view_as(a)
with torch.no_grad():
    c = b.view_as(b)
c += 1

I guess this is fine, but just want to make sure

/// Handles correctly propagating CreationMeta when a new view is created from a previous view.
/// In general, we don't want the new view to be _less_ restrictive than the previous view
/// (it's okay to be _more_ restrictive). A CreationMeta value of DEFAULT is currently the least
/// restrictive, as the behavior for all other CreationMeta values is to error out for in-place ops.
/// If this changes, the logic here will need to be updated to properly handle the new semantics.
inline CreationMeta propagate_creation_meta(CreationMeta prev_view_creation_meta, CreationMeta new_view_creation_meta) {
return (new_view_creation_meta == CreationMeta::DEFAULT) ? prev_view_creation_meta : new_view_creation_meta;
}
handles that case properly.

Also after looking at all the examples, I am now convinced that we should present inference mode as no grad but more restrictive. As most of the behavior is actually the same.

@ailzhang ailzhang requested a review from albanD March 26, 2021 17:06
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!
Thanks for all the updates!

@facebook-github-bot
Copy link
Contributor

@ailzhang merged this pull request in 7caa464.

@facebook-github-bot
Copy link
Contributor

This pull request has been reverted by 263180d.

ailzhang pushed a commit to ailzhang/pytorch that referenced this pull request Mar 31, 2021
)

Summary:
Pull Request resolved: pytorch#55008

reland of pytorch#53343

For easier review, here's a diff between the version before revert. https://www.internalfb.com/phabricator/paste/view/P361764610

Test Plan: Imported from OSS

Differential Revision: D27443229

Pulled By: ailzhang

fbshipit-source-id: faeaff3b6165b933c9f354d5f0344e38269fbb12
@facebook-github-bot facebook-github-bot deleted the gh/ailzhang/51/head branch March 31, 2021 14:17
facebook-github-bot pushed a commit that referenced this pull request Mar 31, 2021
Summary:
https://www.internalfb.com/phabricator/paste/view/P360377337Pull Request resolved: #53343

For easier review, here's a diff between the version before revert. https://www.internalfb.com/phabricator/paste/view/P360750919

Pull Request resolved: #55008

Test Plan: Imported from OSS

Pulled By: ailzhang

Reviewed By: bhosmer

Differential Revision: D27443229

fbshipit-source-id: 01b03446a1f6373f43dd5c7170d26226b50f363c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants