-
Notifications
You must be signed in to change notification settings - Fork 482
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update validation, remove slicing logic from classes #1660
Update validation, remove slicing logic from classes #1660
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/1660
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 3dfd1dd with merge base bf93806 (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor nits. I worry about the logic for checking if cache is enabled. My intuition is that its not robust enough.
I would like to see a test confirming that these errors are properly raised. Do you think its possible?
return self.decoder_max_cache_seq_len is not None | ||
def caches_are_enabled(self) -> bool: | ||
"""Check if the key value caches are setup.""" | ||
return self.layers[0].cache_enabled |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that in the old logic we checked encoder/decoder. It seems that this is not necessary, and we can always check layer 0
is it enough? It makes sense to me, just double checking
torchtune/modules/transformer.py
Outdated
layers which have cache enabled.""" | ||
return self.decoder_max_cache_seq_len is not None | ||
def caches_are_enabled(self) -> bool: | ||
"""Check if the key value caches are setup.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: fine as it is, but maybe add context about how/when this is used. Something like: "useful during inference to xyz". Feel free to ignore
"KV-caches for cross-attention/fusion layers are setup for inference mode, causal masks must be provided!" | ||
" Use the `encoder_mask` arg to provide a causal mask." | ||
"KV-caches for cross-attention/fusion layers are setup for inference mode and you seem to be using" | ||
" encoder_input, causal masks must be provided! Use the `encoder_mask` arg to provide a causal mask." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: prob a period instead of comma would be better
torchtune/modules/transformer.py
Outdated
if mask is None: | ||
raise ValueError( | ||
"KV-caches for self-attention layers are setup for inference mode, causal masks must be provided!" | ||
" Use the `mask` arg to provide a causal mask." | ||
) | ||
|
||
if self.encoder_caches_are_enabled(): | ||
if encoder_mask is None: | ||
if encoder_input is None and encoder_mask: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in there some good default for encoder_mask, like causal mask? If not, just mark as resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I plan on removing / re-working most of this validation logic to have better defaults, but that will be a follow-up PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
stamping to unblock, but IMO adding a small test before merging would be much better
Context
What is the purpose of this PR? Is it to
Please link to any issues this PR addresses.
Changelog
What are the changes made in this PR?
*
Test plan
Please make sure to do each of the following if applicable to your PR. If you're unsure about any one of these just ask and we will happily help. We also have a contributing page for some guidance on contributing.
pre-commit install
)pytest tests
pytest tests -m integration_test
UX
If your function changed a public API, please add a dummy example of what the user experience will look like when calling it.
Here is a docstring example
and a tutorial example