-
Notifications
You must be signed in to change notification settings - Fork 323
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
data monitor callbacks #285
Conversation
Codecov Report
@@ Coverage Diff @@
## master #285 +/- ##
==========================================
- Coverage 83.86% 83.12% -0.74%
==========================================
Files 91 92 +1
Lines 4858 5066 +208
==========================================
+ Hits 4074 4211 +137
- Misses 784 855 +71
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
|
||
|
||
@mock.patch("pl_bolts.callbacks.data_monitor.TrainingDataMonitor.log_histogram") | ||
def test_training_data_monitor(log_histogram, tmpdir): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in 3 tests just with different parameters?
Before submitting
What does this PR do?
I'd like to contribute two callbacks:
TrainingDataMonitor
andModuleDataMonitor
as proposed in #194https://github.com/awaelchli/pytorch-lightning-snippets#callbacks
TrainingDataMonitor
ModuleDataMonitor