-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Remove support for logging multiple metrics together #16389
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
70c4f18
to
8fa72ea
Compare
⚡ Required checks status: All passing 🟢Groups summary🟢 pytorch_lightning: Tests workflowThese checks are required after the changes to 🟢 pytorch_lightning: Azure GPU
These checks are required after the changes to 🟢 pytorch_lightning: Azure HPU
These checks are required after the changes to 🟢 pytorch_lightning: Azure IPU
These checks are required after the changes to 🟢 pytorch_lightning: Docs
These checks are required after the changes to 🟢 mypy
These checks are required after the changes to 🟢 installThese checks are required after the changes to Thank you for your contribution! 💜
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
What does this PR do?
Removes support for
log("key", {"something": 123})
If the logger instance supports this, we suggest logging directly into it, instead of going through the
self.log
mechanism.These logged keys couldn't be monitored by
ModelCheckpoint
orEarlyStopping
.The objective of this is to simplify the logging internals. As this is a very niche feature.
Does your PR introduce any breaking changes? If yes, please list them.
Removes the above.
cc @justusschock @awaelchli @carmocca @Blaizzy @Borda