Skip to content

Overview¶

Comet logging provides a vast suite of logs that are tracked automatically, and the ability to add any custom logging through built-in methods, for your Experiment.

Custom logging¶

The Experiment object lets you keep track of virtually any custom experiment attribute of interest through its logging methods.

We provide detailed documentation for the most used log types in this documentation section, specifically in:

For other logs supported, please refer to the linked Experiment reference.

Automated logging¶

When an Experiment is initialized, Comet automatically turns on logging for the following attributes:

  • Script code and file name, or Jupyter Notebook history
  • Git metadata and patch
  • Model graph representation (see below)
  • Model weights and biases (see below)
  • Model hyperparameters (see below)
  • Training metrics (see below)
  • Command-line arguments to script
  • Console and Jupyter Notebook standard output and error
  • Environment GPU, CPU, host name, and more

Additionally, Comet offers extended automated logging for the frameworks that have integrations, as summarized below.

FrameworkLogged items
fast.aiAll PyTorch items, plus epochs, and metrics. See examples.
KerasGraph description, steps, metrics, hyperparameters, weights and biases as histograms, optimizer config, and number of trainable parameters. See examples.
MLflowHyperparameters, assets, models, plus lower-level framework items (for example, TensorFlow's metrics, TensorBoard summaries).
ProphetHyperparameters, model, and figures
PyTorch LightningLoss and accuracy. See examples.
PyTorchGraph description, steps, and loss. See examples.
Ray TrainDistributed system metrics. See examples.
Scikit-learnHyperparameters. See examples.
TensorBoardSummary scalars (as metrics) and summary histograms
TensorFlow model analysisTime series, plots, and slicing metrics
XGBoostMetrics, hyperparameters. See examples.
Nov. 18, 2024