Comet Command-Line Utilities¶
When you install comet_ml, you also install a collection of utilities for the command-line interface (CLI).
comet upload
- for uploading OfflineExperimentscomet optimize
- for easy running of Optimizer scripts in parallel or serialcomet python
- for injecting "import comet_ml" into your scriptscomet offline
- for exploring offline experiment ZIP filescomet check
- for checking and debugging your environmentcomet models
- for listing and downloading Registered Modelscomet init
- for creating example scripts from cookiecutter recipes
You can interactively get general help on these utilities with:
bash
comet --help
and specific help with any of the following commands:
bash
comet upload --help
comet optimize --help
comet python --help
comet offline --help
comet check --help
comet models --help
comet models list --help
comet models download --help
comet init --help
You can also easily see the comet_ml version by using:
bash
comet --version
The rest of this page describes these utilities.
comet upload¶
The comet upload
utility is used to uploading OfflineExperiments
to Comet.ml. Consider the following command line:
bash
$ comet upload /tmp/comet/5da271fcb60b4652a51dfc0decbe7cd9.zip
This is a script that is installed when you installed comet_ml
. If that fails for any reason, you can try this more direct invocation using the same Python that you used when running your script:
bash
$ python -m comet_ml.scripts.comet_upload /tmp/comet/5da271fcb60b4652a51dfc0decbe7cd9.zip
Don’t forget to include your API Key and update the experiment path to the one displayed at the end of your OfflineExperiment script run. For more details on configuring Python, please see Comet config file.
Sending multiple offline experiments is easy. To do so, execute the same comet upload
command as before, but just replace the path to your experiment like so:
bash
$ comet upload /path/to/*.zip
or:
bash
$ python -m comet_ml.scripts.comet_upload /path/to/*.zip
Debugging¶
If you encounter any bugs with either the OfflineExperiment
class or uploading, please run the uploader with the following:
bash
$ COMET_LOGGING_FILE_LEVEL=debug \
COMET_LOGGING_FILE=/tmp/comet.debug.log \
COMET_API_KEY=MY_API_KEY \
comet upload /path/to/experiments/*.zip
or:
bash
$ COMET_LOGGING_FILE_LEVEL=debug \
COMET_LOGGING_FILE=/tmp/comet.debug.log \
COMET_API_KEY=MY_API_KEY \
python -m comet_ml.scripts.comet_upload /path/to/experiments/*.zip
The debug logs will be in /tmp/comet.debug.log. This log will show details on all of the steps of the process. If you still have problems, please share this file with us via the Comet.ml Slack channel.
comet optimize¶
The comet optimize
is a utility for running the Comet.ml optimizer in
parallel or in serial. The format of the command line is:
bash
$ comet optimize [options] [PYTHON_SCRIPT] OPTIMIZER
where OPTIMIZER is a JSON file, or an optimizer id.
PYTHON_SCRIPT is a regular Python file that takes an optimizer config file, or optimizer ID. If PYTHON_SCRIPT is not included, then an optimizer is created and the optimizer id is displayed.
Positional arguments:
- PYTHON_SCRIPT - the name of the script to run
- OPTIMIZER - optimizer JSON file or optimizer ID
Optional arguments:
-h, --help show this help message and exit
-j PARALLEL, --parallel PARALLEL
number of parallel runs
-t TRIALS, --trials TRIALS
number of trials per parameter configuration
-e EXECUTABLE, --executable EXECUTABLE
Run using an executable other than Python
-d DUMP, --dump DUMP Dump the parameters to given filename
Note that comet optimize
requires having your COMET_API_KEY
pre-configured in one of the many ways possible, for example in an
environment variable, or in your .comet.config
file.
Examples of calling comet optimize
:
bash
$ export COMET_API_KEY=a287c4e3374d3645f3465346cc5
$ export COMET_OPTIMIZER_ID=$(comet optimize opt.json)
$ comet optimize script.py opt.json
$ comet optimize -j 4 script.py opt.json
To use an executable other than python, use -e, like so:
bash
$ comet optimize -e "run-on-cluster.sh" script.py opt.json
For use on machines that you want to dedicate particular GPUs per process (or similar logic), you also have access to the following environment variables:
COMET_OPTIMIZER_PROCESS_ID
- Current job number (starting with 0, up to but not includingj
)COMET_OPTIMIZER_PROCESS_JOBS
- Total number of parallel jobs (e.g.,j
)
for more details, see: Comet environment variables.
For example, you could call your script as defined above:
shell
$ comet optimize -j 4 script.py optimize.json
In the script, you can access COMET_OPTIMIZER_PROCESS_ID
and
COMET_OPTIMIZER_PROCESS_JOBS
and use particular GPU configurations:
```python
script.py¶
import os
setup as per above¶
process_id = os.environ["COMET_OPTIMIZER_PROCESS_ID"] process_jobs = os.environ["COMET_OPTIMIZER_PROCESS_JOBS"]
Handle process_id's 0 through process_jobs - 1¶
if process_id == 0: # handle j == 0 elif process_id == 1: # handle j == 1 elif process_id == 2: # handle j == 2 elif process_id == 3: # handle j == 3 ```
comet python¶
The comet python
utility is used to execute a Python script and
import comet_ml automatically.
Although you still need to import comet_ml
in your script, you do
not need to import comet_ml
before your machine learning libraries
anymore.
Usage:
bash
comet python [-h] [-p PYTHON] [-m MODULE] python_script
Positional arguments:
- python_script: the python script to launch
Optional arguments:
-h, --help show this help message and exit
-p PYTHON, --python PYTHON
Which Python interpreter to use
-m MODULE, --module MODULE
Run library module as a script
comet offline¶
The comet offline
utility is used to explore offline experiment archives.
Usage:
comet offline [-h] [--csv] [--section SECTION] [--level LEVEL]
[--name NAME] [--output OUTPUT] [--raw-size]
[--no-header]
archives [archives ...]
This command line displays summaries of an offline experiments:
bash
$ comet offline *.zip
You may also display the ZIP details in a CSV (Comma-Separated Value) format. This format shows an experiment's data in a row format in the following order:
- Workspace
- Project
- Experiment
- Level
- Section
- Name
- Value
where:
- Workspace: the name of a specific workspace, or DEFAULT
- Project: the name of a specific project, or "general"
- Experiment: the experiment key for this experiment
- Level: detail, maximum, or minimum
- Section: metric, param, log_other, etc.
- Name: name of metric, param, etc.
bash
$ comet offline --csv *.zip
You may use the optional flags --level
, --section
, or --name
to filter the rows. That is, if you use this command line:
bash
$ comet offline --level detail *.zip
Note that when you use --level
, --section
, or --name
then that implies --csv
.
Positional arguments:
- archives: the offline experiment archives to display
Optional arguments:
-h, --help show this help message and exit
--csv output details in csv format
--section SECTION output specific section in csv format, including param,
metric, log_other, data, etc.
--level LEVEL output specific summary level in csv format, including
minimum, maximum, detail
--name NAME output specific name in csv format, including items like
loss, acc, etc.
--output OUTPUT output filename for csv format
--raw-size Use bytes for file sizes
--no-header Use this flag to suppress CSV header
comet check¶
The comet check
command is used to check to see if your environment is
set up properly to use Comet.
Usage:
comet check [-h] [--debug]
The simplest use is:
```bash $ comet check COMET INFO: ================================================================================ COMET INFO: Checking connectivity to server... COMET INFO: ================================================================================ COMET INFO: Configured server address 'https://www.comet.ml/clientlib/' COMET INFO: Server address was configured in INI file '/home/user/.comet.config' COMET INFO: Server connection is ok
COMET INFO: ================================================================================ COMET INFO: Checking connectivity to Rest API... COMET INFO: ================================================================================ COMET INFO: Configured Rest API address 'https://www.comet.ml/api/rest/v2/' COMET INFO: Rest API address was configured in INI file '/home/user/.comet.config' COMET INFO: REST API connection is ok
COMET INFO: ================================================================================
COMET INFO: Checking connectivity to Websocket Server
COMET INFO: ================================================================================
COMET WARNING: No WS address configured on client side, fallbacking on default WS address
'wss://www.comet.ml/ws/logger-ws'.
If that's incorrect set the WS url through the comet.ws_url_override
config key.
COMET INFO: Configured WS address 'wss://www.comet.ml/ws/logger-ws'
COMET INFO: Websocket connection is ok
COMET INFO: ================================================================================ COMET INFO: Checking connectivity to Optimizer Server COMET INFO: ================================================================================ COMET INFO: Configured Optimizer address 'https://www.comet.ml/optimizer/' COMET INFO: Optimizer address was configured in INI file '/home/user/.comet.config' COMET INFO: Optimizer connection is ok
COMET INFO: Summary COMET INFO: -------------------------------------------------------------------------------- COMET INFO: Server connectivity True COMET INFO: Rest API connectivity True COMET INFO: WS server connectivity True COMET INFO: Optimizer server connectivity True ```
Running with the --debug
flag will provide additional details. This
is quite handy for tracking down issues, especially with a new
environment, or on an on-prem installation.
comet models¶
The comet models
command is used to list and download a registered
model to your local file system.
Usage:
comet models download [-h]
--workspace WORKSPACE
--model-name MODEL_NAME
(--model-version MODEL_VERSION | --model-stage MODEL_STAGE)
[--output OUTPUT]
or:
comet models list [-h] --workspace WORKSPACE
For downloading a model, you must provide the name of the workspace and the registered model name. You must also provide a specific version or stage.
For example, to download a registry model named "My Model" from the workspace "My Workspace" at version 1.0.0, you can run:
bash
$ comet models download \
--workspace "My Workspace" \
--model-name "My Model" \
--model-version "1.0.0"
The registry model files will be downloaded to a directory named "model". You can choose a different output directory by using the "--output" flag.
Optional arguments:
-h, --help show this help message and exit
-w WORKSPACE, --workspace WORKSPACE
the workspace name of the registry model to download
--model-name MODEL_NAME
the name of the registry model to download
--model-version MODEL_VERSION
the semantic version of the registry model to download
(for example: 1.0.0)
--model-stage MODEL_STAGE
the stage of the registry model to download (for
example: production)
--output OUTPUT the output directory where to download the model,
default to `model`
comet init¶
You can use comet init
to:
- create a Comet configuration file with your API key; OR
- create a new project directory with sample code based on a template
You may wish to do both in this order.
The first is used in this manner in the terminal:
$ comet init --api-key
This will ask you for your Comet API key. You can also do this
progammatically. See Comet Installation
for information on using comet_ml.init()
.
The second is used to create a new project directory with a Python script and dependency file that shows how to incorporate Comet with various ML libraries. It is called like:
$ comet init
This usage of the comet init
command is used to create example
scripts using the
cookiecutter recipe system.
It currently supports creating example scripts in python
and r
that can be set using the --language
flag (default is
python
).
For example, here is an example use creating a keras
example with
confusion matrix, embedding visualizations, and histograms with the
Comet Optimizer:
% comet init Building Comet example script from recipe... ================================================== Please answer the following questions: project_slug [my_project]: my_project Select online_or_offline: 1 - Online 2 - Offline Choose from 1, 2 [1]: 1 Select framework: 1 - keras Choose from 1 [1]: 1 Select confusion_matrix: 1 - Yes 2 - No Choose from 1, 2 [1]: 1 Select histogram: 1 - Yes 2 - No Choose from 1, 2 [1]: 1 Select embedding: 1 - Yes 2 - No Choose from 1, 2 [1]: 1 Select optimizer: 1 - No 2 - Yes Choose from 1, 2 [1]: 2
At this point there should now be an example script in
my_project/comet-keras-example.py
.
We will continually add additional examples components to the recipe. If you have questions, or pull requests, you can make those at github.com/comet-ml/comet-recipes.
Optional arguments:
optional arguments:
-h, --help show this help message and exit
-a, --api-key Create a ~/.config.comet file with Comet API key
-l LANGUAGE, --language LANGUAGE
The language of example script to generate
-r, --replay Replay the last comet init
-f, --force Force overwrite output directory if it exists
-o OUTPUT, --output OUTPUT
Output directory for scripts to go to