Skip to content

Use third-party optimizers¶

For special use cases where the Comet Optimizer does not support your tuning requirements, you can use your own or any third-party optimizer for hyperparameter tuning and still track your tuning runs taking advantage of Comet Experiment Management.

How to log custom tuning runs to Comet¶

You can use Experiment.log_optimization() to log any custom optimization data to Comet. These attributes are logged in the Other tab of the Single Experiment page with the prefix "optimizer_".

Your custom tuning script should:

  1. Create a Comet Experiment.
  2. Obtain optimization parameters from your custom or third-party optimizer.
  3. Train (and optionally evaluate) your model with the selected parameters.
  4. Log the optimization data to Comet + any relevant metrics, other parameters, and assets.

This logic is showcased in the pseudocode example below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
import comet_ml
import datetime

# Define static parameters which are not returned by the optimizer
static_parameters = {}

def make_id():
    # Create a unique optimizer id
    now = datetime.datetime.now()
    return now.strftime("%Y-%m-%d %H:%M:%S")

OPT_ID = make_id()
OBJECTIVE = "minimize"
METRIC = "loss"

def trial(...):

    # Create a Comet Experiment
    exp = comet_ml.start(...)

    # Obtain optimization parameters from your custom or third-party optimizer
    learning_rate, n_layers = my_optimizer.get_parameters(...)

    # Train and evaluate the model
    loss, accuracy = train_and_evaluate_model(learning_rate, n_layers, static_parameters, ...)

    # Log the optimization data to Comet
    exp.log_optimization(
        optimization_id=OPT_ID,
        objective=OBJECTIVE,
        metric_name=METRIC,
        metric_value=loss,
        parameters={"learning_rate": learning_rate, "n_layers": n_layers},
    )

    # Log relevant metrics, other parameters, and assets to Comet
    exp.log_metric("accuracy", accuracy)
    exp.log_parameters(static_parameters)

By logging optimization data, you gain access to the same Comet UI functionalities as the Comet Optimizer. Discover more in the Analyze hyperparameter tuning results page.

Warning

The Experiment.log_optimization() method is available in version 3.33.10 and later of the Comet Python SDK.

End-to-end example¶

Below is an end-to-end example for custom optimization with Comet.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
import comet_ml
import itertools
import json
import random
import os
import datetime

# Initialize the Comet SDK
comet_ml.login()

# Define your search space
hyperparameters = {
    "learning-rate": [0.001, 0.1, 0.2, 0.5, 0.9],
    "batch-size": [16, 32, 64],
    "hidden-layer-size": [5, 10, 15],
    "optimizer": ["adam", "sgd"],
}

# Define your tuning config
max_n_trials = 5

# Define support methods for tuning
def my_optimizer(params, shuffle=True):
    if shuffle:
        combinations = []
        for values in itertools.product(*params.values()):
            combinations.append(dict(zip(params, values)))
        random.shuffle(combinations)
        for combo in combinations:
            yield combo
    else:
        for values in itertools.product(*params.values()):
            yield dict(zip(params, values))


def train(**params):
    return random.random()

def make_id():
    # Create a unique optimizer id
    now = datetime.datetime.now()
    return now.strftime("%Y-%m-%d %H:%M:%S")

OPT_ID = make_id()
OBJECTIVE = "minimize"
METRIC = "dummy_metric"

# Run tuning with the custom optimizer
for count in range(max_n_trials):

    # Create a Comet Experiment
    exp = comet_ml.start(project_name="dummy-custom-optimization")

    # Obtain optimization parameters from your custom or third-party optimizer
    parameters = list(my_optimizer(hyperparameters))[count]

    # Train and evaluate the model
    metric_value = train(**parameters)

    # Log the optimization data to Comet
    exp.log_optimization(
        optimization_id=OPT_ID,
        objective=OBJECTIVE,
        metric_name=METRIC,
        metric_value=metric_value,
        parameters=parameters,
    )

    # Log relevant metrics, other parameters, and assets to Comet
    exp.log_other("optimizer_name", "dummy-optimizer-001")
    exp.log_other("optimizer_version", "dummy-1.0")
    exp.log_other("optimizer_process", os.getpid())
    exp.log_other("optimizer_run_n", count)

Note that this example custom optimizer has many limitations, including:

  • Randomly selecting combinations may lead to memory issues if a large number of combinations are stored in memory simultaneously, potentially causing system crashes.
  • In the event that a training example fails to complete (e.g., due to a crash), there is no built-in mechanism to retry the combination, which could result in incomplete or inconsistent results.
  • The example only conducts a single trial for each combination, which may not adequately capture the variability in model performance for a real-world model training run.
  • The example lacks support for distributed computing, for which a centralized server is required to provide combinations, limiting scalability and efficiency.
  • The tuning approach used in the example is very basic.

You can substitute this simple version with any custom hyperparameter search, or use the Comet Optimizer directly to solve all of the issues listed.

Dec. 17, 2024