Hyperparameter Optimization (HPO) with Combinations#
The Combinations API provides a powerful foundation for various Hyperparameter Optimization (HPO) techniques. By generating all valid configurations, you can systematically explore the hyperparameter space using different search strategies.
Grid Search#
Grid Search exhaustively tries all possible combinations. This is straightforward with the Combinations API:
from hypster import HP, config
@config
def model_config(hp: HP):
learning_rate = hp.select([0.001, 0.01, 0.1])
batch_size = hp.select([32, 64, 128])
num_layers = hp.select([2, 3, 4])
# Generate combinations
combinations = model_config.get_combinations()
# Mock evaluation function
def evaluate_model(params):
# This is a mock function that returns a random accuracy
# In a real scenario, you would train and evaluate your model here
import random
return random.uniform(0, 1)
# Find best parameters
best_params = None
best_performance = float("-inf")
for params in combinations:
performance = evaluate_model(params)
if performance > best_performance:
best_performance = performance
best_params = params
print(f"Best parameters: {best_params}")
print(f"Best performance: {best_performance}")
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
Cell In[1], line 1
----> 1 from hypster import HP, config
4 @config
5 def model_config(hp: HP):
6 learning_rate = hp.select([0.001, 0.01, 0.1])
ModuleNotFoundError: No module named 'hypster'
Random Search#
Random Search samples randomly from the possible combinations, which can be more efficient for high-dimensional spaces:
import random
num_trials = 20 # Number of random configurations to try
random_combinations = random.sample(combinations, min(num_trials, len(combinations)))
best_params = None
best_performance = float("-inf")
for params in random_combinations:
performance = evaluate_model(params)
if performance > best_performance:
best_performance = performance
best_params = params
print(f"Best parameters: {best_params}")
print(f"Best performance: {best_performance}")
Best parameters: {'learning_rate': 0.1, 'batch_size': 64, 'num_layers': 2}
Best performance: 0.9656099547110238