HP Object Methods in Hypster#
The HP
object provides several methods for defining hyperparameters in your configurations. This guide covers the main methods: select
, text
, and number
.
1. The select
Method#
hp.select(options, default=None, name=None)
The select
method allows you to choose from a list or dictionary of options.
1.1 Dictionary Options#
When using a dictionary, keys must be of type str
, int
, bool
, or float
. Values can be of any type.
optimizer = hp.select({
'adam': torch.optim.Adam,
'sgd': torch.optim.SGD,
1: 'custom_optimizer',
True: lambda lr: torch.optim.AdamW(lr=lr)
}, default='adam')
1.2 List Options#
For lists, values must be of type str
, int
, bool
, or float
. Lists are internally converted to dictionaries.
activation = hp.select(['relu', 'tanh', 'sigmoid'], default='relu')
This is equivalent to:
activation = hp.select({'relu': 'relu', 'tanh': 'tanh', 'sigmoid': 'sigmoid'}, default='relu')
2. The text
Method#
hp.text(default=None, name=None)
Defines a text input with an optional default value.
model_name = hp.text(default='my_awesome_model')
Future implementations will include runtime type checking for string values.
3. The number
Method#
hp.number(default=None, name=None)
Defines a numeric input with an optional default value.
learning_rate = hp.number(default=0.001)
Future implementations will include runtime type checking for numeric values (integers or floats).
4. Comprehensive Example#
Here’s an example demonstrating all HP
methods:
from hypster import HP, config
@config
def hp_methods_example(hp: HP):
import torch
activation = hp.select(['relu', 'tanh', 'sigmoid'], default='relu')
optimizer = hp.select({
'adam': torch.optim.Adam,
'sgd': torch.optim.SGD,
1: 'custom_optimizer',
True: lambda lr: torch.optim.AdamW(lr=lr)
}, default='adam')
model_name = hp.text(default='my_awesome_model')
learning_rate = hp.number(default=0.001)
# Using the selected values
act_func = {
'relu': torch.nn.ReLU(),
'tanh': torch.nn.Tanh(),
'sigmoid': torch.nn.Sigmoid()
}[activation]
if isinstance(optimizer, str):
opt = optimizer # It's a custom optimizer name
elif callable(optimizer):
opt = optimizer(learning_rate) # It's a lambda function
else:
opt = optimizer(torch.nn.Linear(10, 10).parameters(), lr=learning_rate)
print(f"Configured {model_name} with {activation} activation and {opt.__class__.__name__} optimizer")
# Usage
result = hp_methods_example(selections={'activation': 'tanh', 'optimizer': 'sgd'})
This example showcases all HP
methods, including selections from lists and dictionaries with various key types, and using text and number inputs.