Advanced Hypster Features#

Saving and Loading Configurations#

Hypster provides functionality to save and load configurations, making it easy to persist and reuse your setups.

Saving Configurations#

You can save a Hypster configuration using the hypster.save() function. This function cleans the decorator and imports, making the saved file standalone.

Example:

import hypster
from hypster import HP


@hypster.config
def my_config(hp: HP):
    chunking_strategy = hp.select(["paragraph", "semantic", "fixed"], default="paragraph")

    llm_model = hp.select(
        {"haiku": "claude-3-haiku-20240307", "sonnet": "claude-3-5-sonnet-20240620", "gpt-4o-mini": "gpt-4o-mini"},
        default="gpt-4o-mini",
    )

    llm_config = {"temperature": hp.number(0), "max_tokens": hp.number(64)}

    system_prompt = hp.text("You are a helpful assistant. Answer with one word only")
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[1], line 1
----> 1 import hypster
      2 from hypster import HP
      5 @hypster.config
      6 def my_config(hp: HP):

ModuleNotFoundError: No module named 'hypster'
hypster.save(my_config, "my_config.py")

This will:

  1. save the configuration to a file named my_config.py

  2. remove the @config decorator from the function definition

  3. Adding necessary imports, namely: from hypster import HP

These allow portability for future usage of hypster.load()

Loading Configurations#

To load a saved configuration, use the hypster.load() function:

import hypster

my_config = hypster.load("my_config.py")
my_config()
{'chunking_strategy': 'paragraph',
 'llm_model': 'gpt-4o-mini',
 'llm_config': {'temperature': 0, 'max_tokens': 64},
 'system_prompt': 'You are a helpful assistant. Answer with one word only'}

This loads the configuration from my_config.py and allows you to use it in your current setup.