ParamFlow is a lightweight and versatile library for managing hyperparameters and configurations, tailored for machine learning projects and applications requiring layered parameter handling. It merges parameters from multiple sources, generates command-line argument parsers, and simplifies parameter overrides, providing a seamless and efficient experience.
- Layered configuration: Merge parameters from files, environment variables, and command-line arguments.
- Immutable dictionary: Provides a read-only dictionary with attribute-style access.
- Profile support: Manage multiple sets of parameters with profile-based layering.
- Layered meta-parameters:
paramflowconfigures itself using a layered approach. - Automatic type conversion: Converts types during merging based on target parameter types.
- Command-line argument parsing: Automatically generates an
argparseparser from parameter definitions. - Nested Configuration: Allows for nested configuration and merging.
pip install paramflowInstall with .env support:
pip install "paramflow[dotenv]"[default]
learning_rate = 0.001
batch_size = 64import paramflow as pf
params = pf.load('params.toml')
print(params.learning_rate) # 0.001Running the script with --help displays both meta-parameters and parameters:
python app.py --helpMeta-parameters control how paramflow.load reads its own configuration. Layering order:
paramflow.loadarguments- Environment variables (default prefix:
P_) - Command-line arguments (
argparse)
Via command-line:
python print_params.py --profile dqn-adamVia environment variable:
P_PROFILE=dqn-adam python print_params.pyParameters are merged from multiple sources in the following order:
- Configuration files (
.toml,.yaml,.ini,.json,.env) - Environment variables (default prefix:
P_) - Command-line arguments (
argparse)
You can specify the order explicitly (env and args are reserved names):
params = pf.load('params.toml', 'env', '.env', 'args')Override parameters via command-line arguments:
python print_params.py --profile dqn-adam --learning_rate 0.0002[default]
learning_rate = 0.00025
batch_size = 32
optimizer_class = 'torch.optim.RMSprop'
optimizer_kwargs = { momentum = 0.95 }
random_seed = 13
[adam]
learning_rate = 1e-4
optimizer_class = 'torch.optim.Adam'
optimizer_kwargs = {}python app.py --profile adamThis overrides:
learning_rate→1e-4optimizer_class→torch.optim.Adamoptimizer_kwargs→{}
Profiles can be used to manage configurations for different environments.
[default]
debug = true
database_url = "mysql://localhost:3306/myapp"
[dev]
database_url = "mysql://dev:3306/myapp"
[prod]
debug = false
database_url = "mysql://prod:3306/myapp"export P_PROFILE=dev
python app.py