Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added .DS_Store
Binary file not shown.
Binary file added code_snippets/.DS_Store
Binary file not shown.
Binary file added code_snippets/GNBG-II/.DS_Store
Binary file not shown.
91 changes: 91 additions & 0 deletions code_snippets/GNBG-II/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
# GNBG

Evaluates functions from the **GNBG benchmark suite**, a set of structured multimodal optimization problems defined via parameter files (`.mat`) and evaluated through a Python implementation.

## Quick Start

1. Install [uv](https://docs.astral.sh/uv/) if you don't have it yet:

```bash
pip install uv
```

No extra setup is needed beyond having `uv` installed. The required dependencies are:

* `numpy`
* `scipy`

These are resolved automatically via the script header.

## Setup

1. Download or clone the GNBG repository:

```
https://github.com/rohitsalgotra/GNBG-II
```

2. Extract the Python instances:

```
GNBG_Instances.Python-main.zip
```

3. Place the extracted folder in your project directory:

```
project/
├── gnbg.py
├── call_gnbg.py
└── GNBG_Instances.Python-main/
├── f1.mat
├── ...
```

## Usage

```bash
uv run call_gnbg.py
```

## What the Snippet Does

The script:

1. Loads a GNBG problem instance (e.g. `f1.mat`)
2. Constructs the corresponding benchmark function
3. Evaluates it at a given point
4. Prints the result

The implementation is split into:

* **`gnbg.py`** — contains the GNBG class and loader (reusable)
* **`call_gnbg.py`** — minimal runner script

`call_gnbg.py` depends on `gnbg.py`, so both files must be present in the same directory.

## Key Parameters

Edit these in `call_gnbg.py`:

* **`problem_index`** — which GNBG function to load (`1`–`24`)
* **`repo_dir`** — path to the folder containing the `.mat` files
* **`eval_point`** — evaluation point

Example:

```python
problem_index = 1
eval_point = np.zeros(problem.Dimension)
```

## Resources

* GNBG repository:
[https://github.com/rohitsalgotra/GNBG-II](https://github.com/rohitsalgotra/GNBG-II)

* Benchmark description:
Generalized Numerical Benchmark Generator (GNBG)


23 changes: 23 additions & 0 deletions code_snippets/GNBG-II/call_gnbg.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# /// script
# dependencies = [
# "numpy",
# "scipy",
# ]
# ///

import numpy as np
from gnbg import load_gnbg_instance

### input
repo_dir = "GNBG_Instances.Python-main"
problem_index = 1

### prepare
problem = load_gnbg_instance(repo_dir, problem_index)
print(problem)

### evaluation point
eval_point = np.zeros(problem.Dimension)

### go
print(problem(eval_point))
86 changes: 86 additions & 0 deletions code_snippets/GNBG-II/gnbg.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
import os
import numpy as np
from scipy.io import loadmat


class GNBG:
def __init__(self, Dimension, CompNum, CompMinPos, CompSigma,
CompH, Mu, Omega, Lambda, RotationMatrix):
self.Dimension = int(Dimension)
self.CompNum = int(CompNum)
self.CompMinPos = np.asarray(CompMinPos, dtype=float)
self.CompSigma = np.asarray(CompSigma, dtype=float).reshape(-1)
self.CompH = np.asarray(CompH, dtype=float)
self.Mu = np.asarray(Mu, dtype=float)
self.Omega = np.asarray(Omega, dtype=float)
self.Lambda = np.asarray(Lambda, dtype=float).reshape(-1)
self.RotationMatrix = np.asarray(RotationMatrix, dtype=float)

def transform(self, X, Alpha, Beta):
X = np.asarray(X, dtype=float)
Alpha = np.ravel(Alpha)
Beta = np.ravel(Beta)

Y = X.copy()

tmp = X > 0
Y[tmp] = np.log(X[tmp])
Y[tmp] = np.exp(
Y[tmp] + Alpha[0] * (np.sin(Beta[0] * Y[tmp]) + np.sin(Beta[1] * Y[tmp]))
)

tmp = X < 0
Y[tmp] = np.log(-X[tmp])
Y[tmp] = -np.exp(
Y[tmp] + Alpha[1] * (np.sin(Beta[2] * Y[tmp]) + np.sin(Beta[3] * Y[tmp]))
)

return Y

def fitness(self, X):
X = np.asarray(X, dtype=float).reshape(-1, 1)
f = np.full(self.CompNum, np.nan)

for k in range(self.CompNum):
R = self.RotationMatrix[:, :, k] if self.RotationMatrix.ndim == 3 else self.RotationMatrix
shift = self.CompMinPos[k, :].reshape(-1, 1)

a = self.transform(
(X - shift).T @ R.T,
self.Mu[k, :],
self.Omega[k, :],
)
b = self.transform(
R @ (X - shift),
self.Mu[k, :],
self.Omega[k, :],
)

quad = float(np.squeeze(a @ np.diag(np.ravel(self.CompH[k, :])) @ b))
sigma_k = float(self.CompSigma[k])
lambda_k = float(self.Lambda[k])

f[k] = sigma_k + quad ** lambda_k

return float(np.min(f))

__call__ = fitness


def load_gnbg_instance(repo_dir, idx):
data = loadmat(os.path.join(repo_dir, f"f{idx}.mat"))["GNBG"]

def get_scalar(field):
return np.array([item[0] for item in data[field].flatten()])[0, 0]

return GNBG(
Dimension=get_scalar("Dimension"),
CompNum=get_scalar("o"),
CompMinPos=np.array(data["Component_MinimumPosition"][0, 0], dtype=float),
CompSigma=np.atleast_1d(np.array(data["ComponentSigma"][0, 0], dtype=float)).reshape(-1),
CompH=np.array(data["Component_H"][0, 0], dtype=float),
Mu=np.array(data["Mu"][0, 0], dtype=float),
Omega=np.array(data["Omega"][0, 0], dtype=float),
Lambda=np.atleast_1d(np.array(data["lambda"][0, 0], dtype=float)).reshape(-1),
RotationMatrix=np.array(data["RotationMatrix"][0, 0], dtype=float),
)
64 changes: 64 additions & 0 deletions code_snippets/IOHClustering/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# IOH Clustering

Evaluates a function from the [IOHclustering](https://github.com/IOHprofiler/IOHClustering) benchmark suite — a set of continuous black-box optimization problems derived from data clustering tasks, integrated with the [IOHprofiler](https://github.com/IOHprofiler) framework.

Each problem encodes a k-means-style clustering objective: the search vector represents `k` cluster centers in a 2D feature space (after PCA or feature selection), so the problem dimensionality is `k × 2`. All variables are bounded in `[0, 1]`.

## Quick Start

1. Install [uv](https://docs.astral.sh/uv/) if you don't have it yet:

```bash
pip install uv
```

No extra setup is needed beyond having `uv` installed. The `ioh` and `iohclustering` packages are resolved automatically.

> **Note:** This snippet requires **Python 3.10**. The inline metadata enforces this via `requires-python = "==3.10"`. Make sure a Python 3.10 interpreter is available on your system.

## Usage

```bash
uv run call_clustering.py
```

## What the Snippet Does

The script creates clustering problem 5 (`iris_pca`) with `k=2` clusters, evaluates it at the origin, and prints the result. You can adjust the behavior by editing these variables in the script:

- **`fid`** — problem ID (integer) or dataset name (string) passed to `get_problem()` (default: `5`)
- **`k`** — number of cluster centers (default: `2`; available values: `2`, `3`, `5`, `10`)
- **`instance`** — problem instance for transformation-based generalization (default: `1`)
- **`eval_point`** — the point at which the function is evaluated (default: all zeros)

> **Note:** `get_problem()` returns a tuple `(problem, retransform)`. The `retransform` function converts a solution vector back into cluster center coordinates in the original data space.

### Available Datasets

| ID | Name |
|----|------|
| 1 | breast_pca |
| 2 | diabetes_pca |
| 3 | german_postal_selected |
| 4 | glass_pca |
| 5 | iris_pca |
| 6 | kc1_pca |
| 7 | mfeat-fourier_pca |
| 8 | ruspini_selected |
| 9 | segment_pca |
| 10 | wine_pca |

### Available Values of k

| k | Dimensionality |
|---|----------------|
| 2 | 4 |
| 3 | 6 |
| 5 | 10 |
| 10 | 20 |

## Resources

- [IOHClustering GitHub repository](https://github.com/IOHprofiler/IOHClustering)
- [IOHexperimenter GitHub repository](https://github.com/IOHprofiler/IOHexperimenter)
- [Benchmark paper (arXiv)](https://arxiv.org/abs/2505.09233)
20 changes: 20 additions & 0 deletions code_snippets/IOHClustering/call_IOHClustering.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# /// script
# requires-python = "==3.10"
# dependencies = [
# "ioh",
# "iohclustering",
# ]
# ///

from iohclustering import get_problem

# Get benchmark problem by its ID (e.g., ID=5) with k=2 clusters
# Alternatively, by name (e.g., "iris_pca")
clustering_problem, retransform = get_problem(fid=5, k=2)

### evaluation point
dim = clustering_problem.meta_data.n_variables
eval_point = [0.0]*dim

### print function value for eval_point
print(clustering_problem(eval_point))
79 changes: 79 additions & 0 deletions code_snippets/MA-BBOB/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# IOH MA-BBOB (ManyAffine)

Evaluates a function from the [IOHexperimenter](https://github.com/IOHprofiler/IOHexperimenter) **MA-BBOB** problem generator — a method for creating arbitrary affine combinations of the 24 noiseless BBOB test functions, supported on [-5, 5]^n.

MA-BBOB extends the classic BBOB suite by blending its base functions with random weights, shifts, and per-function instance transformations. The `instance` parameter seeds this generation procedure, so the same instance ID always produces the same function.

## Quick Start

1. Install [uv](https://docs.astral.sh/uv/) if you don't have it yet:

```bash
pip install uv
```

No extra setup is needed beyond having `uv` installed. The `ioh` package is resolved automatically.

> **Note:** This snippet requires **Python 3.10**. The inline metadata enforces this via `requires-python = "==3.10"`. Make sure a Python 3.10 interpreter is available on your system.

## Usage

```bash
uv run call_manyaffine.py
```

## What the Snippet Does

The script creates an MA-BBOB problem (instance 1) in 5 dimensions, evaluates it at the origin, and prints the result. You can adjust the behavior by editing these variables in the script:

- **`instance`** — seeds the random generation of weights, sub-problem instances, and optimum location (default: `1`)
- **`n_variables`** — problem dimensionality (default: `5`; the GECCO competition uses `2` and `5`)
- **`eval_point`** — the point at which the function is evaluated (default: all zeros)

### Advanced Constructor

In addition to the simple `(instance, n_variables)` constructor, `ManyAffine` also accepts explicit control over the combination:

```python
ioh.problem.ManyAffine(
xopt, # list[float] — location of the optimum
weights, # list[float, 24] — weight per BBOB base function
instances, # list[int, 24] — instance ID per base function
n_variables, # int — search space dimensionality
scale_factors, # list[float, 24] — (optional) scaling per base function
)
```

### Readable Properties

| Property | Description |
|---|---|
| `weights` | The 24 combination weights |
| `instances` | The 24 sub-problem instance IDs |
| `scale_factors` | The 24 per-function scaling factors |
| `sub_problems` | The 24 underlying BBOB problem objects |
| `function_values` | Current function values of the sub-problems |

### Underlying BBOB Base Functions

| ID | Name | ID | Name |
|----|------|----|------|
| 1 | Sphere | 13 | SharpRidge |
| 2 | Ellipsoid | 14 | DifferentPowers |
| 3 | Rastrigin | 15 | RastriginRotated |
| 4 | BuecheRastrigin | 16 | Weierstrass |
| 5 | LinearSlope | 17 | Schaffers10 |
| 6 | AttractiveSector | 18 | Schaffers1000 |
| 7 | StepEllipsoid | 19 | GriewankRosenbrock |
| 8 | Rosenbrock | 20 | Schwefel |
| 9 | RosenbrockRotated | 21 | Gallagher101 |
| 10 | EllipsoidRotated | 22 | Gallagher21 |
| 11 | Discus | 23 | Katsuura |
| 12 | BentCigar | 24 | LunacekBiRastrigin |

## Resources

- [MA-BBOB paper (arXiv:2312.11083)](https://arxiv.org/abs/2312.11083)
- [GECCO 2025 MA-BBOB Competition](https://iohprofiler.github.io/competitions/mabbob25)
- [Example notebook](https://github.com/IOHprofiler/IOHexperimenter/blob/master/example/Competitions/MA-BBOB/Example_MABBOB.ipynb)
- [IOHexperimenter GitHub repository](https://github.com/IOHprofiler/IOHexperimenter)
17 changes: 17 additions & 0 deletions code_snippets/MA-BBOB/call_mabbob.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# /// script
# requires-python = "==3.10"
# dependencies = [
# "ioh",
# ]
# ///

import ioh

problem = ioh.problem.ManyAffine(instance = 1, n_variables = 5)

### evaluation point
dim = problem.meta_data.n_variables
eval_point = [0.0]*dim

### print function value for eval_point
print(problem(eval_point))
Loading