Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cleanup and docs #63

Merged
merged 5 commits into from
Sep 8, 2023
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 0 additions & 10 deletions .flake8

This file was deleted.

1 change: 0 additions & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,5 @@

# Include individual files
include LICENSE.txt
include requirements.txt
prune tests
prune examples
36 changes: 26 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@ pip install -e DEHB # -e stands for editable, lets you modify the code and reru
To run PyTorch example: (*note additional requirements*)
```bash
python examples/03_pytorch_mnist_hpo.py \
--min_budget 1 --max_budget 3 --verbose --runtime 60
--min_budget 1 \
--max_budget 3 \
--runtime 60 \
--verbose
```

### Running DEHB in a parallel setting
Expand Down Expand Up @@ -58,8 +61,13 @@ to it by that DEHB run.

To run the PyTorch MNIST example on a single node using 2 workers:
```bash
python examples/03_pytorch_mnist_hpo.py --min_budget 1 --max_budget 3 \
--verbose --runtime 60 --n_workers 2 --single_node_with_gpus
python examples/03_pytorch_mnist_hpo.py \
--min_budget 1 \
--max_budget 3 \
--runtime 60 \
--n_workers 2 \
--single_node_with_gpus \
--verbose
```

#### Multi-node runs
Expand All @@ -80,10 +88,18 @@ manner on clusters managed by SLURM. (*not expected to work off-the-shelf*)

To run the PyTorch MNIST example on a multi-node setup using 4 workers:
```bash
bash utils/run_dask_setup.sh -f dask_dump/scheduler.json -e env_name -n 4
bash utils/run_dask_setup.sh \
-f dask_dump/scheduler.json \
-e env_name \
-n 4

eddiebergman marked this conversation as resolved.
Show resolved Hide resolved
sleep 5
python examples/03_pytorch_mnist_hpo.py --min_budget 1 --max_budget 3 \
--verbose --runtime 60 --scheduler_file dask_dump/scheduler.json
python examples/03_pytorch_mnist_hpo.py \
--min_budget 1 \
--max_budget 3 \
--runtime 60 \
--scheduler_file dask_dump/scheduler.json \
--verbose
```

### DEHB Hyperparameters
Expand Down Expand Up @@ -127,8 +143,8 @@ represents the *mutation* strategy while `bin` represents the *binomial crossove
}

@online{Awad-arXiv-2023,
title = {MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization},
author = {Noor Awad and Ayushi Sharma and Frank Hutter},
year = {2023},
keywords = {}
title = {MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization},
author = {Noor Awad and Ayushi Sharma and Frank Hutter},
year = {2023},
keywords = {}
}
Empty file removed __init__.py
Empty file.
111 changes: 69 additions & 42 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,60 +13,72 @@ To start using the `dehb` package, you can install it via pip. You can either in
```bash
# Install from pypi
pip install dehb

# Install as editable from github
git clone https://github.com/automl/DEHB.git
pip install -e DEHB # -e stands for editable, lets you modify the code and rerun things
```

!!! note "From Source"

To install directly from from source

```bash
git clone https://github.com/automl/DEHB.git
pip install -e DEHB # -e stands for editable, lets you modify the code and rerun things
```

## Getting Started

In the following sections we provide some basic examplatory setup for running DEHB with a single worker or in a multi-worker setup.

### Basic single worker setup
A basic setup for optimizing can be done as follows. Please note, that this is example should solely show a simple setup of `dehb`. More in-depth examples can be found in the [examples folder](../examples/). First we need to setup a `ConfigurationSpace`, from which Configurations will be sampled:

```python
import ConfigSpace
```python exec="true" source="material-block" result="python" title="Configuration Space" session="someid"
from ConfigSpace import ConfigurationSpace, Configuration

cs = ConfigSpace.ConfigurationSpace()
cs.add_hyperparameter(ConfigSpace.UniformFloatHyperparameter("x0", lower=3, upper=10, log=False))
cs = ConfigurationSpace({"x0": (3.0, 10.0), "x1": ["red", "green"]})
print(cs)
```

Next, we need an `object_function`, which we are aiming to optimize:
```python
```python exec="true" source="material-block" result="python" title="Configuration Space" session="someid"
import numpy as np
def objective_function(x, budget, **kwargs):
"""Toy objective function.

Args:
x (ConfigSpace.Configuration): Configuration to evaluate
budget (float): Budget to evaluate x on

Returns:
dict: Result dictionary
"""
# This obviously does not make sense in a real world example. Replace this with your actual objective value (y) and cost.
y = np.random.uniform()
cost = 5
result = {
"fitness": y,
"cost": cost
}
return result

def objective_function(x: Configuration, budget: float, **kwargs):
# Replace this with your actual objective value (y) and cost.
cost = (10 if x["x1"] == "red" else 100) + budget
y = x["x0"] + np.random.uniform()
return {"fitness": y, "cost": x["x0"]}

sample_config = cs.sample_configuration()
print(sample_config)

result = objective_function(sample_config, budget=10)
print(result)
```

Finally, we can setup our optimizer and run DEHB:

```python
```python exec="true" source="material-block" result="python" title="Configuration Space" session="someid"
from dehb import DEHB

dim = len(cs.get_hyperparameters())
optimizer = DEHB(f=objective_function, cs=cs, dimensions=dim, min_budget=3, output_path="./logs",
max_budget=27, eta=3, n_workers=1)

# Run optimization for 10 brackets. Output files will be save to ./logs
traj, runtime, history = opt.run(brackets=10, verbose=True)
optimizer = DEHB(
f=objective_function,
cs=cs,
dimensions=dim,
min_budget=3,
max_budget=27,
eta=3,
n_workers=1,
output_path="./logs",
)

# Run optimization for 1 bracket. Output files will be save to ./logs
eddiebergman marked this conversation as resolved.
Show resolved Hide resolved
traj, runtime, history = optimizer.run(brackets=1, verbose=True)
config, fitness, runtime, budget, _ = history[0]
print("config", config)
print("fitness", fitness)
print("runtime", runtime)
print("budget", budget)
```

### Running DEHB in a parallel setting
Expand Down Expand Up @@ -99,8 +111,13 @@ to it by that DEHB run.

To run the PyTorch MNIST example on a single node using 2 workers:
```bash
python examples/03_pytorch_mnist_hpo.py --min_budget 1 --max_budget 3 \
--verbose --runtime 60 --n_workers 2 --single_node_with_gpus
python examples/03_pytorch_mnist_hpo.py \
--min_budget 1 \
--max_budget 3 \
--runtime 60 \
--n_workers 2 \
--single_node_with_gpus \
--verbose
```

#### Multi-node runs
Expand All @@ -121,10 +138,20 @@ manner on clusters managed by SLURM. (*not expected to work off-the-shelf*)

To run the PyTorch MNIST example on a multi-node setup using 4 workers:
```bash
bash utils/run_dask_setup.sh -f dask_dump/scheduler.json -e env_name -n 4
bash utils/run_dask_setup.sh \
-n 4 \
-f dask_dump/scheduler.json \ # This is how the workers will be discovered by DEHB
-e env_name

# Make sure to sleep to allow the workers to setup properly
sleep 5
python examples/03_pytorch_mnist_hpo.py --min_budget 1 --max_budget 3 \
--verbose --runtime 60 --scheduler_file dask_dump/scheduler.json

python examples/03_pytorch_mnist_hpo.py \
--min_budget 1 \
--max_budget 3 \
--runtime 60 \
--scheduler_file dask_dump/scheduler.json \
--verbose
```

## To cite the paper or code
Expand All @@ -142,9 +169,9 @@ If you use DEHB in one of your research projects, please cite our paper(s):
}

@online{Awad-arXiv-2023,
title = {MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization},
author = {Noor Awad and Ayushi Sharma and Frank Hutter},
year = {2023},
keywords = {}
title = {MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization},
author = {Noor Awad and Ayushi Sharma and Frank Hutter},
year = {2023},
keywords = {}
}
```
31 changes: 31 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,39 @@ theme:
icon: material/eye
name: Switch to dark mode

markdown_extensions:
- admonition
- tables
- attr_list
- md_in_html
- toc:
permalink: "#"
- pymdownx.highlight:
anchor_linenums: true
- pymdownx.magiclink:
hide_protocol: true
repo_url_shortener: true
repo_url_shorthand: true
user: automl
repo: DEHB
- pymdownx.highlight
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.details
- pymdownx.tabbed:
alternate_style: true
- pymdownx.superfences:
custom_fences:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- pymdownx.emoji:
emoji_index: !!python/name:materialx.emoji.twemoji
emoji_generator: !!python/name:materialx.emoji.to_svg

plugins:
- search
- markdown-exec
- mkdocstrings:
default_handler: python
enable_inventory: true
Expand Down
70 changes: 60 additions & 10 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,33 +1,83 @@
# For TOML reference
# https://learnxinyminutes.com/docs/toml/
[project]
urls = { Documentation = "https://automl.github.io/DEHB/", Github = "https://github.com/automl/DEHB" }

name = "DEHB"
version = "0.7.0"
eddiebergman marked this conversation as resolved.
Show resolved Hide resolved
dependencies = [
"numpy>=1.18.2",
"loguru>=0.5.3",
"dask>=2.27.0",
"distributed>=2.27.0",
"ConfigSpace>=0.4.16",
]
classifiers = [
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Natural Language :: English",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"Intended Audience :: Science/Research",
"Topic :: Scientific/Engineering",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
]
license = { file = "LICENSE.txt" }
readme = "README.md"
description = "Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization"
authors = [
{ name = "Neeratyoy Mallik", email = "[email protected]" },
{ name = "Noor Awad" },
{ name = "Frank Hutter" },
{ name = "Janis Fix", email = "[email protected]" },
]
requires-python = ">=3.8"
[project.optional-dependencies]
dev = [
# Test
"pytest>=4.6",
"pytest-cov",
"pytest-xdist",
"pytest-timeout",
# Docs
"mkdocs",
"mkdocs-material",
"mkdocstrings[python]",
"markdown-exec[ansi]",
# Others
"ruff",
"black",
"pre-commit",
]
Bronzila marked this conversation as resolved.
Show resolved Hide resolved

[tool.pytest.ini_options]
testpaths = ["tests"] # path to the test directory
testpaths = ["tests"] # path to the test directory
minversion = "3.8"
addopts = "--cov=src --cov-report=lcov" # Should be package name
pythonpath = [
"."
]
pythonpath = ["."]

[tool.coverage.run]
branch = true
context = "dehb" # Should be package name
omit = [
"dehb/__init__.py", # Has variables only needed for setup.py
"dehb/__init__.py", # Has variables only needed for setup.py
]

[tool.coverage.report]
show_missing = true
skip_covered = true
exclude_lines = [
"pragma: no cover",
'\.\.\.',
"raise NotImplementedError",
"if TYPE_CHECKING",
"pragma: no cover",
'\.\.\.',
"raise NotImplementedError",
"if TYPE_CHECKING",
] # These are lines to exclude from coverage

[tool.black]
target-version = ['py38']
line-length = 100

# https://github.com/charliermarsh/ruff
[tool.ruff]
Expand Down Expand Up @@ -183,4 +233,4 @@ warn_return_any = true
module = ["tests.*"]
disallow_untyped_defs = false # Sometimes we just want to ignore verbose types
disallow_untyped_decorators = false # Test decorators are not properly typed
disallow_incomplete_defs = false # Sometimes we just want to ignore verbose types
disallow_incomplete_defs = false # Sometimes we just want to ignore verbose types
5 changes: 0 additions & 5 deletions requirements.txt

This file was deleted.

Loading
Loading