Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Website clean #120

Open
wants to merge 99 commits into
base: website
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
99 commits
Select commit Hold shift + click to select a range
083fe91
cleanup
mdouze Jun 8, 2023
437c480
added neurips23 index file
harsha-simhadri Jun 19, 2023
72350a4
updated neurips23
harsha-simhadri Jun 19, 2023
41ca983
Create CNAME
harsha-simhadri Jun 22, 2023
90f9be9
Update neurips23.html
harsha-simhadri Jun 22, 2023
eb5c7e5
Update neurips23.html
harsha-simhadri Jun 22, 2023
eaf6b35
Add files via upload
harsha-simhadri Jun 22, 2023
8cb478e
Update neurips23.html
harsha-simhadri Jun 22, 2023
86c9bcb
Update neurips23.html
harsha-simhadri Jun 22, 2023
6667503
Update neurips23.html
harsha-simhadri Jun 22, 2023
dbdf1d1
Add files via upload
harsha-simhadri Jun 22, 2023
ec28499
Update neurips23.html
harsha-simhadri Jun 22, 2023
5d6d28d
Update neurips23.html
harsha-simhadri Jun 22, 2023
4880377
Update neurips23.html
harsha-simhadri Jun 22, 2023
b88a264
Add files via upload
harsha-simhadri Jun 22, 2023
6c72834
Update neurips23.html
harsha-simhadri Jun 22, 2023
9821430
Update neurips23.html
harsha-simhadri Jun 22, 2023
bda1581
Update neurips23.html
harsha-simhadri Jun 22, 2023
dcf51a5
Update neurips23.html
harsha-simhadri Jun 22, 2023
5360233
Update neurips23.html
harsha-simhadri Jun 22, 2023
ab29aab
Update neurips23.html
harsha-simhadri Jun 22, 2023
a5dd793
Update neurips23.html
harsha-simhadri Jun 22, 2023
0e0dce0
Update neurips23.html
harsha-simhadri Jun 22, 2023
ed8f542
Update neurips23.html
harsha-simhadri Jun 22, 2023
8901bc1
Update neurips23.html
mdouze Jun 23, 2023
4d7014d
update neurips23.html
ingberam Jun 28, 2023
e59a196
Update neurips23.html
harsha-simhadri Jul 6, 2023
b1b7815
edited neurips23 page links
harsha-simhadri Jul 6, 2023
89333e4
updated website
harsha-simhadri Jul 6, 2023
99b8394
point index to neurip23 html
harsha-simhadri Jul 6, 2023
77e75f7
moved schedule down the neurips 21 webpage
harsha-simhadri Jul 6, 2023
8ed6008
copy index to neurips21 html page
harsha-simhadri Jul 6, 2023
c40c2ad
fixed the page
harsha-simhadri Jul 6, 2023
428e2f2
switch from slack to discord
harsha-simhadri Jul 6, 2023
4237406
switch from slack to discord
harsha-simhadri Jul 6, 2023
685ca70
add discord logo
harsha-simhadri Jul 6, 2023
ac819d5
fix typo
harsha-simhadri Jul 6, 2023
6e216b6
moved icons to navbar
harsha-simhadri Jul 6, 2023
3135d85
update instructions
harsha-simhadri Jul 6, 2023
d433071
update instructions
harsha-simhadri Jul 6, 2023
b248359
update instructions
harsha-simhadri Jul 6, 2023
6e18090
rebuild page
carlosgauci Jul 9, 2023
6999e27
Merge pull request #1 from tejchilli/page-rebuild
tejchilli Jul 9, 2023
7d8f855
centered the table
Jul 9, 2023
5260a91
update page
carlosgauci Jul 11, 2023
2406a6a
Merge pull request #2 from tejchilli/page-updates
tejchilli Jul 11, 2023
e74534c
changed cta copy
Jul 11, 2023
2d34d88
resolved copy conflicts
Jul 11, 2023
e6ead2b
more merge conflict resolutions
Jul 11, 2023
e21b9f6
update
harsha-simhadri Jul 13, 2023
c5cb628
updated links
harsha-simhadri Jul 13, 2023
91f667c
Update neurips23.html
harsha-simhadri Jul 20, 2023
63d1077
Update neurips23.html
harsha-simhadri Jul 20, 2023
6a311aa
added og image
kylermintah Aug 7, 2023
67192e7
Merge pull request #155 from harsha-simhadri/kylermintah/page-rebuild
kylermintah Aug 7, 2023
93387f7
index og image
kylermintah Aug 7, 2023
f84bb8f
missing slash
kylermintah Aug 7, 2023
6a1d84a
Merge pull request #156 from harsha-simhadri/kylermintah/page-rebuild
kylermintah Aug 7, 2023
44f6da7
full path
kylermintah Aug 7, 2023
f4b336f
Merge pull request #157 from harsha-simhadri/kylermintah/page-rebuild
kylermintah Aug 7, 2023
052517e
update deadline
harsha-simhadri Aug 17, 2023
b8cad88
update build deadline
harsha-simhadri Aug 17, 2023
59590f6
Update neurips21.html
maumueller Jul 24, 2023
4b3208b
move schedule down the page
harsha-simhadri Sep 30, 2023
602a75a
update final runbook
harsha-simhadri Oct 18, 2023
5fd334e
Update sponsors (#180)
fzliu Oct 20, 2023
a5dd8f4
Update neurips23.html
harsha-simhadri Oct 20, 2023
33f170b
Delete CNAME
harsha-simhadri Nov 30, 2023
0a179ca
Create CNAME
harsha-simhadri Nov 30, 2023
040126b
Update CNAME
harsha-simhadri Dec 5, 2023
bff48ca
Update CNAME
harsha-simhadri Dec 5, 2023
2bfafd1
add results
harsha-simhadri Dec 5, 2023
9fe0c80
add results
harsha-simhadri Dec 5, 2023
e669a09
add results
harsha-simhadri Dec 5, 2023
9a0eff2
add results
harsha-simhadri Dec 6, 2023
8616820
reformat sections and add links to code
harsha-simhadri Dec 7, 2023
267f064
reformat sections and add links to code
harsha-simhadri Dec 7, 2023
905a177
reformat sections and add links to code
harsha-simhadri Dec 7, 2023
2543971
change link to leaderboard
harsha-simhadri Dec 7, 2023
0a70fe0
typo
harsha-simhadri Dec 7, 2023
33dddd4
Update neurips23.html
harsha-simhadri Dec 7, 2023
ea6b85d
update styling (#256)
carlosgauci Dec 7, 2023
4ef2d94
reformat sections and add links to code
harsha-simhadri Dec 7, 2023
6c19112
fix entry details
harsha-simhadri Dec 8, 2023
78922a7
fix entry details
harsha-simhadri Dec 8, 2023
9b58f3a
fix entry details
harsha-simhadri Dec 8, 2023
ae46e27
fix entry details
harsha-simhadri Dec 8, 2023
908b72d
Update neurips23.html
harsha-simhadri Dec 8, 2023
5c6c28e
Update neurips23.html
harsha-simhadri Dec 8, 2023
acdd71f
Update neurips23.html
harsha-simhadri Dec 8, 2023
281ae9b
Update neurips23.html
harsha-simhadri Dec 8, 2023
93a6487
Update neurips23.html
harsha-simhadri Dec 11, 2023
4ef922e
Update neurips23.html (#257)
ingberam Dec 13, 2023
95f1d58
add neurips'23 slides
harsha-simhadri Dec 18, 2023
99b88d2
more slides
harsha-simhadri Dec 18, 2023
492690a
add linkes to slidesg
harsha-simhadri Dec 18, 2023
25f528d
ongoing leaderboard (#271)
ingberam Jan 23, 2024
9537af2
new leaderboard announcement (#277)
ingberam Feb 5, 2024
d1ab374
leaderboard update (#285)
ingberam Mar 5, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 0 additions & 4 deletions .dockerignore

This file was deleted.

122 changes: 61 additions & 61 deletions .github/workflows/benchmarks.yml
Original file line number Diff line number Diff line change
@@ -1,61 +1,61 @@
# Contributed by @GuilhemN in https://github.com/erikbern/ann-benchmarks/pull/233
name: Billion-Scale ANN Benchmarks

on: [push, pull_request]

jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- algorithm: faiss-ivf
library: faissconda
dataset: random-xs
- algorithm: faiss-t1
dataset: random-xs
library: faissconda
- algorithm: faiss-t1
dataset: random-range-xs
library: faissconda
- algorithm: diskann-t2
dataset: random-xs
library: diskann
- algorithm: diskann-t2
dataset: random-range-xs
library: diskann
- algorithm: httpann_example
dataset: random-xs
library: httpann_example
- algorithm: httpann_example
dataset: random-range-xs
library: httpann_example
fail-fast: false

steps:
- uses: actions/checkout@v2 # Pull the repository

- name: Set up Python 3.6
uses: actions/setup-python@v2
with:
python-version: 3.6

- name: Install dependencies
run: |
pip install -r requirements.txt
python install.py
env:
LIBRARY: ${{ matrix.library }}
DATASET: ${{ matrix.dataset }}

- name: Run the benchmark
run: |
python create_dataset.py --dataset $DATASET
python run.py --algorithm $ALGORITHM --max-n-algorithms 2 --dataset $DATASET --timeout 600
sudo chmod -R 777 results/
python plot.py --dataset $DATASET --output plot.png
python data_export.py --output test.csv

env:
ALGORITHM: ${{ matrix.algorithm}}
DATASET: ${{ matrix.dataset }}
# Contributed by @GuilhemN in https://github.com/erikbern/ann-benchmarks/pull/233
name: Billion-Scale ANN Benchmarks
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- algorithm: faiss-ivf
library: faissconda
dataset: random-xs
- algorithm: faiss-t1
dataset: random-xs
library: faissconda
- algorithm: faiss-t1
dataset: random-range-xs
library: faissconda
- algorithm: diskann-t2
dataset: random-xs
library: diskann
- algorithm: diskann-t2
dataset: random-range-xs
library: diskann
- algorithm: httpann_example
dataset: random-xs
library: httpann_example
- algorithm: httpann_example
dataset: random-range-xs
library: httpann_example
fail-fast: false
steps:
- uses: actions/checkout@v2 # Pull the repository
- name: Set up Python 3.6
uses: actions/setup-python@v2
with:
python-version: 3.6
- name: Install dependencies
run: |
pip install -r requirements.txt
python install.py
env:
LIBRARY: ${{ matrix.library }}
DATASET: ${{ matrix.dataset }}
- name: Run the benchmark
run: |
python create_dataset.py --dataset $DATASET
python run.py --algorithm $ALGORITHM --max-n-algorithms 2 --dataset $DATASET --timeout 600
sudo chmod -R 777 results/
python plot.py --dataset $DATASET --output plot.png
python data_export.py --output test.csv
env:
ALGORITHM: ${{ matrix.algorithm}}
DATASET: ${{ matrix.dataset }}
Binary file added GitHub_Logo_White.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
21 changes: 0 additions & 21 deletions LICENSE

This file was deleted.

21 changes: 0 additions & 21 deletions MSFT-Turing-ANNS-terms.txt

This file was deleted.

154 changes: 77 additions & 77 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,77 +1,77 @@
# Billion-Scale ANN

<http://big-ann-benchmarks.com/>

## Install

The only prerequisite is Python (tested with 3.6) and Docker. Works with newer versions of Python as well but probably requires an updated `requirements.txt` on the host. (Suggestion: copy `requirements.txt` to `requirements${PYTHON_VERSION}.txt` and remove all fixed versions. `requirements.txt` has to be kept for the docker containers.)

1. Clone the repo.
2. Run `pip install -r requirements.txt` (Use `requirements_py38.txt` if you have Python 3.8.)
3. Install docker by following instructions [here](https://docs.docker.com/engine/install/ubuntu/).
You might also want to follow the post-install steps for running docker in non-root user mode.
3. Run `python install.py` to build all the libraries inside Docker containers.

## Storing Data

The framework assumes that all data is stored in `data/`.
Please use a symlink if your datasets and indices are supposed to be stored somewhere else.
The location of the linked folder matters a great deal for SSD-based search performance in T2.
A local SSD such as the one found on Azure Ls-series VMs is better than remote disks, even premium ones.
See [T1/T2](t1_t2/README.md) for more details.

## Data sets

See <http://big-ann-benchmarks.com/> for details on the different datasets.

### Dataset Preparation

Before running experiments, datasets have to be downloaded. All preparation can be carried out by calling

```python
python create_dataset.py --dataset [bigann-1B | deep-1B | text2image-1B | ssnpp-1B | msturing-1B | msspacev-1B]
```

Note that downloading the datasets can potentially take many hours.

For local testing, there exist smaller random datasets `random-xs` and `random-range-xs`.
Furthermore, most datasets have 1M, 10M and 100M versions, run `python create_dataset -h` to get an overview.


## Running the benchmark

Run `python run.py --dataset $DS --algorithm $ALGO` where `DS` is the dataset you are running on,
and `ALGO` is the name of the algorithm. (Use `python run.py --list-algorithms`) to get an overview.
`python run.py -h` provides you with further options.

The parameters used by the implementation to build and query the index can be found in `algos.yaml`.

## Running the track 1 baseline
After running the installation, we can evaluate the baseline as follows.

```bash

for DS in bigann-1B deep-1B text2image-1B ssnpp-1B msturing-1B msspacev-1B;
do
python run.py --dataset $DS --algorithm faiss-t1;
done
```

On a 28-core Xeon E5-2690 v4 that provided 100MB/s downloads, carrying out the baseline experiments took roughly 7 days.

To evaluate the results, run
```bash
sudo chmod -R 777 results/
python data_export.py --output res.csv
python3.8 eval/show_operating_points.py --algorithm faiss-t1 --threshold 10000
```

## Including your algorithm and Evaluating the Results

See [Track T1/T2](t1_t2/README.md) for more details on evaluation for Tracks T1 and T2.

See [Track T3](t3/README.md) for more details on evaluation for Track T3.

# Credits

This project is a version of [ann-benchmarks](https://github.com/erikbern/ann-benchmarks) by [Erik Bernhardsson](https://erikbern.com/) and contributors targetting billion-scale datasets.
# Billion-Scale ANN
<http://big-ann-benchmarks.com/>
## Install
The only prerequisite is Python (tested with 3.6) and Docker. Works with newer versions of Python as well but probably requires an updated `requirements.txt` on the host. (Suggestion: copy `requirements.txt` to `requirements${PYTHON_VERSION}.txt` and remove all fixed versions. `requirements.txt` has to be kept for the docker containers.)
1. Clone the repo.
2. Run `pip install -r requirements.txt` (Use `requirements_py38.txt` if you have Python 3.8.)
3. Install docker by following instructions [here](https://docs.docker.com/engine/install/ubuntu/).
You might also want to follow the post-install steps for running docker in non-root user mode.
3. Run `python install.py` to build all the libraries inside Docker containers.
## Storing Data
The framework assumes that all data is stored in `data/`.
Please use a symlink if your datasets and indices are supposed to be stored somewhere else.
The location of the linked folder matters a great deal for SSD-based search performance in T2.
A local SSD such as the one found on Azure Ls-series VMs is better than remote disks, even premium ones.
See [T1/T2](t1_t2/README.md) for more details.
## Data sets
See <http://big-ann-benchmarks.com/> for details on the different datasets.
### Dataset Preparation
Before running experiments, datasets have to be downloaded. All preparation can be carried out by calling
```python
python create_dataset.py --dataset [bigann-1B | deep-1B | text2image-1B | ssnpp-1B | msturing-1B | msspacev-1B]
```
Note that downloading the datasets can potentially take many hours.
For local testing, there exist smaller random datasets `random-xs` and `random-range-xs`.
Furthermore, most datasets have 1M, 10M and 100M versions, run `python create_dataset -h` to get an overview.
## Running the benchmark
Run `python run.py --dataset $DS --algorithm $ALGO` where `DS` is the dataset you are running on,
and `ALGO` is the name of the algorithm. (Use `python run.py --list-algorithms`) to get an overview.
`python run.py -h` provides you with further options.
The parameters used by the implementation to build and query the index can be found in `algos.yaml`.
## Running the track 1 baseline
After running the installation, we can evaluate the baseline as follows.
```bash
for DS in bigann-1B deep-1B text2image-1B ssnpp-1B msturing-1B msspacev-1B;
do
python run.py --dataset $DS --algorithm faiss-t1;
done
```
On a 28-core Xeon E5-2690 v4 that provided 100MB/s downloads, carrying out the baseline experiments took roughly 7 days.
To evaluate the results, run
```bash
sudo chmod -R 777 results/
python data_export.py --output res.csv
python3.8 eval/show_operating_points.py --algorithm faiss-t1 --threshold 10000
```
## Including your algorithm and Evaluating the Results
See [Track T1/T2](t1_t2/README.md) for more details on evaluation for Tracks T1 and T2.
See [Track T3](t3/README.md) for more details on evaluation for Track T3.
# Credits
This project is a version of [ann-benchmarks](https://github.com/erikbern/ann-benchmarks) by [Erik Bernhardsson](https://erikbern.com/) and contributors targetting billion-scale datasets.
Loading
Loading