This repository is the official implementation of Pure exploration in Kernel and Neural Bandits
To install requirements:
Packages used in this folder include: numpy, functools, scipy, sklearn, math, sys, logging, torch, itertools, pickle, gzip.
This paper includes results on two sythentic datasets and two real datasets.
To run the model(s) in the paper, run this command:
bash run.sh
mnist.pkl contains the raw data of the MNIST dataset.
method_list = [neural_elim, kernel_elim, linear_elim]
For method in method_list:
python run_minst.py method seed
For example, to run Alg.2 NeuralEmbedding, we use:
python run_mnist.py neural_elim 43
- python run_linear_data.py method
Our model achieves the following performance on Mnist Dataset and [Yahoo Dataset](https://webscope.sandbox.yahoo.com/?guccounter=1 style="zoom:50%;" />)
- MNIST dataset:
- Yahoo dataset:
Neural Elimination | Kernel Embedding | Linear Embedding | RAGE | Action Elimination | |
---|---|---|---|---|---|
MNIST Dataset | 98% | 100% | 100% | 100% | 100% |
Yahoo Dataset | 100% | 98% | 88% | 90% | 100% |
We would like to thank Sequential Experimental Design for Transductive Linear Bandits for open-source code.
Please consider citing our paper in your publications if the project helps your research. BibTeX reference is as follow.
@article{zhu2021pure,
title={Pure Exploration in Kernel and Neural Bandits},
author={Zhu, Yinglun and Zhou, Dongruo and Jiang, Ruoxi and Gu, Quanquan and Willett, Rebecca and Nowak, Robert},
journal={arXiv preprint arXiv:2106.12034},
year={2021}
}