Skip to content

Commit

Permalink
[onert/test] Remove nnpackage and modelfile options (#13569)
Browse files Browse the repository at this point in the history
This commit removes "--nnpackage" and "--modelfile" options from `onert_run` and `onert_train` test drivers.
They can be handled on last parameter without option automatically.

ONE-DCO-1.0-Signed-off-by: Hyeongseok Oh <[email protected]>
Co-authored-by: Jiyoung Giuliana Yun <[email protected]>
  • Loading branch information
hseok-oh and jyoungyun authored Aug 1, 2024
1 parent b7f78f3 commit 28de5e3
Show file tree
Hide file tree
Showing 8 changed files with 31 additions and 92 deletions.
2 changes: 1 addition & 1 deletion docs/howto/how-to-build-runtime-tizen-gbs-rpi4.md
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@ Assume `mobilenet_v2_1.4_224` nnpackage is already copied to

```
sh-3.2# BACKENDS="cpu" Product/out/bin/onert_run \
--nnpackage /opt/usr/home/owner/media/models/mobilenet_v2_1.4_224
/opt/usr/home/owner/media/models/mobilenet_v2_1.4_224
Package Filename /opt/usr/home/owner/media/models/mobilenet_v2_1.4_224
===================================
Expand Down
4 changes: 2 additions & 2 deletions docs/runtime/transfer_learning.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,6 @@ The crucial here is proper choosing value of `num_of_trainable_ops` to achieve t

```bash
onert_train \
--modelfile customized_mobilenetv2.circle \
--epoch 5 \
--loss 1 \ # mean squared error
--loss_reduction_type 1 \ # sum over batch size
Expand All @@ -108,7 +107,8 @@ onert_train \
--num_of_trainable_ops 5 \
--load_expected:raw cats_and_dogs.output.bin \
--load_input:raw cats_and_dogs.input.bin \
--export_path customized_mobilenetv2_trained.circle
--export_path customized_mobilenetv2_trained.circle \
customized_mobilenetv2.circle
```
The result of training:
```
Expand Down
2 changes: 1 addition & 1 deletion tests/scripts/benchmark.sh
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ $BRIDGE shell tar -zxf $TEST_ROOT/nnpkg.tar.gz -C $TEST_ROOT/nnpkg
$BRIDGE shell rm $TEST_ROOT/nnpkg.tar.gz

# 1. Run
$BRIDGE shell LD_LIBRARY_PATH=$TEST_ROOT/Product/out/lib TRACING_MODE=1 WORKSPACE_DIR=$TEST_ROOT BACKENDS=$BACKENDS $TEST_ROOT/Product/out/bin/onert_run --nnpackage $NNPKG_PATH_TARGET -r $NUM_RUNS
$BRIDGE shell LD_LIBRARY_PATH=$TEST_ROOT/Product/out/lib TRACING_MODE=1 WORKSPACE_DIR=$TEST_ROOT BACKENDS=$BACKENDS $TEST_ROOT/Product/out/bin/onert_run -r $NUM_RUNS $NNPKG_PATH_TARGET

# 2. Pull result file
echo "Pulling data from target to trace.json"
Expand Down
4 changes: 2 additions & 2 deletions tests/scripts/command/nnpkg-test
Original file line number Diff line number Diff line change
Expand Up @@ -85,9 +85,9 @@ dumped="$outdir/$tcname".out.h5
echo -n "[ Run ] $nnpkg "

if $nnpkg_run \
--nnpackage "$nnpkg" \
--load "$nnpkg/metadata/tc/input.h5" \
--dump "$dumped" >& /dev/null > "$dumped.log" 2>&1 ; then
--dump "$dumped" \
"$nnpkg" >& /dev/null > "$dumped.log" 2>&1 ; then
echo -e "\tPass"
rm "$dumped.log"
else
Expand Down
30 changes: 0 additions & 30 deletions tests/tools/onert_run/src/args.cc
Original file line number Diff line number Diff line change
Expand Up @@ -153,10 +153,6 @@ void Args::Initialize(void)
.nargs(0)
.default_value(false)
.help("Print version and exit immediately");
_arser.add_argument("--nnpackage")
.type(arser::DataType::STR)
.help("NN Package file(directory) name");
_arser.add_argument("--modelfile").type(arser::DataType::STR).help("NN Model filename");
#if defined(ONERT_HAVE_HDF5) && ONERT_HAVE_HDF5 == 1
_arser.add_argument("--dump", "-d").type(arser::DataType::STR).help("Output filename");
_arser.add_argument("--load", "-l").type(arser::DataType::STR).help("Input filename");
Expand Down Expand Up @@ -269,32 +265,6 @@ void Args::Parse(const int argc, char **argv)
return;
}

// Require modelfile, nnpackage, or path
if (!_arser["--nnpackage"] && !_arser["--modelfile"] && !_arser["path"])
{
std::cerr << "Require one of options modelfile, nnpackage, or path." << std::endl;
exit(1);
}

// Cannot use both single model file and nnpackage at once
if (_arser["--nnpackage"] && _arser["--modelfile"])
{
std::cerr << "Cannot use both single model file and nnpackage at once." << std::endl;
exit(1);
}

if (_arser["--nnpackage"])
{
std::cout << "Package Filename " << _package_filename << std::endl;
_package_filename = _arser.get<std::string>("--nnpackage");
}

if (_arser["--modelfile"])
{
std::cout << "Model Filename " << _model_filename << std::endl;
_model_filename = _arser.get<std::string>("--modelfile");
}

if (_arser["path"])
{
auto path = _arser.get<std::string>("path");
Expand Down
48 changes: 24 additions & 24 deletions tests/tools/onert_train/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,27 +18,27 @@ sudo apt install -y libhdf5-dev libboost-program-options-dev

## Usage

You could train your model using the command like below.
You could train your model using the command like below.

```bash
onert_train \
--path [circle file or nnpackage] \
--load_input:raw [training input data] \
--load_expected:raw [training output data] \
--batch_size 32 \
--batch_size 32 \
--epoch 5 \
--optimizer 1 \ # sgd
--learning_rate 0.01 \
--learning_rate 0.01 \
--loss 2 \ # cateogrical crossentropy
--loss_reduction_type 1 # sum over batch size
--num_of_trainable_ops 30 # number of operations to be trained from the back
--loss_reduction_type 1 \ # sum over batch size
--num_of_trainable_ops 30 \ # number of operations to be trained from the back
[circle file or nnpackage]
```

`onert_train --help` would help you to set each parameter.

## Example

To deliver a quick insight to use `onert_train`, let's train a simple mnist model. You could get a mnist tensroflow model code from [here](https://www.kaggle.com/code/amyjang/tensorflow-mnist-cnn-tutorial).
To deliver a quick insight to use `onert_train`, let's train a simple mnist model. You could get a mnist tensroflow model code from [here](https://www.kaggle.com/code/amyjang/tensorflow-mnist-cnn-tutorial).

Before using `onert_train`, training data files and a model file have to be ready.

Expand All @@ -49,17 +49,17 @@ For convenience, we provide a tool([tf dataset convert](https://github.com/Samsu

You could use the tool like this. For detailed usage, please refer [here](https://github.com/Samsung/ONE/tree/master/tools/generate_datafile/tf_dataset_converter#readme).
```bash
# Move to tf_dataset_convert directory
# Move to tf_dataset_convert directory
$ cd ONE/tools/generate_datafile/tf_dataset_converter

# install prerequisites
$ pip3 install -r requirements.txt

# generate binary data files
$ python3 main.py \
--dataset-name mnist \
--prefix-name mnist \
--model mnist
$ python3 main.py \
--dataset-name mnist \
--prefix-name mnist \
--model mnist

# check data files are generated
# There are 'mnist.train.input.1000.bin' and 'mnist.train.output.1000.bin'
Expand All @@ -70,28 +70,28 @@ $ tree out

`onert_train` use a `*.circle` file or a nnpackage as input. <br/>

<!-- This readme is for the ONE developers, so they might know the onecc usage.-->
You could convert tf/tflite/onnx model file into circle file using [`onecc`](https://github.com/Samsung/ONE/tree/master/compiler/one-cmds). <br/>
If you start with tensorflow code, you could first save it as saved format and then convert it to a circle file by using `onecc`.
<!-- This readme is for the ONE developers, so they might know the onecc usage.-->
You could convert tf/tflite/onnx model file into circle file using [`onecc`](https://github.com/Samsung/ONE/tree/master/compiler/one-cmds). <br/>
If you start with tensorflow code, you could first save it as saved format and then convert it to a circle file by using `onecc`.

<!--TODO : Add how to inject training parameter in the circle model -->

### Run onert_train
Now you're ready to run `onert_train`. <br/>
Please pass your model file to `--modelfile` and data files to `--load_input:raw` and `--load_expected:raw`. <br/>
Please pass your model file as argument and data files with `--load_input:raw` and `--load_expected:raw` options. <br/>
Also, you could set training parameter using options like `--batch_size`, `--epoch`.. etc.
Please pay special attention for `num_of_trainable_ops` to determine number of operations to be trained from the back..

```bash
```bash
$ onert_train \
--modelfile mnist.circle \
--load_input:raw mnist.train.input.1000.bin \
--load_expected:raw mnist.train.output.1000.bin \
--batch_size 32 \
--batch_size 32 \
--epoch 5 \
--optimizer 2 \ # adam
--optimizer 2 \ # adam
--learning_rate 0.001 \
--loss 2 \ # cateogrical crossentropy
--loss_reduction_type 1 # sum over batch size
--num_of_trainable_ops 10
--loss 2 \ # cateogrical crossentropy
--loss_reduction_type 1 \ # sum over batch size
--num_of_trainable_ops 10 \
mnist.circle
```
30 changes: 0 additions & 30 deletions tests/tools/onert_train/src/args.cc
Original file line number Diff line number Diff line change
Expand Up @@ -146,10 +146,6 @@ void Args::Initialize(void)
.nargs(0)
.default_value(false)
.help("Print version and exit immediately");
_arser.add_argument("--nnpackage")
.type(arser::DataType::STR)
.help("NN Package file(directory) name");
_arser.add_argument("--modelfile").type(arser::DataType::STR).help("NN Model filename");
_arser.add_argument("--export_circle").type(arser::DataType::STR).help("Path to export circle");
_arser.add_argument("--export_circleplus")
.type(arser::DataType::STR)
Expand Down Expand Up @@ -220,32 +216,6 @@ void Args::Parse(const int argc, char **argv)
return;
}

// Require modelfile, nnpackage, or path
if (!_arser["--nnpackage"] && !_arser["--modelfile"] && !_arser["path"])
{
std::cerr << "Require one of options modelfile, nnpackage, or path." << std::endl;
exit(1);
}

// Cannot use both single model file and nnpackage at once
if (_arser["--nnpackage"] && _arser["--modelfile"])
{
std::cerr << "Cannot use both single model file and nnpackage at once." << std::endl;
exit(1);
}

if (_arser["--nnpackage"])
{
std::cout << "Package Filename " << _package_filename << std::endl;
_package_filename = _arser.get<std::string>("--nnpackage");
}

if (_arser["--modelfile"])
{
std::cout << "Model Filename " << _model_filename << std::endl;
_model_filename = _arser.get<std::string>("--modelfile");
}

if (_arser["path"])
{
auto path = _arser.get<std::string>("path");
Expand Down
3 changes: 1 addition & 2 deletions tools/stab/remote.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,9 +79,8 @@ def profile_backend(self, backend, backend_op_list):
cmd += [f"RUY_THREADS={self.num_threads}"]
cmd += [f"BACKENDS=\'{';'.join(['cpu', backend])}\'"]
cmd += [f"{nnpkg_run_path}"]
cmd += [f"--nnpackage"]
cmd += [f"{nnpkg_path}"]
cmd += [f"-w5 -r50"]
cmd += [f"{nnpkg_path}"]
logging.debug(f"SSH command : {' '.join(cmd)}")
subprocess.call(cmd)

Expand Down

0 comments on commit 28de5e3

Please sign in to comment.