WekaDeeplearning4j comes with an automated JUnit test suite, however, some bugs may only occur in the GUI version of WEKA; because of this, it's important to test a range of cases in the GUI manually before confirming the package is ready for release.
These scenarios can be tested either:
- Manually, as per the instructions, or;
- Automatically, by running the parenthesised
.sh
script fromweka-run-test-scripts/
. This folder also containsrun_all_tests.sh
, which will run all of these tests (you should still check the output to ensure parameters are applied correctly). Check out the README for more info.
The following are run with randomly generated data (using the default Generate
window)
- (
Dl4jMlpClassifier_random_default.sh
) Run with all default parameters
The following are run with the mnist-minimal
dataset loaded, with the ImageInstanceIterator
using the following settings:
directory of images
pointing to themnist-minimal/
image folderdesired width = 224
desired height = 224
desired number of channels = 3
You may like to set number of epochs
to something smaller and set Test options
to
Percentage split
to run these quicker.
- (
Dl4jMlpClassifier_mnist_default.sh
) Run model with all default parameters - (
Dl4jMlpClassifier_mnist_extraLayer.sh
) Run model with an addedDenseLayer
withnOut = 32
- (
Dl4jMlpClassifier_mnist_AlexNet.sh
) Run model withDl4jAlexNet
as the zoo model - (
Dl4jMlpClassifier_mnist_ResNet50.sh
) Run model withDl4jResNet50
as the zoo model - (
Dl4jMlpClassifier_mnist_EffNetB2.sh
) Run model withKerasEfficientNet
as the zoo model, withEFFICIENTNET_B2
as the variation- Ensure that the model actually uses the variation, you should see a log message something like
...Using cached model at /home/rhys/.deeplearning4j/models/keras_efficientnet/KerasEfficientNetB2.zip...
- Ensure that the model actually uses the variation, you should see a log message something like
The following are run with the mnist-minimal
dataset loaded. You'll need to click Undo
after each test to revert the instances.
- (
Dl4jMlpFilter_mnist_default.sh
) Run with default filter settings (usesDl4jResNet50
as the model) - (
Dl4jMlpFilter_mnist_extraLayer.sh
) Run withDl4jResNet50
as zoo model, setUse default feature layer
to false, and addres4a_branch2b
to thefeature extraction layers
- Ensure that the logging output contains something like
...Getting features from layers: [res4a_branch2b, flatten_1]
and that there are attributes from both feature layers (i.e., namedres4a_branch2b
andflatten_1
)
- Ensure that the logging output contains something like
- (
Dl4jMlpFilter_mnist_DenseNet.sh
) Run withKerasDenseNet
as the zoo model andUse default feature extraction layer
set toTrue
- (
Dl4jMlpFilter_mnist_ResNet101v2.sh
) Run withKerasResNet
as the zoo model, variation set toRESNET101V2
, andUse default feature extraction layer
set toTrue
- Ensure that
RESNET101V2
is actually used as the variation. You should see a logging message something like...Using cached model at /home/rhys/.deeplearning4j/models/keras_resnet/KerasResNet101V2.zip
- Ensure that
- (
Dl4jMlpFilter_mnist_EffNetB1ExtraLayer.sh
) Run withKerasEfficientNet
as the zoo model, variation set toEFFICIENTNET_B1
,Use default feature layer
toFalse
, and addblock4c_expand_conv
to thefeature extraction layers
.- Again, ensure the variation is properly set, and the resulting attributes contain features from both layers.
The following are run using the mnist_784
convolutional dataset (src/test/resources/nominal/mnist_784_train_minimal.arff
)
- (
Dl4jMlpFilter_mnist_LeNet.sh
) Run withDl4jLeNet
as the zoo model,Use default feature layer
toTrue
, and a defaultConvolutionInstanceIterator
as theinstance iterator