_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 2048) 33562624
_________________________________________________________________
dense_1 (Dense) (None, 512) 1049088
_________________________________________________________________
dropout_1 (Dropout) (None, 512) 0
_________________________________________________________________
dense_2 (Dense) (None, 101) 51813
=================================================================
Total params: 34,663,525
Trainable params: 34,663,525
Non-trainable params: 0
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, None, None, 3 0
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, None, None, 3 864 input_1[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, None, None, 3 96 conv2d_1[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, None, None, 3 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, None, None, 3 9216 activation_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, None, None, 3 96 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, None, None, 3 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, None, None, 6 18432 activation_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, None, None, 6 192 conv2d_3[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, None, None, 6 0 batch_normalization_3[0][0]
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, None, None, 6 0 activation_3[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, None, None, 8 5120 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, None, None, 8 240 conv2d_4[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, None, None, 8 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, None, None, 1 138240 activation_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, None, None, 1 576 conv2d_5[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, None, None, 1 0 batch_normalization_5[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, None, None, 1 0 activation_5[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, None, None, 6 12288 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, None, None, 6 192 conv2d_9[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, None, None, 6 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, None, None, 4 9216 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, None, None, 9 55296 activation_9[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, None, None, 4 144 conv2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, None, None, 9 288 conv2d_10[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, None, None, 4 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, None, None, 9 0 batch_normalization_10[0][0]
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, None, None, 1 0 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, None, None, 6 12288 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, None, None, 6 76800 activation_7[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, None, None, 9 82944 activation_10[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, None, None, 3 6144 average_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, None, None, 6 192 conv2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, None, None, 6 192 conv2d_8[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, None, None, 9 288 conv2d_11[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, None, None, 3 96 conv2d_12[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, None, None, 6 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, None, None, 6 0 batch_normalization_8[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, None, None, 9 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, None, None, 3 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
mixed0 (Concatenate) (None, None, None, 2 0 activation_6[0][0]
activation_8[0][0]
activation_11[0][0]
activation_12[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, None, None, 6 16384 mixed0[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, None, None, 6 192 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, None, None, 6 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, None, None, 4 12288 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, None, None, 9 55296 activation_16[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, None, None, 4 144 conv2d_14[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, None, None, 9 288 conv2d_17[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, None, None, 4 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, None, None, 9 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
average_pooling2d_2 (AveragePoo (None, None, None, 2 0 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, None, None, 6 16384 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, None, None, 6 76800 activation_14[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, None, None, 9 82944 activation_17[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, None, None, 6 16384 average_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, None, None, 6 192 conv2d_13[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, None, None, 6 192 conv2d_15[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, None, None, 9 288 conv2d_18[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, None, None, 6 192 conv2d_19[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, None, None, 6 0 batch_normalization_13[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, None, None, 6 0 batch_normalization_15[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, None, None, 9 0 batch_normalization_18[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, None, None, 6 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
mixed1 (Concatenate) (None, None, None, 2 0 activation_13[0][0]
activation_15[0][0]
activation_18[0][0]
activation_19[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, None, None, 6 18432 mixed1[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, None, None, 6 192 conv2d_23[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, None, None, 6 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, None, None, 4 13824 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, None, None, 9 55296 activation_23[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, None, None, 4 144 conv2d_21[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, None, None, 9 288 conv2d_24[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, None, None, 4 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
activation_24 (Activation) (None, None, None, 9 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
average_pooling2d_3 (AveragePoo (None, None, None, 2 0 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, None, None, 6 18432 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, None, None, 6 76800 activation_21[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, None, None, 9 82944 activation_24[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, None, None, 6 18432 average_pooling2d_3[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, None, None, 6 192 conv2d_20[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, None, None, 6 192 conv2d_22[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, None, None, 9 288 conv2d_25[0][0]
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, None, None, 6 192 conv2d_26[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, None, None, 6 0 batch_normalization_20[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, None, None, 6 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
activation_25 (Activation) (None, None, None, 9 0 batch_normalization_25[0][0]
__________________________________________________________________________________________________
activation_26 (Activation) (None, None, None, 6 0 batch_normalization_26[0][0]
__________________________________________________________________________________________________
mixed2 (Concatenate) (None, None, None, 2 0 activation_20[0][0]
activation_22[0][0]
activation_25[0][0]
activation_26[0][0]
__________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, None, None, 6 18432 mixed2[0][0]
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, None, None, 6 192 conv2d_28[0][0]
__________________________________________________________________________________________________
activation_28 (Activation) (None, None, None, 6 0 batch_normalization_28[0][0]
__________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, None, None, 9 55296 activation_28[0][0]
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, None, None, 9 288 conv2d_29[0][0]
__________________________________________________________________________________________________
activation_29 (Activation) (None, None, None, 9 0 batch_normalization_29[0][0]
__________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, None, None, 3 995328 mixed2[0][0]
__________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, None, None, 9 82944 activation_29[0][0]
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, None, None, 3 1152 conv2d_27[0][0]
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, None, None, 9 288 conv2d_30[0][0]
__________________________________________________________________________________________________
activation_27 (Activation) (None, None, None, 3 0 batch_normalization_27[0][0]
__________________________________________________________________________________________________
activation_30 (Activation) (None, None, None, 9 0 batch_normalization_30[0][0]
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D) (None, None, None, 2 0 mixed2[0][0]
__________________________________________________________________________________________________
mixed3 (Concatenate) (None, None, None, 7 0 activation_27[0][0]
activation_30[0][0]
max_pooling2d_3[0][0]
__________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, None, None, 1 98304 mixed3[0][0]
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, None, None, 1 384 conv2d_35[0][0]
__________________________________________________________________________________________________
activation_35 (Activation) (None, None, None, 1 0 batch_normalization_35[0][0]
__________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, None, None, 1 114688 activation_35[0][0]
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, None, None, 1 384 conv2d_36[0][0]
__________________________________________________________________________________________________
activation_36 (Activation) (None, None, None, 1 0 batch_normalization_36[0][0]
__________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, None, None, 1 98304 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, None, None, 1 114688 activation_36[0][0]
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, None, None, 1 384 conv2d_32[0][0]
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, None, None, 1 384 conv2d_37[0][0]
__________________________________________________________________________________________________
activation_32 (Activation) (None, None, None, 1 0 batch_normalization_32[0][0]
__________________________________________________________________________________________________
activation_37 (Activation) (None, None, None, 1 0 batch_normalization_37[0][0]
__________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, None, None, 1 114688 activation_32[0][0]
__________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, None, None, 1 114688 activation_37[0][0]
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, None, None, 1 384 conv2d_33[0][0]
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, None, None, 1 384 conv2d_38[0][0]
__________________________________________________________________________________________________
activation_33 (Activation) (None, None, None, 1 0 batch_normalization_33[0][0]
__________________________________________________________________________________________________
activation_38 (Activation) (None, None, None, 1 0 batch_normalization_38[0][0]
__________________________________________________________________________________________________
average_pooling2d_4 (AveragePoo (None, None, None, 7 0 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, None, None, 1 147456 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, None, None, 1 172032 activation_33[0][0]
__________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, None, None, 1 172032 activation_38[0][0]
__________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, None, None, 1 147456 average_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, None, None, 1 576 conv2d_31[0][0]
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, None, None, 1 576 conv2d_34[0][0]
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, None, None, 1 576 conv2d_39[0][0]
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, None, None, 1 576 conv2d_40[0][0]
__________________________________________________________________________________________________
activation_31 (Activation) (None, None, None, 1 0 batch_normalization_31[0][0]
__________________________________________________________________________________________________
activation_34 (Activation) (None, None, None, 1 0 batch_normalization_34[0][0]
__________________________________________________________________________________________________
activation_39 (Activation) (None, None, None, 1 0 batch_normalization_39[0][0]
__________________________________________________________________________________________________
activation_40 (Activation) (None, None, None, 1 0 batch_normalization_40[0][0]
__________________________________________________________________________________________________
mixed4 (Concatenate) (None, None, None, 7 0 activation_31[0][0]
activation_34[0][0]
activation_39[0][0]
activation_40[0][0]
__________________________________________________________________________________________________
conv2d_45 (Conv2D) (None, None, None, 1 122880 mixed4[0][0]
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, None, None, 1 480 conv2d_45[0][0]
__________________________________________________________________________________________________
activation_45 (Activation) (None, None, None, 1 0 batch_normalization_45[0][0]
__________________________________________________________________________________________________
conv2d_46 (Conv2D) (None, None, None, 1 179200 activation_45[0][0]
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, None, None, 1 480 conv2d_46[0][0]
__________________________________________________________________________________________________
activation_46 (Activation) (None, None, None, 1 0 batch_normalization_46[0][0]
__________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, None, None, 1 122880 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_47 (Conv2D) (None, None, None, 1 179200 activation_46[0][0]
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, None, None, 1 480 conv2d_42[0][0]
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, None, None, 1 480 conv2d_47[0][0]
__________________________________________________________________________________________________
activation_42 (Activation) (None, None, None, 1 0 batch_normalization_42[0][0]
__________________________________________________________________________________________________
activation_47 (Activation) (None, None, None, 1 0 batch_normalization_47[0][0]
__________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, None, None, 1 179200 activation_42[0][0]
__________________________________________________________________________________________________
conv2d_48 (Conv2D) (None, None, None, 1 179200 activation_47[0][0]
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, None, None, 1 480 conv2d_43[0][0]
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, None, None, 1 480 conv2d_48[0][0]
__________________________________________________________________________________________________
activation_43 (Activation) (None, None, None, 1 0 batch_normalization_43[0][0]
__________________________________________________________________________________________________
activation_48 (Activation) (None, None, None, 1 0 batch_normalization_48[0][0]
__________________________________________________________________________________________________
average_pooling2d_5 (AveragePoo (None, None, None, 7 0 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, None, None, 1 147456 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_44 (Conv2D) (None, None, None, 1 215040 activation_43[0][0]
__________________________________________________________________________________________________
conv2d_49 (Conv2D) (None, None, None, 1 215040 activation_48[0][0]
__________________________________________________________________________________________________
conv2d_50 (Conv2D) (None, None, None, 1 147456 average_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, None, None, 1 576 conv2d_41[0][0]
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, None, None, 1 576 conv2d_44[0][0]
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, None, None, 1 576 conv2d_49[0][0]
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, None, None, 1 576 conv2d_50[0][0]
__________________________________________________________________________________________________
activation_41 (Activation) (None, None, None, 1 0 batch_normalization_41[0][0]
__________________________________________________________________________________________________
activation_44 (Activation) (None, None, None, 1 0 batch_normalization_44[0][0]
__________________________________________________________________________________________________
activation_49 (Activation) (None, None, None, 1 0 batch_normalization_49[0][0]
__________________________________________________________________________________________________
activation_50 (Activation) (None, None, None, 1 0 batch_normalization_50[0][0]
__________________________________________________________________________________________________
mixed5 (Concatenate) (None, None, None, 7 0 activation_41[0][0]
activation_44[0][0]
activation_49[0][0]
activation_50[0][0]
__________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, None, None, 1 122880 mixed5[0][0]
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, None, None, 1 480 conv2d_55[0][0]
__________________________________________________________________________________________________
activation_55 (Activation) (None, None, None, 1 0 batch_normalization_55[0][0]
__________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, None, None, 1 179200 activation_55[0][0]
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, None, None, 1 480 conv2d_56[0][0]
__________________________________________________________________________________________________
activation_56 (Activation) (None, None, None, 1 0 batch_normalization_56[0][0]
__________________________________________________________________________________________________
conv2d_52 (Conv2D) (None, None, None, 1 122880 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, None, None, 1 179200 activation_56[0][0]
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, None, None, 1 480 conv2d_52[0][0]
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, None, None, 1 480 conv2d_57[0][0]
__________________________________________________________________________________________________
activation_52 (Activation) (None, None, None, 1 0 batch_normalization_52[0][0]
__________________________________________________________________________________________________
activation_57 (Activation) (None, None, None, 1 0 batch_normalization_57[0][0]
__________________________________________________________________________________________________
conv2d_53 (Conv2D) (None, None, None, 1 179200 activation_52[0][0]
__________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, None, None, 1 179200 activation_57[0][0]
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, None, None, 1 480 conv2d_53[0][0]
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, None, None, 1 480 conv2d_58[0][0]
__________________________________________________________________________________________________
activation_53 (Activation) (None, None, None, 1 0 batch_normalization_53[0][0]
__________________________________________________________________________________________________
activation_58 (Activation) (None, None, None, 1 0 batch_normalization_58[0][0]
__________________________________________________________________________________________________
average_pooling2d_6 (AveragePoo (None, None, None, 7 0 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_51 (Conv2D) (None, None, None, 1 147456 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, None, None, 1 215040 activation_53[0][0]
__________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, None, None, 1 215040 activation_58[0][0]
__________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, None, None, 1 147456 average_pooling2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, None, None, 1 576 conv2d_51[0][0]
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, None, None, 1 576 conv2d_54[0][0]
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, None, None, 1 576 conv2d_59[0][0]
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, None, None, 1 576 conv2d_60[0][0]
__________________________________________________________________________________________________
activation_51 (Activation) (None, None, None, 1 0 batch_normalization_51[0][0]
__________________________________________________________________________________________________
activation_54 (Activation) (None, None, None, 1 0 batch_normalization_54[0][0]
__________________________________________________________________________________________________
activation_59 (Activation) (None, None, None, 1 0 batch_normalization_59[0][0]
__________________________________________________________________________________________________
activation_60 (Activation) (None, None, None, 1 0 batch_normalization_60[0][0]
__________________________________________________________________________________________________
mixed6 (Concatenate) (None, None, None, 7 0 activation_51[0][0]
activation_54[0][0]
activation_59[0][0]
activation_60[0][0]
__________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, None, None, 1 147456 mixed6[0][0]
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, None, None, 1 576 conv2d_65[0][0]
__________________________________________________________________________________________________
activation_65 (Activation) (None, None, None, 1 0 batch_normalization_65[0][0]
__________________________________________________________________________________________________
conv2d_66 (Conv2D) (None, None, None, 1 258048 activation_65[0][0]
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, None, None, 1 576 conv2d_66[0][0]
__________________________________________________________________________________________________
activation_66 (Activation) (None, None, None, 1 0 batch_normalization_66[0][0]
__________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, None, None, 1 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_67 (Conv2D) (None, None, None, 1 258048 activation_66[0][0]
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, None, None, 1 576 conv2d_62[0][0]
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, None, None, 1 576 conv2d_67[0][0]
__________________________________________________________________________________________________
activation_62 (Activation) (None, None, None, 1 0 batch_normalization_62[0][0]
__________________________________________________________________________________________________
activation_67 (Activation) (None, None, None, 1 0 batch_normalization_67[0][0]
__________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, None, None, 1 258048 activation_62[0][0]
__________________________________________________________________________________________________
conv2d_68 (Conv2D) (None, None, None, 1 258048 activation_67[0][0]
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, None, None, 1 576 conv2d_63[0][0]
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, None, None, 1 576 conv2d_68[0][0]
__________________________________________________________________________________________________
activation_63 (Activation) (None, None, None, 1 0 batch_normalization_63[0][0]
__________________________________________________________________________________________________
activation_68 (Activation) (None, None, None, 1 0 batch_normalization_68[0][0]
__________________________________________________________________________________________________
average_pooling2d_7 (AveragePoo (None, None, None, 7 0 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, None, None, 1 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, None, None, 1 258048 activation_63[0][0]
__________________________________________________________________________________________________
conv2d_69 (Conv2D) (None, None, None, 1 258048 activation_68[0][0]
__________________________________________________________________________________________________
conv2d_70 (Conv2D) (None, None, None, 1 147456 average_pooling2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, None, None, 1 576 conv2d_61[0][0]
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, None, None, 1 576 conv2d_64[0][0]
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, None, None, 1 576 conv2d_69[0][0]
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, None, None, 1 576 conv2d_70[0][0]
__________________________________________________________________________________________________
activation_61 (Activation) (None, None, None, 1 0 batch_normalization_61[0][0]
__________________________________________________________________________________________________
activation_64 (Activation) (None, None, None, 1 0 batch_normalization_64[0][0]
__________________________________________________________________________________________________
activation_69 (Activation) (None, None, None, 1 0 batch_normalization_69[0][0]
__________________________________________________________________________________________________
activation_70 (Activation) (None, None, None, 1 0 batch_normalization_70[0][0]
__________________________________________________________________________________________________
mixed7 (Concatenate) (None, None, None, 7 0 activation_61[0][0]
activation_64[0][0]
activation_69[0][0]
activation_70[0][0]
__________________________________________________________________________________________________
conv2d_73 (Conv2D) (None, None, None, 1 147456 mixed7[0][0]
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, None, None, 1 576 conv2d_73[0][0]
__________________________________________________________________________________________________
activation_73 (Activation) (None, None, None, 1 0 batch_normalization_73[0][0]
__________________________________________________________________________________________________
conv2d_74 (Conv2D) (None, None, None, 1 258048 activation_73[0][0]
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, None, None, 1 576 conv2d_74[0][0]
__________________________________________________________________________________________________
activation_74 (Activation) (None, None, None, 1 0 batch_normalization_74[0][0]
__________________________________________________________________________________________________
conv2d_71 (Conv2D) (None, None, None, 1 147456 mixed7[0][0]
__________________________________________________________________________________________________
conv2d_75 (Conv2D) (None, None, None, 1 258048 activation_74[0][0]
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, None, None, 1 576 conv2d_71[0][0]
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, None, None, 1 576 conv2d_75[0][0]
__________________________________________________________________________________________________
activation_71 (Activation) (None, None, None, 1 0 batch_normalization_71[0][0]
__________________________________________________________________________________________________
activation_75 (Activation) (None, None, None, 1 0 batch_normalization_75[0][0]
__________________________________________________________________________________________________
conv2d_72 (Conv2D) (None, None, None, 3 552960 activation_71[0][0]
__________________________________________________________________________________________________
conv2d_76 (Conv2D) (None, None, None, 1 331776 activation_75[0][0]
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, None, None, 3 960 conv2d_72[0][0]
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, None, None, 1 576 conv2d_76[0][0]
__________________________________________________________________________________________________
activation_72 (Activation) (None, None, None, 3 0 batch_normalization_72[0][0]
__________________________________________________________________________________________________
activation_76 (Activation) (None, None, None, 1 0 batch_normalization_76[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, None, None, 7 0 mixed7[0][0]
__________________________________________________________________________________________________
mixed8 (Concatenate) (None, None, None, 1 0 activation_72[0][0]
activation_76[0][0]
max_pooling2d_4[0][0]
__________________________________________________________________________________________________
conv2d_81 (Conv2D) (None, None, None, 4 573440 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, None, None, 4 1344 conv2d_81[0][0]
__________________________________________________________________________________________________
activation_81 (Activation) (None, None, None, 4 0 batch_normalization_81[0][0]
__________________________________________________________________________________________________
conv2d_78 (Conv2D) (None, None, None, 3 491520 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_82 (Conv2D) (None, None, None, 3 1548288 activation_81[0][0]
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, None, None, 3 1152 conv2d_78[0][0]
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, None, None, 3 1152 conv2d_82[0][0]
__________________________________________________________________________________________________
activation_78 (Activation) (None, None, None, 3 0 batch_normalization_78[0][0]
__________________________________________________________________________________________________
activation_82 (Activation) (None, None, None, 3 0 batch_normalization_82[0][0]
__________________________________________________________________________________________________
conv2d_79 (Conv2D) (None, None, None, 3 442368 activation_78[0][0]
__________________________________________________________________________________________________
conv2d_80 (Conv2D) (None, None, None, 3 442368 activation_78[0][0]
__________________________________________________________________________________________________
conv2d_83 (Conv2D) (None, None, None, 3 442368 activation_82[0][0]
__________________________________________________________________________________________________
conv2d_84 (Conv2D) (None, None, None, 3 442368 activation_82[0][0]
__________________________________________________________________________________________________
average_pooling2d_8 (AveragePoo (None, None, None, 1 0 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_77 (Conv2D) (None, None, None, 3 409600 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, None, None, 3 1152 conv2d_79[0][0]
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, None, None, 3 1152 conv2d_80[0][0]
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, None, None, 3 1152 conv2d_83[0][0]
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, None, None, 3 1152 conv2d_84[0][0]
__________________________________________________________________________________________________
conv2d_85 (Conv2D) (None, None, None, 1 245760 average_pooling2d_8[0][0]
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, None, None, 3 960 conv2d_77[0][0]
__________________________________________________________________________________________________
activation_79 (Activation) (None, None, None, 3 0 batch_normalization_79[0][0]
__________________________________________________________________________________________________
activation_80 (Activation) (None, None, None, 3 0 batch_normalization_80[0][0]
__________________________________________________________________________________________________
activation_83 (Activation) (None, None, None, 3 0 batch_normalization_83[0][0]
__________________________________________________________________________________________________
activation_84 (Activation) (None, None, None, 3 0 batch_normalization_84[0][0]
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, None, None, 1 576 conv2d_85[0][0]
__________________________________________________________________________________________________
activation_77 (Activation) (None, None, None, 3 0 batch_normalization_77[0][0]
__________________________________________________________________________________________________
mixed9_0 (Concatenate) (None, None, None, 7 0 activation_79[0][0]
activation_80[0][0]
__________________________________________________________________________________________________
concatenate_1 (Concatenate) (None, None, None, 7 0 activation_83[0][0]
activation_84[0][0]
__________________________________________________________________________________________________
activation_85 (Activation) (None, None, None, 1 0 batch_normalization_85[0][0]
__________________________________________________________________________________________________
mixed9 (Concatenate) (None, None, None, 2 0 activation_77[0][0]
mixed9_0[0][0]
concatenate_1[0][0]
activation_85[0][0]
__________________________________________________________________________________________________
conv2d_90 (Conv2D) (None, None, None, 4 917504 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, None, None, 4 1344 conv2d_90[0][0]
__________________________________________________________________________________________________
activation_90 (Activation) (None, None, None, 4 0 batch_normalization_90[0][0]
__________________________________________________________________________________________________
conv2d_87 (Conv2D) (None, None, None, 3 786432 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_91 (Conv2D) (None, None, None, 3 1548288 activation_90[0][0]
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, None, None, 3 1152 conv2d_87[0][0]
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, None, None, 3 1152 conv2d_91[0][0]
__________________________________________________________________________________________________
activation_87 (Activation) (None, None, None, 3 0 batch_normalization_87[0][0]
__________________________________________________________________________________________________
activation_91 (Activation) (None, None, None, 3 0 batch_normalization_91[0][0]
__________________________________________________________________________________________________
conv2d_88 (Conv2D) (None, None, None, 3 442368 activation_87[0][0]
__________________________________________________________________________________________________
conv2d_89 (Conv2D) (None, None, None, 3 442368 activation_87[0][0]
__________________________________________________________________________________________________
conv2d_92 (Conv2D) (None, None, None, 3 442368 activation_91[0][0]
__________________________________________________________________________________________________
conv2d_93 (Conv2D) (None, None, None, 3 442368 activation_91[0][0]
__________________________________________________________________________________________________
average_pooling2d_9 (AveragePoo (None, None, None, 2 0 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_86 (Conv2D) (None, None, None, 3 655360 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, None, None, 3 1152 conv2d_88[0][0]
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, None, None, 3 1152 conv2d_89[0][0]
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, None, None, 3 1152 conv2d_92[0][0]
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, None, None, 3 1152 conv2d_93[0][0]
__________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, None, None, 1 393216 average_pooling2d_9[0][0]
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, None, None, 3 960 conv2d_86[0][0]
__________________________________________________________________________________________________
activation_88 (Activation) (None, None, None, 3 0 batch_normalization_88[0][0]
__________________________________________________________________________________________________
activation_89 (Activation) (None, None, None, 3 0 batch_normalization_89[0][0]
__________________________________________________________________________________________________
activation_92 (Activation) (None, None, None, 3 0 batch_normalization_92[0][0]
__________________________________________________________________________________________________
activation_93 (Activation) (None, None, None, 3 0 batch_normalization_93[0][0]
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, None, None, 1 576 conv2d_94[0][0]
__________________________________________________________________________________________________
activation_86 (Activation) (None, None, None, 3 0 batch_normalization_86[0][0]
__________________________________________________________________________________________________
mixed9_1 (Concatenate) (None, None, None, 7 0 activation_88[0][0]
activation_89[0][0]
__________________________________________________________________________________________________
concatenate_2 (Concatenate) (None, None, None, 7 0 activation_92[0][0]
activation_93[0][0]
__________________________________________________________________________________________________
activation_94 (Activation) (None, None, None, 1 0 batch_normalization_94[0][0]
__________________________________________________________________________________________________
mixed10 (Concatenate) (None, None, None, 2 0 activation_86[0][0]
mixed9_1[0][0]
concatenate_2[0][0]
activation_94[0][0]
__________________________________________________________________________________________________
global_average_pooling2d_1 (Glo (None, 2048) 0 mixed10[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 1024) 2098176 global_average_pooling2d_1[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 101) 103525 dense_1[0][0]
==================================================================================================
Total params: 24,004,485
Trainable params: 23,970,053
Non-trainable params: 34,432
__________________________________________________________________________________________________
None
Found 1788425 images belonging to 101 classes.
Found 697865 images belonging to 101 classes.
Loading network from ImageNet weights.
Windows 10 Envoriment prepare:
Annaconda Python 3.6.6 + Tensorflow-gpu 1.11 + CUDA 9 + Cudnn 9.0
conda create -n tensorflow python=3.5
pip install -r requirements.txt
Summarize the videos, their class, train/test status and frame count in a CSV we’ll reference throughout our training.
compare to UCF-101 video files summary:
Datasets | Size | FPS | Length | Images |
---|---|---|---|---|
UCF-101 | 320X240 | 25 | 4~10s | 44 |
RTVC2018 | 540X960,480X684,1280X720... | 30 | 10s | 50 |
0.Generate train/test list file
@see: https://raw.githubusercontent.com/LouiValley/AI-Challenge-RTVC/master/mlsv2018/0_train_test_files.py
1.Split all the videos into train/test folders
@see: https://raw.githubusercontent.com/LouiValley/AI-Challenge-RTVC/master/mlsv2018/1_move_files.py
2.Extract jpegs of each frame for each video
@see: https://raw.githubusercontent.com/LouiValley/AI-Challenge-RTVC/master/mlsv2018/2_extract_files.py
3.Extract frame images features
@see: https://raw.githubusercontent.com/LouiValley/AI-Challenge-RTVC/master/mlsv2018/3_extract_features.py
4.Train model and validate it
@see: https://raw.githubusercontent.com/LouiValley/AI-Challenge-RTVC/master/mlsv2018/train.py
5.Prediction demo
@see: https://raw.githubusercontent.com/LouiValley/AI-Challenge-RTVC/master/mlsv2018/demo.py
1.GPU CUDA_ERROE_OUT_OF_MEMORY
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
sess = tf.Session(config=config)
2.Load our data from file. But with blank value
with open(os.path.join('data', 'data_file.csv'), 'r') as fin:
reader = csv.reader(fin)
data = list(reader)
print(data)
to
with open(os.path.join('data', 'data_file.csv'), 'r') as fin:
data = pandas.read_csv(fin, header=None)
print(data.values.tolist())
3.Get the parts in Windows10
parts = video.split(os.path.sep)
to
parts = video.split("/")
1_move_files.py in Windows10
if not os.path.exists(filename):
print("Can't find %s to move. Skipping." % (filename))
continue
# Move it.
dest = os.path.join(group, classname, filename)
print("Moving %s to %s" % (filename, dest))
os.rename(filename, dest)
to
fullfilename = "UCF-101/" + classname + "/"+ filename
print(fullfilename)
if not os.path.exists(fullfilename):
print("Can't find %s to move. Skipping." % (fullfilename))
continue
# Move it.
dest = os.path.join(group, classname, filename)
print("Moving %s to %s" % (fullfilename, dest))
os.rename(fullfilename, dest)
4.MemoryError
Epoch 1/1000
6/100 [>.............................] - ETA: 2:08 - loss: 1.4358 - acc: 0.6094 - top_k_categorical_accuracy: 0.8542Traceback (most recent call last):
File "train_cnn.py", line 142, in <module>
main(weights_file)
File "train_cnn.py", line 138, in main
[checkpointer, early_stopper, tensorboard])
File "train_cnn.py", line 119, in train_model
callbacks=callbacks)
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras\legacy\interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras\engine\training.py", line 1418, in fit_generator
initial_epoch=initial_epoch)
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras\engine\training_generator.py", line 180, in fit_generator
generator_output = next(output_generator)
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras\utils\data_utils.py", line 601, in get
six.reraise(*sys.exc_info())
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\six.py", line 693, in reraise
raise value
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras\utils\data_utils.py", line 595, in get
inputs = self.queue.get(block=True).get()
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\multiprocessing\pool.py", line 644, in get
raise self._value
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\multiprocessing\pool.py", line 119, in worker
result = (True, func(*args, **kwds))
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras\utils\data_utils.py", line 401, in get_index
return _SHARED_SEQUENCES[uid][i]
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras_preprocessing\image.py", line 1441, in __getitem__
return self._get_batches_of_transformed_samples(index_array)
File "C:\Users\Administrator\AppData\Local\conda\conda\envs\tensorflow_gpu\lib\site-packages\keras_preprocessing\image.py", line 1916, in _get_batches_of_transformed_samples
dtype=self.dtype)
MemoryError
train_cnn.py
seq_length = 40
class_limit = 4 # Number of classes to extract. Can be 1-101 or None for all.
Epoch 00025: val_loss did not improve from 1.46369
Epoch 26/1000
100/100 [==============================] - 66s 655ms/step - loss: 0.6564 - acc: 0.8159 - top_k_categorical_accuracy: 0.9556 - val_loss: 1.6311 - val_acc: 0.5312 - val_top_k_categorical_accuracy: 0.8281
Epoch 00026: val_loss did not improve from 1.46369
seq_length = 40
class_limit = 101 # Number of classes to extract. Can be 1-101 or None for all.
100/100 [==============================] - 64s 644ms/step - loss: 0.6991 - acc: 0.8144 - top_k_categorical_accuracy: 0.9541 - val_loss: 1.6719 - val_acc: 0.5594 - val_top_k_categorical_accuracy: 0.8313
Epoch 00018: val_loss did not improve from 1.47632
train.py
lstm
Epoch 00021: val_loss did not improve from 1.15622
Epoch 22/1000
262/262 [==============================] - 117s 447ms/step - loss: 1.0282 - acc: 0.7091 - top_k_categorical_accuracy: 0.9294 - val_loss: 1.1878 - val_acc: 0.6477 - val_top_k_categorical_accuracy: 0.9094
Epoch 00022: val_loss did not improve from 1.15622
lrcn
262/262 [==============================] - 116s 443ms/step - loss: 0.6100 - acc: 0.8206 - top_k_categorical_accuracy: 0.9724 - val_loss: 1.1782 - val_acc: 0.7055 - val_top_k_categorical_accuracy: 0.9117
Epoch 00037: val_loss did not improve from 1.08926
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 2048) 33562624
_________________________________________________________________
dense_1 (Dense) (None, 512) 1049088
_________________________________________________________________
dropout_1 (Dropout) (None, 512) 0
_________________________________________________________________
dense_2 (Dense) (None, 101) 51813
=================================================================
Total params: 34,663,525
Trainable params: 34,663,525
Non-trainable params: 0
_________________________________________________________________
None
mlp
262/262 [==============================] - 117s 446ms/step - loss: 0.6700 - acc: 0.8040 - top_k_categorical_accuracy: 0.9656 - val_loss: 1.1547 - val_acc: 0.7156 - val_top_k_categorical_accuracy: 0.9187
Epoch 00034: val_loss did not improve from 1.11599
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 2048) 33562624
_________________________________________________________________
dense_1 (Dense) (None, 512) 1049088
_________________________________________________________________
dropout_1 (Dropout) (None, 512) 0
_________________________________________________________________
dense_2 (Dense) (None, 101) 51813
=================================================================
Total params: 34,663,525
Trainable params: 34,663,525
Non-trainable params: 0
_________________________________________________________________
None
c3d
262/262 [==============================] - 118s 449ms/step - loss: 1.0766 - acc: 0.7011 - top_k_categorical_accuracy: 0.9216 - val_loss: 1.2615 - val_acc: 0.6703 - val_top_k_categorical_accuracy: 0.9008
Epoch 00021: val_loss did not improve from 1.14744
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 2048) 33562624
_________________________________________________________________
dense_1 (Dense) (None, 512) 1049088
_________________________________________________________________
dropout_1 (Dropout) (None, 512) 0
_________________________________________________________________
dense_2 (Dense) (None, 101) 51813
=================================================================
Total params: 34,663,525
Trainable params: 34,663,525
Non-trainable params: 0
_________________________________________________________________
None
conv_3d
262/262 [==============================] - 119s 455ms/step - loss: 0.8180 - acc: 0.7654 - top_k_categorical_accuracy: 0.9507 - val_loss: 1.1457 - val_acc: 0.6969 - val_top_k_categorical_accuracy: 0.9109
Epoch 00029: val_loss did not improve from 1.05913
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 2048) 33562624
_________________________________________________________________
dense_1 (Dense) (None, 512) 1049088
_________________________________________________________________
dropout_1 (Dropout) (None, 512) 0
_________________________________________________________________
dense_2 (Dense) (None, 101) 51813
=================================================================
Total params: 34,663,525
Trainable params: 34,663,525
Non-trainable params: 0
_________________________________________________________________
None
https://www.analyticsvidhya.com/blog/2017/08/introduction-to-multi-label-classification/
https://medium.com/coinmonks/multi-label-classification-blog-tags-prediction-using-nlp-b0b5ee6686fc
https://www.depends-on-the-definition.com/guide-to-multi-label-classification-with-neural-networks/
https://www.kdnuggets.com/2018/09/object-detection-image-classification-yolo.html
https://medium.com/@RiterApp/capsule-networks-as-a-new-approach-to-image-recognition-345d4db0831