scriptDir: /tmp/bitcraze/aideck-gap8-examples/tools/build example_path: examples/ai/classification/ make_args: clean model build image full_path: /tmp/bitcraze/aideck-gap8-examples/tools/build/../../examples/ai/classification/ Requirement already satisfied: numpy==1.22.3 in /home/user/.local/lib/python3.8/site-packages (1.22.3) Building GAP8 mode with 8 bit quantization script model/nntool_script GEN ... /gap_sdk/tools/autotiler_v3/CNN_Generators/CNN_Generator_Util.c /gap_sdk/tools/autotiler_v3/CNN_Generators/CNN_Copy_Generators.c /gap_sdk/tools/autotiler_v3/CNN_Generators/SSD_Generators.c /gap_sdk/tools/autotiler_v3/Generators/BilinearResizes/ResizeGenerator.c /gap_sdk/tools/autotiler_v3/DSP_Generators/DSP_Generators.c /gap_sdk/tools/autotiler_v3/CNN_Generators_SQ8/CNN_Generators_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Generators_SQ8/RNN_Generators_SQ8.c APP_SRCS... classification.c ../../../lib/cpx/src/com.c ../../../lib/cpx/src/cpx.c BUILD_MODEL_SQ8BIT/classificationKernels.c /gap_sdk/libs/gap_lib/img_io/ImgIO.c /gap_sdk/tools/autotiler_v3/CNN_Libraries/SSD_BasicKernels.c /gap_sdk/tools/autotiler_v3/Generators/BilinearResizes/ResizeBasicKernels.c /gap_sdk/tools/autotiler_v3/CNN_Libraries/CNN_Copy.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_AT_Misc.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/CmplxFunctions.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/MatMulDSP.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/FFT_Library.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/MfccBasicKernels.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/PreProcessing.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/math_funcs.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/pulp_dsp/plp_cos_f32s_xpulpv2.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/pulp_dsp/plp_sin_f32s_xpulpv2.c /gap_sdk/tools/autotiler_v3/DSP_Libraries/pulp_dsp/plp_common_tables.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Activation_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Activation_HWC_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Bias_Linear_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Conv_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_MatMul_Conv_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Pooling_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Conv_DW_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_Conv_DW_Red_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_MatAlgebra_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/CNN_SoftMax_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8/RNN_SQ8.c APP_CFLAGS... -DMODEL_QUANTIZED -g -Os -mno-memcpy -fno-tree-loop-distribute-patterns -I. -I/gap_sdk/libs/gap_lib/include -I/gap_sdk/tools/autotiler_v3/Emulation -I/gap_sdk/tools/autotiler_v3/Autotiler -I/gap_sdk/tools/autotiler_v3/Generators/BilinearResizes -I/gap_sdk/tools/autotiler_v3/CNN_Libraries -I/gap_sdk/tools/autotiler_v3/DSP_Libraries -I/gap_sdk/tools/autotiler_v3/CNN_Libraries_fp16 -I/gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8 -I/tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/BUILD_MODEL_SQ8BIT -DPERF -DAT_MODEL_PREFIX=classification -DAT_INPUT_HEIGHT= -DAT_INPUT_WIDTH= -DAT_INPUT_COLORS= -DSTACK_SIZE=6096 -DSLAVE_STACK_SIZE=1024 -DconfigUSE_TIMERS=1 -DINCLUDE_xTimerPendFunctionCall=1 -DFS_PARTITIONTABLE_OFFSET=0x40000 -DFREQ_FC=200 -DFREQ_CL=170 -DTXQ_SIZE=5 -DRXQ_SIZE=5 rm -f -rf BUILD_MODEL_SQ8BIT rm -f /tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/BUILD/GAP8_V2/GCC_RISCV_FREERTOS/flash.img mkdir BUILD_MODEL_SQ8BIT cp model/classification_q.tflite BUILD_MODEL_SQ8BIT/classification.tflite echo "GENERATING NNTOOL STATE FILE" GENERATING NNTOOL STATE FILE nntool -s model/nntool_script BUILD_MODEL_SQ8BIT/classification.tflite -q /usr/local/lib/python3.8/dist-packages/sklearn/utils/multiclass.py:13: DeprecationWarning: Please use `spmatrix` from the `scipy.sparse` namespace, the `scipy.sparse.base` namespace is deprecated. from scipy.sparse.base import spmatrix /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:30: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations method='lar', copy_X=True, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:167: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations method='lar', copy_X=True, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:284: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_Gram=True, verbose=0, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:862: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_X=True, fit_path=True, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1101: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_X=True, fit_path=True, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1127: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, positive=False): /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1362: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations max_n_alphas=1000, n_jobs=None, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1602: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations max_n_alphas=1000, n_jobs=None, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1738: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_X=True, positive=False): /usr/local/lib/python3.8/dist-packages/sklearn/utils/optimize.py:18: DeprecationWarning: Please use `line_search_wolfe2` from the `scipy.optimize` namespace, the `scipy.optimize.linesearch` namespace is deprecated. from scipy.optimize.linesearch import line_search_wolfe2, line_search_wolfe1 /usr/local/lib/python3.8/dist-packages/sklearn/utils/optimize.py:18: DeprecationWarning: Please use `line_search_wolfe1` from the `scipy.optimize` namespace, the `scipy.optimize.linesearch` namespace is deprecated. from scipy.optimize.linesearch import line_search_wolfe2, line_search_wolfe1 /usr/local/lib/python3.8/dist-packages/sklearn/decomposition/online_lda.py:29: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations EPS = np.finfo(np.float).eps settings - set log level to INFO log_level - was: 'INFO' now: 'INFO' open - opening graph file BUILD_MODEL_SQ8BIT/classification.tflite load_quantization = True tflite - Importing TFLITE model version 3 quantize_mixin - removing (de)quantize node QUANTIZE_0_0 with no effect quantize_mixin - removing (de)quantize node QUANTIZE_0_75 with no effect nngraph - update graph dimensions nngraph - update graph dimensions /gap_sdk/tools/nntool/nntool/quantization/multiplicative/quantizers/filter_mult.py:241: RuntimeWarning: divide by zero encountered in log2 biases_bits = np.ceil(np.log2(np.abs(biases_q.quantize(biases_node.dqvalue)))) unified_quantization_handler - indicating change of FULLY_CONNECTED_0_73 input from c, out_cin_c, out_c to chw, out_cin_chw, out_c order - rerun adjust command unified_quantization_handler - indicating change of FULLY_CONNECTED_0_73 output from c to chw order - rerun adjust command nngraph - update graph dimensions nngraph - update graph dimensions nngraph - update graph dimensions fuse_pad - adding padding from: PAD_0_8 to filter: DEPTHWISE_CONV_2D_0_9 - has 0 reshapes fuse_pad - adding padding from: PAD_0_16 to filter: DEPTHWISE_CONV_2D_0_17 - has 0 reshapes fuse_pad - adding padding from: PAD_0_28 to filter: DEPTHWISE_CONV_2D_0_29 - has 0 reshapes fuse_pad - adding padding from: PAD_0_55 to filter: DEPTHWISE_CONV_2D_0_56 - has 0 reshapes nngraph - update graph dimensions debug - was: False now: True adjust_order - adding transposes to correct tensor order for AT kernels global_pool - global pool fusion MEAN_0_72: inserting transpose before operation eliminate_transposes - found elimination for CONV_2D_0_10_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_10_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_10_trans_out0 downwards - 4 eliminated eliminate_transposes - found elimination for CONV_2D_0_11_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_11_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_13_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_13_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_15_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_15_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_18_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_18_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_18_trans_out0 downwards - 6 eliminated eliminate_transposes - found elimination for CONV_2D_0_19_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_19_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_21_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_21_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_23_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_23_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_25_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_25_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_27_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_27_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_2_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_2_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_2_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_30_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_30_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_30_trans_out0 downwards - 8 eliminated eliminate_transposes - found elimination for CONV_2D_0_31_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_31_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_33_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_33_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_35_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_35_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_37_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_37_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_39_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_39_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_41_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_41_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_43_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_43_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_45_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_45_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_45_trans_out0 downwards - 6 eliminated eliminate_transposes - found elimination for CONV_2D_0_46_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_46_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_48_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_48_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_4_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_4_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_4_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_50_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_50_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_52_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_52_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_54_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_54_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_57_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_57_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_57_trans_out0 downwards - 6 eliminated eliminate_transposes - found elimination for CONV_2D_0_58_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_58_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_60_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_60_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_62_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_62_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_64_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_64_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_66_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_66_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_68_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_68_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_68_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_69_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_69_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_6_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_6_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_6_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_71_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_71_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_71_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_7_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_7_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_12_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_17_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_1_trans_in0 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_1_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_20_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_24_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_29_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_32_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_36_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_40_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_44_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_47_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_51_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_56_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_59_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_5_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_63_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_67_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_70_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_9_trans_in1 upwards - 1 eliminated eliminate_transposes - eliminate transposes nngraph - update graph dimensions nngraph - update graph dimensions eliminate_transposes - no transposes to eliminate found nngraph - update graph dimensions nngraph - update graph dimensions eliminate_transposes - no further transpose sequences found nngraph - adjusted order nngraph - update graph dimensions remove_noops - removing QUANTIZE_0_0 that does nothing remove_noops - removing QUANTIZE_0_75 that does nothing matcher - ++ fusion remove_noops modified graph fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_1,DEPTHWISE_CONV_2D_0_1_activation into DEPTHWISE_CONV_2D_0_1_fusion fuse_gap_convs - fusing nodes CONV_2D_0_4,CONV_2D_0_4_activation into CONV_2D_0_4_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_5,DEPTHWISE_CONV_2D_0_5_activation into DEPTHWISE_CONV_2D_0_5_fusion fuse_gap_convs - fusing nodes CONV_2D_0_7,CONV_2D_0_7_activation into CONV_2D_0_7_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_9,DEPTHWISE_CONV_2D_0_9_activation into DEPTHWISE_CONV_2D_0_9_fusion fuse_gap_convs - fusing nodes CONV_2D_0_11,CONV_2D_0_11_activation into CONV_2D_0_11_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_12,DEPTHWISE_CONV_2D_0_12_activation into DEPTHWISE_CONV_2D_0_12_fusion fuse_gap_convs - fusing nodes CONV_2D_0_15,CONV_2D_0_15_activation into CONV_2D_0_15_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_17,DEPTHWISE_CONV_2D_0_17_activation into DEPTHWISE_CONV_2D_0_17_fusion fuse_gap_convs - fusing nodes CONV_2D_0_19,CONV_2D_0_19_activation into CONV_2D_0_19_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_20,DEPTHWISE_CONV_2D_0_20_activation into DEPTHWISE_CONV_2D_0_20_fusion fuse_gap_convs - fusing nodes CONV_2D_0_23,CONV_2D_0_23_activation into CONV_2D_0_23_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_24,DEPTHWISE_CONV_2D_0_24_activation into DEPTHWISE_CONV_2D_0_24_fusion fuse_gap_convs - fusing nodes CONV_2D_0_27,CONV_2D_0_27_activation into CONV_2D_0_27_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_29,DEPTHWISE_CONV_2D_0_29_activation into DEPTHWISE_CONV_2D_0_29_fusion fuse_gap_convs - fusing nodes CONV_2D_0_31,CONV_2D_0_31_activation into CONV_2D_0_31_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_32,DEPTHWISE_CONV_2D_0_32_activation into DEPTHWISE_CONV_2D_0_32_fusion fuse_gap_convs - fusing nodes CONV_2D_0_35,CONV_2D_0_35_activation into CONV_2D_0_35_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_36,DEPTHWISE_CONV_2D_0_36_activation into DEPTHWISE_CONV_2D_0_36_fusion fuse_gap_convs - fusing nodes CONV_2D_0_39,CONV_2D_0_39_activation into CONV_2D_0_39_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_40,DEPTHWISE_CONV_2D_0_40_activation into DEPTHWISE_CONV_2D_0_40_fusion fuse_gap_convs - fusing nodes CONV_2D_0_43,CONV_2D_0_43_activation into CONV_2D_0_43_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_44,DEPTHWISE_CONV_2D_0_44_activation into DEPTHWISE_CONV_2D_0_44_fusion fuse_gap_convs - fusing nodes CONV_2D_0_46,CONV_2D_0_46_activation into CONV_2D_0_46_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_47,DEPTHWISE_CONV_2D_0_47_activation into DEPTHWISE_CONV_2D_0_47_fusion fuse_gap_convs - fusing nodes CONV_2D_0_50,CONV_2D_0_50_activation into CONV_2D_0_50_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_51,DEPTHWISE_CONV_2D_0_51_activation into DEPTHWISE_CONV_2D_0_51_fusion fuse_gap_convs - fusing nodes CONV_2D_0_54,CONV_2D_0_54_activation into CONV_2D_0_54_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_56,DEPTHWISE_CONV_2D_0_56_activation into DEPTHWISE_CONV_2D_0_56_fusion fuse_gap_convs - fusing nodes CONV_2D_0_58,CONV_2D_0_58_activation into CONV_2D_0_58_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_59,DEPTHWISE_CONV_2D_0_59_activation into DEPTHWISE_CONV_2D_0_59_fusion fuse_gap_convs - fusing nodes CONV_2D_0_62,CONV_2D_0_62_activation into CONV_2D_0_62_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_63,DEPTHWISE_CONV_2D_0_63_activation into DEPTHWISE_CONV_2D_0_63_fusion fuse_gap_convs - fusing nodes CONV_2D_0_66,CONV_2D_0_66_activation into CONV_2D_0_66_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_67,DEPTHWISE_CONV_2D_0_67_activation into DEPTHWISE_CONV_2D_0_67_fusion fuse_gap_convs - fusing nodes CONV_2D_0_69,CONV_2D_0_69_activation into CONV_2D_0_69_fusion fuse_gap_convs - fusing nodes CONV_2D_0_71,CONV_2D_0_71_activation into CONV_2D_0_71_fusion matcher - ++ fusion fuse_gap_convs modified graph matcher - ++ fusion scaled_match_group modified graph /gap_sdk/tools/nntool/nntool/quantization/multiplicative/quantizers/filter_mult.py:241: RuntimeWarning: divide by zero encountered in log2 biases_bits = np.ceil(np.log2(np.abs(biases_q.quantize(biases_node.dqvalue)))) nngraph - update graph dimensions adjust_order - adding transposes to correct tensor order for AT kernels eliminate_transposes - no transposes to eliminate found nngraph - update graph dimensions nngraph - update graph dimensions eliminate_transposes - no further transpose sequences found nngraph - adjusted order +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | Step | Step name | Operation | Input Dims | Output Dims | Inputs | Active | Params | Ops | Params | Hints | | | | | (cxhxw) | (cxhxw) | | size | size | | | | +======+==========================+=========================+==============+=============+========+========+========+=========+==========================+=========================+ | 0 | input_1 | input | 1x324x244 | 1x324x244 | | 79056 | 0 | | I 1x324x244 | in: cxhxw out: cxhxw | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 3 | DEPTHWISE_CONV_2D_0_1_fu | conv_fusion_conv_active | 1x324x244 | 1x162x122 | 0/0 | 98822 | 0 | 19.76K | F 1x1x1x1 S 2x2 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | sion | | 1x1x1x1 | | 1/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 1 | | 2/0 | | | | zero, Activation relu | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 6 | CONV_2D_0_2 | conv2d | 1x162x122 | 3x162x122 | 3/0 | 79062 | 0 | 59.29K | F 3x1x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 3x1x1x1 | | 4/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 3 | | 5/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 7 | RESIZE_BILINEAR_0_3 | bilinear | 3x162x122 | 3x96x96 | 6/0 | 86940 | 0 | 175.10K | Resizer bilinear | in: cxhxw out: cxhxw | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 10 | CONV_2D_0_4_fusion | conv_fusion_conv_active | 3x96x96 | 16x48x48 | 7/0 | 64960 | 0 | 995.33K | F 16x3x3x3 S 2x2 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 16x3x3x3 | | 8/0 | | | | 1 M 1 P (0, 1)x(0, 1) | w,out_c out: cxhxw | | | | | 16 | | 9/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 13 | DEPTHWISE_CONV_2D_0_5_fu | conv_fusion_conv_active | 16x48x48 | 16x48x48 | 10/0 | 73888 | 0 | 331.78K | F 16x1x3x3 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | sion | | 16x1x3x3 | | 11/0 | | | | 16 M 1 P (1, 1)x(1, 1) | w,out_c out: cxhxw | | | | | 16 | | 12/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 16 | CONV_2D_0_6 | conv2d | 16x48x48 | 8x48x48 | 13/0 | 55432 | 0 | 294.91K | F 8x16x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 8x16x1x1 | | 14/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 8 | | 15/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 19 | CONV_2D_0_7_fusion | conv_fusion_conv_active | 8x48x48 | 48x48x48 | 16/0 | 129456 | 0 | 884.74K | F 48x8x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 48x8x1x1 | | 17/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 48 | | 18/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 22 | DEPTHWISE_CONV_2D_0_9_fu | conv_fusion_conv_active | 48x48x48 | 48x24x24 | 19/0 | 138720 | 0 | 248.83K | F 48x1x3x3 S 2x2 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | sion | | 48x1x3x3 | | 20/0 | | | | 48 M 1 P (0, 1)x(0, 1) | w,out_c out: cxhxw | | | | | 48 | | 21/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 25 | CONV_2D_0_10 | conv2d | 48x24x24 | 8x24x24 | 22/0 | 32648 | 0 | 221.18K | F 8x48x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 8x48x1x1 | | 23/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 8 | | 24/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 28 | CONV_2D_0_11_fusion | conv_fusion_conv_active | 8x24x24 | 48x24x24 | 25/0 | 32688 | 0 | 221.18K | F 48x8x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 48x8x1x1 | | 26/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 48 | | 27/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 31 | DEPTHWISE_CONV_2D_0_12_f | conv_fusion_conv_active | 48x24x24 | 48x24x24 | 28/0 | 60384 | 0 | 248.83K | F 48x1x3x3 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | usion | | 48x1x3x3 | | 29/0 | | | | 48 M 1 P (1, 1)x(1, 1) | w,out_c out: cxhxw | | | | | 48 | | 30/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 34 | CONV_2D_0_13 | conv2d | 48x24x24 | 8x24x24 | 31/0 | 37256 | 0 | 221.18K | F 8x48x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 8x48x1x1 | | 32/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 8 | | 33/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 35 | ADD_0_14 | add | 8x24x24 | 8x24x24 | 25/0 | 13824 | 0 | 9.22K | add | in: none out: none | | | | | 8x24x24 | | 34/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 38 | CONV_2D_0_15_fusion | conv_fusion_conv_active | 8x24x24 | 48x24x24 | 35/0 | 32688 | 0 | 221.18K | F 48x8x1x1 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | | | 48x8x1x1 | | 36/0 | | | | 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 48 | | 37/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 41 | DEPTHWISE_CONV_2D_0_17_f | conv_fusion_conv_active | 48x24x24 | 48x12x12 | 38/0 | 35040 | 0 | 62.21K | F 48x1x3x3 S 2x2 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | usion | | 48x1x3x3 | | 39/0 | | | | 48 M 1 P (0, 1)x(0, 1) | w,out_c out: cxhxw | | | | | 48 | | 40/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 44 | CONV_2D_0_18 | conv2d | 48x12x12 | 16x12x12 | 41/0 | 10000 | 0 | 110.59K | F 16x48x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 16x48x1x1 | | 42/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 16 | | 43/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 47 | CONV_2D_0_19_fusion | conv_fusion_conv_active | 16x12x12 | 96x12x12 | 44/0 | 17760 | 0 | 221.18K | F 96x16x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 96x16x1x1 | | 45/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 96 | | 46/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 50 | DEPTHWISE_CONV_2D_0_20_f | conv_fusion_conv_active | 96x12x12 | 96x12x12 | 47/0 | 30912 | 0 | 124.42K | F 96x1x3x3 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | usion | | 96x1x3x3 | | 48/0 | | | | 96 M 1 P (1, 1)x(1, 1) | w,out_c out: cxhxw | | | | | 96 | | 49/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 53 | CONV_2D_0_21 | conv2d | 96x12x12 | 16x12x12 | 50/0 | 19984 | 0 | 221.18K | F 16x96x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 16x96x1x1 | | 51/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 16 | | 52/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 54 | ADD_0_22 | add | 16x12x12 | 16x12x12 | 44/0 | 6912 | 0 | 4.61K | add | in: none out: none | | | | | 16x12x12 | | 53/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 57 | CONV_2D_0_23_fusion | conv_fusion_conv_active | 16x12x12 | 96x12x12 | 54/0 | 17760 | 0 | 221.18K | F 96x16x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 96x16x1x1 | | 55/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 96 | | 56/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 60 | DEPTHWISE_CONV_2D_0_24_f | conv_fusion_conv_active | 96x12x12 | 96x12x12 | 57/0 | 30912 | 0 | 124.42K | F 96x1x3x3 S 1x1 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | usion | | 96x1x3x3 | | 58/0 | | | | 96 M 1 P (1, 1)x(1, 1) | w,out_c out: cxhxw | | | | | 96 | | 59/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 63 | CONV_2D_0_25 | conv2d | 96x12x12 | 16x12x12 | 60/0 | 19984 | 0 | 221.18K | F 16x96x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 16x96x1x1 | | 61/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 16 | | 62/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 64 | ADD_0_26 | add | 16x12x12 | 16x12x12 | 54/0 | 6912 | 0 | 4.61K | add | in: none out: none | | | | | 16x12x12 | | 63/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 67 | CONV_2D_0_27_fusion | conv_fusion_conv_active | 16x12x12 | 96x12x12 | 64/0 | 17760 | 0 | 221.18K | F 96x16x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 96x16x1x1 | | 65/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 96 | | 66/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 70 | DEPTHWISE_CONV_2D_0_29_f | conv_fusion_conv_active | 96x12x12 | 96x6x6 | 67/0 | 18240 | 0 | 31.10K | F 96x1x3x3 S 2x2 D 1x1 G | in: cxhxw,out_cxin_cxhx | | | usion | | 96x1x3x3 | | 68/0 | | | | 96 M 1 P (0, 1)x(0, 1) | w,out_c out: cxhxw | | | | | 96 | | 69/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 73 | CONV_2D_0_30 | conv2d | 96x6x6 | 24x6x6 | 70/0 | 6648 | 0 | 82.94K | F 24x96x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 24x96x1x1 | | 71/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 24 | | 72/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 76 | CONV_2D_0_31_fusion | conv_fusion_conv_active | 24x6x6 | 144x6x6 | 73/0 | 9648 | 0 | 124.42K | F 144x24x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 144x24x1x1 | | 74/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 144 | | 75/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 79 | DEPTHWISE_CONV_2D_0_32_f | conv_fusion_conv_active | 144x6x6 | 144x6x6 | 76/0 | 12672 | 0 | 46.66K | F 144x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 144x1x3x3 | | 77/0 | | | | G 144 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 144 | | 78/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 82 | CONV_2D_0_33 | conv2d | 144x6x6 | 24x6x6 | 79/0 | 10392 | 0 | 124.42K | F 24x144x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 24x144x1x1 | | 80/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 24 | | 81/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 83 | ADD_0_34 | add | 24x6x6 | 24x6x6 | 73/0 | 2592 | 0 | 1.73K | add | in: none out: none | | | | | 24x6x6 | | 82/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 86 | CONV_2D_0_35_fusion | conv_fusion_conv_active | 24x6x6 | 144x6x6 | 83/0 | 9648 | 0 | 124.42K | F 144x24x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 144x24x1x1 | | 84/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 144 | | 85/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 89 | DEPTHWISE_CONV_2D_0_36_f | conv_fusion_conv_active | 144x6x6 | 144x6x6 | 86/0 | 12672 | 0 | 46.66K | F 144x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 144x1x3x3 | | 87/0 | | | | G 144 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 144 | | 88/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 92 | CONV_2D_0_37 | conv2d | 144x6x6 | 24x6x6 | 89/0 | 10392 | 0 | 124.42K | F 24x144x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 24x144x1x1 | | 90/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 24 | | 91/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 93 | ADD_0_38 | add | 24x6x6 | 24x6x6 | 83/0 | 2592 | 0 | 1.73K | add | in: none out: none | | | | | 24x6x6 | | 92/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 96 | CONV_2D_0_39_fusion | conv_fusion_conv_active | 24x6x6 | 144x6x6 | 93/0 | 9648 | 0 | 124.42K | F 144x24x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 144x24x1x1 | | 94/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 144 | | 95/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 99 | DEPTHWISE_CONV_2D_0_40_f | conv_fusion_conv_active | 144x6x6 | 144x6x6 | 96/0 | 12672 | 0 | 46.66K | F 144x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 144x1x3x3 | | 97/0 | | | | G 144 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 144 | | 98/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 102 | CONV_2D_0_41 | conv2d | 144x6x6 | 24x6x6 | 99/0 | 10392 | 0 | 124.42K | F 24x144x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 24x144x1x1 | | 100/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 24 | | 101/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 103 | ADD_0_42 | add | 24x6x6 | 24x6x6 | 93/0 | 2592 | 0 | 1.73K | add | in: none out: none | | | | | 24x6x6 | | 102/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 106 | CONV_2D_0_43_fusion | conv_fusion_conv_active | 24x6x6 | 144x6x6 | 103/0 | 9648 | 0 | 124.42K | F 144x24x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 144x24x1x1 | | 104/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 144 | | 105/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 109 | DEPTHWISE_CONV_2D_0_44_f | conv_fusion_conv_active | 144x6x6 | 144x6x6 | 106/0 | 11808 | 0 | 46.66K | F 144x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 144x1x3x3 | | 107/0 | | | | G 144 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 144 | | 108/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 112 | CONV_2D_0_45 | conv2d | 144x6x6 | 32x6x6 | 109/0 | 10976 | 0 | 165.89K | F 32x144x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 32x144x1x1 | | 110/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 32 | | 111/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 115 | CONV_2D_0_46_fusion | conv_fusion_conv_active | 32x6x6 | 192x6x6 | 112/0 | 14400 | 0 | 221.18K | F 192x32x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 192x32x1x1 | | 113/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 192 | | 114/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 118 | DEPTHWISE_CONV_2D_0_47_f | conv_fusion_conv_active | 192x6x6 | 192x6x6 | 115/0 | 16896 | 0 | 62.21K | F 192x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 192x1x3x3 | | 116/0 | | | | G 192 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 192 | | 117/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 121 | CONV_2D_0_48 | conv2d | 192x6x6 | 32x6x6 | 118/0 | 15392 | 0 | 221.18K | F 32x192x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 32x192x1x1 | | 119/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 32 | | 120/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 122 | ADD_0_49 | add | 32x6x6 | 32x6x6 | 112/0 | 3456 | 0 | 2.30K | add | in: none out: none | | | | | 32x6x6 | | 121/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 125 | CONV_2D_0_50_fusion | conv_fusion_conv_active | 32x6x6 | 192x6x6 | 122/0 | 14400 | 0 | 221.18K | F 192x32x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 192x32x1x1 | | 123/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 192 | | 124/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 128 | DEPTHWISE_CONV_2D_0_51_f | conv_fusion_conv_active | 192x6x6 | 192x6x6 | 125/0 | 16896 | 0 | 62.21K | F 192x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 192x1x3x3 | | 126/0 | | | | G 192 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 192 | | 127/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 131 | CONV_2D_0_52 | conv2d | 192x6x6 | 32x6x6 | 128/0 | 15392 | 0 | 221.18K | F 32x192x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 32x192x1x1 | | 129/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 32 | | 130/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 132 | ADD_0_53 | add | 32x6x6 | 32x6x6 | 122/0 | 3456 | 0 | 2.30K | add | in: none out: none | | | | | 32x6x6 | | 131/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 135 | CONV_2D_0_54_fusion | conv_fusion_conv_active | 32x6x6 | 192x6x6 | 132/0 | 14400 | 0 | 221.18K | F 192x32x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 192x32x1x1 | | 133/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 192 | | 134/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 138 | DEPTHWISE_CONV_2D_0_56_f | conv_fusion_conv_active | 192x6x6 | 192x3x3 | 135/0 | 10560 | 0 | 15.55K | F 192x1x3x3 S 2x2 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 192x1x3x3 | | 136/0 | | | | G 192 M 1 P (0, 1)x(0, | w,out_c out: cxhxw | | | | | 192 | | 137/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 141 | CONV_2D_0_57 | conv2d | 192x3x3 | 56x3x3 | 138/0 | 13040 | 0 | 96.77K | F 56x192x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 56x192x1x1 | | 139/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 56 | | 140/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 144 | CONV_2D_0_58_fusion | conv_fusion_conv_active | 56x3x3 | 336x3x3 | 141/0 | 22680 | 0 | 169.34K | F 336x56x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 336x56x1x1 | | 142/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 336 | | 143/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 147 | DEPTHWISE_CONV_2D_0_59_f | conv_fusion_conv_active | 336x3x3 | 336x3x3 | 144/0 | 9912 | 0 | 27.22K | F 336x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 336x1x3x3 | | 145/0 | | | | G 336 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 336 | | 146/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 150 | CONV_2D_0_60 | conv2d | 336x3x3 | 56x3x3 | 147/0 | 22904 | 0 | 169.34K | F 56x336x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 56x336x1x1 | | 148/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 56 | | 149/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 151 | ADD_0_61 | add | 56x3x3 | 56x3x3 | 141/0 | 1512 | 0 | 1.01K | add | in: none out: none | | | | | 56x3x3 | | 150/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 154 | CONV_2D_0_62_fusion | conv_fusion_conv_active | 56x3x3 | 336x3x3 | 151/0 | 22680 | 0 | 169.34K | F 336x56x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 336x56x1x1 | | 152/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 336 | | 153/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 157 | DEPTHWISE_CONV_2D_0_63_f | conv_fusion_conv_active | 336x3x3 | 336x3x3 | 154/0 | 9912 | 0 | 27.22K | F 336x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 336x1x3x3 | | 155/0 | | | | G 336 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 336 | | 156/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 160 | CONV_2D_0_64 | conv2d | 336x3x3 | 56x3x3 | 157/0 | 22904 | 0 | 169.34K | F 56x336x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 56x336x1x1 | | 158/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 56 | | 159/0 | | | | zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 161 | ADD_0_65 | add | 56x3x3 | 56x3x3 | 151/0 | 1512 | 0 | 1.01K | add | in: none out: none | | | | | 56x3x3 | | 160/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 164 | CONV_2D_0_66_fusion | conv_fusion_conv_active | 56x3x3 | 336x3x3 | 161/0 | 22680 | 0 | 169.34K | F 336x56x1x1 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 336x56x1x1 | | 162/0 | | | | G 1 M 1 P (0, 0)x(0, 0) | w,out_c out: cxhxw | | | | | 336 | | 163/0 | | | | zero, Activation relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 167 | DEPTHWISE_CONV_2D_0_67_f | conv_fusion_conv_active | 336x3x3 | 336x3x3 | 164/0 | 9408 | 0 | 27.22K | F 336x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | usion | | 336x1x3x3 | | 165/0 | | | | G 336 M 1 P (1, 1)x(1, | w,out_c out: cxhxw | | | | | 336 | | 166/0 | | | | 1) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 170 | CONV_2D_0_68 | conv2d | 336x3x3 | 112x3x3 | 167/0 | 41776 | 0 | 338.69K | F 112x336x1x1 S 1x1 D | in: cxhxw,out_cxin_cxhx | | | | | 112x336x1x1 | | 168/0 | | | | 1x1 G 1 M 1 P (0, 0)x(0, | w,out_c out: cxhxw | | | | | 112 | | 169/0 | | | | 0) zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 173 | CONV_2D_0_69_fusion | conv_fusion_conv_active | 112x3x3 | 1280x3x3 | 170/0 | 157168 | 0 | 1.29M | F 1280x112x1x1 S 1x1 D | in: cxhxw,out_cxin_cxhx | | | | | 1280x112x1x1 | | 171/0 | | | | 1x1 G 1 M 1 P (0, 0)x(0, | w,out_c out: cxhxw | | | | | 1280 | | 172/0 | | | | 0) zero, Activation | | | | | | | | | | | | relu6 | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 176 | DEPTHWISE_CONV_2D_0_70 | conv2d | 1280x3x3 | 1280x1x1 | 173/0 | 25600 | 0 | 11.52K | F 1280x1x3x3 S 1x1 D 1x1 | in: cxhxw,out_cxin_cxhx | | | | | 1280x1x3x3 | | 174/0 | | | | G 1280 M 1 P (0, 0)x(0, | w,out_c out: cxhxw | | | | | 1280 | | 175/0 | | | | 0) zero | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 179 | CONV_2D_0_71_fusion | conv_fusion_conv_active | 1280x1x1 | 32x1x1 | 176/0 | 42304 | 0 | 40.96K | F 32x1280x1x1 S 1x1 D | in: cxhxw,out_cxin_cxhx | | | | | 32x1280x1x1 | | 177/0 | | | | 1x1 G 1 M 1 P (0, 0)x(0, | w,out_c out: cxhxw | | | | | 32 | | 178/0 | | | | 0) zero, Activation relu | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 180 | MEAN_0_72 | global_average_pool | 32x1x1 | 32 | 179/0 | 64 | 0 | 33 | average A [1, 2] | in: none out: none | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 183 | FULLY_CONNECTED_0_73 | linear | 32 | 2 | 180/0 | 100 | 0 | 64 | F 2x32 B 1 | in: none out: none | | | | | 2x32 | | 181/0 | | | | | | | | | | 2 | | 182/0 | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 184 | SOFTMAX_0_74 | softmax | 2 | 2 | 183/0 | 4 | 0 | 4 | Beta 0.0 Axis 0 Op: | in: none out: none | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | 185 | output_1 | output | 2 | 2 | 184/0 | 2 | 0 | | O 2 | in: none out: none | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | | Totals (#) | | | | | 157168 | 442954 | 10.99M | | | | | Max active/Total params | | | | | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ | | Totals (#) | | | | | | 600122 | 10.99M | | | | | Max mem usage | | | | | | | | | | +------+--------------------------+-------------------------+--------------+-------------+--------+--------+--------+---------+--------------------------+-------------------------+ save_state - saved state to BUILD_MODEL_SQ8BIT/classification.json echo "GENERATING AUTOTILER MODEL" GENERATING AUTOTILER MODEL nntool -g -M BUILD_MODEL_SQ8BIT -m classificationModel.c -T BUILD_MODEL_SQ8BIT/tensors -H classificationInfo.h BUILD_MODEL_SQ8BIT/classification.json /usr/local/lib/python3.8/dist-packages/sklearn/utils/multiclass.py:13: DeprecationWarning: Please use `spmatrix` from the `scipy.sparse` namespace, the `scipy.sparse.base` namespace is deprecated. from scipy.sparse.base import spmatrix /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:30: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations method='lar', copy_X=True, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:167: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations method='lar', copy_X=True, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:284: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_Gram=True, verbose=0, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:862: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_X=True, fit_path=True, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1101: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_X=True, fit_path=True, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1127: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, positive=False): /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1362: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations max_n_alphas=1000, n_jobs=None, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1602: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations max_n_alphas=1000, n_jobs=None, eps=np.finfo(np.float).eps, /usr/local/lib/python3.8/dist-packages/sklearn/linear_model/least_angle.py:1738: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations eps=np.finfo(np.float).eps, copy_X=True, positive=False): /usr/local/lib/python3.8/dist-packages/sklearn/utils/optimize.py:18: DeprecationWarning: Please use `line_search_wolfe2` from the `scipy.optimize` namespace, the `scipy.optimize.linesearch` namespace is deprecated. from scipy.optimize.linesearch import line_search_wolfe2, line_search_wolfe1 /usr/local/lib/python3.8/dist-packages/sklearn/utils/optimize.py:18: DeprecationWarning: Please use `line_search_wolfe1` from the `scipy.optimize` namespace, the `scipy.optimize.linesearch` namespace is deprecated. from scipy.optimize.linesearch import line_search_wolfe2, line_search_wolfe1 /usr/local/lib/python3.8/dist-packages/sklearn/decomposition/online_lda.py:29: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations EPS = np.finfo(np.float).eps settings - set log level to INFO log_level - was: 'INFO' now: 'INFO' open - opening graph file BUILD_MODEL_SQ8BIT/classification.tflite load_quantization = True tflite - Importing TFLITE model version 3 quantize_mixin - removing (de)quantize node QUANTIZE_0_0 with no effect quantize_mixin - removing (de)quantize node QUANTIZE_0_75 with no effect nngraph - update graph dimensions nngraph - update graph dimensions /gap_sdk/tools/nntool/nntool/quantization/multiplicative/quantizers/filter_mult.py:241: RuntimeWarning: divide by zero encountered in log2 biases_bits = np.ceil(np.log2(np.abs(biases_q.quantize(biases_node.dqvalue)))) unified_quantization_handler - indicating change of FULLY_CONNECTED_0_73 input from c, out_cin_c, out_c to chw, out_cin_chw, out_c order - rerun adjust command unified_quantization_handler - indicating change of FULLY_CONNECTED_0_73 output from c to chw order - rerun adjust command nngraph - update graph dimensions nngraph - update graph dimensions nngraph - update graph dimensions fuse_pad - adding padding from: PAD_0_8 to filter: DEPTHWISE_CONV_2D_0_9 - has 0 reshapes fuse_pad - adding padding from: PAD_0_16 to filter: DEPTHWISE_CONV_2D_0_17 - has 0 reshapes fuse_pad - adding padding from: PAD_0_28 to filter: DEPTHWISE_CONV_2D_0_29 - has 0 reshapes fuse_pad - adding padding from: PAD_0_55 to filter: DEPTHWISE_CONV_2D_0_56 - has 0 reshapes nngraph - update graph dimensions debug - was: False now: True adjust_order - adding transposes to correct tensor order for AT kernels global_pool - global pool fusion MEAN_0_72: inserting transpose before operation eliminate_transposes - found elimination for CONV_2D_0_10_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_10_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_10_trans_out0 downwards - 4 eliminated eliminate_transposes - found elimination for CONV_2D_0_11_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_11_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_13_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_13_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_15_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_15_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_18_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_18_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_18_trans_out0 downwards - 6 eliminated eliminate_transposes - found elimination for CONV_2D_0_19_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_19_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_21_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_21_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_23_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_23_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_25_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_25_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_27_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_27_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_2_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_2_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_2_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_30_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_30_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_30_trans_out0 downwards - 8 eliminated eliminate_transposes - found elimination for CONV_2D_0_31_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_31_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_33_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_33_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_35_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_35_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_37_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_37_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_39_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_39_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_41_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_41_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_43_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_43_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_45_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_45_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_45_trans_out0 downwards - 6 eliminated eliminate_transposes - found elimination for CONV_2D_0_46_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_46_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_48_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_48_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_4_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_4_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_4_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_50_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_50_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_52_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_52_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_54_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_54_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_57_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_57_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_57_trans_out0 downwards - 6 eliminated eliminate_transposes - found elimination for CONV_2D_0_58_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_58_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_60_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_60_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_62_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_62_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_64_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_64_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_66_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_66_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_68_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_68_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_68_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_69_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_69_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_6_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_6_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_6_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_71_trans_in0 upwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_71_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_71_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for CONV_2D_0_7_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for CONV_2D_0_7_trans_out0 downwards - 2 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_12_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_17_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_1_trans_in0 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_1_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_20_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_24_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_29_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_32_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_36_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_40_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_44_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_47_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_51_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_56_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_59_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_5_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_63_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_67_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_70_trans_in1 upwards - 1 eliminated eliminate_transposes - found elimination for DEPTHWISE_CONV_2D_0_9_trans_in1 upwards - 1 eliminated eliminate_transposes - eliminate transposes nngraph - update graph dimensions nngraph - update graph dimensions eliminate_transposes - no transposes to eliminate found nngraph - update graph dimensions nngraph - update graph dimensions eliminate_transposes - no further transpose sequences found nngraph - adjusted order nngraph - update graph dimensions remove_noops - removing QUANTIZE_0_0 that does nothing remove_noops - removing QUANTIZE_0_75 that does nothing matcher - ++ fusion remove_noops modified graph fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_1,DEPTHWISE_CONV_2D_0_1_activation into DEPTHWISE_CONV_2D_0_1_fusion fuse_gap_convs - fusing nodes CONV_2D_0_4,CONV_2D_0_4_activation into CONV_2D_0_4_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_5,DEPTHWISE_CONV_2D_0_5_activation into DEPTHWISE_CONV_2D_0_5_fusion fuse_gap_convs - fusing nodes CONV_2D_0_7,CONV_2D_0_7_activation into CONV_2D_0_7_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_9,DEPTHWISE_CONV_2D_0_9_activation into DEPTHWISE_CONV_2D_0_9_fusion fuse_gap_convs - fusing nodes CONV_2D_0_11,CONV_2D_0_11_activation into CONV_2D_0_11_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_12,DEPTHWISE_CONV_2D_0_12_activation into DEPTHWISE_CONV_2D_0_12_fusion fuse_gap_convs - fusing nodes CONV_2D_0_15,CONV_2D_0_15_activation into CONV_2D_0_15_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_17,DEPTHWISE_CONV_2D_0_17_activation into DEPTHWISE_CONV_2D_0_17_fusion fuse_gap_convs - fusing nodes CONV_2D_0_19,CONV_2D_0_19_activation into CONV_2D_0_19_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_20,DEPTHWISE_CONV_2D_0_20_activation into DEPTHWISE_CONV_2D_0_20_fusion fuse_gap_convs - fusing nodes CONV_2D_0_23,CONV_2D_0_23_activation into CONV_2D_0_23_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_24,DEPTHWISE_CONV_2D_0_24_activation into DEPTHWISE_CONV_2D_0_24_fusion fuse_gap_convs - fusing nodes CONV_2D_0_27,CONV_2D_0_27_activation into CONV_2D_0_27_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_29,DEPTHWISE_CONV_2D_0_29_activation into DEPTHWISE_CONV_2D_0_29_fusion fuse_gap_convs - fusing nodes CONV_2D_0_31,CONV_2D_0_31_activation into CONV_2D_0_31_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_32,DEPTHWISE_CONV_2D_0_32_activation into DEPTHWISE_CONV_2D_0_32_fusion fuse_gap_convs - fusing nodes CONV_2D_0_35,CONV_2D_0_35_activation into CONV_2D_0_35_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_36,DEPTHWISE_CONV_2D_0_36_activation into DEPTHWISE_CONV_2D_0_36_fusion fuse_gap_convs - fusing nodes CONV_2D_0_39,CONV_2D_0_39_activation into CONV_2D_0_39_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_40,DEPTHWISE_CONV_2D_0_40_activation into DEPTHWISE_CONV_2D_0_40_fusion fuse_gap_convs - fusing nodes CONV_2D_0_43,CONV_2D_0_43_activation into CONV_2D_0_43_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_44,DEPTHWISE_CONV_2D_0_44_activation into DEPTHWISE_CONV_2D_0_44_fusion fuse_gap_convs - fusing nodes CONV_2D_0_46,CONV_2D_0_46_activation into CONV_2D_0_46_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_47,DEPTHWISE_CONV_2D_0_47_activation into DEPTHWISE_CONV_2D_0_47_fusion fuse_gap_convs - fusing nodes CONV_2D_0_50,CONV_2D_0_50_activation into CONV_2D_0_50_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_51,DEPTHWISE_CONV_2D_0_51_activation into DEPTHWISE_CONV_2D_0_51_fusion fuse_gap_convs - fusing nodes CONV_2D_0_54,CONV_2D_0_54_activation into CONV_2D_0_54_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_56,DEPTHWISE_CONV_2D_0_56_activation into DEPTHWISE_CONV_2D_0_56_fusion fuse_gap_convs - fusing nodes CONV_2D_0_58,CONV_2D_0_58_activation into CONV_2D_0_58_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_59,DEPTHWISE_CONV_2D_0_59_activation into DEPTHWISE_CONV_2D_0_59_fusion fuse_gap_convs - fusing nodes CONV_2D_0_62,CONV_2D_0_62_activation into CONV_2D_0_62_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_63,DEPTHWISE_CONV_2D_0_63_activation into DEPTHWISE_CONV_2D_0_63_fusion fuse_gap_convs - fusing nodes CONV_2D_0_66,CONV_2D_0_66_activation into CONV_2D_0_66_fusion fuse_gap_convs - fusing nodes DEPTHWISE_CONV_2D_0_67,DEPTHWISE_CONV_2D_0_67_activation into DEPTHWISE_CONV_2D_0_67_fusion fuse_gap_convs - fusing nodes CONV_2D_0_69,CONV_2D_0_69_activation into CONV_2D_0_69_fusion fuse_gap_convs - fusing nodes CONV_2D_0_71,CONV_2D_0_71_activation into CONV_2D_0_71_fusion matcher - ++ fusion fuse_gap_convs modified graph matcher - ++ fusion scaled_match_group modified graph /gap_sdk/tools/nntool/nntool/quantization/multiplicative/quantizers/filter_mult.py:241: RuntimeWarning: divide by zero encountered in log2 biases_bits = np.ceil(np.log2(np.abs(biases_q.quantize(biases_node.dqvalue)))) nngraph - update graph dimensions adjust_order - adding transposes to correct tensor order for AT kernels eliminate_transposes - no transposes to eliminate found nngraph - update graph dimensions nngraph - update graph dimensions eliminate_transposes - no further transpose sequences found nngraph - adjusted order generator - Saving model to BUILD_MODEL_SQ8BIT/classificationModel.c generator - Writing constants to BUILD_MODEL_SQ8BIT echo "COMPILING AUTOTILER MODEL" COMPILING AUTOTILER MODEL gcc -g -o BUILD_MODEL_SQ8BIT/GenTile -I. -I/gap_sdk/tools/autotiler_v3/Autotiler -I/gap_sdk/tools/autotiler_v3/Emulation -I/gap_sdk/tools/autotiler_v3/CNN_Generators -I/gap_sdk/tools/autotiler_v3/Generators/BilinearResizes -I/gap_sdk/tools/autotiler_v3/CNN_Libraries -I/gap_sdk/tools/autotiler_v3/CNN_Libraries_fp16 -I/gap_sdk/tools/autotiler_v3/DSP_Generators -I/gap_sdk/tools/autotiler_v3/CNN_Generators_SQ8 -I/gap_sdk/tools/autotiler_v3/Generators/BilinearResizes -I/gap_sdk/tools/autotiler_v3/CNN_Libraries -I/gap_sdk/tools/autotiler_v3/DSP_Libraries -I/gap_sdk/tools/autotiler_v3/CNN_Libraries_fp16 -I/gap_sdk/tools/autotiler_v3/CNN_Libraries_SQ8 /gap_sdk/tools/autotiler_v3/CNN_Generators/CNN_Generator_Util.c /gap_sdk/tools/autotiler_v3/CNN_Generators/CNN_Copy_Generators.c /gap_sdk/tools/autotiler_v3/CNN_Generators/SSD_Generators.c /gap_sdk/tools/autotiler_v3/Generators/BilinearResizes/ResizeGenerator.c /gap_sdk/tools/autotiler_v3/DSP_Generators/DSP_Generators.c /gap_sdk/tools/autotiler_v3/CNN_Generators_SQ8/CNN_Generators_SQ8.c /gap_sdk/tools/autotiler_v3/CNN_Generators_SQ8/RNN_Generators_SQ8.c BUILD_MODEL_SQ8BIT/classificationModel.c /gap_sdk/tools/autotiler_v3/Autotiler/LibTile.a -lSDL2 -lSDL2_ttf echo "RUNNING AUTOTILER MODEL" RUNNING AUTOTILER MODEL BUILD_MODEL_SQ8BIT/GenTile -o BUILD_MODEL_SQ8BIT -c BUILD_MODEL_SQ8BIT --L1 46736 --L2 270000 --L3 8000000 Generating Code For User Kernel: S3_Conv2d_1x1x1x1_Relu Generating Code For User Kernel: S6_Conv2d_3x1x1x1 Generating Code For User Kernel: S7_Op_RESIZE_BILINEAR_0_3 Generating Code For User Kernel: S10_Conv2d_16x3x3x3_Relu6 Generating Code For User Kernel: S13_Conv2d_16x1x3x3_Relu6 Generating Code For User Kernel: S16_Conv2d_8x16x1x1 Generating Code For User Kernel: S19_Conv2d_48x8x1x1_Relu6 Generating Code For User Kernel: S22_Conv2d_48x1x3x3_Relu6 Generating Code For User Kernel: S25_Conv2d_8x48x1x1 Generating Code For User Kernel: S28_Conv2d_48x8x1x1_Relu6 Generating Code For User Kernel: S31_Conv2d_48x1x3x3_Relu6 Generating Code For User Kernel: S34_Conv2d_8x48x1x1 Generating Code For User Kernel: S35_MatAdd_8x24x24 Generating Code For User Kernel: S38_Conv2d_48x8x1x1_Relu6 Generating Code For User Kernel: S41_Conv2d_48x1x3x3_Relu6 Generating Code For User Kernel: S44_Conv2d_16x48x1x1 Generating Code For User Kernel: S47_Conv2d_96x16x1x1_Relu6 Generating Code For User Kernel: S50_Conv2d_96x1x3x3_Relu6 Generating Code For User Kernel: S53_Conv2d_16x96x1x1 Generating Code For User Kernel: S54_MatAdd_16x12x12 Generating Code For User Kernel: S57_Conv2d_96x16x1x1_Relu6 Generating Code For User Kernel: S60_Conv2d_96x1x3x3_Relu6 Generating Code For User Kernel: S63_Conv2d_16x96x1x1 Generating Code For User Kernel: S64_MatAdd_16x12x12 Generating Code For User Kernel: S67_Conv2d_96x16x1x1_Relu6 Generating Code For User Kernel: S70_Conv2d_96x1x3x3_Relu6 Generating Code For User Kernel: S73_Conv2d_24x96x1x1 Generating Code For User Kernel: S76_Conv2d_144x24x1x1_Relu6 Generating Code For User Kernel: S79_Conv2d_144x1x3x3_Relu6 Generating Code For User Kernel: S82_Conv2d_24x144x1x1 Generating Code For User Kernel: S83_MatAdd_24x6x6 Generating Code For User Kernel: S86_Conv2d_144x24x1x1_Relu6 Generating Code For User Kernel: S89_Conv2d_144x1x3x3_Relu6 Generating Code For User Kernel: S92_Conv2d_24x144x1x1 Generating Code For User Kernel: S93_MatAdd_24x6x6 Generating Code For User Kernel: S96_Conv2d_144x24x1x1_Relu6 Generating Code For User Kernel: S99_Conv2d_144x1x3x3_Relu6 Generating Code For User Kernel: S102_Conv2d_24x144x1x1 Generating Code For User Kernel: S103_MatAdd_24x6x6 Generating Code For User Kernel: S106_Conv2d_144x24x1x1_Relu6 Generating Code For User Kernel: S109_Conv2d_144x1x3x3_Relu6 Generating Code For User Kernel: S112_Conv2d_32x144x1x1 Generating Code For User Kernel: S115_Conv2d_192x32x1x1_Relu6 Generating Code For User Kernel: S118_Conv2d_192x1x3x3_Relu6 Generating Code For User Kernel: S121_Conv2d_32x192x1x1 Generating Code For User Kernel: S122_MatAdd_32x6x6 Generating Code For User Kernel: S125_Conv2d_192x32x1x1_Relu6 Generating Code For User Kernel: S128_Conv2d_192x1x3x3_Relu6 Generating Code For User Kernel: S131_Conv2d_32x192x1x1 Generating Code For User Kernel: S132_MatAdd_32x6x6 Generating Code For User Kernel: S135_Conv2d_192x32x1x1_Relu6 Generating Code For User Kernel: S138_Conv2d_192x1x3x3_Relu6 Generating Code For User Kernel: S141_Conv2d_56x192x1x1 Generating Code For User Kernel: S144_Conv2d_336x56x1x1_Relu6 Generating Code For User Kernel: S147_Conv2d_336x1x3x3_Relu6 Generating Code For User Kernel: S150_Conv2d_56x336x1x1 Generating Code For User Kernel: S151_MatAdd_56x3x3 Generating Code For User Kernel: S154_Conv2d_336x56x1x1_Relu6 Generating Code For User Kernel: S157_Conv2d_336x1x3x3_Relu6 Generating Code For User Kernel: S160_Conv2d_56x336x1x1 Generating Code For User Kernel: S161_MatAdd_56x3x3 Generating Code For User Kernel: S164_Conv2d_336x56x1x1_Relu6 Generating Code For User Kernel: S167_Conv2d_336x1x3x3_Relu6 Generating Code For User Kernel: S170_Conv2d_112x336x1x1 Generating Code For User Kernel: S173_Conv2d_1280x112x1x1_Relu6 Generating Code For User Kernel: S176_Conv2d_1280x1x3x3 Generating Code For User Kernel: S179_Conv2d_32x1280x1x1_Relu Generating Code For User Kernel: S180_Op_MEAN_0_72 Generating Code For User Kernel: S183_Linear_2x32 Generating Code For User Kernel: S184_SoftMax Flash image classification_L3_Flash_Const.dat (size 485864) for device AT_MEM_L3_HFLASH successfuly generated Shared L1 Memory size (Bytes) : Given: 46736, Used: 46696 L2 Memory size (Bytes) : Given: 270000, Used: 270000 HyperRam Memory size (Bytes) : Given: 8000000, Used: 461168 HyperFlash Memory size (Bytes) : Given: 67108864, Used: 485864 L3 Memory bandwidth for 1 graph run : 414865 Bytes L2 Memory bandwidth for 1 graph run : 1883361 Bytes Sum of all Kernels arguments size : 1877793 Bytes Tiling Bandwith overhead : 1.002965 Move/KerArgSize Sum of baseline bandwidth : 18132056 Bytes Percentage of baseline BW for L2 : 10.3869 % Percentage of baseline BW for L3 : 2.28802 % Sum of all Kernels operations : 11345090 Operations Total amount of flash coefficients : 485864 Bytes Basic kernels library : Gap.h : classification.h : CNN_BasicKernels_SQ8.h : ResizeBasicKernels.h Output Directory : BUILD_MODEL_SQ8BIT The following files have been generated: classificationKernels.c Generated C code for the user kernels and the user kernels groups classificationKernels.h Header file for the generated C code classification_L3_Flash_Const.dat Flash content for Graph constants CC classification.c classification.c: In function 'RunNetwork': classification.c:68:4: warning: pointer targets in passing argument 1 of 'classificationCNN' differ in signedness [-Wpointer-sign] (cameraBuffer, Output_1); ^~~~~~~~~~~~ In file included from classification.c:32:0: /tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/BUILD_MODEL_SQ8BIT/classificationKernels.h:535:12: note: expected 'signed char * restrict' but argument is of type 'unsigned char *' extern int classificationCNN( ^~~~~~~~~~~~~~~~~ CC com.c CC cpx.c CC classificationKernels.c CC ImgIO.c /gap_sdk/libs/gap_lib/img_io/ImgIO.c: In function 'ReadImageFromFile': /gap_sdk/libs/gap_lib/img_io/ImgIO.c:347:5: warning: 'ImageShort' may be used uninitialized in this function [-Wmaybe-uninitialized] if (ImageShort){ ^ CC SSD_BasicKernels.c CC ResizeBasicKernels.c CC CNN_Copy.c CC CNN_AT_Misc.c CC CmplxFunctions.c CC MatMulDSP.c CC FFT_Library.c CC MfccBasicKernels.c CC PreProcessing.c CC math_funcs.c CC plp_cos_f32s_xpulpv2.c CC plp_sin_f32s_xpulpv2.c CC plp_common_tables.c CC CNN_Activation_SQ8.c CC CNN_Activation_HWC_SQ8.c CC CNN_Bias_Linear_SQ8.c CC CNN_Conv_SQ8.c CC CNN_MatMul_Conv_SQ8.c CC CNN_Pooling_SQ8.c CC CNN_Conv_DW_SQ8.c CC CNN_Conv_DW_Red_SQ8.c CC CNN_MatAlgebra_SQ8.c CC CNN_SoftMax_SQ8.c CC RNN_SQ8.c ASM gap8_iet.S ASM startup_gap8.S ASM cluster_core.S ASM asm_util.S ASM port_asm.S CC gap8_it.c CC system_gap8.c CC gap_io.c CC stdlib.c CC string.c CC errno.c CC cl_malloc.c CC cl_to_fc_delegate.c CC fc_to_cl_delegate.c CC cl_team.c CC hyperbus_cl_internal.c CC uart_cl_internal.c CC cl_dma_irq.c CC fc_event.c CC fll.c CC gpio.c CC pad.c CC pmu.c CC pmu_internal.c CC pwm.c CC pwm_internal.c CC rtc.c CC rtc_internal.c CC timer.c CC hyperbus.c CC hyperbus_internal.c CC cpi.c CC cpi_internal.c CC dmacpy.c CC dmacpy_internal.c CC i2c.c CC i2c_internal.c CC i2s.c CC i2s_internal.c CC spi.c CC spi_internal.c CC uart.c CC uart_internal.c CC flash.c CC partition.c CC flash_partition.c CC md5.c CC fs.c CC lfs.c CC lfs_util.c CC pi_lfs.c CC read_fs.c CC host_fs.c CC ota.c CC ota_utility.c CC updater.c CC bootloader_utility.c CC ai_deck.c CC camera.c CC himax.c CC hyperflash.c CC hyperram.c CC spiram.c CC spiflash.c /tmp/cco5eVDS.s: Assembler messages: /tmp/cco5eVDS.s:2865: Warning: ignoring changed section attributes for .data CC ram.c CC alloc_extern.c CC pi_log.c CC event_kernel.c CC mem_slab.c CC cl_l1_malloc.c CC fc_l1_malloc.c CC l2_malloc.c CC malloc_external.c CC malloc_internal.c CC pi_malloc.c CC device.c CC pmsis_task.c CC pmsis_backend_native_task_api.c CC port.c CC printf.c CC list.c CC queue.c CC tasks.c CC timers.c CC event_groups.c CC stream_buffer.c CC FreeRTOS_util.c gapy --target=ai_deck --platform=board --work-dir=/tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/BUILD/GAP8_V2/GCC_RISCV_FREERTOS --config-ini=/tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/config.ini --config-opt=**/flash/content/partitions/readfs/files=/tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/BUILD_MODEL_SQ8BIT/classification_L3_Flash_Const.dat run --image --binary=/tmp/bitcraze/aideck-gap8-examples/examples/ai/classification/BUILD/GAP8_V2/GCC_RISCV_FREERTOS/classification