Removing the last Softmax and FullyConnected layers

We will proceed by looking into the MobileNet V2 structure and understanding which layers we need to keep or remove. We do this by first running the print function on a neural network architecture. This is shown as follows:

Op:Pooling, Name=pool6Inputs:        arg[0]=relu6_4(0)Attrs:        global_pool=True        kernel=(1, 1)        pool_type=avg        pooling_convention=fullVariable:fc7_weightVariable:fc7_bias--------------------Op:Convolution, Name=fc7Inputs:        arg[0]=pool6(0)        arg[1]=fc7_weight(0) version=0        arg[2]=fc7_bias(0) version=0Attrs:        kernel=(1, 1)        no_bias=False        num_filter=1000        pad=(0, 0)        stride=(1, 1)--------------------Op:Flatten, Name=fc7Inputs:        arg[0]=fc7(0)Variable:prob_label--------------------Op:SoftmaxOutput, ...

Get Hands-On Computer Vision with Julia now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.