CAR DAMAGE DETECTION USING MASK-RCNN

Abstract

Car damage reporting and penalty calculations have always been a challenging issue for license companies and used car selling companies. This paper deals with the issue of the quantitative analysis of the damages by performing unbiased pricing by using Mask RCNN, the state of the art technology for instance segmentation. This paper is an extension of the business technologies to detect and quantify car scratches to address the problems faced by the used car industry and car rental companies. It will support businesses eliminating middle-men and paving the way for a more objective system of pricing and insurance in the vehicle dealership market. Introduction Instance segmentation is the task of detecting and delineating each distinct object of interest appearing in an image. Mask-RCNN is the current state of the art technology for highly accurate mask detection for RoIs (Region of Interest) .In this project we train the MRCNN model to train and detect effective damage area in an image/video . This project is a business extension of existing technologies to detect car scratches and quantifying damages, in order to tackle the problems faced by used car industry and car rental companies for automation of penalty occurred due to these accidents.

This jupyter notebook contains data visualization of car damage images and automated car damage detection example. First we need to import all the packages including custom functions of Matterport Mask R-CNN’ repository

import os
import sys
import itertools
import math
import logging
import json
import re
import random
from collections import OrderedDict
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.patches as patches
import matplotlib.lines as lines
from matplotlib.patches import Polygon
# Import Mask RCNN
#sys.path.append(ROOT_DIR)  # To find local version of the library
from mrcnn import utils
from mrcnn import visualize
from mrcnn.visualize import display_images
from mrcnn import model
import mrcnn.model as modellib
from mrcnn.model import log
import cv2
import custom,custom_1
import imgaug,h5py,IPython
%matplotlib inline
C:\Users\Sourish\Anaconda3\lib\site-packages\dask\config.py:168: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  data = yaml.load(f.read()) or {}
Using TensorFlow backend.

Setting up the configuration

root directory,data path setting up the ,log file path and model object(weight matrix)for inference (prediction)

# Root directory of the project
ROOT_DIR = os.getcwd()
sys.path.append(ROOT_DIR)  # To find local version of the library
MODEL_DIR = os.path.join(ROOT_DIR, "logs")
custom_WEIGHTS_PATH = "mask_rcnn_scratch_0013.h5"  # TODO: update this path for best performing iteration weights
config = custom.CustomConfig()
custom_DIR = os.path.join(ROOT_DIR, "custom/")
custom_DIR
'C:\\Users\\Sourish\\Mask_RCNN\\custom/'

Loading the Data

# Load dataset
dataset = custom_1.CustomDataset()
dataset.load_custom(custom_DIR, "train")
# Must call before using the dataset
dataset.prepare()
print("Image Count: {}".format(len(dataset.image_ids)))
print("Class Count: {}".format(dataset.num_classes))
for i, info in enumerate(dataset.class_info):
    print("{:3}. {:50}".format(i, info['name']))
Image Count: 49
Class Count: 2
  0. BG                                                
  1. scratch     

We will visualize few car damage (scratch) images

# Load and display random samples
image_ids = np.random.choice(dataset.image_ids, 5)
for image_id in image_ids:
    image = dataset.load_image(image_id)
    mask, class_ids = dataset.load_mask(image_id)
    visualize.display_top_masks(image, mask, class_ids, dataset.class_names)

Mask_RCNN_car_damage_prediction_7_0.png

Mask_RCNN_car_damage_prediction_7_1.png

Mask_RCNN_car_damage_prediction_7_2.png

Mask_RCNN_car_damage_prediction_7_3.png

Mask_RCNN_car_damage_prediction_7_4.png

Next we will see Bounding Box(BB)with annotated damage mask for a typical car image.

image_id = random.choice(dataset.image_ids)
image = dataset.load_image(image_id)
mask, class_ids = dataset.load_mask(image_id)
# Compute Bounding box
bbox = utils.extract_bboxes(mask)
# Display image and additional stats
print("image_id ", image_id, dataset.image_reference(image_id))
log("image", image)
log("mask", mask)
log("class_ids", class_ids)
log("bbox", bbox)
# Display image and instances
visualize.display_instances(image, bbox, mask, class_ids, dataset.class_names)
image_id  32 C:\Users\Sourish\Mask_RCNN\custom/train\image35.jpg
image                    shape: (224, 225, 3)         min:    0.00000  max:  255.00000  uint8
mask                     shape: (224, 225, 2)         min:    0.00000  max:    1.00000  bool
class_ids                shape: (2,)                  min:    1.00000  max:    1.00000  int32
bbox                     shape: (2, 4)                min:   81.00000  max:  199.00000  int32

Mask_RCNN_car_damage_prediction_9_1.png

We see some the components of image annotations. Mainly it has x and y co-ordinate of all labeled damages(‘polygon’) and class name(here ‘scratch’) for respective car image.

#Annotation file load
annotations1 = json.load(open(os.path.join(ROOT_DIR, "via_region_data.json"),encoding="utf8"))
annotations = list(annotations1.values()) 
annotations = [a for a in annotations if a['regions']]
annotations[0]
{'fileref': '',
 'size': 46041,
 'filename': 'image2.jpg',
 'base64_img_data': '',
 'file_attributes': {},
 'regions': {'0': {'shape_attributes': {'name': 'polygon',
    'all_points_x': [428,
     429,
     480,
     518,
     557,
     577,
     610,
     660,
     642,
     578,
     579,
     585,
     590,
     574,
     580,
     516,
     507,
     474,
     427,
     426,
     412,
     412,
     430,
     470,
     452,
     428],
    'all_points_y': [232,
     216,
     198,
     193,
     212,
     238,
     237,
     242,
     248,
     248,
     260,
     292,
     343,
     409,
     417,
     441,
     443,
     427,
     413,
     381,
     324,
     301,
     288,
     249,
     231,
     232]},
   'region_attributes': {'Scratch': 'scratch'}},
  '1': {'shape_attributes': {'name': 'polygon',
    'all_points_x': [470, 500, 578, 718, 670, 594, 553, 510, 469, 448, 470],
    'all_points_y': [516, 548, 562, 557, 569, 595, 587, 600, 576, 552, 516]},
   'region_attributes': {'Scratch': 'scratch'}}}} #### If we have to quantify a car damage,we need to know the x and y coordinates of the polygon to calculate area of the marked/detected damage.This is for 2nd damage polygon of 'image2.jpg' ```python annotations[1]['regions']['0']['shape_attributes'] l = [] for d in annotations[1]['regions']['0']['shape_attributes'].values():
l.append(d) display('x co-ordinates of the damage:',l[1])     display('y co-ordinates of the damage:',l[2]) ```
'x co-ordinates of the damage:'
[293, 360, 349, 308, 293]
'y co-ordinates of the damage:'
[303, 330, 314, 302, 303] **For prediction or damage detection we need to use the model as inference mode. Model description is consists of important model information like CNN architecture name('resnet101'), ROI threshold(0.9 as defined),configuration description, weightage of different loss components, mask shape, WEIGHT_DECAY etc.**

Get Inferences

config = custom.CustomConfig()
ROOT_DIR = 'C:/Users/Sourish/Mask_RCNN'
CUSTOM_DIR = os.path.join(ROOT_DIR + "/custom/")
print(CUSTOM_DIR)
class InferenceConfig(config.__class__):
    # Run detection on one image at a time
    GPU_COUNT = 1
    IMAGES_PER_GPU = 1
config = InferenceConfig()
config.display()
# Device to load the neural network on.
# Useful if you're training a model on the same 
# machine, in which case use CPU and leave the
# GPU for training.
DEVICE = "/cpu:0"  # /cpu:0 or /gpu:0
# Inspect the model in training or inference modes
# values: 'inference' or 'training'
# TODO: code for 'training' test mode not ready yet
TEST_MODE = "inference"
C:/Users/Sourish/Mask_RCNN/custom/
Configurations:
BACKBONE                       resnet101
BACKBONE_STRIDES               [4, 8, 16, 32, 64]
BATCH_SIZE                     1
BBOX_STD_DEV                   [0.1 0.1 0.2 0.2]
COMPUTE_BACKBONE_SHAPE         None
DETECTION_MAX_INSTANCES        100
DETECTION_MIN_CONFIDENCE       0.9
DETECTION_NMS_THRESHOLD        0.3
FPN_CLASSIF_FC_LAYERS_SIZE     1024
GPU_COUNT                      1
GRADIENT_CLIP_NORM             5.0
IMAGES_PER_GPU                 1
IMAGE_CHANNEL_COUNT            3
IMAGE_MAX_DIM                  1024
IMAGE_META_SIZE                14
IMAGE_MIN_DIM                  800
IMAGE_MIN_SCALE                0
IMAGE_RESIZE_MODE              square
IMAGE_SHAPE                    [1024 1024    3]
LEARNING_MOMENTUM              0.9
LEARNING_RATE                  0.001
LOSS_WEIGHTS                   {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0}
MASK_POOL_SIZE                 14
MASK_SHAPE                     [28, 28]
MAX_GT_INSTANCES               100
MEAN_PIXEL                     [123.7 116.8 103.9]
MINI_MASK_SHAPE                (56, 56)
NAME                           damage
NUM_CLASSES                    2
POOL_SIZE                      7
POST_NMS_ROIS_INFERENCE        1000
POST_NMS_ROIS_TRAINING         2000
PRE_NMS_LIMIT                  6000
ROI_POSITIVE_RATIO             0.33
RPN_ANCHOR_RATIOS              [0.5, 1, 2]
RPN_ANCHOR_SCALES              (32, 64, 128, 256, 512)
RPN_ANCHOR_STRIDE              1
RPN_BBOX_STD_DEV               [0.1 0.1 0.2 0.2]
RPN_NMS_THRESHOLD              0.7
RPN_TRAIN_ANCHORS_PER_IMAGE    256
STEPS_PER_EPOCH                100
TOP_DOWN_PYRAMID_SIZE          256
TRAIN_BN                       False
TRAIN_ROIS_PER_IMAGE           200
USE_MINI_MASK                  True
USE_RPN_ROIS                   True
VALIDATION_STEPS               50
WEIGHT_DECAY                   0.0001

Helper functions

To visualize predicted damage masks and loading the model weights for prediction

def get_ax(rows=1, cols=1, size=16):
    """Return a Matplotlib Axes array to be used in
    all visualizations in the notebook. Provide a
    central point to control graph sizes.
    Adjust the size attribute to control how big to render images
    """
    _, ax = plt.subplots(rows, cols, figsize=(size*cols, size*rows))
    return ax
from importlib import reload # was constantly changin the visualization, so I decided to reload it instead of notebook
reload(visualize)
# Create model in inference mode
import tensorflow as tf
with tf.device(DEVICE):
    model = modellib.MaskRCNN(mode="inference", model_dir=MODEL_DIR,
                              config=config)
# load the last best model you trained
# weights_path = model.find_last()[1]
custom_WEIGHTS_PATH = 'C:/Users/Sourish/Mask_RCNN/logs/scratch20190612T2046/mask_rcnn_scratch_0013.h5'
# Load weights
print("Loading weights ", custom_WEIGHTS_PATH)
model.load_weights(custom_WEIGHTS_PATH, by_name=True)    
Loading weights  C:/Users/Sourish/Mask_RCNN/logs/scratch20190612T2046/mask_rcnn_scratch_0013.h5
Re-starting from epoch 13

Loading validation data-set for prediction

dataset = custom_1.CustomDataset()
dataset.load_custom(CUSTOM_DIR,'val')
dataset.prepare()
print('Images: {}\nclasses: {}'.format(len(dataset.image_ids), dataset.class_names))
Images: 6
classes: ['BG', 'scratch']

Visualize model weight

matrix descriptive statistics(shapes, histograms)

visualize.display_weight_stats(model)
WEIGHT NAME SHAPE MIN MAX STD
conv1/kernel:0 (7, 7, 3, 64) -0.8616 +0.8451 +0.1315
conv1/bias:0 (64,) -0.0002 +0.0004 +0.0001
bn_conv1/gamma:0 (64,) +0.0835 +2.6411 +0.5091
bn_conv1/beta:0 (64,) -2.3931 +5.3610 +1.9781
bn_conv1/moving_mean:0 (64,) -173.0470 +116.3013 +44.5654
bn_conv1/moving_variance:0*** Overflow?(64,) +0.0000 +146335.3594 +21847.9668
res2a_branch2a/kernel:0 (1, 1, 64, 64) -0.6574 +0.3179 +0.0764
res2a_branch2a/bias:0 (64,) -0.0022 +0.0082 +0.0018
bn2a_branch2a/gamma:0 (64,) +0.2169 +1.8489 +0.4116
bn2a_branch2a/beta:0 (64,) -2.1180 +3.7332 +1.1786
bn2a_branch2a/moving_mean:0 (64,) -6.1235 +7.2220 +2.2789
bn2a_branch2a/moving_variance:0 (64,) +0.0000 +8.9258 +2.0314
res2a_branch2b/kernel:0 (3, 3, 64, 64) -0.3878 +0.5070 +0.0323
res2a_branch2b/bias:0 (64,) -0.0037 +0.0026 +0.0010
bn2a_branch2b/gamma:0 (64,) +0.3165 +1.7010 +0.3042
bn2a_branch2b/beta:0 (64,) -1.9348 +4.5429 +1.5113
bn2a_branch2b/moving_mean:0 (64,) -6.7752 +4.5769 +2.2594
bn2a_branch2b/moving_variance:0 (64,) +0.0000 +5.5085 +1.0835
res2a_branch2c/kernel:0 (1, 1, 64, 256) -0.4468 +0.3615 +0.0410
res2a_branch2c/bias:0 (256,) -0.0041 +0.0052 +0.0016
res2a_branch1/kernel:0 (1, 1, 64, 256) -0.8674 +0.7588 +0.0703
res2a_branch1/bias:0 (256,) -0.0034 +0.0025 +0.0009
bn2a_branch2c/gamma:0 (256,) -0.5782 +3.1806 +0.6192
bn2a_branch2c/beta:0 (256,) -1.1422 +1.4273 +0.4229
bn2a_branch2c/moving_mean:0 (256,) -4.2602 +3.0864 +1.0168
bn2a_branch2c/moving_variance:0 (256,) +0.0000 +2.6688 +0.3827
bn2a_branch1/gamma:0 (256,) +0.2411 +3.4973 +0.6241
bn2a_branch1/beta:0 (256,) -1.1422 +1.4274 +0.4229
bn2a_branch1/moving_mean:0 (256,) -8.0883 +8.6554 +2.0289
bn2a_branch1/moving_variance:0 (256,) +0.0000 +8.7306 +1.5526
res2b_branch2a/kernel:0 (1, 1, 256, 64) -0.2536 +0.2319 +0.0358
res2b_branch2a/bias:0 (64,) -0.0027 +0.0028 +0.0012
bn2b_branch2a/gamma:0 (64,) +0.2032 +1.7708 +0.3812
bn2b_branch2a/beta:0 (64,) -2.0546 +1.6670 +0.8851
bn2b_branch2a/moving_mean:0 (64,) -1.5484 +1.7334 +0.7177
bn2b_branch2a/moving_variance:0 (64,) +0.0000 +2.7921 +0.7575
res2b_branch2b/kernel:0 (3, 3, 64, 64) -0.5226 +0.3397 +0.0356
res2b_branch2b/bias:0 (64,) -0.0047 +0.0033 +0.0015
bn2b_branch2b/gamma:0 (64,) +0.5213 +1.4725 +0.2252
bn2b_branch2b/beta:0 (64,) -2.4533 +2.7526 +1.1960
bn2b_branch2b/moving_mean:0 (64,) -1.8186 +0.8886 +0.5529
bn2b_branch2b/moving_variance:0 (64,) +0.0808 +1.1064 +0.2187
res2b_branch2c/kernel:0 (1, 1, 64, 256) -0.3382 +0.3298 +0.0415
res2b_branch2c/bias:0 (256,) -0.0075 +0.0103 +0.0020
bn2b_branch2c/gamma:0 (256,) -0.0363 +1.7920 +0.4227
bn2b_branch2c/beta:0 (256,) -1.2938 +0.9636 +0.3430
bn2b_branch2c/moving_mean:0 (256,) -2.4192 +2.0440 +0.5019
bn2b_branch2c/moving_variance:0 (256,) +0.0000 +0.1844 +0.0315
res2c_branch2a/kernel:0 (1, 1, 256, 64) -0.3012 +0.2199 +0.0415
res2c_branch2a/bias:0 (64,) -0.0009 +0.0024 +0.0008
bn2c_branch2a/gamma:0 (64,) +0.2659 +1.8204 +0.2834
bn2c_branch2a/beta:0 (64,) -2.0168 +0.8445 +0.7879
bn2c_branch2a/moving_mean:0 (64,) -4.5208 +1.6091 +1.2391
bn2c_branch2a/moving_variance:0 (64,) +0.0000 +3.4581 +0.7942
res2c_branch2b/kernel:0 (3, 3, 64, 64) -0.2007 +0.2176 +0.0378
res2c_branch2b/bias:0 (64,) -0.0030 +0.0058 +0.0018
bn2c_branch2b/gamma:0 (64,) +0.6267 +1.5415 +0.2137
bn2c_branch2b/beta:0 (64,) -2.4090 +1.8192 +0.6302
bn2c_branch2b/moving_mean:0 (64,) -1.4737 +0.0594 +0.2559
bn2c_branch2b/moving_variance:0 (64,) +0.2314 +2.1085 +0.3072
res2c_branch2c/kernel:0 (1, 1, 64, 256) -0.2935 +0.2596 +0.0434
res2c_branch2c/bias:0 (256,) -0.0041 +0.0184 +0.0029
bn2c_branch2c/gamma:0 (256,) -0.0217 +2.3695 +0.5250
bn2c_branch2c/beta:0 (256,) -1.6829 +1.0992 +0.4280
bn2c_branch2c/moving_mean:0 (256,) -1.2568 +0.7135 +0.2851
bn2c_branch2c/moving_variance:0 (256,) +0.0010 +0.5712 +0.0975
res3a_branch2a/kernel:0 (1, 1, 256, 128) -0.4997 +0.6191 +0.0305
res3a_branch2a/bias:0 (128,) -0.0025 +0.0020 +0.0009
bn3a_branch2a/gamma:0 (128,) +0.4899 +1.3306 +0.1884
bn3a_branch2a/beta:0 (128,) -1.8391 +2.5643 +0.7573
bn3a_branch2a/moving_mean:0 (128,) -4.0452 +1.7707 +0.8690
bn3a_branch2a/moving_variance:0 (128,) +0.0620 +7.9964 +1.2851
res3a_branch2b/kernel:0 (3, 3, 128, 128) -0.3225 +0.4509 +0.0223
res3a_branch2b/bias:0 (128,) -0.0011 +0.0019 +0.0006
bn3a_branch2b/gamma:0 (128,) +0.4666 +1.8240 +0.2182
bn3a_branch2b/beta:0 (128,) -1.9434 +1.8963 +0.7859
bn3a_branch2b/moving_mean:0 (128,) -5.8993 +3.3426 +2.0065
bn3a_branch2b/moving_variance:0 (128,) +0.0001 +6.9908 +0.9404
res3a_branch2c/kernel:0 (1, 1, 128, 512) -0.4949 +0.3345 +0.0283
res3a_branch2c/bias:0 (512,) -0.0063 +0.0078 +0.0013
res3a_branch1/kernel:0 (1, 1, 256, 512) -0.4556 +0.6877 +0.0290
res3a_branch1/bias:0 (512,) -0.0055 +0.0039 +0.0008
bn3a_branch2c/gamma:0 (512,) -0.0039 +3.7005 +0.6168
bn3a_branch2c/beta:0 (512,) -0.9616 +1.4438 +0.3693
bn3a_branch2c/moving_mean:0 (512,) -1.6188 +1.3639 +0.3736
bn3a_branch2c/moving_variance:0 (512,) +0.0002 +0.9085 +0.1065
bn3a_branch1/gamma:0 (512,) -0.0158 +2.6945 +0.4766
bn3a_branch1/beta:0 (512,) -0.9616 +1.4437 +0.3693
bn3a_branch1/moving_mean:0 (512,) -3.5990 +2.8529 +0.7936
bn3a_branch1/moving_variance:0 (512,) +0.0030 +6.5634 +0.6189
res3b_branch2a/kernel:0 (1, 1, 512, 128) -0.2015 +0.1914 +0.0252
res3b_branch2a/bias:0 (128,) -0.0015 +0.0020 +0.0008
bn3b_branch2a/gamma:0 (128,) +0.5928 +1.5316 +0.1783
bn3b_branch2a/beta:0 (128,) -3.9542 +0.6799 +0.6433
bn3b_branch2a/moving_mean:0 (128,) -2.6765 +1.1148 +0.6228
bn3b_branch2a/moving_variance:0 (128,) +0.2431 +3.5601 +0.5766
res3b_branch2b/kernel:0 (3, 3, 128, 128) -0.2265 +0.2805 +0.0240
res3b_branch2b/bias:0 (128,) -0.0027 +0.0051 +0.0013
bn3b_branch2b/gamma:0 (128,) +0.4900 +1.4915 +0.2334
bn3b_branch2b/beta:0 (128,) -2.4206 +1.4218 +0.6774
bn3b_branch2b/moving_mean:0 (128,) -2.1795 +1.2802 +0.4907
bn3b_branch2b/moving_variance:0 (128,) +0.0892 +1.4424 +0.2128
res3b_branch2c/kernel:0 (1, 1, 128, 512) -0.3113 +0.4752 +0.0289
res3b_branch2c/bias:0 (512,) -0.0061 +0.0141 +0.0020
bn3b_branch2c/gamma:0 (512,) -0.0431 +2.0087 +0.4044
bn3b_branch2c/beta:0 (512,) -1.5772 +1.1600 +0.3742
bn3b_branch2c/moving_mean:0 (512,) -1.0651 +0.6899 +0.2256
bn3b_branch2c/moving_variance:0 (512,) +0.0002 +0.1858 +0.0288
res3c_branch2a/kernel:0 (1, 1, 512, 128) -0.2672 +0.2625 +0.0284
res3c_branch2a/bias:0 (128,) -0.0017 +0.0035 +0.0008
bn3c_branch2a/gamma:0 (128,) +0.5933 +1.4906 +0.1891
bn3c_branch2a/beta:0 (128,) -2.8070 +0.7289 +0.5408
bn3c_branch2a/moving_mean:0 (128,) -3.0308 +1.6105 +0.8393
bn3c_branch2a/moving_variance:0 (128,) +0.2414 +4.0907 +0.8334
res3c_branch2b/kernel:0 (3, 3, 128, 128) -0.2250 +0.2017 +0.0233
res3c_branch2b/bias:0 (128,) -0.0055 +0.0081 +0.0022
bn3c_branch2b/gamma:0 (128,) +0.4480 +1.5784 +0.2838
bn3c_branch2b/beta:0 (128,) -1.4159 +1.3200 +0.5555
bn3c_branch2b/moving_mean:0 (128,) -1.0064 +0.5542 +0.2663
bn3c_branch2b/moving_variance:0 (128,) +0.1152 +0.8630 +0.1393
res3c_branch2c/kernel:0 (1, 1, 128, 512) -0.3069 +0.3883 +0.0264
res3c_branch2c/bias:0 (512,) -0.0075 +0.0120 +0.0020
bn3c_branch2c/gamma:0 (512,) -0.0409 +1.8960 +0.3768
bn3c_branch2c/beta:0 (512,) -1.5428 +0.8270 +0.3608
bn3c_branch2c/moving_mean:0 (512,) -0.8480 +0.7275 +0.1809
bn3c_branch2c/moving_variance:0 (512,) +0.0002 +0.1614 +0.0254
res3d_branch2a/kernel:0 (1, 1, 512, 128) -0.2583 +0.2893 +0.0306
res3d_branch2a/bias:0 (128,) -0.0014 +0.0018 +0.0007
bn3d_branch2a/gamma:0 (128,) +0.6395 +1.4617 +0.1923
bn3d_branch2a/beta:0 (128,) -2.9768 +0.6138 +0.6397
bn3d_branch2a/moving_mean:0 (128,) -3.4373 +2.0843 +0.9585
bn3d_branch2a/moving_variance:0 (128,) +0.0032 +3.2415 +0.5276
res3d_branch2b/kernel:0 (3, 3, 128, 128) -0.1592 +0.2480 +0.0237
res3d_branch2b/bias:0 (128,) -0.0025 +0.0062 +0.0017
bn3d_branch2b/gamma:0 (128,) +0.6485 +3.2665 +0.2892
bn3d_branch2b/beta:0 (128,) -1.6517 +1.5628 +0.5605
bn3d_branch2b/moving_mean:0 (128,) -0.9797 +0.3526 +0.2822
bn3d_branch2b/moving_variance:0 (128,) +0.2176 +1.4907 +0.1816
res3d_branch2c/kernel:0 (1, 1, 128, 512) -0.2404 +0.3462 +0.0271
res3d_branch2c/bias:0 (512,) -0.0042 +0.0048 +0.0015
bn3d_branch2c/gamma:0 (512,) -0.0272 +1.9218 +0.5163
bn3d_branch2c/beta:0 (512,) -1.0555 +0.9748 +0.2711
bn3d_branch2c/moving_mean:0 (512,) -1.1079 +0.4287 +0.2314
bn3d_branch2c/moving_variance:0 (512,) +0.0002 +0.3496 +0.0543
res4a_branch2a/kernel:0 (1, 1, 512, 256) -0.2763 +0.2629 +0.0150
res4a_branch2a/bias:0 (256,) -0.0012 +0.0013 +0.0004
bn4a_branch2a/gamma:0 (256,) +0.4785 +1.4612 +0.1493
bn4a_branch2a/beta:0 (256,) -1.9415 +1.1200 +0.3882
bn4a_branch2a/moving_mean:0 (256,) -3.8936 +1.1756 +0.6395
bn4a_branch2a/moving_variance:0 (256,) +0.0515 +2.6553 +0.2808
res4a_branch2b/kernel:0 (3, 3, 256, 256) -0.1716 +0.1965 +0.0106
res4a_branch2b/bias:0 (256,) -0.0033 +0.0037 +0.0007
bn4a_branch2b/gamma:0 (256,) +0.4768 +1.5810 +0.1980
bn4a_branch2b/beta:0 (256,) -2.5978 +1.1149 +0.4805
bn4a_branch2b/moving_mean:0 (256,) -2.7021 +2.6603 +0.5277
bn4a_branch2b/moving_variance:0 (256,) +0.1003 +1.1930 +0.1722
res4a_branch2c/kernel:0 (1, 1, 256, 1024) -0.2861 +0.1943 +0.0141
res4a_branch2c/bias:0 (1024,) -0.0049 +0.0115 +0.0012
res4a_branch1/kernel:0 (1, 1, 512, 1024) -0.3615 +0.3428 +0.0159
res4a_branch1/bias:0 (1024,) -0.0015 +0.0015 +0.0004
bn4a_branch2c/gamma:0 (1024,) -0.0104 +2.8173 +0.4544
bn4a_branch2c/beta:0 (1024,) -0.5242 +2.0439 +0.2862
bn4a_branch2c/moving_mean:0 (1024,) -0.4020 +0.2339 +0.0729
bn4a_branch2c/moving_variance:0 (1024,) +0.0000 +0.1119 +0.0107
bn4a_branch1/gamma:0 (1024,) +0.1723 +3.9846 +0.7125
bn4a_branch1/beta:0 (1024,) -0.5242 +2.0441 +0.2862
bn4a_branch1/moving_mean:0 (1024,) -4.9091 +2.9439 +0.7998
bn4a_branch1/moving_variance:0 (1024,) +0.0413 +6.4599 +0.5613
res4b_branch2a/kernel:0 (1, 1, 1024, 256) -0.1251 +0.1742 +0.0082
res4b_branch2a/bias:0 (256,) -0.0007 +0.0006 +0.0002
bn4b_branch2a/gamma:0 (256,) +0.4161 +1.5930 +0.1866
bn4b_branch2a/beta:0 (256,) -2.2049 +2.0415 +0.4853
bn4b_branch2a/moving_mean:0 (256,) -3.9798 +2.5647 +0.9971
bn4b_branch2a/moving_variance:0 (256,) +0.1146 +9.1091 +1.1798
res4b_branch2b/kernel:0 (3, 3, 256, 256) -0.0968 +0.1622 +0.0073
res4b_branch2b/bias:0 (256,) -0.0022 +0.0018 +0.0006
bn4b_branch2b/gamma:0 (256,) +0.4992 +1.4646 +0.1878
bn4b_branch2b/beta:0 (256,) -1.6821 +0.5865 +0.4319
bn4b_branch2b/moving_mean:0 (256,) -5.5636 +1.5920 +0.8391
bn4b_branch2b/moving_variance:0 (256,) +0.0140 +1.2341 +0.1852
res4b_branch2c/kernel:0 (1, 1, 256, 1024) -0.2025 +0.2962 +0.0104
res4b_branch2c/bias:0 (1024,) -0.0057 +0.0110 +0.0017
bn4b_branch2c/gamma:0 (1024,) -0.0006 +3.1556 +0.3625
bn4b_branch2c/beta:0 (1024,) -1.0586 +0.9457 +0.1802
bn4b_branch2c/moving_mean:0 (1024,) -0.3958 +0.3607 +0.0831
bn4b_branch2c/moving_variance:0 (1024,) +0.0000 +0.1685 +0.0150
res4c_branch2a/kernel:0 (1, 1, 1024, 256) -0.1010 +0.1236 +0.0083
res4c_branch2a/bias:0 (256,) -0.0006 +0.0006 +0.0002
bn4c_branch2a/gamma:0 (256,) +0.5716 +1.7534 +0.1407
bn4c_branch2a/beta:0 (256,) -0.9249 +1.3189 +0.3732
bn4c_branch2a/moving_mean:0 (256,) -3.9561 +1.9207 +1.0270
bn4c_branch2a/moving_variance:0 (256,) +0.2736 +4.1165 +0.5892
res4c_branch2b/kernel:0 (3, 3, 256, 256) -0.1008 +0.1099 +0.0075
res4c_branch2b/bias:0 (256,) -0.0015 +0.0021 +0.0005
bn4c_branch2b/gamma:0 (256,) +0.5034 +1.2173 +0.1483
bn4c_branch2b/beta:0 (256,) -1.4417 +0.5756 +0.3369
bn4c_branch2b/moving_mean:0 (256,) -2.9200 +1.5946 +0.5483
bn4c_branch2b/moving_variance:0 (256,) +0.0281 +1.2198 +0.1510
res4c_branch2c/kernel:0 (1, 1, 256, 1024) -0.1328 +0.1666 +0.0108
res4c_branch2c/bias:0 (1024,) -0.0057 +0.0144 +0.0018
bn4c_branch2c/gamma:0 (1024,) +0.0043 +2.2694 +0.2649
bn4c_branch2c/beta:0 (1024,) -1.1019 +0.7349 +0.1791
bn4c_branch2c/moving_mean:0 (1024,) -0.3293 +0.1280 +0.0515
bn4c_branch2c/moving_variance:0 (1024,) +0.0001 +0.0869 +0.0065
res4d_branch2a/kernel:0 (1, 1, 1024, 256) -0.1169 +0.1507 +0.0104
res4d_branch2a/bias:0 (256,) -0.0010 +0.0006 +0.0003
bn4d_branch2a/gamma:0 (256,) +0.5686 +1.4401 +0.1488
bn4d_branch2a/beta:0 (256,) -1.3452 +0.4773 +0.3038
bn4d_branch2a/moving_mean:0 (256,) -2.9391 +2.3041 +0.8307
bn4d_branch2a/moving_variance:0 (256,) +0.2651 +4.1963 +0.5838
res4d_branch2b/kernel:0 (3, 3, 256, 256) -0.1036 +0.0993 +0.0088
res4d_branch2b/bias:0 (256,) -0.0035 +0.0054 +0.0014
bn4d_branch2b/gamma:0 (256,) +0.4286 +1.5386 +0.1594
bn4d_branch2b/beta:0 (256,) -1.4343 +0.3851 +0.2820
bn4d_branch2b/moving_mean:0 (256,) -1.1160 +0.5873 +0.2232
bn4d_branch2b/moving_variance:0 (256,) +0.0355 +0.4098 +0.0649
res4d_branch2c/kernel:0 (1, 1, 256, 1024) -0.2382 +0.1296 +0.0120
res4d_branch2c/bias:0 (1024,) -0.0070 +0.0199 +0.0023
bn4d_branch2c/gamma:0 (1024,) +0.0461 +2.8471 +0.3289
bn4d_branch2c/beta:0 (1024,) -1.3527 +0.5924 +0.2292
bn4d_branch2c/moving_mean:0 (1024,) -0.2602 +0.0767 +0.0430
bn4d_branch2c/moving_variance:0 (1024,) +0.0013 +0.0854 +0.0057
res4e_branch2a/kernel:0 (1, 1, 1024, 256) -0.1474 +0.1154 +0.0103
res4e_branch2a/bias:0 (256,) -0.0006 +0.0009 +0.0003
bn4e_branch2a/gamma:0 (256,) +0.6414 +1.3680 +0.1230
bn4e_branch2a/beta:0 (256,) -1.0867 +0.3564 +0.2688
bn4e_branch2a/moving_mean:0 (256,) -3.8987 +1.4863 +1.0202
bn4e_branch2a/moving_variance:0 (256,) +0.2908 +4.1538 +0.5407
res4e_branch2b/kernel:0 (3, 3, 256, 256) -0.0862 +0.0939 +0.0091
res4e_branch2b/bias:0 (256,) -0.0029 +0.0051 +0.0010
bn4e_branch2b/gamma:0 (256,) +0.5490 +1.2861 +0.1311
bn4e_branch2b/beta:0 (256,) -1.2790 +0.2216 +0.2528
bn4e_branch2b/moving_mean:0 (256,) -1.0716 +0.6271 +0.2514
bn4e_branch2b/moving_variance:0 (256,) +0.0388 +0.5987 +0.0750
res4e_branch2c/kernel:0 (1, 1, 256, 1024) -0.2014 +0.1777 +0.0121
res4e_branch2c/bias:0 (1024,) -0.0053 +0.0107 +0.0020
bn4e_branch2c/gamma:0 (1024,) +0.0251 +1.7328 +0.1828
bn4e_branch2c/beta:0 (1024,) -1.0507 +0.4076 +0.1765
bn4e_branch2c/moving_mean:0 (1024,) -0.2043 +0.0873 +0.0359
bn4e_branch2c/moving_variance:0 (1024,) +0.0007 +0.0373 +0.0033
res4f_branch2a/kernel:0 (1, 1, 1024, 256) -0.0860 +0.1289 +0.0106
res4f_branch2a/bias:0 (256,) -0.0006 +0.0010 +0.0003
bn4f_branch2a/gamma:0 (256,) +0.6962 +1.3808 +0.0996
bn4f_branch2a/beta:0 (256,) -1.1552 +0.3890 +0.2462
bn4f_branch2a/moving_mean:0 (256,) -4.3923 +2.0997 +1.0476
bn4f_branch2a/moving_variance:0 (256,) +0.3235 +3.9428 +0.6106
res4f_branch2b/kernel:0 (3, 3, 256, 256) -0.1023 +0.1056 +0.0097
res4f_branch2b/bias:0 (256,) -0.0038 +0.0037 +0.0010
bn4f_branch2b/gamma:0 (256,) +0.4326 +1.2403 +0.1092
bn4f_branch2b/beta:0 (256,) -1.4296 +0.5155 +0.2444
bn4f_branch2b/moving_mean:0 (256,) -1.6409 +1.4733 +0.3083
bn4f_branch2b/moving_variance:0 (256,) +0.0403 +0.5428 +0.0756
res4f_branch2c/kernel:0 (1, 1, 256, 1024) -0.1396 +0.1322 +0.0121
res4f_branch2c/bias:0 (1024,) -0.0073 +0.0118 +0.0025
bn4f_branch2c/gamma:0 (1024,) +0.1776 +1.6173 +0.1734
bn4f_branch2c/beta:0 (1024,) -0.9800 +0.2974 +0.1410
bn4f_branch2c/moving_mean:0 (1024,) -0.1488 +0.0678 +0.0309
bn4f_branch2c/moving_variance:0 (1024,) +0.0008 +0.0160 +0.0021
res4g_branch2a/kernel:0 (1, 1, 1024, 256) -0.1154 +0.2504 +0.0109
res4g_branch2a/bias:0 (256,) -0.0008 +0.0007 +0.0003
bn4g_branch2a/gamma:0 (256,) +0.6048 +1.2029 +0.1063
bn4g_branch2a/beta:0 (256,) -1.2676 +0.2347 +0.2817
bn4g_branch2a/moving_mean:0 (256,) -4.1649 +1.3777 +0.9897
bn4g_branch2a/moving_variance:0 (256,) +0.2755 +3.4157 +0.6010
res4g_branch2b/kernel:0 (3, 3, 256, 256) -0.1271 +0.1231 +0.0098
res4g_branch2b/bias:0 (256,) -0.0044 +0.0032 +0.0012
bn4g_branch2b/gamma:0 (256,) +0.4751 +1.7336 +0.1333
bn4g_branch2b/beta:0 (256,) -1.2614 +0.1900 +0.2693
bn4g_branch2b/moving_mean:0 (256,) -0.9138 +0.8013 +0.2403
bn4g_branch2b/moving_variance:0 (256,) +0.0337 +0.5631 +0.0711
res4g_branch2c/kernel:0 (1, 1, 256, 1024) -0.1536 +0.2370 +0.0120
res4g_branch2c/bias:0 (1024,) -0.0083 +0.0083 +0.0024
bn4g_branch2c/gamma:0 (1024,) +0.0907 +1.8097 +0.1972
bn4g_branch2c/beta:0 (1024,) -0.9016 +0.2926 +0.1411
bn4g_branch2c/moving_mean:0 (1024,) -0.1636 +0.0711 +0.0339
bn4g_branch2c/moving_variance:0 (1024,) +0.0009 +0.0321 +0.0033
res4h_branch2a/kernel:0 (1, 1, 1024, 256) -0.1293 +0.1603 +0.0118
res4h_branch2a/bias:0 (256,) -0.0009 +0.0007 +0.0003
bn4h_branch2a/gamma:0 (256,) +0.6202 +1.2079 +0.0980
bn4h_branch2a/beta:0 (256,) -1.4124 +0.0712 +0.2610
bn4h_branch2a/moving_mean:0 (256,) -3.4425 +2.5030 +0.8797
bn4h_branch2a/moving_variance:0 (256,) +0.3146 +2.6701 +0.4704
res4h_branch2b/kernel:0 (3, 3, 256, 256) -0.0999 +0.1200 +0.0103
res4h_branch2b/bias:0 (256,) -0.0056 +0.0069 +0.0016
bn4h_branch2b/gamma:0 (256,) +0.4724 +1.4768 +0.1470
bn4h_branch2b/beta:0 (256,) -1.5817 +0.4324 +0.2521
bn4h_branch2b/moving_mean:0 (256,) -0.6978 +0.9014 +0.1644
bn4h_branch2b/moving_variance:0 (256,) +0.0297 +0.4195 +0.0563
res4h_branch2c/kernel:0 (1, 1, 256, 1024) -0.1493 +0.2391 +0.0123
res4h_branch2c/bias:0 (1024,) -0.0071 +0.0081 +0.0024
bn4h_branch2c/gamma:0 (1024,) +0.0532 +2.4672 +0.2958
bn4h_branch2c/beta:0 (1024,) -0.7953 +0.3313 +0.1519
bn4h_branch2c/moving_mean:0 (1024,) -0.1890 +0.1145 +0.0347
bn4h_branch2c/moving_variance:0 (1024,) +0.0006 +0.0432 +0.0041
res4i_branch2a/kernel:0 (1, 1, 1024, 256) -0.1308 +0.2974 +0.0132
res4i_branch2a/bias:0 (256,) -0.0008 +0.0009 +0.0003
bn4i_branch2a/gamma:0 (256,) +0.3549 +1.0498 +0.1261
bn4i_branch2a/beta:0 (256,) -1.4706 +0.4588 +0.2771
bn4i_branch2a/moving_mean:0 (256,) -3.8994 +2.5413 +1.0756
bn4i_branch2a/moving_variance:0 (256,) +0.6003 +6.0203 +0.5194
res4i_branch2b/kernel:0 (3, 3, 256, 256) -0.1302 +0.1541 +0.0093
res4i_branch2b/bias:0 (256,) -0.0063 +0.0112 +0.0028
bn4i_branch2b/gamma:0 (256,) +0.5664 +1.6229 +0.1404
bn4i_branch2b/beta:0 (256,) -1.4855 +0.4776 +0.2638
bn4i_branch2b/moving_mean:0 (256,) -0.3930 +0.1225 +0.0744
bn4i_branch2b/moving_variance:0 (256,) +0.0113 +0.0744 +0.0117
res4i_branch2c/kernel:0 (1, 1, 256, 1024) -0.1554 +0.1568 +0.0120
res4i_branch2c/bias:0 (1024,) -0.0069 +0.0131 +0.0017
bn4i_branch2c/gamma:0 (1024,) +0.0400 +2.1618 +0.1852
bn4i_branch2c/beta:0 (1024,) -0.5914 +0.7033 +0.1253
bn4i_branch2c/moving_mean:0 (1024,) -0.3092 +0.1096 +0.0572
bn4i_branch2c/moving_variance:0 (1024,) +0.0007 +0.0708 +0.0049
res4j_branch2a/kernel:0 (1, 1, 1024, 256) -0.1264 +0.1794 +0.0126
res4j_branch2a/bias:0 (256,) -0.0010 +0.0006 +0.0003
bn4j_branch2a/gamma:0 (256,) +0.5069 +1.2823 +0.1149
bn4j_branch2a/beta:0 (256,) -1.9055 +0.2154 +0.2870
bn4j_branch2a/moving_mean:0 (256,) -4.5494 +1.8513 +0.9530
bn4j_branch2a/moving_variance:0 (256,) +0.5073 +5.0598 +0.4980
res4j_branch2b/kernel:0 (3, 3, 256, 256) -0.1036 +0.2351 +0.0103
res4j_branch2b/bias:0 (256,) -0.0067 +0.0059 +0.0020
bn4j_branch2b/gamma:0 (256,) +0.4131 +1.4478 +0.1323
bn4j_branch2b/beta:0 (256,) -1.9463 +0.4984 +0.2980
bn4j_branch2b/moving_mean:0 (256,) -0.7600 +0.4444 +0.1658
bn4j_branch2b/moving_variance:0 (256,) +0.0380 +0.3044 +0.0348
res4j_branch2c/kernel:0 (1, 1, 256, 1024) -0.1406 +0.1723 +0.0121
res4j_branch2c/bias:0 (1024,) -0.0062 +0.0091 +0.0025
bn4j_branch2c/gamma:0 (1024,) +0.0288 +2.0876 +0.1976
bn4j_branch2c/beta:0 (1024,) -0.8279 +0.1712 +0.1242
bn4j_branch2c/moving_mean:0 (1024,) -0.1671 +0.0815 +0.0318
bn4j_branch2c/moving_variance:0 (1024,) +0.0002 +0.0240 +0.0027
res4k_branch2a/kernel:0 (1, 1, 1024, 256) -0.1369 +0.1895 +0.0115
res4k_branch2a/bias:0 (256,) -0.0009 +0.0007 +0.0003
bn4k_branch2a/gamma:0 (256,) +0.5384 +1.1987 +0.1222
bn4k_branch2a/beta:0 (256,) -1.7274 +0.3939 +0.3094
bn4k_branch2a/moving_mean:0 (256,) -5.5573 +2.2690 +1.0985
bn4k_branch2a/moving_variance:0 (256,) +0.3051 +3.5934 +0.4903
res4k_branch2b/kernel:0 (3, 3, 256, 256) -0.0799 +0.1296 +0.0095
res4k_branch2b/bias:0 (256,) -0.0060 +0.0041 +0.0014
bn4k_branch2b/gamma:0 (256,) +0.4960 +1.2266 +0.1243
bn4k_branch2b/beta:0 (256,) -1.2792 +0.2008 +0.2593
bn4k_branch2b/moving_mean:0 (256,) -0.8575 +1.2273 +0.2591
bn4k_branch2b/moving_variance:0 (256,) +0.0168 +0.3693 +0.0569
res4k_branch2c/kernel:0 (1, 1, 256, 1024) -0.1247 +0.2241 +0.0114
res4k_branch2c/bias:0 (1024,) -0.0067 +0.0081 +0.0024
bn4k_branch2c/gamma:0 (1024,) +0.1148 +1.9941 +0.2081
bn4k_branch2c/beta:0 (1024,) -1.6103 +0.1858 +0.1599
bn4k_branch2c/moving_mean:0 (1024,) -0.1446 +0.0679 +0.0319
bn4k_branch2c/moving_variance:0 (1024,) +0.0010 +0.0395 +0.0042
res4l_branch2a/kernel:0 (1, 1, 1024, 256) -0.2041 +0.1931 +0.0135
res4l_branch2a/bias:0 (256,) -0.0011 +0.0008 +0.0003
bn4l_branch2a/gamma:0 (256,) +0.4169 +1.5267 +0.1211
bn4l_branch2a/beta:0 (256,) -1.8435 +0.3071 +0.2976
bn4l_branch2a/moving_mean:0 (256,) -4.0608 +2.0131 +1.0383
bn4l_branch2a/moving_variance:0 (256,) +0.5797 +5.8934 +0.5997
res4l_branch2b/kernel:0 (3, 3, 256, 256) -0.1075 +0.1778 +0.0101
res4l_branch2b/bias:0 (256,) -0.0089 +0.0092 +0.0025
bn4l_branch2b/gamma:0 (256,) +0.4045 +1.3411 +0.1290
bn4l_branch2b/beta:0 (256,) -1.4767 +0.4854 +0.2710
bn4l_branch2b/moving_mean:0 (256,) -0.3820 +0.2590 +0.0978
bn4l_branch2b/moving_variance:0 (256,) +0.0194 +0.1928 +0.0195
res4l_branch2c/kernel:0 (1, 1, 256, 1024) -0.1198 +0.1714 +0.0123
res4l_branch2c/bias:0 (1024,) -0.0098 +0.0088 +0.0023
bn4l_branch2c/gamma:0 (1024,) +0.0792 +1.6725 +0.1760
bn4l_branch2c/beta:0 (1024,) -1.0323 +0.7015 +0.1526
bn4l_branch2c/moving_mean:0 (1024,) -0.1346 +0.0818 +0.0317
bn4l_branch2c/moving_variance:0 (1024,) +0.0007 +0.0367 +0.0027
res4m_branch2a/kernel:0 (1, 1, 1024, 256) -0.0814 +0.1539 +0.0118
res4m_branch2a/bias:0 (256,) -0.0008 +0.0006 +0.0003
bn4m_branch2a/gamma:0 (256,) +0.5300 +1.1776 +0.0988
bn4m_branch2a/beta:0 (256,) -1.2997 +0.3071 +0.2241
bn4m_branch2a/moving_mean:0 (256,) -6.0726 +1.2736 +1.0509
bn4m_branch2a/moving_variance:0 (256,) +0.4134 +5.9330 +0.5904
res4m_branch2b/kernel:0 (3, 3, 256, 256) -0.1037 +0.1375 +0.0091
res4m_branch2b/bias:0 (256,) -0.0072 +0.0071 +0.0020
bn4m_branch2b/gamma:0 (256,) +0.5814 +1.1881 +0.1004
bn4m_branch2b/beta:0 (256,) -1.3691 +0.1966 +0.2357
bn4m_branch2b/moving_mean:0 (256,) -0.6768 +0.5120 +0.1552
bn4m_branch2b/moving_variance:0 (256,) +0.0234 +0.3098 +0.0412
res4m_branch2c/kernel:0 (1, 1, 256, 1024) -0.1469 +0.1552 +0.0113
res4m_branch2c/bias:0 (1024,) -0.0083 +0.0104 +0.0025
bn4m_branch2c/gamma:0 (1024,) +0.1858 +1.7955 +0.1699
bn4m_branch2c/beta:0 (1024,) -0.7632 +0.5809 +0.1474
bn4m_branch2c/moving_mean:0 (1024,) -0.1689 +0.0692 +0.0349
bn4m_branch2c/moving_variance:0 (1024,) +0.0007 +0.0343 +0.0037
res4n_branch2a/kernel:0 (1, 1, 1024, 256) -0.1217 +0.1600 +0.0128
res4n_branch2a/bias:0 (256,) -0.0012 +0.0006 +0.0003
bn4n_branch2a/gamma:0 (256,) +0.4676 +1.0852 +0.1114
bn4n_branch2a/beta:0 (256,) -1.2727 +0.0440 +0.2325
bn4n_branch2a/moving_mean:0 (256,) -3.7141 +1.9744 +0.9055
bn4n_branch2a/moving_variance:0 (256,) +0.4856 +2.9867 +0.3987
res4n_branch2b/kernel:0 (3, 3, 256, 256) -0.1221 +0.1551 +0.0089
res4n_branch2b/bias:0 (256,) -0.0099 +0.0089 +0.0028
bn4n_branch2b/gamma:0 (256,) +0.4873 +1.1907 +0.1089
bn4n_branch2b/beta:0 (256,) -1.0325 +0.5976 +0.2187
bn4n_branch2b/moving_mean:0 (256,) -0.3456 +0.0662 +0.0728
bn4n_branch2b/moving_variance:0 (256,) +0.0111 +0.2385 +0.0187
res4n_branch2c/kernel:0 (1, 1, 256, 1024) -0.0992 +0.1609 +0.0109
res4n_branch2c/bias:0 (1024,) -0.0078 +0.0087 +0.0021
bn4n_branch2c/gamma:0 (1024,) +0.1917 +1.6763 +0.1232
bn4n_branch2c/beta:0 (1024,) -0.7562 +0.6426 +0.1316
bn4n_branch2c/moving_mean:0 (1024,) -0.2036 +0.1072 +0.0411
bn4n_branch2c/moving_variance:0 (1024,) +0.0013 +0.0381 +0.0031
res4o_branch2a/kernel:0 (1, 1, 1024, 256) -0.0879 +0.1375 +0.0125
res4o_branch2a/bias:0 (256,) -0.0009 +0.0009 +0.0003
bn4o_branch2a/gamma:0 (256,) +0.4154 +1.0786 +0.1032
bn4o_branch2a/beta:0 (256,) -1.5070 +0.1578 +0.2357
bn4o_branch2a/moving_mean:0 (256,) -6.4399 +2.0957 +1.2424
bn4o_branch2a/moving_variance:0 (256,) +0.5261 +5.2096 +0.5908
res4o_branch2b/kernel:0 (3, 3, 256, 256) -0.0939 +0.1264 +0.0090
res4o_branch2b/bias:0 (256,) -0.0093 +0.0069 +0.0027
bn4o_branch2b/gamma:0 (256,) +0.5379 +1.2134 +0.1078
bn4o_branch2b/beta:0 (256,) -1.2517 +0.4139 +0.2248
bn4o_branch2b/moving_mean:0 (256,) -0.3864 +0.3207 +0.0934
bn4o_branch2b/moving_variance:0 (256,) +0.0164 +0.1712 +0.0202
res4o_branch2c/kernel:0 (1, 1, 256, 1024) -0.1602 +0.1698 +0.0111
res4o_branch2c/bias:0 (1024,) -0.0073 +0.0096 +0.0021
bn4o_branch2c/gamma:0 (1024,) +0.2356 +1.9104 +0.1531
bn4o_branch2c/beta:0 (1024,) -0.8014 +0.5613 +0.1387
bn4o_branch2c/moving_mean:0 (1024,) -0.2040 +0.0855 +0.0429
bn4o_branch2c/moving_variance:0 (1024,) +0.0009 +0.0544 +0.0049
res4p_branch2a/kernel:0 (1, 1, 1024, 256) -0.1453 +0.2050 +0.0138
res4p_branch2a/bias:0 (256,) -0.0008 +0.0009 +0.0003
bn4p_branch2a/gamma:0 (256,) +0.5041 +1.0460 +0.0900
bn4p_branch2a/beta:0 (256,) -1.4744 +0.0466 +0.2374
bn4p_branch2a/moving_mean:0 (256,) -3.5993 +2.5332 +1.0418
bn4p_branch2a/moving_variance:0 (256,) +0.6268 +3.2764 +0.5098
res4p_branch2b/kernel:0 (3, 3, 256, 256) -0.0963 +0.1146 +0.0102
res4p_branch2b/bias:0 (256,) -0.0117 +0.0087 +0.0026
bn4p_branch2b/gamma:0 (256,) +0.4508 +1.3897 +0.1299
bn4p_branch2b/beta:0 (256,) -1.4155 +0.4056 +0.2478
bn4p_branch2b/moving_mean:0 (256,) -0.2807 +0.1532 +0.0755
bn4p_branch2b/moving_variance:0 (256,) +0.0209 +0.1309 +0.0195
res4p_branch2c/kernel:0 (1, 1, 256, 1024) -0.1161 +0.1738 +0.0122
res4p_branch2c/bias:0 (1024,) -0.0087 +0.0081 +0.0020
bn4p_branch2c/gamma:0 (1024,) +0.1803 +1.7117 +0.1949
bn4p_branch2c/beta:0 (1024,) -1.0347 +0.3854 +0.1619
bn4p_branch2c/moving_mean:0 (1024,) -0.1642 +0.0812 +0.0336
bn4p_branch2c/moving_variance:0 (1024,) +0.0010 +0.0413 +0.0038
res4q_branch2a/kernel:0 (1, 1, 1024, 256) -0.1236 +0.2559 +0.0137
res4q_branch2a/bias:0 (256,) -0.0008 +0.0012 +0.0003
bn4q_branch2a/gamma:0 (256,) +0.3504 +1.0037 +0.1050
bn4q_branch2a/beta:0 (256,) -1.5841 +0.3542 +0.2878
bn4q_branch2a/moving_mean:0 (256,) -5.4757 +2.7636 +1.1594
bn4q_branch2a/moving_variance:0 (256,) +0.4812 +10.5219 +0.8778
res4q_branch2b/kernel:0 (3, 3, 256, 256) -0.1804 +0.2048 +0.0089
res4q_branch2b/bias:0 (256,) -0.0106 +0.0080 +0.0028
bn4q_branch2b/gamma:0 (256,) +0.6510 +1.4631 +0.1185
bn4q_branch2b/beta:0 (256,) -1.1869 +0.3730 +0.2479
bn4q_branch2b/moving_mean:0 (256,) -0.2944 +0.1277 +0.0664
bn4q_branch2b/moving_variance:0 (256,) +0.0110 +0.0925 +0.0102
res4q_branch2c/kernel:0 (1, 1, 256, 1024) -0.1754 +0.1839 +0.0118
res4q_branch2c/bias:0 (1024,) -0.0073 +0.0053 +0.0015
bn4q_branch2c/gamma:0 (1024,) +0.0368 +2.1137 +0.2127
bn4q_branch2c/beta:0 (1024,) -0.7801 +0.3531 +0.1493
bn4q_branch2c/moving_mean:0 (1024,) -0.3336 +0.1393 +0.0609
bn4q_branch2c/moving_variance:0 (1024,) +0.0006 +0.0498 +0.0058
res4r_branch2a/kernel:0 (1, 1, 1024, 256) -0.1730 +0.2589 +0.0137
res4r_branch2a/bias:0 (256,) -0.0009 +0.0010 +0.0003
bn4r_branch2a/gamma:0 (256,) +0.2862 +0.9191 +0.1058
bn4r_branch2a/beta:0 (256,) -1.3459 +0.2720 +0.2652
bn4r_branch2a/moving_mean:0 (256,) -2.5019 +3.6722 +1.0108
bn4r_branch2a/moving_variance:0 (256,) +0.6803 +5.8562 +0.5767
res4r_branch2b/kernel:0 (3, 3, 256, 256) -0.1322 +0.1870 +0.0086
res4r_branch2b/bias:0 (256,) -0.0092 +0.0095 +0.0031
bn4r_branch2b/gamma:0 (256,) +0.5149 +1.4533 +0.1367
bn4r_branch2b/beta:0 (256,) -0.9097 +0.6818 +0.2112
bn4r_branch2b/moving_mean:0 (256,) -0.2223 +0.1066 +0.0609
bn4r_branch2b/moving_variance:0 (256,) +0.0062 +0.0611 +0.0083
res4r_branch2c/kernel:0 (1, 1, 256, 1024) -0.0890 +0.1848 +0.0112
res4r_branch2c/bias:0 (1024,) -0.0115 +0.0075 +0.0017
bn4r_branch2c/gamma:0 (1024,) +0.1361 +1.7486 +0.1512
bn4r_branch2c/beta:0 (1024,) -0.7476 +0.3266 +0.1342
bn4r_branch2c/moving_mean:0 (1024,) -0.2405 +0.1581 +0.0556
bn4r_branch2c/moving_variance:0 (1024,) +0.0019 +0.0666 +0.0043
res4s_branch2a/kernel:0 (1, 1, 1024, 256) -0.1326 +0.1900 +0.0130
res4s_branch2a/bias:0 (256,) -0.0007 +0.0008 +0.0003
bn4s_branch2a/gamma:0 (256,) +0.3468 +0.9720 +0.1053
bn4s_branch2a/beta:0 (256,) -1.2850 +0.5307 +0.2683
bn4s_branch2a/moving_mean:0 (256,) -9.8440 +2.2720 +1.2494
bn4s_branch2a/moving_variance:0 (256,) +0.5810 +17.3876 +1.1579
res4s_branch2b/kernel:0 (3, 3, 256, 256) -0.2008 +0.1875 +0.0085
res4s_branch2b/bias:0 (256,) -0.0109 +0.0088 +0.0028
bn4s_branch2b/gamma:0 (256,) +0.5201 +1.1966 +0.1040
bn4s_branch2b/beta:0 (256,) -1.1253 +0.3305 +0.2241
bn4s_branch2b/moving_mean:0 (256,) -0.3961 +0.1227 +0.0815
bn4s_branch2b/moving_variance:0 (256,) +0.0097 +0.0778 +0.0102
res4s_branch2c/kernel:0 (1, 1, 256, 1024) -0.1133 +0.1737 +0.0112
res4s_branch2c/bias:0 (1024,) -0.0104 +0.0076 +0.0019
bn4s_branch2c/gamma:0 (1024,) +0.1482 +1.7350 +0.1424
bn4s_branch2c/beta:0 (1024,) -0.7700 +0.4369 +0.1229
bn4s_branch2c/moving_mean:0 (1024,) -0.1928 +0.1375 +0.0475
bn4s_branch2c/moving_variance:0 (1024,) +0.0015 +0.0487 +0.0035
res4t_branch2a/kernel:0 (1, 1, 1024, 256) -0.1664 +0.1841 +0.0131
res4t_branch2a/bias:0 (256,) -0.0013 +0.0016 +0.0004
bn4t_branch2a/gamma:0 (256,) +0.4422 +1.2399 +0.1050
bn4t_branch2a/beta:0 (256,) -1.1249 +0.2680 +0.2436
bn4t_branch2a/moving_mean:0 (256,) -6.2349 +2.5628 +1.3091
bn4t_branch2a/moving_variance:0 (256,) +0.4907 +5.1029 +0.5986
res4t_branch2b/kernel:0 (3, 3, 256, 256) -0.1324 +0.1087 +0.0094
res4t_branch2b/bias:0 (256,) -0.0096 +0.0074 +0.0024
bn4t_branch2b/gamma:0 (256,) +0.4694 +1.2439 +0.1069
bn4t_branch2b/beta:0 (256,) -1.1027 +0.6878 +0.2195
bn4t_branch2b/moving_mean:0 (256,) -0.6517 +0.2553 +0.1409
bn4t_branch2b/moving_variance:0 (256,) +0.0288 +0.2597 +0.0352
res4t_branch2c/kernel:0 (1, 1, 256, 1024) -0.1675 +0.1682 +0.0118
res4t_branch2c/bias:0 (1024,) -0.0144 +0.0073 +0.0019
bn4t_branch2c/gamma:0 (1024,) +0.2195 +1.9902 +0.1718
bn4t_branch2c/beta:0 (1024,) -0.7889 +0.3014 +0.1334
bn4t_branch2c/moving_mean:0 (1024,) -0.1941 +0.1983 +0.0427
bn4t_branch2c/moving_variance:0 (1024,) +0.0010 +0.0383 +0.0034
res4u_branch2a/kernel:0 (1, 1, 1024, 256) -0.1064 +0.1593 +0.0122
res4u_branch2a/bias:0 (256,) -0.0010 +0.0009 +0.0003
bn4u_branch2a/gamma:0 (256,) +0.2925 +1.1103 +0.1152
bn4u_branch2a/beta:0 (256,) -1.2993 +0.5482 +0.2517
bn4u_branch2a/moving_mean:0 (256,) -8.9484 +3.0622 +1.2511
bn4u_branch2a/moving_variance:0 (256,) +0.3403 +10.5238 +1.0425
res4u_branch2b/kernel:0 (3, 3, 256, 256) -0.1250 +0.1247 +0.0080
res4u_branch2b/bias:0 (256,) -0.0125 +0.0058 +0.0024
bn4u_branch2b/gamma:0 (256,) +0.6188 +1.3505 +0.1055
bn4u_branch2b/beta:0 (256,) -0.9702 +0.4346 +0.2031
bn4u_branch2b/moving_mean:0 (256,) -0.4856 +0.3687 +0.1192
bn4u_branch2b/moving_variance:0 (256,) +0.0157 +0.1070 +0.0150
res4u_branch2c/kernel:0 (1, 1, 256, 1024) -0.1426 +0.2122 +0.0106
res4u_branch2c/bias:0 (1024,) -0.0084 +0.0064 +0.0016
bn4u_branch2c/gamma:0 (1024,) +0.0975 +1.7799 +0.1396
bn4u_branch2c/beta:0 (1024,) -0.7753 +0.6956 +0.1624
bn4u_branch2c/moving_mean:0 (1024,) -0.2670 +0.1501 +0.0620
bn4u_branch2c/moving_variance:0 (1024,) +0.0017 +0.0830 +0.0062
res4v_branch2a/kernel:0 (1, 1, 1024, 256) -0.1540 +0.2322 +0.0125
res4v_branch2a/bias:0 (256,) -0.0011 +0.0015 +0.0003
bn4v_branch2a/gamma:0 (256,) +0.3923 +1.0099 +0.0927
bn4v_branch2a/beta:0 (256,) -1.2259 +0.4520 +0.2694
bn4v_branch2a/moving_mean:0 (256,) -5.8968 +2.1904 +1.1998
bn4v_branch2a/moving_variance:0 (256,) +0.5714 +5.2581 +0.6591
res4v_branch2b/kernel:0 (3, 3, 256, 256) -0.1413 +0.1647 +0.0085
res4v_branch2b/bias:0 (256,) -0.0130 +0.0072 +0.0022
bn4v_branch2b/gamma:0 (256,) +0.6167 +1.1526 +0.0908
bn4v_branch2b/beta:0 (256,) -1.0165 +0.8713 +0.1890
bn4v_branch2b/moving_mean:0 (256,) -0.4301 +0.1983 +0.1069
bn4v_branch2b/moving_variance:0 (256,) +0.0213 +0.2294 +0.0224
res4v_branch2c/kernel:0 (1, 1, 256, 1024) -0.1078 +0.1724 +0.0112
res4v_branch2c/bias:0 (1024,) -0.0045 +0.0057 +0.0016
bn4v_branch2c/gamma:0 (1024,) +0.2332 +1.6640 +0.1350
bn4v_branch2c/beta:0 (1024,) -0.9275 +0.5133 +0.1990
bn4v_branch2c/moving_mean:0 (1024,) -0.2657 +0.2240 +0.0562
bn4v_branch2c/moving_variance:0 (1024,) +0.0014 +0.0536 +0.0044
res4w_branch2a/kernel:0 (1, 1, 1024, 256) -0.1421 +0.2230 +0.0128
res4w_branch2a/bias:0 (256,) -0.0011 +0.0017 +0.0003
bn4w_branch2a/gamma:0 (256,) +0.2562 +1.0847 +0.1115
bn4w_branch2a/beta:0 (256,) -1.4639 +0.3603 +0.2947
bn4w_branch2a/moving_mean:0 (256,) -13.4450 +3.0168 +1.9482
bn4w_branch2a/moving_variance:0 (256,) +0.5124 +13.2866 +1.0325
res4w_branch2b/kernel:0 (3, 3, 256, 256) -0.1053 +0.1691 +0.0084
res4w_branch2b/bias:0 (256,) -0.0078 +0.0080 +0.0024
bn4w_branch2b/gamma:0 (256,) +0.7056 +1.4043 +0.0986
bn4w_branch2b/beta:0 (256,) -0.9674 +0.3868 +0.2011
bn4w_branch2b/moving_mean:0 (256,) -0.2898 +0.2128 +0.0745
bn4w_branch2b/moving_variance:0 (256,) +0.0105 +0.1042 +0.0124
res4w_branch2c/kernel:0 (1, 1, 256, 1024) -0.1479 +0.1984 +0.0111
res4w_branch2c/bias:0 (1024,) -0.0042 +0.0050 +0.0014
bn4w_branch2c/gamma:0 (1024,) +0.0221 +1.5448 +0.1517
bn4w_branch2c/beta:0 (1024,) -0.8512 +0.5036 +0.1815
bn4w_branch2c/moving_mean:0 (1024,) -0.3939 +0.1908 +0.1037
bn4w_branch2c/moving_variance:0 (1024,) +0.0007 +0.1042 +0.0077
res5a_branch2a/kernel:0 (1, 1, 1024, 512) -0.1763 +0.2315 +0.0143
res5a_branch2a/bias:0 (512,) -0.0014 +0.0011 +0.0003
bn5a_branch2a/gamma:0 (512,) +0.5024 +1.2280 +0.1224
bn5a_branch2a/beta:0 (512,) -1.4505 +0.4926 +0.3033
bn5a_branch2a/moving_mean:0 (512,) -13.2482 +6.5317 +1.7947
bn5a_branch2a/moving_variance:0 (512,) +0.8438 +12.3391 +1.0853
res5a_branch2b/kernel:0 (3, 3, 512, 512) -0.2433 +0.3231 +0.0091
res5a_branch2b/bias:0 (512,) -0.0018 +0.0043 +0.0008
bn5a_branch2b/gamma:0 (512,) +0.3045 +1.4122 +0.1359
bn5a_branch2b/beta:0 (512,) -1.6459 +0.7115 +0.3145
bn5a_branch2b/moving_mean:0 (512,) -1.7575 +1.3904 +0.2868
bn5a_branch2b/moving_variance:0 (512,) +0.0838 +0.8812 +0.0821
res5a_branch2c/kernel:0 (1, 1, 512, 2048) -0.2880 +0.3244 +0.0122
res5a_branch2c/bias:0 (2048,) -0.0106 +0.0227 +0.0013
res5a_branch1/kernel:0 (1, 1, 1024, 2048) -0.3702 +0.4639 +0.0105
res5a_branch1/bias:0 (2048,) -0.0008 +0.0019 +0.0002
bn5a_branch2c/gamma:0 (2048,) +0.6632 +2.6859 +0.2373
bn5a_branch2c/beta:0 (2048,) -1.8489 +1.4639 +0.2354
bn5a_branch2c/moving_mean:0 (2048,) -0.4234 +0.5529 +0.0570
bn5a_branch2c/moving_variance:0 (2048,) +0.0023 +0.1430 +0.0061
bn5a_branch1/gamma:0 (2048,) +0.8580 +4.8497 +0.5097
bn5a_branch1/beta:0 (2048,) -1.8488 +1.4641 +0.2354
bn5a_branch1/moving_mean:0 (2048,) -8.1453 +5.4726 +1.0711
bn5a_branch1/moving_variance:0 (2048,) +0.2737 +5.2058 +0.3888
res5b_branch2a/kernel:0 (1, 1, 2048, 512) -0.1497 +0.2567 +0.0107
res5b_branch2a/bias:0 (512,) -0.0015 +0.0040 +0.0004
bn5b_branch2a/gamma:0 (512,) +0.3823 +1.1338 +0.0958
bn5b_branch2a/beta:0 (512,) -1.1818 +0.6068 +0.1976
bn5b_branch2a/moving_mean:0 (512,) -3.0110 +4.9119 +0.5962
bn5b_branch2a/moving_variance:0 (512,) +0.5778 +4.9932 +0.4898
res5b_branch2b/kernel:0 (3, 3, 512, 512) -0.1338 +0.2394 +0.0079
res5b_branch2b/bias:0 (512,) -0.0105 +0.0073 +0.0013
bn5b_branch2b/gamma:0 (512,) +0.5321 +1.1694 +0.1049
bn5b_branch2b/beta:0 (512,) -1.8375 +0.5450 +0.2784
bn5b_branch2b/moving_mean:0 (512,) -1.2009 +1.5489 +0.2116
bn5b_branch2b/moving_variance:0 (512,) +0.0530 +0.6918 +0.0500
res5b_branch2c/kernel:0 (1, 1, 512, 2048) -0.1345 +0.1962 +0.0106
res5b_branch2c/bias:0 (2048,) -0.0181 +0.0196 +0.0018
bn5b_branch2c/gamma:0 (2048,) +0.5622 +2.4065 +0.2234
bn5b_branch2c/beta:0 (2048,) -2.3543 +0.1656 +0.2122
bn5b_branch2c/moving_mean:0 (2048,) -0.2994 +0.9054 +0.0436
bn5b_branch2c/moving_variance:0 (2048,) +0.0017 +0.1444 +0.0042
res5c_branch2a/kernel:0 (1, 1, 2048, 512) -0.1803 +0.3580 +0.0115
res5c_branch2a/bias:0 (512,) -0.0037 +0.0055 +0.0005
bn5c_branch2a/gamma:0 (512,) +0.1743 +1.1331 +0.0973
bn5c_branch2a/beta:0 (512,) -1.3940 +0.9286 +0.2532
bn5c_branch2a/moving_mean:0 (512,) -1.6788 +4.0125 +0.4136
bn5c_branch2a/moving_variance:0 (512,) +0.4097 +6.2837 +0.4842
res5c_branch2b/kernel:0 (3, 3, 512, 512) -0.0940 +0.0945 +0.0071
res5c_branch2b/bias:0 (512,) -0.0092 +0.0111 +0.0019
bn5c_branch2b/gamma:0 (512,) +0.4880 +1.1432 +0.0915
bn5c_branch2b/beta:0 (512,) -1.4251 +0.3417 +0.2747
bn5c_branch2b/moving_mean:0 (512,) -0.6788 +0.0926 +0.1057
bn5c_branch2b/moving_variance:0 (512,) +0.0341 +0.2969 +0.0293
res5c_branch2c/kernel:0 (1, 1, 512, 2048) -0.1305 +0.1288 +0.0103
res5c_branch2c/bias:0 (2048,) -0.0031 +0.0070 +0.0011
bn5c_branch2c/gamma:0 (2048,) +0.5999 +2.5360 +0.2191
bn5c_branch2c/beta:0 (2048,) -4.0101 -0.6658 +0.2211
bn5c_branch2c/moving_mean:0 (2048,) -0.2560 +0.1848 +0.0335
bn5c_branch2c/moving_variance:0 (2048,) +0.0021 +0.0345 +0.0027
fpn_c5p5/kernel:0 (1, 1, 2048, 256) -0.0663 +0.0641 +0.0080
fpn_c5p5/bias:0 (256,) -0.0149 +0.0132 +0.0053
fpn_c4p4/kernel:0 (1, 1, 1024, 256) -0.1159 +0.0827 +0.0101
fpn_c4p4/bias:0 (256,) -0.0049 +0.0042 +0.0015
fpn_c3p3/kernel:0 (1, 1, 512, 256) -0.0507 +0.0581 +0.0076
fpn_c3p3/bias:0 (256,) -0.0064 +0.0058 +0.0021
fpn_c2p2/kernel:0 (1, 1, 256, 256) -0.0370 +0.0534 +0.0064
fpn_c2p2/bias:0 (256,) -0.0056 +0.0067 +0.0021
fpn_p5/kernel:0 (3, 3, 256, 256) -0.0349 +0.0396 +0.0059
fpn_p5/bias:0 (256,) -0.0089 +0.0080 +0.0038
fpn_p2/kernel:0 (3, 3, 256, 256) -0.0302 +0.0326 +0.0055
fpn_p2/bias:0 (256,) -0.0069 +0.0060 +0.0023
fpn_p3/kernel:0 (3, 3, 256, 256) -0.0248 +0.0277 +0.0048
fpn_p3/bias:0 (256,) -0.0040 +0.0041 +0.0015
fpn_p4/kernel:0 (3, 3, 256, 256) -0.0274 +0.0281 +0.0051
fpn_p4/bias:0 (256,) -0.0041 +0.0038 +0.0017
rpn_conv_shared/kernel:0 (3, 3, 256, 512) -0.0407 +0.0367 +0.0027
rpn_conv_shared/bias:0 (512,) -0.0049 +0.0074 +0.0012
rpn_class_raw/kernel:0 (1, 1, 512, 6) -0.1370 +0.1370 +0.0170
rpn_class_raw/bias:0 (6,) -0.0246 +0.0246 +0.0162
rpn_bbox_pred/kernel:0 (1, 1, 512, 12) -0.1086 +0.2623 +0.0242
rpn_bbox_pred/bias:0 (12,) -0.0504 +0.0705 +0.0348
mrcnn_class_conv1/kernel:0 (7, 7, 256, 1024) -0.0270 +0.0269 +0.0033
mrcnn_class_conv1/bias:0 (1024,) -0.0016 +0.0005 +0.0003
mrcnn_class_bn1/gamma:0 (1024,) +0.9535 +1.0465 +0.0092
mrcnn_class_bn1/beta:0 (1024,) -0.0325 +0.0055 +0.0036
mrcnn_class_bn1/moving_mean:0 (1024,) -13.7297 +4.7460 +1.4822
mrcnn_class_bn1/moving_variance:0 (1024,) +1.6954 +30.5685 +2.7805
mrcnn_class_conv2/kernel:0 (1, 1, 1024, 1024) -0.0782 +0.0445 +0.0059
mrcnn_class_conv2/bias:0 (1024,) -0.0212 +0.0286 +0.0053
mrcnn_class_bn2/gamma:0 (1024,) +0.9765 +1.0521 +0.0101
mrcnn_class_bn2/beta:0 (1024,) -0.0147 +0.0292 +0.0045
mrcnn_class_bn2/moving_mean:0 (1024,) -0.7209 +0.9867 +0.1617
mrcnn_class_bn2/moving_variance:0 (1024,) +0.0094 +0.3570 +0.0415
mrcnn_class_logits/kernel:0 (1024, 2) -0.0778 +0.0790 +0.0439
mrcnn_class_logits/bias:0 (2,) -0.0004 +0.0004 +0.0004
mrcnn_bbox_fc/kernel:0 (1024, 8) -0.0771 +0.0770 +0.0432
mrcnn_bbox_fc/bias:0 (8,) -0.0006 +0.0004 +0.0003
mrcnn_mask_conv1/kernel:0 (3, 3, 256, 256) -0.0690 +0.0600 +0.0050
mrcnn_mask_conv1/bias:0 (256,) -0.0033 +0.0034 +0.0010
mrcnn_mask_bn1/gamma:0 (256,) +0.9733 +1.1360 +0.0178
mrcnn_mask_bn1/beta:0 (256,) -0.0205 +0.0023 +0.0036
mrcnn_mask_bn1/moving_mean:0 (256,) -2.4143 +1.1433 +0.5678
mrcnn_mask_bn1/moving_variance:0 (256,) +0.6905 +4.3261 +0.5650
mrcnn_mask_conv2/kernel:0 (3, 3, 256, 256) -0.0757 +0.1481 +0.0050
mrcnn_mask_conv2/bias:0 (256,) -0.0069 +0.0069 +0.0024
mrcnn_mask_bn2/gamma:0 (256,) +0.9765 +1.0491 +0.0119
mrcnn_mask_bn2/beta:0 (256,) -0.0182 +0.0021 +0.0035
mrcnn_mask_bn2/moving_mean:0 (256,) -0.6003 +0.1184 +0.1236
mrcnn_mask_bn2/moving_variance:0 (256,) +0.0437 +0.4538 +0.0551
mrcnn_mask_conv3/kernel:0 (3, 3, 256, 256) -0.0550 +0.0605 +0.0048
mrcnn_mask_conv3/bias:0 (256,) -0.0118 +0.0102 +0.0038
mrcnn_mask_bn3/gamma:0 (256,) +0.9797 +1.0433 +0.0099
mrcnn_mask_bn3/beta:0 (256,) -0.0307 +0.0014 +0.0046
mrcnn_mask_bn3/moving_mean:0 (256,) -0.5812 +0.2401 +0.1452
mrcnn_mask_bn3/moving_variance:0 (256,) +0.0137 +0.3976 +0.0564
mrcnn_mask_conv4/kernel:0 (3, 3, 256, 256) -0.0425 +0.0374 +0.0043
mrcnn_mask_conv4/bias:0 (256,) -0.0022 +0.0051 +0.0010
mrcnn_mask_bn4/gamma:0 (256,) +0.9905 +1.0759 +0.0209
mrcnn_mask_bn4/beta:0 (256,) +0.0045 +0.0464 +0.0113
mrcnn_mask_bn4/moving_mean:0 (256,) -0.1459 +0.6716 +0.1304
mrcnn_mask_bn4/moving_variance:0 (256,) +0.0331 +0.2902 +0.0494
mrcnn_mask_deconv/kernel:0 (2, 2, 256, 256) -0.0395 +0.0652 +0.0057
mrcnn_mask_deconv/bias:0 (256,) -0.0030 +0.0770 +0.0112
mrcnn_mask/kernel:0 (1, 1, 256, 2) -0.1647 +0.1596 +0.0868
mrcnn_mask/bias:0 (2,) +0.0000 +0.0048 +0.0024

Prediction on a Random Validation Image

image_id = random.choice(dataset.image_ids)
image, image_meta, gt_class_id, gt_bbox, gt_mask =\
    modellib.load_image_gt(dataset, config, image_id, use_mini_mask=False)
info = dataset.image_info[image_id]
print("image ID: {}.{} ({}) {}".format(info["source"], info["id"], image_id, 
                                       dataset.image_reference(image_id)))
# Run object detection
results = model.detect([image], verbose=1)
# Display results
ax = get_ax(1)
r = results[0]
visualize.display_instances(image, r['rois'], r['masks'], r['class_ids'], 
                            dataset.class_names, r['scores'], ax=ax,
                            title="Predictions")
log("gt_class_id", gt_class_id)
log("gt_bbox", gt_bbox)
log("gt_mask", gt_mask)
print('The car has:{} damages'.format(len(dataset.image_info[image_id]['polygons'])))
image ID: scratch.image53.jpeg (2) C:/Users/Sourish/Mask_RCNN/custom/val\image53.jpeg
Processing 1 images
image                    shape: (1024, 1024, 3)       min:    0.00000  max:  255.00000  uint8
molded_images            shape: (1, 1024, 1024, 3)    min: -123.70000  max:  151.10000  float64
image_metas              shape: (1, 14)               min:    0.00000  max: 1024.00000  int32
anchors                  shape: (1, 261888, 4)        min:   -0.35390  max:    1.29134  float32
gt_class_id              shape: (2,)                  min:    1.00000  max:    1.00000  int32
gt_bbox                  shape: (2, 4)                min:  315.00000  max:  728.00000  int32
gt_mask                  shape: (1024, 1024, 2)       min:    0.00000  max:    1.00000  bool
The car has:2 damages

Mask_RCNN_car_damage_prediction_23_1.png

image_id = random.choice(dataset.image_ids)
image, image_meta, gt_class_id, gt_bbox, gt_mask =\
    modellib.load_image_gt(dataset, config, image_id, use_mini_mask=False)
info = dataset.image_info[image_id]
print("image ID: {}.{} ({}) {}".format(info["source"], info["id"], image_id, 
                                       dataset.image_reference(image_id)))
# Run object detection
results = model.detect([image], verbose=1)
# Display results
ax = get_ax(1)
r = results[0]
visualize.display_instances(image, r['rois'], r['masks'], r['class_ids'], 
                            dataset.class_names, r['scores'], ax=ax,
                            title="Predictions")
log("gt_class_id", gt_class_id)
log("gt_bbox", gt_bbox)
log("gt_mask", gt_mask)
print('The car has:{} damages'.format(len(dataset.image_info[image_id]['polygons'])))
image ID: scratch.image52.jpeg (1) C:/Users/Sourish/Mask_RCNN/custom/val\image52.jpeg
Processing 1 images
image                    shape: (1024, 1024, 3)       min:    0.00000  max:  255.00000  uint8
molded_images            shape: (1, 1024, 1024, 3)    min: -123.70000  max:  141.10000  float64
image_metas              shape: (1, 14)               min:    0.00000  max: 1024.00000  int32
anchors                  shape: (1, 261888, 4)        min:   -0.35390  max:    1.29134  float32
gt_class_id              shape: (1,)                  min:    1.00000  max:    1.00000  int32
gt_bbox                  shape: (1, 4)                min:  272.00000  max:  930.00000  int32
gt_mask                  shape: (1024, 1024, 1)       min:    0.00000  max:    1.00000  bool
The car has:1 damages

Mask_RCNN_car_damage_prediction_25_1.png

Pretty decent prediction considering training with only 49 images and 15 epochs