faster rcnn 在stage1 fast rcnn阶段 总是出现bbox_loss=-nan,请问是啥原因?代码贴在下面,使用ZF网络,为了测试,迭代次数调的比较小。

Logging output to /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/experiments/logs/faster_rcnn_alt_opt_qwl1.txt
+ cd /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/tools/

real    0m0.003s
user    0m0.000s
sys    0m0.004s
+ python train_faster_rcnn_alt_opt.py --net_name ZF
Called with args:
Namespace(cfg_file=None, imdb_name='voc_2007_trainval', net_name='ZF', pretrained_model=None, set_cfgs=None)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Stage 1 RPN, init from ImageNet model
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Init model: None
Using config:
{'DATA_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/data',
 'DEDUP_BOXES': 0.0625,
 'EPS': 1e-14,
 'EXP_DIR': 'default',
 'GPU_ID': 0,
 'MATLAB': 'matlab',
 'MODELS_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc',
 'PIXEL_MEANS': array([[[ 102.9801,  115.9465,  122.7717]]]),
 'RNG_SEED': 3,
 'ROOT_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master',
 'TEST': {'BBOX_REG': True,
          'HAS_RPN': False,
          'MAX_SIZE': 1000,
          'NMS': 0.3,
          'PROPOSAL_METHOD': 'selective_search',
          'RPN_MIN_SIZE': 16,
          'RPN_NMS_THRESH': 0.7,
          'RPN_POST_NMS_TOP_N': 300,
          'RPN_PRE_NMS_TOP_N': 6000,
          'SCALES': [48],
          'SVM': False},
 'TRAIN': {'ASPECT_GROUPING': True,
           'BATCH_SIZE': 128,
           'BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'BBOX_NORMALIZE_MEANS': [0.0, 0.0, 0.0, 0.0],
           'BBOX_NORMALIZE_STDS': [0.1, 0.1, 0.2, 0.2],
           'BBOX_NORMALIZE_TARGETS': True,
           'BBOX_NORMALIZE_TARGETS_PRECOMPUTED': False,
           'BBOX_REG': False,
           'BBOX_THRESH': 0.5,
           'BG_THRESH_HI': 0.5,
           'BG_THRESH_LO': 0.1,
           'FG_FRACTION': 0.25,
           'FG_THRESH': 0.5,
           'HAS_RPN': True,
           'IMS_PER_BATCH': 1,
           'MAX_SIZE': 700,
           'PROPOSAL_METHOD': 'gt',
           'RPN_BATCHSIZE': 256,
           'RPN_BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'RPN_CLOBBER_POSITIVES': False,
           'RPN_FG_FRACTION': 0.5,
           'RPN_MIN_SIZE': 16,
           'RPN_NEGATIVE_OVERLAP': 0.3,
           'RPN_NMS_THRESH': 0.7,
           'RPN_POSITIVE_OVERLAP': 0.7,
           'RPN_POSITIVE_WEIGHT': -1.0,
           'RPN_POST_NMS_TOP_N': 2000,
           'RPN_PRE_NMS_TOP_N': 12000,
           'SCALES': [256],
           'SNAPSHOT_INFIX': 'stage1',
           'SNAPSHOT_ITERS': 10000,
           'USE_FLIPPED': True,
           'USE_PREFETCH': False},
 'USE_GPU_NMS': False}
Loaded dataset `voc_2007_trainval` for training
Set proposal method: gt
Appending horizontally-flipped training examples...
wrote gt roidb to /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/data/cache/voc_2007_trainval_gt_roidb.pkl
done
Preparing training data...
done
roidb len: 418
Output will be saved to `/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval`
Filtered 0 roidb entries: 418 -> 418
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0109 00:06:19.632952  3343 solver.cpp:48] Initializing solver from parameters: 
train_net: "/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc/ZF/faster_rcnn_alt_opt/stage1_rpn_train.pt"
base_lr: 0
display: 20
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 0.0005
stepsize: 400
snapshot: 0
snapshot_prefix: "zf_rpn"
average_loss: 100
I0109 00:06:19.632994  3343 solver.cpp:81] Creating training net from train_net file: /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc/ZF/faster_rcnn_alt_opt/stage1_rpn_train.pt
I0109 00:06:19.633535  3343 net.cpp:49] Initializing net from parameters: 
name: "ZF"
state {
  phase: TRAIN
}
layer {
  name: "input-data"
  type: "Python"
  top: "data"
  top: "im_info"
  top: "gt_boxes"
  python_param {
    module: "roi_data_layer.layer"
    layer: "RoIDataLayer"
    param_str: "\'num_classes\': 2"
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 96
    pad: 3
    kernel_size: 7
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "conv1"
  top: "norm1"
  lrn_param {
    local_size: 3
    alpha: 5e-05
    beta: 0.75
    norm_region: WITHIN_CHANNEL
    engine: CAFFE
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "norm1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
    pad: 1
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    stride: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "conv2"
  top: "norm2"
  lrn_param {
    local_size: 3
    alpha: 5e-05
    beta: 0.75
    norm_region: WITHIN_CHANNEL
    engine: CAFFE
  }
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "norm2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
    pad: 1
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "rpn_conv1"
  type: "Convolution"
  bottom: "conv5"
  top: "rpn_conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "rpn_relu1"
  type: "ReLU"
  bottom: "rpn_conv1"
  top: "rpn_conv1"
}
layer {
  name: "rpn_cls_score"
  type: "Convolution"
  bottom: "rpn_conv1"
  top: "rpn_cls_score"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 18
    pad: 0
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "rpn_bbox_pred"
  type: "Convolution"
  bottom: "rpn_conv1"
  top: "rpn_bbox_pred"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 36
    pad: 0
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "rpn_cls_score_reshape"
  type: "Reshape"
  bottom: "rpn_cls_score"
  top: "rpn_cls_score_reshape"
  reshape_param {
    shape {
      dim: 0
      dim: 2
      dim: -1
      dim: 0
    }
  }
}
layer {
  name: "rpn-data"
  type: "Python"
  bottom: "rpn_cls_score"
  bottom: "gt_boxes"
  bottom: "im_info"
  bottom: "data"
  top: "rpn_labels"
  top: "rpn_bbox_targets"
  top: "rpn_bbox_inside_weights"
  top: "rpn_bbox_outside_weights"
  python_param {
    module: "rpn.anchor_target_layer"
    layer: "AnchorTargetLayer"
    param_str: "\'feat_stride\': 16"
  }
}
layer {
  name: "rpn_loss_cls"
  type: "SoftmaxWithLoss"
  bottom: "rpn_cls_score_reshape"
  bottom: "rpn_labels"
  top: "rpn_cls_loss"
  loss_weight: 1
  propagate_down: true
  propagate_down: false
  loss_param {
    ignore_label: -1
    normalize: true
  }
}
layer {
  name: "rpn_loss_bbox"
  type: "SmoothL1Loss"
  bottom: "rpn_bbox_pred"
  bottom: "rpn_bbox_targets"
  bottom: "rpn_bbox_inside_weights"
  bottom: "rpn_bbox_outside_weights"
  top: "rpn_loss_bbox"
  loss_weight: 1
  smooth_l1_loss_param {
    sigma: 3
  }
}
layer {
  name: "dummy_roi_pool_conv5"
  type: "DummyData"
  top: "dummy_roi_pool_conv5"
  dummy_data_param {
    data_filler {
      type: "gaussian"
      std: 0.01
    }
    shape {
      dim: 1
      dim: 9216
    }
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "dummy_roi_pool_conv5"
  top: "fc6"
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "silence_fc7"
  type: "Silence"
  bottom: "fc7"
}
I0109 00:06:19.633636  3343 layer_factory.hpp:77] Creating layer input-data
I0109 00:06:19.704708  3343 net.cpp:106] Creating Layer input-data
I0109 00:06:19.704731  3343 net.cpp:411] input-data -> data
I0109 00:06:19.704743  3343 net.cpp:411] input-data -> im_info
I0109 00:06:19.704751  3343 net.cpp:411] input-data -> gt_boxes
RoiDataLayer: name_to_top: {'gt_boxes': 2, 'data': 0, 'im_info': 1}
I0109 00:06:19.705282  3343 net.cpp:150] Setting up input-data
I0109 00:06:19.705297  3343 net.cpp:157] Top shape: 1 3 256 700 (537600)
I0109 00:06:19.705304  3343 net.cpp:157] Top shape: 1 3 (3)
I0109 00:06:19.705309  3343 net.cpp:157] Top shape: 1 4 (4)
I0109 00:06:19.705314  3343 net.cpp:165] Memory required for data: 2150428
I0109 00:06:19.705322  3343 layer_factory.hpp:77] Creating layer data_input-data_0_split
I0109 00:06:19.705333  3343 net.cpp:106] Creating Layer data_input-data_0_split
I0109 00:06:19.705339  3343 net.cpp:454] data_input-data_0_split <- data
I0109 00:06:19.705348  3343 net.cpp:411] data_input-data_0_split -> data_input-data_0_split_0
I0109 00:06:19.705358  3343 net.cpp:411] data_input-data_0_split -> data_input-data_0_split_1
I0109 00:06:19.705374  3343 net.cpp:150] Setting up data_input-data_0_split
I0109 00:06:19.705380  3343 net.cpp:157] Top shape: 1 3 256 700 (537600)
I0109 00:06:19.705386  3343 net.cpp:157] Top shape: 1 3 256 700 (537600)
I0109 00:06:19.705391  3343 net.cpp:165] Memory required for data: 6451228
I0109 00:06:19.705396  3343 layer_factory.hpp:77] Creating layer conv1
I0109 00:06:19.705409  3343 net.cpp:106] Creating Layer conv1
I0109 00:06:19.705412  3343 net.cpp:454] conv1 <- data_input-data_0_split_0
I0109 00:06:19.705420  3343 net.cpp:411] conv1 -> conv1
I0109 00:06:19.705508  3343 net.cpp:150] Setting up conv1
I0109 00:06:19.705516  3343 net.cpp:157] Top shape: 1 96 128 350 (4300800)
I0109 00:06:19.705520  3343 net.cpp:165] Memory required for data: 23654428
I0109 00:06:19.705534  3343 layer_factory.hpp:77] Creating layer relu1
I0109 00:06:19.705543  3343 net.cpp:106] Creating Layer relu1
I0109 00:06:19.705546  3343 net.cpp:454] relu1 <- conv1
I0109 00:06:19.705552  3343 net.cpp:397] relu1 -> conv1 (in-place)
I0109 00:06:19.705562  3343 net.cpp:150] Setting up relu1
I0109 00:06:19.705569  3343 net.cpp:157] Top shape: 1 96 128 350 (4300800)
I0109 00:06:19.705572  3343 net.cpp:165] Memory required for data: 40857628
I0109 00:06:19.705576  3343 layer_factory.hpp:77] Creating layer norm1
I0109 00:06:19.705586  3343 net.cpp:106] Creating Layer norm1
I0109 00:06:19.705590  3343 net.cpp:454] norm1 <- conv1
I0109 00:06:19.705595  3343 net.cpp:411] norm1 -> norm1
I0109 00:06:19.705631  3343 net.cpp:150] Setting up norm1
I0109 00:06:19.705639  3343 net.cpp:157] Top shape: 1 96 128 350 (4300800)
I0109 00:06:19.705643  3343 net.cpp:165] Memory required for data: 58060828
I0109 00:06:19.705647  3343 layer_factory.hpp:77] Creating layer pool1
I0109 00:06:19.705654  3343 net.cpp:106] Creating Layer pool1
I0109 00:06:19.705659  3343 net.cpp:454] pool1 <- norm1
I0109 00:06:19.705665  3343 net.cpp:411] pool1 -> pool1
I0109 00:06:19.705672  3343 net.cpp:150] Setting up pool1
I0109 00:06:19.705679  3343 net.cpp:157] Top shape: 1 96 65 176 (1098240)
I0109 00:06:19.705683  3343 net.cpp:165] Memory required for data: 62453788
I0109 00:06:19.705687  3343 layer_factory.hpp:77] Creating layer conv2
I0109 00:06:19.705696  3343 net.cpp:106] Creating Layer conv2
I0109 00:06:19.705701  3343 net.cpp:454] conv2 <- pool1
I0109 00:06:19.705708  3343 net.cpp:411] conv2 -> conv2
I0109 00:06:19.706637  3343 net.cpp:150] Setting up conv2
I0109 00:06:19.706646  3343 net.cpp:157] Top shape: 1 256 33 88 (743424)
I0109 00:06:19.706650  3343 net.cpp:165] Memory required for data: 65427484
I0109 00:06:19.706658  3343 layer_factory.hpp:77] Creating layer relu2
I0109 00:06:19.706666  3343 net.cpp:106] Creating Layer relu2
I0109 00:06:19.706671  3343 net.cpp:454] relu2 <- conv2
I0109 00:06:19.706677  3343 net.cpp:397] relu2 -> conv2 (in-place)
I0109 00:06:19.706686  3343 net.cpp:150] Setting up relu2
I0109 00:06:19.706691  3343 net.cpp:157] Top shape: 1 256 33 88 (743424)
I0109 00:06:19.706694  3343 net.cpp:165] Memory required for data: 68401180
I0109 00:06:19.706698  3343 layer_factory.hpp:77] Creating layer norm2
I0109 00:06:19.706707  3343 net.cpp:106] Creating Layer norm2
I0109 00:06:19.706712  3343 net.cpp:454] norm2 <- conv2
I0109 00:06:19.706718  3343 net.cpp:411] norm2 -> norm2
I0109 00:06:19.706737  3343 net.cpp:150] Setting up norm2
I0109 00:06:19.706744  3343 net.cpp:157] Top shape: 1 256 33 88 (743424)
I0109 00:06:19.706748  3343 net.cpp:165] Memory required for data: 71374876
I0109 00:06:19.706753  3343 layer_factory.hpp:77] Creating layer pool2
I0109 00:06:19.706759  3343 net.cpp:106] Creating Layer pool2
I0109 00:06:19.706763  3343 net.cpp:454] pool2 <- norm2
I0109 00:06:19.706770  3343 net.cpp:411] pool2 -> pool2
I0109 00:06:19.706779  3343 net.cpp:150] Setting up pool2
I0109 00:06:19.706785  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.706790  3343 net.cpp:165] Memory required for data: 72158236
I0109 00:06:19.706794  3343 layer_factory.hpp:77] Creating layer conv3
I0109 00:06:19.706800  3343 net.cpp:106] Creating Layer conv3
I0109 00:06:19.706804  3343 net.cpp:454] conv3 <- pool2
I0109 00:06:19.706812  3343 net.cpp:411] conv3 -> conv3
I0109 00:06:19.708349  3343 net.cpp:150] Setting up conv3
I0109 00:06:19.708359  3343 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:06:19.708361  3343 net.cpp:165] Memory required for data: 73333276
I0109 00:06:19.708369  3343 layer_factory.hpp:77] Creating layer relu3
I0109 00:06:19.708376  3343 net.cpp:106] Creating Layer relu3
I0109 00:06:19.708381  3343 net.cpp:454] relu3 <- conv3
I0109 00:06:19.708387  3343 net.cpp:397] relu3 -> conv3 (in-place)
I0109 00:06:19.708395  3343 net.cpp:150] Setting up relu3
I0109 00:06:19.708401  3343 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:06:19.708405  3343 net.cpp:165] Memory required for data: 74508316
I0109 00:06:19.708406  3343 layer_factory.hpp:77] Creating layer conv4
I0109 00:06:19.708415  3343 net.cpp:106] Creating Layer conv4
I0109 00:06:19.708418  3343 net.cpp:454] conv4 <- conv3
I0109 00:06:19.708425  3343 net.cpp:411] conv4 -> conv4
I0109 00:06:19.710436  3343 net.cpp:150] Setting up conv4
I0109 00:06:19.710448  3343 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:06:19.710451  3343 net.cpp:165] Memory required for data: 75683356
I0109 00:06:19.710458  3343 layer_factory.hpp:77] Creating layer relu4
I0109 00:06:19.710467  3343 net.cpp:106] Creating Layer relu4
I0109 00:06:19.710472  3343 net.cpp:454] relu4 <- conv4
I0109 00:06:19.710479  3343 net.cpp:397] relu4 -> conv4 (in-place)
I0109 00:06:19.710487  3343 net.cpp:150] Setting up relu4
I0109 00:06:19.710494  3343 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:06:19.710497  3343 net.cpp:165] Memory required for data: 76858396
I0109 00:06:19.710501  3343 layer_factory.hpp:77] Creating layer conv5
I0109 00:06:19.710510  3343 net.cpp:106] Creating Layer conv5
I0109 00:06:19.710515  3343 net.cpp:454] conv5 <- conv4
I0109 00:06:19.710520  3343 net.cpp:411] conv5 -> conv5
I0109 00:06:19.711848  3343 net.cpp:150] Setting up conv5
I0109 00:06:19.711882  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.711884  3343 net.cpp:165] Memory required for data: 77641756
I0109 00:06:19.711901  3343 layer_factory.hpp:77] Creating layer relu5
I0109 00:06:19.711918  3343 net.cpp:106] Creating Layer relu5
I0109 00:06:19.711923  3343 net.cpp:454] relu5 <- conv5
I0109 00:06:19.711930  3343 net.cpp:397] relu5 -> conv5 (in-place)
I0109 00:06:19.711941  3343 net.cpp:150] Setting up relu5
I0109 00:06:19.711946  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.711951  3343 net.cpp:165] Memory required for data: 78425116
I0109 00:06:19.711953  3343 layer_factory.hpp:77] Creating layer rpn_conv1
I0109 00:06:19.711972  3343 net.cpp:106] Creating Layer rpn_conv1
I0109 00:06:19.711977  3343 net.cpp:454] rpn_conv1 <- conv5
I0109 00:06:19.711982  3343 net.cpp:411] rpn_conv1 -> rpn_conv1
I0109 00:06:19.723140  3343 net.cpp:150] Setting up rpn_conv1
I0109 00:06:19.723172  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.723176  3343 net.cpp:165] Memory required for data: 79208476
I0109 00:06:19.723183  3343 layer_factory.hpp:77] Creating layer rpn_relu1
I0109 00:06:19.723192  3343 net.cpp:106] Creating Layer rpn_relu1
I0109 00:06:19.723196  3343 net.cpp:454] rpn_relu1 <- rpn_conv1
I0109 00:06:19.723202  3343 net.cpp:397] rpn_relu1 -> rpn_conv1 (in-place)
I0109 00:06:19.723211  3343 net.cpp:150] Setting up rpn_relu1
I0109 00:06:19.723214  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.723217  3343 net.cpp:165] Memory required for data: 79991836
I0109 00:06:19.723218  3343 layer_factory.hpp:77] Creating layer rpn_conv1_rpn_relu1_0_split
I0109 00:06:19.723225  3343 net.cpp:106] Creating Layer rpn_conv1_rpn_relu1_0_split
I0109 00:06:19.723227  3343 net.cpp:454] rpn_conv1_rpn_relu1_0_split <- rpn_conv1
I0109 00:06:19.723232  3343 net.cpp:411] rpn_conv1_rpn_relu1_0_split -> rpn_conv1_rpn_relu1_0_split_0
I0109 00:06:19.723239  3343 net.cpp:411] rpn_conv1_rpn_relu1_0_split -> rpn_conv1_rpn_relu1_0_split_1
I0109 00:06:19.723254  3343 net.cpp:150] Setting up rpn_conv1_rpn_relu1_0_split
I0109 00:06:19.723258  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.723263  3343 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:06:19.723268  3343 net.cpp:165] Memory required for data: 81558556
I0109 00:06:19.723270  3343 layer_factory.hpp:77] Creating layer rpn_cls_score
I0109 00:06:19.723284  3343 net.cpp:106] Creating Layer rpn_cls_score
I0109 00:06:19.723287  3343 net.cpp:454] rpn_cls_score <- rpn_conv1_rpn_relu1_0_split_0
I0109 00:06:19.723296  3343 net.cpp:411] rpn_cls_score -> rpn_cls_score
I0109 00:06:19.723414  3343 net.cpp:150] Setting up rpn_cls_score
I0109 00:06:19.723428  3343 net.cpp:157] Top shape: 1 18 17 45 (13770)
I0109 00:06:19.723433  3343 net.cpp:165] Memory required for data: 81613636
I0109 00:06:19.723438  3343 layer_factory.hpp:77] Creating layer rpn_cls_score_rpn_cls_score_0_split
I0109 00:06:19.723450  3343 net.cpp:106] Creating Layer rpn_cls_score_rpn_cls_score_0_split
I0109 00:06:19.723453  3343 net.cpp:454] rpn_cls_score_rpn_cls_score_0_split <- rpn_cls_score
I0109 00:06:19.723458  3343 net.cpp:411] rpn_cls_score_rpn_cls_score_0_split -> rpn_cls_score_rpn_cls_score_0_split_0
I0109 00:06:19.723462  3343 net.cpp:411] rpn_cls_score_rpn_cls_score_0_split -> rpn_cls_score_rpn_cls_score_0_split_1
I0109 00:06:19.723467  3343 net.cpp:150] Setting up rpn_cls_score_rpn_cls_score_0_split
I0109 00:06:19.723471  3343 net.cpp:157] Top shape: 1 18 17 45 (13770)
I0109 00:06:19.723474  3343 net.cpp:157] Top shape: 1 18 17 45 (13770)
I0109 00:06:19.723476  3343 net.cpp:165] Memory required for data: 81723796
I0109 00:06:19.723479  3343 layer_factory.hpp:77] Creating layer rpn_bbox_pred
I0109 00:06:19.723489  3343 net.cpp:106] Creating Layer rpn_bbox_pred
I0109 00:06:19.723493  3343 net.cpp:454] rpn_bbox_pred <- rpn_conv1_rpn_relu1_0_split_1
I0109 00:06:19.723500  3343 net.cpp:411] rpn_bbox_pred -> rpn_bbox_pred
I0109 00:06:19.723692  3343 net.cpp:150] Setting up rpn_bbox_pred
I0109 00:06:19.723701  3343 net.cpp:157] Top shape: 1 36 17 45 (27540)
I0109 00:06:19.723703  3343 net.cpp:165] Memory required for data: 81833956
I0109 00:06:19.723711  3343 layer_factory.hpp:77] Creating layer rpn_cls_score_reshape
I0109 00:06:19.723718  3343 net.cpp:106] Creating Layer rpn_cls_score_reshape
I0109 00:06:19.723723  3343 net.cpp:454] rpn_cls_score_reshape <- rpn_cls_score_rpn_cls_score_0_split_0
I0109 00:06:19.723732  3343 net.cpp:411] rpn_cls_score_reshape -> rpn_cls_score_reshape
I0109 00:06:19.723744  3343 net.cpp:150] Setting up rpn_cls_score_reshape
I0109 00:06:19.723752  3343 net.cpp:157] Top shape: 1 2 153 45 (13770)
I0109 00:06:19.723755  3343 net.cpp:165] Memory required for data: 81889036
I0109 00:06:19.723759  3343 layer_factory.hpp:77] Creating layer rpn-data
I0109 00:06:19.726662  3343 net.cpp:106] Creating Layer rpn-data
I0109 00:06:19.726673  3343 net.cpp:454] rpn-data <- rpn_cls_score_rpn_cls_score_0_split_1
I0109 00:06:19.726680  3343 net.cpp:454] rpn-data <- gt_boxes
I0109 00:06:19.726685  3343 net.cpp:454] rpn-data <- im_info
I0109 00:06:19.726689  3343 net.cpp:454] rpn-data <- data_input-data_0_split_1
I0109 00:06:19.726697  3343 net.cpp:411] rpn-data -> rpn_labels
I0109 00:06:19.726707  3343 net.cpp:411] rpn-data -> rpn_bbox_targets
I0109 00:06:19.726718  3343 net.cpp:411] rpn-data -> rpn_bbox_inside_weights
I0109 00:06:19.726727  3343 net.cpp:411] rpn-data -> rpn_bbox_outside_weights
I0109 00:06:19.727392  3343 net.cpp:150] Setting up rpn-data
I0109 00:06:19.727406  3343 net.cpp:157] Top shape: 1 1 153 45 (6885)
I0109 00:06:19.727411  3343 net.cpp:157] Top shape: 1 36 17 45 (27540)
I0109 00:06:19.727416  3343 net.cpp:157] Top shape: 1 36 17 45 (27540)
I0109 00:06:19.727421  3343 net.cpp:157] Top shape: 1 36 17 45 (27540)
I0109 00:06:19.727425  3343 net.cpp:165] Memory required for data: 82247056
I0109 00:06:19.727432  3343 layer_factory.hpp:77] Creating layer rpn_loss_cls
I0109 00:06:19.727442  3343 net.cpp:106] Creating Layer rpn_loss_cls
I0109 00:06:19.727447  3343 net.cpp:454] rpn_loss_cls <- rpn_cls_score_reshape
I0109 00:06:19.727453  3343 net.cpp:454] rpn_loss_cls <- rpn_labels
I0109 00:06:19.727459  3343 net.cpp:411] rpn_loss_cls -> rpn_cls_loss
I0109 00:06:19.727468  3343 layer_factory.hpp:77] Creating layer rpn_loss_cls
I0109 00:06:19.727504  3343 net.cpp:150] Setting up rpn_loss_cls
I0109 00:06:19.727510  3343 net.cpp:157] Top shape: (1)
I0109 00:06:19.727514  3343 net.cpp:160]     with loss weight 1
I0109 00:06:19.727530  3343 net.cpp:165] Memory required for data: 82247060
I0109 00:06:19.727535  3343 layer_factory.hpp:77] Creating layer rpn_loss_bbox
I0109 00:06:19.727540  3343 net.cpp:106] Creating Layer rpn_loss_bbox
I0109 00:06:19.727545  3343 net.cpp:454] rpn_loss_bbox <- rpn_bbox_pred
I0109 00:06:19.727547  3343 net.cpp:454] rpn_loss_bbox <- rpn_bbox_targets
I0109 00:06:19.727551  3343 net.cpp:454] rpn_loss_bbox <- rpn_bbox_inside_weights
I0109 00:06:19.727556  3343 net.cpp:454] rpn_loss_bbox <- rpn_bbox_outside_weights
I0109 00:06:19.727565  3343 net.cpp:411] rpn_loss_bbox -> rpn_loss_bbox
I0109 00:06:19.727793  3343 net.cpp:150] Setting up rpn_loss_bbox
I0109 00:06:19.727800  3343 net.cpp:157] Top shape: (1)
I0109 00:06:19.727804  3343 net.cpp:160]     with loss weight 1
I0109 00:06:19.727809  3343 net.cpp:165] Memory required for data: 82247064
I0109 00:06:19.727813  3343 layer_factory.hpp:77] Creating layer dummy_roi_pool_conv5
I0109 00:06:19.727823  3343 net.cpp:106] Creating Layer dummy_roi_pool_conv5
I0109 00:06:19.727829  3343 net.cpp:411] dummy_roi_pool_conv5 -> dummy_roi_pool_conv5
I0109 00:06:19.727840  3343 net.cpp:150] Setting up dummy_roi_pool_conv5
I0109 00:06:19.727845  3343 net.cpp:157] Top shape: 1 9216 (9216)
I0109 00:06:19.727849  3343 net.cpp:165] Memory required for data: 82283928
I0109 00:06:19.727852  3343 layer_factory.hpp:77] Creating layer fc6
I0109 00:06:19.727859  3343 net.cpp:106] Creating Layer fc6
I0109 00:06:19.727864  3343 net.cpp:454] fc6 <- dummy_roi_pool_conv5
I0109 00:06:19.727867  3343 net.cpp:411] fc6 -> fc6
I0109 00:06:19.770541  3343 net.cpp:150] Setting up fc6
I0109 00:06:19.770612  3343 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:06:19.770617  3343 net.cpp:165] Memory required for data: 82300312
I0109 00:06:19.770642  3343 layer_factory.hpp:77] Creating layer relu6
I0109 00:06:19.770656  3343 net.cpp:106] Creating Layer relu6
I0109 00:06:19.770661  3343 net.cpp:454] relu6 <- fc6
I0109 00:06:19.770671  3343 net.cpp:397] relu6 -> fc6 (in-place)
I0109 00:06:19.770686  3343 net.cpp:150] Setting up relu6
I0109 00:06:19.770690  3343 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:06:19.770692  3343 net.cpp:165] Memory required for data: 82316696
I0109 00:06:19.770694  3343 layer_factory.hpp:77] Creating layer fc7
I0109 00:06:19.770701  3343 net.cpp:106] Creating Layer fc7
I0109 00:06:19.770704  3343 net.cpp:454] fc7 <- fc6
I0109 00:06:19.770712  3343 net.cpp:411] fc7 -> fc7
I0109 00:06:19.790748  3343 net.cpp:150] Setting up fc7
I0109 00:06:19.790848  3343 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:06:19.790851  3343 net.cpp:165] Memory required for data: 82333080
I0109 00:06:19.790870  3343 layer_factory.hpp:77] Creating layer silence_fc7
I0109 00:06:19.790885  3343 net.cpp:106] Creating Layer silence_fc7
I0109 00:06:19.790891  3343 net.cpp:454] silence_fc7 <- fc7
I0109 00:06:19.790900  3343 net.cpp:150] Setting up silence_fc7
I0109 00:06:19.790902  3343 net.cpp:165] Memory required for data: 82333080
I0109 00:06:19.790907  3343 net.cpp:228] silence_fc7 does not need backward computation.
I0109 00:06:19.790911  3343 net.cpp:228] fc7 does not need backward computation.
I0109 00:06:19.790915  3343 net.cpp:228] relu6 does not need backward computation.
I0109 00:06:19.790917  3343 net.cpp:228] fc6 does not need backward computation.
I0109 00:06:19.790920  3343 net.cpp:228] dummy_roi_pool_conv5 does not need backward computation.
I0109 00:06:19.790923  3343 net.cpp:226] rpn_loss_bbox needs backward computation.
I0109 00:06:19.790927  3343 net.cpp:226] rpn_loss_cls needs backward computation.
I0109 00:06:19.790931  3343 net.cpp:226] rpn-data needs backward computation.
I0109 00:06:19.790936  3343 net.cpp:226] rpn_cls_score_reshape needs backward computation.
I0109 00:06:19.790941  3343 net.cpp:226] rpn_bbox_pred needs backward computation.
I0109 00:06:19.790946  3343 net.cpp:226] rpn_cls_score_rpn_cls_score_0_split needs backward computation.
I0109 00:06:19.790948  3343 net.cpp:226] rpn_cls_score needs backward computation.
I0109 00:06:19.790951  3343 net.cpp:226] rpn_conv1_rpn_relu1_0_split needs backward computation.
I0109 00:06:19.790956  3343 net.cpp:226] rpn_relu1 needs backward computation.
I0109 00:06:19.790959  3343 net.cpp:226] rpn_conv1 needs backward computation.
I0109 00:06:19.790962  3343 net.cpp:226] relu5 needs backward computation.
I0109 00:06:19.790964  3343 net.cpp:226] conv5 needs backward computation.
I0109 00:06:19.790967  3343 net.cpp:226] relu4 needs backward computation.
I0109 00:06:19.790971  3343 net.cpp:226] conv4 needs backward computation.
I0109 00:06:19.790974  3343 net.cpp:226] relu3 needs backward computation.
I0109 00:06:19.790977  3343 net.cpp:226] conv3 needs backward computation.
I0109 00:06:19.790982  3343 net.cpp:226] pool2 needs backward computation.
I0109 00:06:19.790987  3343 net.cpp:226] norm2 needs backward computation.
I0109 00:06:19.790989  3343 net.cpp:226] relu2 needs backward computation.
I0109 00:06:19.790993  3343 net.cpp:226] conv2 needs backward computation.
I0109 00:06:19.790998  3343 net.cpp:226] pool1 needs backward computation.
I0109 00:06:19.791002  3343 net.cpp:226] norm1 needs backward computation.
I0109 00:06:19.791007  3343 net.cpp:226] relu1 needs backward computation.
I0109 00:06:19.791010  3343 net.cpp:226] conv1 needs backward computation.
I0109 00:06:19.791018  3343 net.cpp:228] data_input-data_0_split does not need backward computation.
I0109 00:06:19.791023  3343 net.cpp:228] input-data does not need backward computation.
I0109 00:06:19.791026  3343 net.cpp:270] This network produces output rpn_cls_loss
I0109 00:06:19.791031  3343 net.cpp:270] This network produces output rpn_loss_bbox
I0109 00:06:19.791066  3343 net.cpp:283] Network initialization done.
I0109 00:06:19.791240  3343 solver.cpp:60] Solver scaffolding done.
Solving...
I0109 00:06:20.781101  3343 solver.cpp:229] Iteration 0, loss = 0.828117
I0109 00:06:20.781138  3343 solver.cpp:245]     Train net output #0: rpn_cls_loss = 0.693146 (* 1 = 0.693146 loss)
I0109 00:06:20.781146  3343 solver.cpp:245]     Train net output #1: rpn_loss_bbox = 0.134971 (* 1 = 0.134971 loss)
I0109 00:06:20.781152  3343 sgd_solver.cpp:106] Iteration 0, lr = 0
I0109 00:06:39.866544  3343 solver.cpp:229] Iteration 20, loss = 0.828117
I0109 00:06:39.866582  3343 solver.cpp:245]     Train net output #0: rpn_cls_loss = 0.693146 (* 1 = 0.693146 loss)
I0109 00:06:39.866588  3343 solver.cpp:245]     Train net output #1: rpn_loss_bbox = 0.134971 (* 1 = 0.134971 loss)
I0109 00:06:39.866611  3343 sgd_solver.cpp:106] Iteration 20, lr = 0
I0109 00:06:58.521415  3343 solver.cpp:229] Iteration 40, loss = 0.827898
I0109 00:06:58.521447  3343 solver.cpp:245]     Train net output #0: rpn_cls_loss = 0.693146 (* 1 = 0.693146 loss)
I0109 00:06:58.521453  3343 solver.cpp:245]     Train net output #1: rpn_loss_bbox = 0.134751 (* 1 = 0.134751 loss)
I0109 00:06:58.521458  3343 sgd_solver.cpp:106] Iteration 40, lr = 0
I0109 00:07:17.182181  3343 solver.cpp:229] Iteration 60, loss = 0.827898
I0109 00:07:17.182215  3343 solver.cpp:245]     Train net output #0: rpn_cls_loss = 0.693146 (* 1 = 0.693146 loss)
I0109 00:07:17.182221  3343 solver.cpp:245]     Train net output #1: rpn_loss_bbox = 0.134751 (* 1 = 0.134751 loss)
I0109 00:07:17.182243  3343 sgd_solver.cpp:106] Iteration 60, lr = 0
I0109 00:07:35.854626  3343 solver.cpp:229] Iteration 80, loss = 0.827898
I0109 00:07:35.854660  3343 solver.cpp:245]     Train net output #0: rpn_cls_loss = 0.693146 (* 1 = 0.693146 loss)
I0109 00:07:35.854665  3343 solver.cpp:245]     Train net output #1: rpn_loss_bbox = 0.134751 (* 1 = 0.134751 loss)
I0109 00:07:35.854671  3343 sgd_solver.cpp:106] Iteration 80, lr = 0
Wrote snapshot to: /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval/zf_rpn_stage1_iter_100.caffemodel
done solving
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Stage 1 RPN, generate proposals
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
RPN model: /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval/zf_rpn_stage1_iter_100.caffemodel
Using config:
{'DATA_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/data',
 'DEDUP_BOXES': 0.0625,
 'EPS': 1e-14,
 'EXP_DIR': 'default',
 'GPU_ID': 0,
 'MATLAB': 'matlab',
 'MODELS_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc',
 'PIXEL_MEANS': array([[[ 102.9801,  115.9465,  122.7717]]]),
 'RNG_SEED': 3,
 'ROOT_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master',
 'TEST': {'BBOX_REG': True,
          'HAS_RPN': False,
          'MAX_SIZE': 1000,
          'NMS': 0.3,
          'PROPOSAL_METHOD': 'selective_search',
          'RPN_MIN_SIZE': 16,
          'RPN_NMS_THRESH': 0.7,
          'RPN_POST_NMS_TOP_N': 2000,
          'RPN_PRE_NMS_TOP_N': -1,
          'SCALES': [48],
          'SVM': False},
 'TRAIN': {'ASPECT_GROUPING': True,
           'BATCH_SIZE': 128,
           'BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'BBOX_NORMALIZE_MEANS': [0.0, 0.0, 0.0, 0.0],
           'BBOX_NORMALIZE_STDS': [0.1, 0.1, 0.2, 0.2],
           'BBOX_NORMALIZE_TARGETS': True,
           'BBOX_NORMALIZE_TARGETS_PRECOMPUTED': False,
           'BBOX_REG': True,
           'BBOX_THRESH': 0.5,
           'BG_THRESH_HI': 0.5,
           'BG_THRESH_LO': 0.1,
           'FG_FRACTION': 0.25,
           'FG_THRESH': 0.5,
           'HAS_RPN': False,
           'IMS_PER_BATCH': 1,
           'MAX_SIZE': 700,
           'PROPOSAL_METHOD': 'selective_search',
           'RPN_BATCHSIZE': 256,
           'RPN_BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'RPN_CLOBBER_POSITIVES': False,
           'RPN_FG_FRACTION': 0.5,
           'RPN_MIN_SIZE': 16,
           'RPN_NEGATIVE_OVERLAP': 0.3,
           'RPN_NMS_THRESH': 0.7,
           'RPN_POSITIVE_OVERLAP': 0.7,
           'RPN_POSITIVE_WEIGHT': -1.0,
           'RPN_POST_NMS_TOP_N': 2000,
           'RPN_PRE_NMS_TOP_N': 12000,
           'SCALES': [256],
           'SNAPSHOT_INFIX': 'stage1',
           'SNAPSHOT_ITERS': 10000,
           'USE_FLIPPED': True,
           'USE_PREFETCH': False},
 'USE_GPU_NMS': False}
Loaded dataset `voc_2007_trainval` for proposal generation
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0109 00:07:56.457439  3357 net.cpp:49] Initializing net from parameters: 
name: "ZF"
input: "data"
input: "im_info"
state {
  phase: TEST
}
input_shape {
  dim: 1
  dim: 3
  dim: 224
  dim: 224
}
input_shape {
  dim: 1
  dim: 3
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    pad: 3
    kernel_size: 7
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "conv1"
  top: "norm1"
  lrn_param {
    local_size: 3
    alpha: 5e-05
    beta: 0.75
    norm_region: WITHIN_CHANNEL
    engine: CAFFE
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "norm1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
    pad: 1
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    stride: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "conv2"
  top: "norm2"
  lrn_param {
    local_size: 3
    alpha: 5e-05
    beta: 0.75
    norm_region: WITHIN_CHANNEL
    engine: CAFFE
  }
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "norm2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
    pad: 1
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "rpn_conv1"
  type: "Convolution"
  bottom: "conv5"
  top: "rpn_conv1"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "rpn_relu1"
  type: "ReLU"
  bottom: "rpn_conv1"
  top: "rpn_conv1"
}
layer {
  name: "rpn_cls_score"
  type: "Convolution"
  bottom: "rpn_conv1"
  top: "rpn_cls_score"
  convolution_param {
    num_output: 18
    pad: 0
    kernel_size: 1
    stride: 1
  }
}
layer {
  name: "rpn_bbox_pred"
  type: "Convolution"
  bottom: "rpn_conv1"
  top: "rpn_bbox_pred"
  convolution_param {
    num_output: 36
    pad: 0
    kernel_size: 1
    stride: 1
  }
}
layer {
  name: "rpn_cls_score_reshape"
  type: "Reshape"
  bottom: "rpn_cls_score"
  top: "rpn_cls_score_reshape"
  reshape_param {
    shape {
      dim: 0
      dim: 2
      dim: -1
      dim: 0
    }
  }
}
layer {
  name: "rpn_cls_prob"
  type: "Softmax"
  bottom: "rpn_cls_score_reshape"
  top: "rpn_cls_prob"
}
layer {
  name: "rpn_cls_prob_reshape"
  type: "Reshape"
  bottom: "rpn_cls_prob"
  top: "rpn_cls_prob_reshape"
  reshape_param {
    shape {
      dim: 0
      dim: 18
      dim: -1
      dim: 0
    }
  }
}
layer {
  name: "proposal"
  type: "Python"
  bottom: "rpn_cls_prob_reshape"
  bottom: "rpn_bbox_pred"
  bottom: "im_info"
  top: "rois"
  top: "scores"
  python_param {
    module: "rpn.proposal_layer"
    layer: "ProposalLayer"
    param_str: "\'feat_stride\': 16"
  }
}
I0109 00:07:56.457671  3357 net.cpp:413] Input 0 -> data
I0109 00:07:56.457710  3357 net.cpp:413] Input 1 -> im_info
I0109 00:07:56.457751  3357 layer_factory.hpp:77] Creating layer conv1
I0109 00:07:56.457783  3357 net.cpp:106] Creating Layer conv1
I0109 00:07:56.457795  3357 net.cpp:454] conv1 <- data
I0109 00:07:56.457840  3357 net.cpp:411] conv1 -> conv1
I0109 00:07:56.457993  3357 net.cpp:150] Setting up conv1
I0109 00:07:56.458012  3357 net.cpp:157] Top shape: 1 96 112 112 (1204224)
I0109 00:07:56.458022  3357 net.cpp:165] Memory required for data: 4816896
I0109 00:07:56.458042  3357 layer_factory.hpp:77] Creating layer relu1
I0109 00:07:56.458056  3357 net.cpp:106] Creating Layer relu1
I0109 00:07:56.458061  3357 net.cpp:454] relu1 <- conv1
I0109 00:07:56.458070  3357 net.cpp:397] relu1 -> conv1 (in-place)
I0109 00:07:56.458084  3357 net.cpp:150] Setting up relu1
I0109 00:07:56.458091  3357 net.cpp:157] Top shape: 1 96 112 112 (1204224)
I0109 00:07:56.458096  3357 net.cpp:165] Memory required for data: 9633792
I0109 00:07:56.458101  3357 layer_factory.hpp:77] Creating layer norm1
I0109 00:07:56.458117  3357 net.cpp:106] Creating Layer norm1
I0109 00:07:56.458124  3357 net.cpp:454] norm1 <- conv1
I0109 00:07:56.458137  3357 net.cpp:411] norm1 -> norm1
I0109 00:07:56.458211  3357 net.cpp:150] Setting up norm1
I0109 00:07:56.458228  3357 net.cpp:157] Top shape: 1 96 112 112 (1204224)
I0109 00:07:56.458235  3357 net.cpp:165] Memory required for data: 14450688
I0109 00:07:56.458243  3357 layer_factory.hpp:77] Creating layer pool1
I0109 00:07:56.458256  3357 net.cpp:106] Creating Layer pool1
I0109 00:07:56.458264  3357 net.cpp:454] pool1 <- norm1
I0109 00:07:56.458272  3357 net.cpp:411] pool1 -> pool1
I0109 00:07:56.458286  3357 net.cpp:150] Setting up pool1
I0109 00:07:56.458293  3357 net.cpp:157] Top shape: 1 96 57 57 (311904)
I0109 00:07:56.458297  3357 net.cpp:165] Memory required for data: 15698304
I0109 00:07:56.458302  3357 layer_factory.hpp:77] Creating layer conv2
I0109 00:07:56.458313  3357 net.cpp:106] Creating Layer conv2
I0109 00:07:56.458318  3357 net.cpp:454] conv2 <- pool1
I0109 00:07:56.458325  3357 net.cpp:411] conv2 -> conv2
I0109 00:07:56.459810  3357 net.cpp:150] Setting up conv2
I0109 00:07:56.459827  3357 net.cpp:157] Top shape: 1 256 29 29 (215296)
I0109 00:07:56.459834  3357 net.cpp:165] Memory required for data: 16559488
I0109 00:07:56.459852  3357 layer_factory.hpp:77] Creating layer relu2
I0109 00:07:56.459867  3357 net.cpp:106] Creating Layer relu2
I0109 00:07:56.459877  3357 net.cpp:454] relu2 <- conv2
I0109 00:07:56.459892  3357 net.cpp:397] relu2 -> conv2 (in-place)
I0109 00:07:56.459903  3357 net.cpp:150] Setting up relu2
I0109 00:07:56.459913  3357 net.cpp:157] Top shape: 1 256 29 29 (215296)
I0109 00:07:56.459920  3357 net.cpp:165] Memory required for data: 17420672
I0109 00:07:56.459928  3357 layer_factory.hpp:77] Creating layer norm2
I0109 00:07:56.459941  3357 net.cpp:106] Creating Layer norm2
I0109 00:07:56.459951  3357 net.cpp:454] norm2 <- conv2
I0109 00:07:56.459965  3357 net.cpp:411] norm2 -> norm2
I0109 00:07:56.460013  3357 net.cpp:150] Setting up norm2
I0109 00:07:56.460027  3357 net.cpp:157] Top shape: 1 256 29 29 (215296)
I0109 00:07:56.460034  3357 net.cpp:165] Memory required for data: 18281856
I0109 00:07:56.460042  3357 layer_factory.hpp:77] Creating layer pool2
I0109 00:07:56.460052  3357 net.cpp:106] Creating Layer pool2
I0109 00:07:56.460055  3357 net.cpp:454] pool2 <- norm2
I0109 00:07:56.460062  3357 net.cpp:411] pool2 -> pool2
I0109 00:07:56.460074  3357 net.cpp:150] Setting up pool2
I0109 00:07:56.460081  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.460085  3357 net.cpp:165] Memory required for data: 18512256
I0109 00:07:56.460089  3357 layer_factory.hpp:77] Creating layer conv3
I0109 00:07:56.460104  3357 net.cpp:106] Creating Layer conv3
I0109 00:07:56.460114  3357 net.cpp:454] conv3 <- pool2
I0109 00:07:56.460125  3357 net.cpp:411] conv3 -> conv3
I0109 00:07:56.461786  3357 net.cpp:150] Setting up conv3
I0109 00:07:56.461817  3357 net.cpp:157] Top shape: 1 384 15 15 (86400)
I0109 00:07:56.461823  3357 net.cpp:165] Memory required for data: 18857856
I0109 00:07:56.461850  3357 layer_factory.hpp:77] Creating layer relu3
I0109 00:07:56.461868  3357 net.cpp:106] Creating Layer relu3
I0109 00:07:56.461879  3357 net.cpp:454] relu3 <- conv3
I0109 00:07:56.461894  3357 net.cpp:397] relu3 -> conv3 (in-place)
I0109 00:07:56.461911  3357 net.cpp:150] Setting up relu3
I0109 00:07:56.461925  3357 net.cpp:157] Top shape: 1 384 15 15 (86400)
I0109 00:07:56.461931  3357 net.cpp:165] Memory required for data: 19203456
I0109 00:07:56.461941  3357 layer_factory.hpp:77] Creating layer conv4
I0109 00:07:56.461956  3357 net.cpp:106] Creating Layer conv4
I0109 00:07:56.461966  3357 net.cpp:454] conv4 <- conv3
I0109 00:07:56.461976  3357 net.cpp:411] conv4 -> conv4
I0109 00:07:56.464504  3357 net.cpp:150] Setting up conv4
I0109 00:07:56.464545  3357 net.cpp:157] Top shape: 1 384 15 15 (86400)
I0109 00:07:56.464551  3357 net.cpp:165] Memory required for data: 19549056
I0109 00:07:56.464576  3357 layer_factory.hpp:77] Creating layer relu4
I0109 00:07:56.464598  3357 net.cpp:106] Creating Layer relu4
I0109 00:07:56.464630  3357 net.cpp:454] relu4 <- conv4
I0109 00:07:56.464663  3357 net.cpp:397] relu4 -> conv4 (in-place)
I0109 00:07:56.464680  3357 net.cpp:150] Setting up relu4
I0109 00:07:56.464694  3357 net.cpp:157] Top shape: 1 384 15 15 (86400)
I0109 00:07:56.464701  3357 net.cpp:165] Memory required for data: 19894656
I0109 00:07:56.464710  3357 layer_factory.hpp:77] Creating layer conv5
I0109 00:07:56.464727  3357 net.cpp:106] Creating Layer conv5
I0109 00:07:56.464735  3357 net.cpp:454] conv5 <- conv4
I0109 00:07:56.464743  3357 net.cpp:411] conv5 -> conv5
I0109 00:07:56.467422  3357 net.cpp:150] Setting up conv5
I0109 00:07:56.467447  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.467454  3357 net.cpp:165] Memory required for data: 20125056
I0109 00:07:56.467483  3357 layer_factory.hpp:77] Creating layer relu5
I0109 00:07:56.467500  3357 net.cpp:106] Creating Layer relu5
I0109 00:07:56.467512  3357 net.cpp:454] relu5 <- conv5
I0109 00:07:56.467528  3357 net.cpp:397] relu5 -> conv5 (in-place)
I0109 00:07:56.467545  3357 net.cpp:150] Setting up relu5
I0109 00:07:56.467556  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.467561  3357 net.cpp:165] Memory required for data: 20355456
I0109 00:07:56.467567  3357 layer_factory.hpp:77] Creating layer rpn_conv1
I0109 00:07:56.467579  3357 net.cpp:106] Creating Layer rpn_conv1
I0109 00:07:56.467586  3357 net.cpp:454] rpn_conv1 <- conv5
I0109 00:07:56.467594  3357 net.cpp:411] rpn_conv1 -> rpn_conv1
I0109 00:07:56.469519  3357 net.cpp:150] Setting up rpn_conv1
I0109 00:07:56.469542  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.469547  3357 net.cpp:165] Memory required for data: 20585856
I0109 00:07:56.469565  3357 layer_factory.hpp:77] Creating layer rpn_relu1
I0109 00:07:56.469583  3357 net.cpp:106] Creating Layer rpn_relu1
I0109 00:07:56.469594  3357 net.cpp:454] rpn_relu1 <- rpn_conv1
I0109 00:07:56.469609  3357 net.cpp:397] rpn_relu1 -> rpn_conv1 (in-place)
I0109 00:07:56.469626  3357 net.cpp:150] Setting up rpn_relu1
I0109 00:07:56.469641  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.469648  3357 net.cpp:165] Memory required for data: 20816256
I0109 00:07:56.469653  3357 layer_factory.hpp:77] Creating layer rpn_conv1_rpn_relu1_0_split
I0109 00:07:56.469671  3357 net.cpp:106] Creating Layer rpn_conv1_rpn_relu1_0_split
I0109 00:07:56.469678  3357 net.cpp:454] rpn_conv1_rpn_relu1_0_split <- rpn_conv1
I0109 00:07:56.469688  3357 net.cpp:411] rpn_conv1_rpn_relu1_0_split -> rpn_conv1_rpn_relu1_0_split_0
I0109 00:07:56.469702  3357 net.cpp:411] rpn_conv1_rpn_relu1_0_split -> rpn_conv1_rpn_relu1_0_split_1
I0109 00:07:56.469715  3357 net.cpp:150] Setting up rpn_conv1_rpn_relu1_0_split
I0109 00:07:56.469727  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.469738  3357 net.cpp:157] Top shape: 1 256 15 15 (57600)
I0109 00:07:56.469745  3357 net.cpp:165] Memory required for data: 21277056
I0109 00:07:56.469755  3357 layer_factory.hpp:77] Creating layer rpn_cls_score
I0109 00:07:56.469774  3357 net.cpp:106] Creating Layer rpn_cls_score
I0109 00:07:56.469806  3357 net.cpp:454] rpn_cls_score <- rpn_conv1_rpn_relu1_0_split_0
I0109 00:07:56.469822  3357 net.cpp:411] rpn_cls_score -> rpn_cls_score
I0109 00:07:56.469879  3357 net.cpp:150] Setting up rpn_cls_score
I0109 00:07:56.469898  3357 net.cpp:157] Top shape: 1 18 15 15 (4050)
I0109 00:07:56.469907  3357 net.cpp:165] Memory required for data: 21293256
I0109 00:07:56.469924  3357 layer_factory.hpp:77] Creating layer rpn_bbox_pred
I0109 00:07:56.469941  3357 net.cpp:106] Creating Layer rpn_bbox_pred
I0109 00:07:56.469950  3357 net.cpp:454] rpn_bbox_pred <- rpn_conv1_rpn_relu1_0_split_1
I0109 00:07:56.469959  3357 net.cpp:411] rpn_bbox_pred -> rpn_bbox_pred
I0109 00:07:56.470029  3357 net.cpp:150] Setting up rpn_bbox_pred
I0109 00:07:56.470046  3357 net.cpp:157] Top shape: 1 36 15 15 (8100)
I0109 00:07:56.470057  3357 net.cpp:165] Memory required for data: 21325656
I0109 00:07:56.470072  3357 layer_factory.hpp:77] Creating layer rpn_cls_score_reshape
I0109 00:07:56.470088  3357 net.cpp:106] Creating Layer rpn_cls_score_reshape
I0109 00:07:56.470098  3357 net.cpp:454] rpn_cls_score_reshape <- rpn_cls_score
I0109 00:07:56.470119  3357 net.cpp:411] rpn_cls_score_reshape -> rpn_cls_score_reshape
I0109 00:07:56.470156  3357 net.cpp:150] Setting up rpn_cls_score_reshape
I0109 00:07:56.470170  3357 net.cpp:157] Top shape: 1 2 135 15 (4050)
I0109 00:07:56.470175  3357 net.cpp:165] Memory required for data: 21341856
I0109 00:07:56.470182  3357 layer_factory.hpp:77] Creating layer rpn_cls_prob
I0109 00:07:56.470196  3357 net.cpp:106] Creating Layer rpn_cls_prob
I0109 00:07:56.470208  3357 net.cpp:454] rpn_cls_prob <- rpn_cls_score_reshape
I0109 00:07:56.470222  3357 net.cpp:411] rpn_cls_prob -> rpn_cls_prob
I0109 00:07:56.470247  3357 net.cpp:150] Setting up rpn_cls_prob
I0109 00:07:56.470259  3357 net.cpp:157] Top shape: 1 2 135 15 (4050)
I0109 00:07:56.470264  3357 net.cpp:165] Memory required for data: 21358056
I0109 00:07:56.470269  3357 layer_factory.hpp:77] Creating layer rpn_cls_prob_reshape
I0109 00:07:56.470288  3357 net.cpp:106] Creating Layer rpn_cls_prob_reshape
I0109 00:07:56.470299  3357 net.cpp:454] rpn_cls_prob_reshape <- rpn_cls_prob
I0109 00:07:56.470311  3357 net.cpp:411] rpn_cls_prob_reshape -> rpn_cls_prob_reshape
I0109 00:07:56.470329  3357 net.cpp:150] Setting up rpn_cls_prob_reshape
I0109 00:07:56.470343  3357 net.cpp:157] Top shape: 1 18 15 15 (4050)
I0109 00:07:56.470352  3357 net.cpp:165] Memory required for data: 21374256
I0109 00:07:56.470361  3357 layer_factory.hpp:77] Creating layer proposal
I0109 00:07:56.948462  3357 net.cpp:106] Creating Layer proposal
I0109 00:07:56.948508  3357 net.cpp:454] proposal <- rpn_cls_prob_reshape
I0109 00:07:56.948524  3357 net.cpp:454] proposal <- rpn_bbox_pred
I0109 00:07:56.948535  3357 net.cpp:454] proposal <- im_info
I0109 00:07:56.948551  3357 net.cpp:411] proposal -> rois
I0109 00:07:56.948573  3357 net.cpp:411] proposal -> scores
I0109 00:07:56.950920  3357 net.cpp:150] Setting up proposal
I0109 00:07:56.950966  3357 net.cpp:157] Top shape: 1 5 (5)
I0109 00:07:56.950978  3357 net.cpp:157] Top shape: 1 1 1 1 (1)
I0109 00:07:56.950987  3357 net.cpp:165] Memory required for data: 21374280
I0109 00:07:56.950999  3357 net.cpp:228] proposal does not need backward computation.
I0109 00:07:56.951014  3357 net.cpp:228] rpn_cls_prob_reshape does not need backward computation.
I0109 00:07:56.951023  3357 net.cpp:228] rpn_cls_prob does not need backward computation.
I0109 00:07:56.951035  3357 net.cpp:228] rpn_cls_score_reshape does not need backward computation.
I0109 00:07:56.951045  3357 net.cpp:228] rpn_bbox_pred does not need backward computation.
I0109 00:07:56.951055  3357 net.cpp:228] rpn_cls_score does not need backward computation.
I0109 00:07:56.951066  3357 net.cpp:228] rpn_conv1_rpn_relu1_0_split does not need backward computation.
I0109 00:07:56.951076  3357 net.cpp:228] rpn_relu1 does not need backward computation.
I0109 00:07:56.951084  3357 net.cpp:228] rpn_conv1 does not need backward computation.
I0109 00:07:56.951093  3357 net.cpp:228] relu5 does not need backward computation.
I0109 00:07:56.951102  3357 net.cpp:228] conv5 does not need backward computation.
I0109 00:07:56.951112  3357 net.cpp:228] relu4 does not need backward computation.
I0109 00:07:56.951120  3357 net.cpp:228] conv4 does not need backward computation.
I0109 00:07:56.951129  3357 net.cpp:228] relu3 does not need backward computation.
I0109 00:07:56.951138  3357 net.cpp:228] conv3 does not need backward computation.
I0109 00:07:56.951148  3357 net.cpp:228] pool2 does not need backward computation.
I0109 00:07:56.951156  3357 net.cpp:228] norm2 does not need backward computation.
I0109 00:07:56.951165  3357 net.cpp:228] relu2 does not need backward computation.
I0109 00:07:56.951174  3357 net.cpp:228] conv2 does not need backward computation.
I0109 00:07:56.951184  3357 net.cpp:228] pool1 does not need backward computation.
I0109 00:07:56.951192  3357 net.cpp:228] norm1 does not need backward computation.
I0109 00:07:56.951202  3357 net.cpp:228] relu1 does not need backward computation.
I0109 00:07:56.951211  3357 net.cpp:228] conv1 does not need backward computation.
I0109 00:07:56.951220  3357 net.cpp:270] This network produces output rois
I0109 00:07:56.951232  3357 net.cpp:270] This network produces output scores
I0109 00:07:56.951283  3357 net.cpp:283] Network initialization done.
I0109 00:07:57.099508  3357 net.cpp:816] Ignoring source layer input-data
I0109 00:07:57.099534  3357 net.cpp:816] Ignoring source layer data_input-data_0_split
I0109 00:07:57.105355  3357 net.cpp:816] Ignoring source layer rpn_cls_score_rpn_cls_score_0_split
I0109 00:07:57.105379  3357 net.cpp:816] Ignoring source layer rpn-data
I0109 00:07:57.105382  3357 net.cpp:816] Ignoring source layer rpn_loss_cls
I0109 00:07:57.105386  3357 net.cpp:816] Ignoring source layer rpn_loss_bbox
I0109 00:07:57.105388  3357 net.cpp:816] Ignoring source layer dummy_roi_pool_conv5
I0109 00:07:57.105391  3357 net.cpp:816] Ignoring source layer fc6
I0109 00:07:57.105393  3357 net.cpp:816] Ignoring source layer relu6
I0109 00:07:57.105396  3357 net.cpp:816] Ignoring source layer fc7
I0109 00:07:57.105398  3357 net.cpp:816] Ignoring source layer silence_fc7
Output will be saved to `/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval`
im_proposals: 1/209 0.029s
im_proposals: 2/209 0.026s
im_proposals: 3/209 0.025s
im_proposals: 4/209 0.025s
im_proposals: 5/209 0.025s
im_proposals: 6/209 0.025s
im_proposals: 7/209 0.025s
im_proposals: 8/209 0.025s
im_proposals: 9/209 0.025s
im_proposals: 10/209 0.025s
im_proposals: 11/209 0.025s
im_proposals: 12/209 0.025s
im_proposals: 13/209 0.025s
im_proposals: 14/209 0.025s
im_proposals: 15/209 0.025s
im_proposals: 16/209 0.025s
im_proposals: 17/209 0.025s
im_proposals: 18/209 0.025s
im_proposals: 19/209 0.025s
im_proposals: 20/209 0.025s
im_proposals: 21/209 0.025s
im_proposals: 22/209 0.025s
im_proposals: 23/209 0.025s
im_proposals: 24/209 0.025s
im_proposals: 25/209 0.025s
im_proposals: 26/209 0.025s
im_proposals: 27/209 0.025s
im_proposals: 28/209 0.025s
im_proposals: 29/209 0.025s
im_proposals: 30/209 0.025s
im_proposals: 31/209 0.025s
im_proposals: 32/209 0.025s
im_proposals: 33/209 0.025s
im_proposals: 34/209 0.025s
im_proposals: 35/209 0.025s
im_proposals: 36/209 0.025s
im_proposals: 37/209 0.025s
im_proposals: 38/209 0.025s
im_proposals: 39/209 0.025s
im_proposals: 40/209 0.024s
im_proposals: 41/209 0.024s
im_proposals: 42/209 0.024s
im_proposals: 43/209 0.024s
im_proposals: 44/209 0.024s
im_proposals: 45/209 0.024s
im_proposals: 46/209 0.024s
im_proposals: 47/209 0.024s
im_proposals: 48/209 0.024s
im_proposals: 49/209 0.024s
im_proposals: 50/209 0.024s
im_proposals: 51/209 0.024s
im_proposals: 52/209 0.024s
im_proposals: 53/209 0.024s
im_proposals: 54/209 0.024s
im_proposals: 55/209 0.024s
im_proposals: 56/209 0.024s
im_proposals: 57/209 0.024s
im_proposals: 58/209 0.024s
im_proposals: 59/209 0.024s
im_proposals: 60/209 0.024s
im_proposals: 61/209 0.024s
im_proposals: 62/209 0.023s
im_proposals: 63/209 0.023s
im_proposals: 64/209 0.023s
im_proposals: 65/209 0.023s
im_proposals: 66/209 0.023s
im_proposals: 67/209 0.023s
im_proposals: 68/209 0.023s
im_proposals: 69/209 0.023s
im_proposals: 70/209 0.023s
im_proposals: 71/209 0.023s
im_proposals: 72/209 0.023s
im_proposals: 73/209 0.023s
im_proposals: 74/209 0.023s
im_proposals: 75/209 0.023s
im_proposals: 76/209 0.023s
im_proposals: 77/209 0.023s
im_proposals: 78/209 0.023s
im_proposals: 79/209 0.023s
im_proposals: 80/209 0.023s
im_proposals: 81/209 0.023s
im_proposals: 82/209 0.023s
im_proposals: 83/209 0.023s
im_proposals: 84/209 0.023s
im_proposals: 85/209 0.023s
im_proposals: 86/209 0.023s
im_proposals: 87/209 0.023s
im_proposals: 88/209 0.023s
im_proposals: 89/209 0.023s
im_proposals: 90/209 0.023s
im_proposals: 91/209 0.023s
im_proposals: 92/209 0.023s
im_proposals: 93/209 0.023s
im_proposals: 94/209 0.023s
im_proposals: 95/209 0.023s
im_proposals: 96/209 0.023s
im_proposals: 97/209 0.023s
im_proposals: 98/209 0.023s
im_proposals: 99/209 0.023s
im_proposals: 100/209 0.023s
im_proposals: 101/209 0.023s
im_proposals: 102/209 0.023s
im_proposals: 103/209 0.023s
im_proposals: 104/209 0.023s
im_proposals: 105/209 0.023s
im_proposals: 106/209 0.023s
im_proposals: 107/209 0.023s
im_proposals: 108/209 0.023s
im_proposals: 109/209 0.023s
im_proposals: 110/209 0.023s
im_proposals: 111/209 0.023s
im_proposals: 112/209 0.023s
im_proposals: 113/209 0.023s
im_proposals: 114/209 0.023s
im_proposals: 115/209 0.023s
im_proposals: 116/209 0.023s
im_proposals: 117/209 0.023s
im_proposals: 118/209 0.023s
im_proposals: 119/209 0.022s
im_proposals: 120/209 0.022s
im_proposals: 121/209 0.022s
im_proposals: 122/209 0.022s
im_proposals: 123/209 0.022s
im_proposals: 124/209 0.022s
im_proposals: 125/209 0.022s
im_proposals: 126/209 0.022s
im_proposals: 127/209 0.022s
im_proposals: 128/209 0.022s
im_proposals: 129/209 0.022s
im_proposals: 130/209 0.022s
im_proposals: 131/209 0.022s
im_proposals: 132/209 0.022s
im_proposals: 133/209 0.022s
im_proposals: 134/209 0.022s
im_proposals: 135/209 0.022s
im_proposals: 136/209 0.022s
im_proposals: 137/209 0.022s
im_proposals: 138/209 0.022s
im_proposals: 139/209 0.022s
im_proposals: 140/209 0.022s
im_proposals: 141/209 0.022s
im_proposals: 142/209 0.022s
im_proposals: 143/209 0.022s
im_proposals: 144/209 0.022s
im_proposals: 145/209 0.022s
im_proposals: 146/209 0.022s
im_proposals: 147/209 0.022s
im_proposals: 148/209 0.022s
im_proposals: 149/209 0.022s
im_proposals: 150/209 0.022s
im_proposals: 151/209 0.022s
im_proposals: 152/209 0.022s
im_proposals: 153/209 0.022s
im_proposals: 154/209 0.022s
im_proposals: 155/209 0.022s
im_proposals: 156/209 0.022s
im_proposals: 157/209 0.022s
im_proposals: 158/209 0.022s
im_proposals: 159/209 0.022s
im_proposals: 160/209 0.022s
im_proposals: 161/209 0.022s
im_proposals: 162/209 0.022s
im_proposals: 163/209 0.022s
im_proposals: 164/209 0.022s
im_proposals: 165/209 0.022s
im_proposals: 166/209 0.022s
im_proposals: 167/209 0.022s
im_proposals: 168/209 0.022s
im_proposals: 169/209 0.022s
im_proposals: 170/209 0.022s
im_proposals: 171/209 0.022s
im_proposals: 172/209 0.022s
im_proposals: 173/209 0.022s
im_proposals: 174/209 0.022s
im_proposals: 175/209 0.022s
im_proposals: 176/209 0.022s
im_proposals: 177/209 0.022s
im_proposals: 178/209 0.022s
im_proposals: 179/209 0.022s
im_proposals: 180/209 0.022s
im_proposals: 181/209 0.022s
im_proposals: 182/209 0.022s
im_proposals: 183/209 0.022s
im_proposals: 184/209 0.022s
im_proposals: 185/209 0.022s
im_proposals: 186/209 0.022s
im_proposals: 187/209 0.022s
im_proposals: 188/209 0.022s
im_proposals: 189/209 0.022s
im_proposals: 190/209 0.022s
im_proposals: 191/209 0.022s
im_proposals: 192/209 0.022s
im_proposals: 193/209 0.022s
im_proposals: 194/209 0.022s
im_proposals: 195/209 0.022s
im_proposals: 196/209 0.022s
im_proposals: 197/209 0.022s
im_proposals: 198/209 0.022s
im_proposals: 199/209 0.022s
im_proposals: 200/209 0.022s
im_proposals: 201/209 0.022s
im_proposals: 202/209 0.022s
im_proposals: 203/209 0.022s
im_proposals: 204/209 0.022s
im_proposals: 205/209 0.022s
im_proposals: 206/209 0.022s
im_proposals: 207/209 0.022s
im_proposals: 208/209 0.022s
im_proposals: 209/209 0.022s
Wrote RPN proposals to /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval/zf_rpn_stage1_iter_100_proposals.pkl
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Stage 1 Fast R-CNN using RPN proposals, init from ImageNet model
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Init model: None
RPN proposals: /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval/zf_rpn_stage1_iter_100_proposals.pkl
Using config:
{'DATA_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/data',
 'DEDUP_BOXES': 0.0625,
 'EPS': 1e-14,
 'EXP_DIR': 'default',
 'GPU_ID': 0,
 'MATLAB': 'matlab',
 'MODELS_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc',
 'PIXEL_MEANS': array([[[ 102.9801,  115.9465,  122.7717]]]),
 'RNG_SEED': 3,
 'ROOT_DIR': '/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master',
 'TEST': {'BBOX_REG': True,
          'HAS_RPN': False,
          'MAX_SIZE': 1000,
          'NMS': 0.3,
          'PROPOSAL_METHOD': 'selective_search',
          'RPN_MIN_SIZE': 16,
          'RPN_NMS_THRESH': 0.7,
          'RPN_POST_NMS_TOP_N': 300,
          'RPN_PRE_NMS_TOP_N': 6000,
          'SCALES': [48],
          'SVM': False},
 'TRAIN': {'ASPECT_GROUPING': True,
           'BATCH_SIZE': 128,
           'BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'BBOX_NORMALIZE_MEANS': [0.0, 0.0, 0.0, 0.0],
           'BBOX_NORMALIZE_STDS': [0.1, 0.1, 0.2, 0.2],
           'BBOX_NORMALIZE_TARGETS': True,
           'BBOX_NORMALIZE_TARGETS_PRECOMPUTED': False,
           'BBOX_REG': True,
           'BBOX_THRESH': 0.5,
           'BG_THRESH_HI': 0.5,
           'BG_THRESH_LO': 0.1,
           'FG_FRACTION': 0.25,
           'FG_THRESH': 0.5,
           'HAS_RPN': False,
           'IMS_PER_BATCH': 1,
           'MAX_SIZE': 700,
           'PROPOSAL_METHOD': 'rpn',
           'RPN_BATCHSIZE': 256,
           'RPN_BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'RPN_CLOBBER_POSITIVES': False,
           'RPN_FG_FRACTION': 0.5,
           'RPN_MIN_SIZE': 16,
           'RPN_NEGATIVE_OVERLAP': 0.3,
           'RPN_NMS_THRESH': 0.7,
         
已邀请:

qwl123

赞同来自:

'TRAIN': {'ASPECT_GROUPING': True,
           'BATCH_SIZE': 128,
           'BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'BBOX_NORMALIZE_MEANS': [0.0, 0.0, 0.0, 0.0],
           'BBOX_NORMALIZE_STDS': [0.1, 0.1, 0.2, 0.2],
           'BBOX_NORMALIZE_TARGETS': True,
           'BBOX_NORMALIZE_TARGETS_PRECOMPUTED': False,
           'BBOX_REG': True,
           'BBOX_THRESH': 0.5,
           'BG_THRESH_HI': 0.5,
           'BG_THRESH_LO': 0.1,
           'FG_FRACTION': 0.25,
           'FG_THRESH': 0.5,
           'HAS_RPN': False,
           'IMS_PER_BATCH': 1,
           'MAX_SIZE': 700,
           'PROPOSAL_METHOD': 'rpn',
           'RPN_BATCHSIZE': 256,
           'RPN_BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
           'RPN_CLOBBER_POSITIVES': False,
           'RPN_FG_FRACTION': 0.5,
           'RPN_MIN_SIZE': 16,
           'RPN_NEGATIVE_OVERLAP': 0.3,
           'RPN_NMS_THRESH': 0.7,
           'RPN_POSITIVE_OVERLAP': 0.7,
           'RPN_POSITIVE_WEIGHT': -1.0,
           'RPN_POST_NMS_TOP_N': 2000,
           'RPN_PRE_NMS_TOP_N': 12000,
           'SCALES': [256],
           'SNAPSHOT_INFIX': 'stage1',
           'SNAPSHOT_ITERS': 10000,
           'USE_FLIPPED': True,
           'USE_PREFETCH': False},
 'USE_GPU_NMS': False}
Loaded dataset `voc_2007_trainval` for training
Set proposal method: rpn
Appending horizontally-flipped training examples...
voc_2007_trainval gt roidb loaded from /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/data/cache/voc_2007_trainval_gt_roidb.pkl
loading /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval/zf_rpn_stage1_iter_100_proposals.pkl
done
Preparing training data...
done
Output will be saved to `/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/output/default/voc_2007_trainval`
Filtered 0 roidb entries: 418 -> 418
Computing bounding-box regression targets...
bbox target means:
[[ 0.  0.  0.  0.]
 [ 0.  0.  0.  0.]]
[ 0.  0.  0.  0.]
bbox target stdevs:
[[ 0.  0.  0.  0.]
 [ 0.  0.  0.  0.]]
[ 0.  0.  0.  0.]
Normalizing targets
/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/tools/../lib/roi_data_layer/roidb.py:101: RuntimeWarning: invalid value encountered in divide
  roidb[im_i]['bbox_targets'][cls_inds, 1:] /= stds[cls, :]
done
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0109 00:08:03.185263  3361 solver.cpp:48] Initializing solver from parameters: 
train_net: "/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc/ZF/faster_rcnn_alt_opt/stage1_fast_rcnn_train.pt"
base_lr: 0
display: 20
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 0.0005
stepsize: 150
snapshot: 0
snapshot_prefix: "zf_fast_rcnn"
average_loss: 100
I0109 00:08:03.185319  3361 solver.cpp:81] Creating training net from train_net file: /media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/models/pascal_voc/ZF/faster_rcnn_alt_opt/stage1_fast_rcnn_train.pt
I0109 00:08:03.220324  3361 net.cpp:49] Initializing net from parameters: 
name: "ZF"
state {
  phase: TRAIN
}
layer {
  name: "data"
  type: "Python"
  top: "data"
  top: "rois"
  top: "labels"
  top: "bbox_targets"
  top: "bbox_inside_weights"
  top: "bbox_outside_weights"
  python_param {
    module: "roi_data_layer.layer"
    layer: "RoIDataLayer"
    param_str: "\'num_classes\': 2"
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 96
    pad: 3
    kernel_size: 7
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "conv1"
  top: "norm1"
  lrn_param {
    local_size: 3
    alpha: 5e-05
    beta: 0.75
    norm_region: WITHIN_CHANNEL
    engine: CAFFE
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "norm1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
    pad: 1
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    stride: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "conv2"
  top: "norm2"
  lrn_param {
    local_size: 3
    alpha: 5e-05
    beta: 0.75
    norm_region: WITHIN_CHANNEL
    engine: CAFFE
  }
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "norm2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
    pad: 1
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    stride: 1
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "roi_pool_conv5"
  type: "ROIPooling"
  bottom: "conv5"
  bottom: "rois"
  top: "roi_pool_conv5"
  roi_pooling_param {
    pooled_h: 6
    pooled_w: 6
    spatial_scale: 0.0625
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "roi_pool_conv5"
  top: "fc6"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
    scale_train: false
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
    scale_train: false
  }
}
layer {
  name: "cls_score"
  type: "InnerProduct"
  bottom: "fc7"
  top: "cls_score"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "bbox_pred"
  type: "InnerProduct"
  bottom: "fc7"
  top: "bbox_pred"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 8
    weight_filler {
      type: "gaussian"
      std: 0.001
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "loss_cls"
  type: "SoftmaxWithLoss"
  bottom: "cls_score"
  bottom: "labels"
  top: "cls_loss"
  loss_weight: 1
  propagate_down: true
  propagate_down: false
  loss_param {
    ignore_label: -1
    normalize: true
  }
}
layer {
  name: "loss_bbox"
  type: "SmoothL1Loss"
  bottom: "bbox_pred"
  bottom: "bbox_targets"
  bottom: "bbox_inside_weights"
  bottom: "bbox_outside_weights"
  top: "bbox_loss"
  loss_weight: 1
}
layer {
  name: "rpn_conv1"
  type: "Convolution"
  bottom: "conv5"
  top: "rpn_conv1"
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "rpn_relu1"
  type: "ReLU"
  bottom: "rpn_conv1"
  top: "rpn_conv1"
}
layer {
  name: "rpn_cls_score"
  type: "Convolution"
  bottom: "rpn_conv1"
  top: "rpn_cls_score"
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
  convolution_param {
    num_output: 18
    pad: 0
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "rpn_bbox_pred"
  type: "Convolution"
  bottom: "rpn_conv1"
  top: "rpn_bbox_pred"
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
  convolution_param {
    num_output: 36
    pad: 0
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "silence_rpn_cls_score"
  type: "Silence"
  bottom: "rpn_cls_score"
}
layer {
  name: "silence_rpn_bbox_pred"
  type: "Silence"
  bottom: "rpn_bbox_pred"
}
I0109 00:08:03.220799  3361 layer_factory.hpp:77] Creating layer data
I0109 00:08:03.324904  3361 net.cpp:106] Creating Layer data
I0109 00:08:03.324925  3361 net.cpp:411] data -> data
I0109 00:08:03.324936  3361 net.cpp:411] data -> rois
I0109 00:08:03.324944  3361 net.cpp:411] data -> labels
I0109 00:08:03.324951  3361 net.cpp:411] data -> bbox_targets
I0109 00:08:03.324959  3361 net.cpp:411] data -> bbox_inside_weights
I0109 00:08:03.324967  3361 net.cpp:411] data -> bbox_outside_weights
RoiDataLayer: name_to_top: {'bbox_inside_weights': 4, 'labels': 2, 'rois': 1, 'bbox_targets': 3, 'bbox_outside_weights': 5, 'data': 0}
I0109 00:08:03.325460  3361 net.cpp:150] Setting up data
I0109 00:08:03.325474  3361 net.cpp:157] Top shape: 1 3 256 700 (537600)
I0109 00:08:03.325481  3361 net.cpp:157] Top shape: 1 5 (5)
I0109 00:08:03.325487  3361 net.cpp:157] Top shape: 1 (1)
I0109 00:08:03.325492  3361 net.cpp:157] Top shape: 1 8 (8)
I0109 00:08:03.325497  3361 net.cpp:157] Top shape: 1 8 (8)
I0109 00:08:03.325502  3361 net.cpp:157] Top shape: 1 8 (8)
I0109 00:08:03.325505  3361 net.cpp:165] Memory required for data: 2150520
I0109 00:08:03.325510  3361 layer_factory.hpp:77] Creating layer conv1
I0109 00:08:03.325522  3361 net.cpp:106] Creating Layer conv1
I0109 00:08:03.325529  3361 net.cpp:454] conv1 <- data
I0109 00:08:03.325537  3361 net.cpp:411] conv1 -> conv1
I0109 00:08:03.325631  3361 net.cpp:150] Setting up conv1
I0109 00:08:03.325639  3361 net.cpp:157] Top shape: 1 96 128 350 (4300800)
I0109 00:08:03.325642  3361 net.cpp:165] Memory required for data: 19353720
I0109 00:08:03.325655  3361 layer_factory.hpp:77] Creating layer relu1
I0109 00:08:03.325664  3361 net.cpp:106] Creating Layer relu1
I0109 00:08:03.325670  3361 net.cpp:454] relu1 <- conv1
I0109 00:08:03.325675  3361 net.cpp:397] relu1 -> conv1 (in-place)
I0109 00:08:03.325686  3361 net.cpp:150] Setting up relu1
I0109 00:08:03.325692  3361 net.cpp:157] Top shape: 1 96 128 350 (4300800)
I0109 00:08:03.325696  3361 net.cpp:165] Memory required for data: 36556920
I0109 00:08:03.325701  3361 layer_factory.hpp:77] Creating layer norm1
I0109 00:08:03.325707  3361 net.cpp:106] Creating Layer norm1
I0109 00:08:03.325711  3361 net.cpp:454] norm1 <- conv1
I0109 00:08:03.325717  3361 net.cpp:411] norm1 -> norm1
I0109 00:08:03.325752  3361 net.cpp:150] Setting up norm1
I0109 00:08:03.325759  3361 net.cpp:157] Top shape: 1 96 128 350 (4300800)
I0109 00:08:03.325763  3361 net.cpp:165] Memory required for data: 53760120
I0109 00:08:03.325767  3361 layer_factory.hpp:77] Creating layer pool1
I0109 00:08:03.325774  3361 net.cpp:106] Creating Layer pool1
I0109 00:08:03.325779  3361 net.cpp:454] pool1 <- norm1
I0109 00:08:03.325785  3361 net.cpp:411] pool1 -> pool1
I0109 00:08:03.325798  3361 net.cpp:150] Setting up pool1
I0109 00:08:03.325803  3361 net.cpp:157] Top shape: 1 96 65 176 (1098240)
I0109 00:08:03.325808  3361 net.cpp:165] Memory required for data: 58153080
I0109 00:08:03.325811  3361 layer_factory.hpp:77] Creating layer conv2
I0109 00:08:03.325819  3361 net.cpp:106] Creating Layer conv2
I0109 00:08:03.325824  3361 net.cpp:454] conv2 <- pool1
I0109 00:08:03.325831  3361 net.cpp:411] conv2 -> conv2
I0109 00:08:03.326500  3361 net.cpp:150] Setting up conv2
I0109 00:08:03.326508  3361 net.cpp:157] Top shape: 1 256 33 88 (743424)
I0109 00:08:03.326513  3361 net.cpp:165] Memory required for data: 61126776
I0109 00:08:03.326521  3361 layer_factory.hpp:77] Creating layer relu2
I0109 00:08:03.326529  3361 net.cpp:106] Creating Layer relu2
I0109 00:08:03.326534  3361 net.cpp:454] relu2 <- conv2
I0109 00:08:03.326539  3361 net.cpp:397] relu2 -> conv2 (in-place)
I0109 00:08:03.326545  3361 net.cpp:150] Setting up relu2
I0109 00:08:03.326550  3361 net.cpp:157] Top shape: 1 256 33 88 (743424)
I0109 00:08:03.326553  3361 net.cpp:165] Memory required for data: 64100472
I0109 00:08:03.326557  3361 layer_factory.hpp:77] Creating layer norm2
I0109 00:08:03.326563  3361 net.cpp:106] Creating Layer norm2
I0109 00:08:03.326568  3361 net.cpp:454] norm2 <- conv2
I0109 00:08:03.326575  3361 net.cpp:411] norm2 -> norm2
I0109 00:08:03.326592  3361 net.cpp:150] Setting up norm2
I0109 00:08:03.326599  3361 net.cpp:157] Top shape: 1 256 33 88 (743424)
I0109 00:08:03.326603  3361 net.cpp:165] Memory required for data: 67074168
I0109 00:08:03.326607  3361 layer_factory.hpp:77] Creating layer pool2
I0109 00:08:03.326616  3361 net.cpp:106] Creating Layer pool2
I0109 00:08:03.326618  3361 net.cpp:454] pool2 <- norm2
I0109 00:08:03.326624  3361 net.cpp:411] pool2 -> pool2
I0109 00:08:03.326632  3361 net.cpp:150] Setting up pool2
I0109 00:08:03.326639  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.326643  3361 net.cpp:165] Memory required for data: 67857528
I0109 00:08:03.326647  3361 layer_factory.hpp:77] Creating layer conv3
I0109 00:08:03.326655  3361 net.cpp:106] Creating Layer conv3
I0109 00:08:03.326661  3361 net.cpp:454] conv3 <- pool2
I0109 00:08:03.326668  3361 net.cpp:411] conv3 -> conv3
I0109 00:08:03.327708  3361 net.cpp:150] Setting up conv3
I0109 00:08:03.327714  3361 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:08:03.327718  3361 net.cpp:165] Memory required for data: 69032568
I0109 00:08:03.327725  3361 layer_factory.hpp:77] Creating layer relu3
I0109 00:08:03.327733  3361 net.cpp:106] Creating Layer relu3
I0109 00:08:03.327738  3361 net.cpp:454] relu3 <- conv3
I0109 00:08:03.327744  3361 net.cpp:397] relu3 -> conv3 (in-place)
I0109 00:08:03.327751  3361 net.cpp:150] Setting up relu3
I0109 00:08:03.327755  3361 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:08:03.327757  3361 net.cpp:165] Memory required for data: 70207608
I0109 00:08:03.327761  3361 layer_factory.hpp:77] Creating layer conv4
I0109 00:08:03.327769  3361 net.cpp:106] Creating Layer conv4
I0109 00:08:03.327774  3361 net.cpp:454] conv4 <- conv3
I0109 00:08:03.327780  3361 net.cpp:411] conv4 -> conv4
I0109 00:08:03.329535  3361 net.cpp:150] Setting up conv4
I0109 00:08:03.329551  3361 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:08:03.329553  3361 net.cpp:165] Memory required for data: 71382648
I0109 00:08:03.329563  3361 layer_factory.hpp:77] Creating layer relu4
I0109 00:08:03.329573  3361 net.cpp:106] Creating Layer relu4
I0109 00:08:03.329577  3361 net.cpp:454] relu4 <- conv4
I0109 00:08:03.329583  3361 net.cpp:397] relu4 -> conv4 (in-place)
I0109 00:08:03.329591  3361 net.cpp:150] Setting up relu4
I0109 00:08:03.329597  3361 net.cpp:157] Top shape: 1 384 17 45 (293760)
I0109 00:08:03.329601  3361 net.cpp:165] Memory required for data: 72557688
I0109 00:08:03.329604  3361 layer_factory.hpp:77] Creating layer conv5
I0109 00:08:03.329614  3361 net.cpp:106] Creating Layer conv5
I0109 00:08:03.329619  3361 net.cpp:454] conv5 <- conv4
I0109 00:08:03.329625  3361 net.cpp:411] conv5 -> conv5
I0109 00:08:03.330552  3361 net.cpp:150] Setting up conv5
I0109 00:08:03.330564  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.330567  3361 net.cpp:165] Memory required for data: 73341048
I0109 00:08:03.330585  3361 layer_factory.hpp:77] Creating layer relu5
I0109 00:08:03.330593  3361 net.cpp:106] Creating Layer relu5
I0109 00:08:03.330598  3361 net.cpp:454] relu5 <- conv5
I0109 00:08:03.330606  3361 net.cpp:397] relu5 -> conv5 (in-place)
I0109 00:08:03.330613  3361 net.cpp:150] Setting up relu5
I0109 00:08:03.330618  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.330621  3361 net.cpp:165] Memory required for data: 74124408
I0109 00:08:03.330626  3361 layer_factory.hpp:77] Creating layer conv5_relu5_0_split
I0109 00:08:03.330634  3361 net.cpp:106] Creating Layer conv5_relu5_0_split
I0109 00:08:03.330637  3361 net.cpp:454] conv5_relu5_0_split <- conv5
I0109 00:08:03.330644  3361 net.cpp:411] conv5_relu5_0_split -> conv5_relu5_0_split_0
I0109 00:08:03.330652  3361 net.cpp:411] conv5_relu5_0_split -> conv5_relu5_0_split_1
I0109 00:08:03.330667  3361 net.cpp:150] Setting up conv5_relu5_0_split
I0109 00:08:03.330672  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.330677  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.330680  3361 net.cpp:165] Memory required for data: 75691128
I0109 00:08:03.330684  3361 layer_factory.hpp:77] Creating layer roi_pool_conv5
I0109 00:08:03.330703  3361 net.cpp:106] Creating Layer roi_pool_conv5
I0109 00:08:03.330706  3361 net.cpp:454] roi_pool_conv5 <- conv5_relu5_0_split_0
I0109 00:08:03.330713  3361 net.cpp:454] roi_pool_conv5 <- rois
I0109 00:08:03.330720  3361 net.cpp:411] roi_pool_conv5 -> roi_pool_conv5
I0109 00:08:03.330729  3361 roi_pooling_layer.cpp:30] Spatial scale: 0.0625
I0109 00:08:03.330750  3361 net.cpp:150] Setting up roi_pool_conv5
I0109 00:08:03.330760  3361 net.cpp:157] Top shape: 1 256 6 6 (9216)
I0109 00:08:03.330763  3361 net.cpp:165] Memory required for data: 75727992
I0109 00:08:03.330767  3361 layer_factory.hpp:77] Creating layer fc6
I0109 00:08:03.330775  3361 net.cpp:106] Creating Layer fc6
I0109 00:08:03.330780  3361 net.cpp:454] fc6 <- roi_pool_conv5
I0109 00:08:03.330786  3361 net.cpp:411] fc6 -> fc6
I0109 00:08:03.367540  3361 net.cpp:150] Setting up fc6
I0109 00:08:03.367574  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.367578  3361 net.cpp:165] Memory required for data: 75744376
I0109 00:08:03.367588  3361 layer_factory.hpp:77] Creating layer relu6
I0109 00:08:03.367595  3361 net.cpp:106] Creating Layer relu6
I0109 00:08:03.367599  3361 net.cpp:454] relu6 <- fc6
I0109 00:08:03.367604  3361 net.cpp:397] relu6 -> fc6 (in-place)
I0109 00:08:03.367614  3361 net.cpp:150] Setting up relu6
I0109 00:08:03.367617  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.367619  3361 net.cpp:165] Memory required for data: 75760760
I0109 00:08:03.367624  3361 layer_factory.hpp:77] Creating layer drop6
I0109 00:08:03.367640  3361 net.cpp:106] Creating Layer drop6
I0109 00:08:03.367641  3361 net.cpp:454] drop6 <- fc6
I0109 00:08:03.367645  3361 net.cpp:397] drop6 -> fc6 (in-place)
I0109 00:08:03.367653  3361 net.cpp:150] Setting up drop6
I0109 00:08:03.367656  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.367658  3361 net.cpp:165] Memory required for data: 75777144
I0109 00:08:03.367661  3361 layer_factory.hpp:77] Creating layer fc7
I0109 00:08:03.367674  3361 net.cpp:106] Creating Layer fc7
I0109 00:08:03.367678  3361 net.cpp:454] fc7 <- fc6
I0109 00:08:03.367684  3361 net.cpp:411] fc7 -> fc7
I0109 00:08:03.380499  3361 net.cpp:150] Setting up fc7
I0109 00:08:03.380533  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.380537  3361 net.cpp:165] Memory required for data: 75793528
I0109 00:08:03.380549  3361 layer_factory.hpp:77] Creating layer relu7
I0109 00:08:03.380560  3361 net.cpp:106] Creating Layer relu7
I0109 00:08:03.380583  3361 net.cpp:454] relu7 <- fc7
I0109 00:08:03.380599  3361 net.cpp:397] relu7 -> fc7 (in-place)
I0109 00:08:03.380612  3361 net.cpp:150] Setting up relu7
I0109 00:08:03.380616  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.380620  3361 net.cpp:165] Memory required for data: 75809912
I0109 00:08:03.380623  3361 layer_factory.hpp:77] Creating layer drop7
I0109 00:08:03.380645  3361 net.cpp:106] Creating Layer drop7
I0109 00:08:03.380648  3361 net.cpp:454] drop7 <- fc7
I0109 00:08:03.380657  3361 net.cpp:397] drop7 -> fc7 (in-place)
I0109 00:08:03.380666  3361 net.cpp:150] Setting up drop7
I0109 00:08:03.380671  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.380673  3361 net.cpp:165] Memory required for data: 75826296
I0109 00:08:03.380678  3361 layer_factory.hpp:77] Creating layer fc7_drop7_0_split
I0109 00:08:03.380686  3361 net.cpp:106] Creating Layer fc7_drop7_0_split
I0109 00:08:03.380689  3361 net.cpp:454] fc7_drop7_0_split <- fc7
I0109 00:08:03.380695  3361 net.cpp:411] fc7_drop7_0_split -> fc7_drop7_0_split_0
I0109 00:08:03.380702  3361 net.cpp:411] fc7_drop7_0_split -> fc7_drop7_0_split_1
I0109 00:08:03.380712  3361 net.cpp:150] Setting up fc7_drop7_0_split
I0109 00:08:03.380715  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.380719  3361 net.cpp:157] Top shape: 1 4096 (4096)
I0109 00:08:03.380722  3361 net.cpp:165] Memory required for data: 75859064
I0109 00:08:03.380726  3361 layer_factory.hpp:77] Creating layer cls_score
I0109 00:08:03.380734  3361 net.cpp:106] Creating Layer cls_score
I0109 00:08:03.380738  3361 net.cpp:454] cls_score <- fc7_drop7_0_split_0
I0109 00:08:03.380744  3361 net.cpp:411] cls_score -> cls_score
I0109 00:08:03.380920  3361 net.cpp:150] Setting up cls_score
I0109 00:08:03.380928  3361 net.cpp:157] Top shape: 1 2 (2)
I0109 00:08:03.380930  3361 net.cpp:165] Memory required for data: 75859072
I0109 00:08:03.380936  3361 layer_factory.hpp:77] Creating layer bbox_pred
I0109 00:08:03.380944  3361 net.cpp:106] Creating Layer bbox_pred
I0109 00:08:03.380965  3361 net.cpp:454] bbox_pred <- fc7_drop7_0_split_1
I0109 00:08:03.380972  3361 net.cpp:411] bbox_pred -> bbox_pred
I0109 00:08:03.381531  3361 net.cpp:150] Setting up bbox_pred
I0109 00:08:03.381538  3361 net.cpp:157] Top shape: 1 8 (8)
I0109 00:08:03.381541  3361 net.cpp:165] Memory required for data: 75859104
I0109 00:08:03.381554  3361 layer_factory.hpp:77] Creating layer loss_cls
I0109 00:08:03.381579  3361 net.cpp:106] Creating Layer loss_cls
I0109 00:08:03.381583  3361 net.cpp:454] loss_cls <- cls_score
I0109 00:08:03.381588  3361 net.cpp:454] loss_cls <- labels
I0109 00:08:03.381593  3361 net.cpp:411] loss_cls -> cls_loss
I0109 00:08:03.381603  3361 layer_factory.hpp:77] Creating layer loss_cls
I0109 00:08:03.381620  3361 net.cpp:150] Setting up loss_cls
I0109 00:08:03.381640  3361 net.cpp:157] Top shape: (1)
I0109 00:08:03.381642  3361 net.cpp:160]     with loss weight 1
I0109 00:08:03.381669  3361 net.cpp:165] Memory required for data: 75859108
I0109 00:08:03.381670  3361 layer_factory.hpp:77] Creating layer loss_bbox
I0109 00:08:03.381675  3361 net.cpp:106] Creating Layer loss_bbox
I0109 00:08:03.381676  3361 net.cpp:454] loss_bbox <- bbox_pred
I0109 00:08:03.381680  3361 net.cpp:454] loss_bbox <- bbox_targets
I0109 00:08:03.381683  3361 net.cpp:454] loss_bbox <- bbox_inside_weights
I0109 00:08:03.381685  3361 net.cpp:454] loss_bbox <- bbox_outside_weights
I0109 00:08:03.381706  3361 net.cpp:411] loss_bbox -> bbox_loss
I0109 00:08:03.381719  3361 net.cpp:150] Setting up loss_bbox
I0109 00:08:03.381724  3361 net.cpp:157] Top shape: (1)
I0109 00:08:03.381727  3361 net.cpp:160]     with loss weight 1
I0109 00:08:03.381733  3361 net.cpp:165] Memory required for data: 75859112
I0109 00:08:03.381737  3361 layer_factory.hpp:77] Creating layer rpn_conv1
I0109 00:08:03.381744  3361 net.cpp:106] Creating Layer rpn_conv1
I0109 00:08:03.381747  3361 net.cpp:454] rpn_conv1 <- conv5_relu5_0_split_1
I0109 00:08:03.381752  3361 net.cpp:411] rpn_conv1 -> rpn_conv1
I0109 00:08:03.391804  3361 net.cpp:150] Setting up rpn_conv1
I0109 00:08:03.391883  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.391909  3361 net.cpp:165] Memory required for data: 76642472
I0109 00:08:03.391927  3361 layer_factory.hpp:77] Creating layer rpn_relu1
I0109 00:08:03.391963  3361 net.cpp:106] Creating Layer rpn_relu1
I0109 00:08:03.391970  3361 net.cpp:454] rpn_relu1 <- rpn_conv1
I0109 00:08:03.391989  3361 net.cpp:397] rpn_relu1 -> rpn_conv1 (in-place)
I0109 00:08:03.392007  3361 net.cpp:150] Setting up rpn_relu1
I0109 00:08:03.392014  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.392016  3361 net.cpp:165] Memory required for data: 77425832
I0109 00:08:03.392019  3361 layer_factory.hpp:77] Creating layer rpn_conv1_rpn_relu1_0_split
I0109 00:08:03.392035  3361 net.cpp:106] Creating Layer rpn_conv1_rpn_relu1_0_split
I0109 00:08:03.392045  3361 net.cpp:454] rpn_conv1_rpn_relu1_0_split <- rpn_conv1
I0109 00:08:03.392060  3361 net.cpp:411] rpn_conv1_rpn_relu1_0_split -> rpn_conv1_rpn_relu1_0_split_0
I0109 00:08:03.392074  3361 net.cpp:411] rpn_conv1_rpn_relu1_0_split -> rpn_conv1_rpn_relu1_0_split_1
I0109 00:08:03.392089  3361 net.cpp:150] Setting up rpn_conv1_rpn_relu1_0_split
I0109 00:08:03.392098  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.392105  3361 net.cpp:157] Top shape: 1 256 17 45 (195840)
I0109 00:08:03.392110  3361 net.cpp:165] Memory required for data: 78992552
I0109 00:08:03.392114  3361 layer_factory.hpp:77] Creating layer rpn_cls_score
I0109 00:08:03.392143  3361 net.cpp:106] Creating Layer rpn_cls_score
I0109 00:08:03.392156  3361 net.cpp:454] rpn_cls_score <- rpn_conv1_rpn_relu1_0_split_0
I0109 00:08:03.392169  3361 net.cpp:411] rpn_cls_score -> rpn_cls_score
I0109 00:08:03.392406  3361 net.cpp:150] Setting up rpn_cls_score
I0109 00:08:03.392422  3361 net.cpp:157] Top shape: 1 18 17 45 (13770)
I0109 00:08:03.392426  3361 net.cpp:165] Memory required for data: 79047632
I0109 00:08:03.392442  3361 layer_factory.hpp:77] Creating layer rpn_bbox_pred
I0109 00:08:03.392498  3361 net.cpp:106] Creating Layer rpn_bbox_pred
I0109 00:08:03.392508  3361 net.cpp:454] rpn_bbox_pred <- rpn_conv1_rpn_relu1_0_split_1
I0109 00:08:03.392518  3361 net.cpp:411] rpn_bbox_pred -> rpn_bbox_pred
I0109 00:08:03.392783  3361 net.cpp:150] Setting up rpn_bbox_pred
I0109 00:08:03.392791  3361 net.cpp:157] Top shape: 1 36 17 45 (27540)
I0109 00:08:03.392796  3361 net.cpp:165] Memory required for data: 79157792
I0109 00:08:03.392801  3361 layer_factory.hpp:77] Creating layer silence_rpn_cls_score
I0109 00:08:03.392804  3361 net.cpp:106] Creating Layer silence_rpn_cls_score
I0109 00:08:03.392807  3361 net.cpp:454] silence_rpn_cls_score <- rpn_cls_score
I0109 00:08:03.392810  3361 net.cpp:150] Setting up silence_rpn_cls_score
I0109 00:08:03.392812  3361 net.cpp:165] Memory required for data: 79157792
I0109 00:08:03.392814  3361 layer_factory.hpp:77] Creating layer silence_rpn_bbox_pred
I0109 00:08:03.392817  3361 net.cpp:106] Creating Layer silence_rpn_bbox_pred
I0109 00:08:03.392839  3361 net.cpp:454] silence_rpn_bbox_pred <- rpn_bbox_pred
I0109 00:08:03.392843  3361 net.cpp:150] Setting up silence_rpn_bbox_pred
I0109 00:08:03.392844  3361 net.cpp:165] Memory required for data: 79157792
I0109 00:08:03.392846  3361 net.cpp:228] silence_rpn_bbox_pred does not need backward computation.
I0109 00:08:03.392848  3361 net.cpp:228] silence_rpn_cls_score does not need backward computation.
I0109 00:08:03.392850  3361 net.cpp:228] rpn_bbox_pred does not need backward computation.
I0109 00:08:03.392853  3361 net.cpp:228] rpn_cls_score does not need backward computation.
I0109 00:08:03.392855  3361 net.cpp:228] rpn_conv1_rpn_relu1_0_split does not need backward computation.
I0109 00:08:03.392858  3361 net.cpp:228] rpn_relu1 does not need backward computation.
I0109 00:08:03.392863  3361 net.cpp:228] rpn_conv1 does not need backward computation.
I0109 00:08:03.392884  3361 net.cpp:226] loss_bbox needs backward computation.
I0109 00:08:03.392889  3361 net.cpp:226] loss_cls needs backward computation.
I0109 00:08:03.392894  3361 net.cpp:226] bbox_pred needs backward computation.
I0109 00:08:03.392899  3361 net.cpp:226] cls_score needs backward computation.
I0109 00:08:03.392902  3361 net.cpp:226] fc7_drop7_0_split needs backward computation.
I0109 00:08:03.392907  3361 net.cpp:226] drop7 needs backward computation.
I0109 00:08:03.392910  3361 net.cpp:226] relu7 needs backward computation.
I0109 00:08:03.392913  3361 net.cpp:226] fc7 needs backward computation.
I0109 00:08:03.392915  3361 net.cpp:226] drop6 needs backward computation.
I0109 00:08:03.392917  3361 net.cpp:226] relu6 needs backward computation.
I0109 00:08:03.392920  3361 net.cpp:226] fc6 needs backward computation.
I0109 00:08:03.392922  3361 net.cpp:226] roi_pool_conv5 needs backward computation.
I0109 00:08:03.392925  3361 net.cpp:226] conv5_relu5_0_split needs backward computation.
I0109 00:08:03.392928  3361 net.cpp:226] relu5 needs backward computation.
I0109 00:08:03.392931  3361 net.cpp:226] conv5 needs backward computation.
I0109 00:08:03.392933  3361 net.cpp:226] relu4 needs backward computation.
I0109 00:08:03.392936  3361 net.cpp:226] conv4 needs backward computation.
I0109 00:08:03.392938  3361 net.cpp:226] relu3 needs backward computation.
I0109 00:08:03.392940  3361 net.cpp:226] conv3 needs backward computation.
I0109 00:08:03.392942  3361 net.cpp:226] pool2 needs backward computation.
I0109 00:08:03.392946  3361 net.cpp:226] norm2 needs backward computation.
I0109 00:08:03.392947  3361 net.cpp:226] relu2 needs backward computation.
I0109 00:08:03.392949  3361 net.cpp:226] conv2 needs backward computation.
I0109 00:08:03.392952  3361 net.cpp:226] pool1 needs backward computation.
I0109 00:08:03.392954  3361 net.cpp:226] norm1 needs backward computation.
I0109 00:08:03.392957  3361 net.cpp:226] relu1 needs backward computation.
I0109 00:08:03.392976  3361 net.cpp:226] conv1 needs backward computation.
I0109 00:08:03.392980  3361 net.cpp:228] data does not need backward computation.
I0109 00:08:03.392982  3361 net.cpp:270] This network produces output bbox_loss
I0109 00:08:03.392984  3361 net.cpp:270] This network produces output cls_loss
I0109 00:08:03.393026  3361 net.cpp:283] Network initialization done.
I0109 00:08:03.393139  3361 solver.cpp:60] Solver scaffolding done.
Solving...
/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/tools/../lib/roi_data_layer/minibatch.py:100: VisibleDeprecationWarning: using a non-integer number instead of an integer will result in an error in the future
  fg_inds, size=fg_rois_per_this_image, replace=False)
/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/tools/../lib/roi_data_layer/minibatch.py:120: VisibleDeprecationWarning: using a non-integer number instead of an integer will result in an error in the future
  labels[fg_rois_per_this_image:] = 0
/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/tools/../lib/roi_data_layer/minibatch.py:176: VisibleDeprecationWarning: using a non-integer number instead of an integer will result in an error in the future
  bbox_targets[ind, start:end] = bbox_target_data[ind, 1:]
/media/qwl/qwl/TPfiles/DL/RCNN/faster_rcnn-master/py-faster-rcnn-master/py-faster-rcnn-master/tools/../lib/roi_data_layer/minibatch.py:177: VisibleDeprecationWarning: using a non-integer number instead of an integer will result in an error in the future
  bbox_inside_weights[ind, start:end] = cfg.TRAIN.BBOX_INSIDE_WEIGHTS
I0109 00:08:04.427263  3361 solver.cpp:229] Iteration 0, loss = -nan
I0109 00:08:04.427296  3361 solver.cpp:245]     Train net output #0: bbox_loss = -nan (* 1 = -nan loss)
I0109 00:08:04.427302  3361 solver.cpp:245]     Train net output #1: cls_loss = 0.693147 (* 1 = 0.693147 loss)
I0109 00:08:04.427309  3361 sgd_solver.cpp:106] Iteration 0, lr = 0

hunterlew

赞同来自:

我碰到过一次,把学习率调小就可以了

lwslove911

赞同来自:

你这个没加载预训练模型吧pretrained_model=None,从0开始训练这点数据也不够吧?

yp41155

赞同来自:

我也不知道您还在不在关注这个,如果还在,讲下我刚刚找到的原因,是因为标注xml的时候,我从0开始标记了,而RBG写代码的时候是从1开始的,因此,他在load ground truth的时候,做了个-1,然后之后有log的操作,如果是0的话,会出现log(-1),因此会在网络初始化之后报出
RuntimeWarning: invalid value encountered in log targets_dw = np.log(gt_widths / ex_widths)
这个错误,因此只要按照这篇会回答的方法修改下就行了,修改学习率没有用!千万不要被误导,他只是延缓了nan出现时间(仅在我这种错误下)
http://stackoverflow.com/quest ... asets

南七骄傲 - 90后IT男

赞同来自:

你这个是训练自己的数据集吧

要回复问题请先登录注册