loss无穷大

我用caffe三个内积层模仿神经网络,输入3600,ip1输出300,ip2输出100,ip3输出3600,loss取EuclideanLoss,我想学习类似信号平滑,输入输出点数一样的效果。训练数据500,为什么经过几次训练loss就无穷大了(matlab),哪里出错了吗?
name: "FhrNet"
layer {
name: "data"
type: "Input"
top: "fhr"
top: "fhr_baseline"
include {
phase: TRAIN
}
input_param {
shape{
dim : 1
dim : 1
dim : 1
dim:3600
}
shape{
dim : 1
dim : 1
dim : 1
dim:3600

}
}
}

layer {
name: "ip1"
type: "InnerProduct"
bottom: "fhr"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 300
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}

layer {
name: "ip2"
type: "InnerProduct"
bottom: "ip1"
top: "ip2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 50
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}

layer {
name: "ip3"
type: "InnerProduct"
bottom: "ip2"
top: "ip3"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 3600
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}

layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip3"
bottom: "fhr_baseline"
top: "loss"
}


net: "H:/caffe/matlab/fhr/fhr_train.prototxt"
base_lr: 0.1
momentum: 0.9
weight_decay: 0.0005
lr_policy: "step"
gamma: 0.1
stepsize: 10
max_iter: 500
solver_mode: CPU
已邀请:

要回复问题请先登录注册