您的位置:首页 > 其它

Setting the regularization_type for all layers

2014-10-17 14:30 239 查看
Many papers said if using the L2-normalization on features then the performance will improved greatly, so I tried to find approaches to achieve this. In the file caffe.proto, SolverParameter has optional parm named regularization_type, we can switch this
between L1 and L2, two ways can do it: 1) set in solver.protocol file, set regularization_type = L1 or L2 like set stepsize=100000; 2) set it on caffe.proto file, this will change the
default setting of regularization_type if we on longer set this param in solver.protocoll file.

The above setting mainly focus on hidden layers, if we want to use SVM and change its regularization_type, then first we need declare the loss layer is HINGE_LOSS,
SVM tends to benefit from L2 normalization, a possible explanation is that after normalization the inner product corresponds to the cosine similarity.

layers {

name: "loss"

type: HINGE_LOSS

bottom: "ip1"

bottom: "label"

top: "loss"

hinge_loss_param

{

norm = 2# ---> L2

}

}
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐