您的位置:首页 > 其它

The "ReQU" unit

2015-08-17 15:22 169 查看
Here, we’ll implement a made-up activation function that we’ll call the Rectified Quadratic Unit(ReQU). Like the sigmoid and ReLU and several others, it is applied element-wise to all its inputs:

zi=I[xi>0]x2i={x2i0if xi>0otherwise

require 'nn'

local ReQU = torch.class('nn.ReQU', 'nn.Module')

function ReQU:updateOutput(input)
-- TODO
self.output:resizeAs(input):copy(input)
self.output[torch.lt(self.output, 0)] = 0
self.output:pow(2)
-- ...something here...
return self.output
end

function ReQU:updateGradInput(input, gradOutput)
-- TODO
self.gradInput:resizeAs(gradOutput):copy(gradOutput)
self.gradInput[torch.lt(input, 0)] = 0
self.gradInput:mul(2):cmul(input)
-- ...something here...
return self.gradInput
end
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  practical