简单的BP神经网络实现
2016-07-03 16:27
561 查看
简单的BP神经网络cosfunction;(coursera machine learning programming assignment)
function [J grad] = nnCostFunction(nn_params, ... input_layer_size, ... hidden_layer_size, ... num_labels, ... X, y, lambda) %nn_params 为unroll的参数 %nn_params Theta1 = reshape(nn_params(1:hidden_layer_size * (input_layer_size + 1)), ... hidden_layer_size, (input_layer_size + 1)); Theta2 = reshape(nn_params((1 + (hidden_layer_size * (input_layer_size + 1))):end), ... num_labels, (hidden_layer_size + 1)); % Setup some useful variables m = size(X, 1); % You need to return the following variables correctly J = 0; Theta1_grad = zeros(size(Theta1)); Theta2_grad = zeros(size(Theta2)); %% 向前传递算法的计算 sigmoid = @(z)(1./(1+exp(-z))); a1 = [ones(m,1) X]; z2 = a1 *Theta1'; a2 = sigmoid(z2); a2 = [ones(size(a2,1),1) a2]; z3 = a2 * Theta2'; a3 = sigmoid(z3); y_classes = zeros(m,num_labels); %% 将分类结果y转化为向量形式,如y=5,转化为y=[0 0 0 0 1 0 0 0 0]; for i = 1 : m y_classes(i,y(i)) = 1; end %无正则化 J = sum(sum(-y_classes.*log(a3) - (1-y_classes).*log(1-a3))/m); %偏移位不进行正则化; Theta1_no_bias = Theta1; Theta1_no_bias(:,1) = 0; Theta2_no_bias = Theta2; Theta2_no_bias(:,1) = 0; % 有正则化 J= J + (sum( 4000 sum(Theta1_no_bias.*Theta1_no_bias)) + sum(sum(Theta2_no_bias.*Theta2_no_bias)))*lambda/2/m; %根据向前传递的算法计算每层误差,最后计算出每层的梯度。 delta3 = (a3 - y_classes); delta2 = delta3 * Theta2(:,2:end).*sigmoidGradient(z2); Theta1(:,1) = 0; Theta2(:,1) = 0; Theta1_grad = delta2' * a1/m + lambda /m *Theta1; Theta2_grad = delta3' * a2/m + lambda /m *Theta2; % Unroll gradients grad = [Theta1_grad(:) ; Theta2_grad(:)]; end
相关文章推荐
- 第一章 概述
- HttpServletRequest
- 网络安全及Web安全学习
- http2续
- html、web、http
- 读书笔记-java网络编程-3线程-从线程返回信息
- TCP /IP 协议-应用层协议
- 【Android网络请求】如何使用Volley发送POST请求
- 读书笔记-java网络编程-3线程-java线程概述
- 网络攻防
- HTML5+规范:Downloader(管理网络文件下载任务)
- TCP /IP 协议- 链路层
- 【FFMPEG】网络流媒体协议
- linux minimal安装方式nat网络配置
- 【JAVA】通过HttpURLConnection 上传和下载文件(二)
- TCP /IP 协议-简介
- TCP的三次握手和四次挥手
- [线性规划与网络流24题] 网络流常见模型
- TCP /IP 协议-(传输层)UDP协议
- python学习之路-9 socket网络编程