UFLDL教程Exercise答案(4):Softmax Regression
2016-11-21 13:13
567 查看
教程地址:http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutorial
Exercise地址:http://deeplearning.stanford.edu/wiki/index.php/Exercise:Softmax_Regression
当前路径下 (mat和m文件在一起): load ***.mat;
在下一级路径下: load .\下一级路径的文件名\***.mat;
在上一级路径下: load ..\***.mat;
在平行文件夹内: load ..\平行的文件夹\***.mat;
images = loadMNISTImages('./mnist/train-images.idx3-ubyte');
labels = loadMNISTLabels('./mnist/train-labels.idx1-ubyte');
% Instructions: Compute the cost and gradient for softmax regression.
% You need to compute thetagrad and cost.
% The groundTruth matrix might come in handy.
M1 = theta*data; % (numClasses*numCases)
M2 = bsxfun(@minus, M1, max(M1,[],1)); %max(M,[],1)返回一个行向量,对应每一列的最大值;
%bsxfun将M中每个元素减去每列的最大值,防止数据溢出
M3 = exp(M2);
p = bsxfun(@rdivide, M3,sum(M3)); %概率矩阵,compute the hypothesis (by dividing all elements in each column by their column sum)
cost = -1/numCases*sum(sum(groundTruth.*log(p))) + lambda/2*sum(sum(theta.^2));
thetagrad = -1/numCases*((groundTruth - p)*data') + lambda*theta; %(numClasses*numCases)
% ------------------------------------------------------------------
% Instructions: Compute pred using theta assuming that the labels start
% from 1.
[~,pred] = max(theta * data); %pred存放每列最大值所在行号,即该样本所属类别
% ---------------------------------------------------------------------注:运行时将DEBUG设置为“false”。
Exercise地址:http://deeplearning.stanford.edu/wiki/index.php/Exercise:Softmax_Regression
代码
Step 0: Initialize constants and parameters ——代码已给
Step 1: Load data ——代码已给
注意根据自己的文件目录和文件名修改当前路径下 (mat和m文件在一起): load ***.mat;
在下一级路径下: load .\下一级路径的文件名\***.mat;
在上一级路径下: load ..\***.mat;
在平行文件夹内: load ..\平行的文件夹\***.mat;
images = loadMNISTImages('./mnist/train-images.idx3-ubyte');
labels = loadMNISTLabels('./mnist/train-labels.idx1-ubyte');
Step 2: Implement softmaxCost ——softmaxCost.m
%% ---------- YOUR CODE HERE --------------------------------------% Instructions: Compute the cost and gradient for softmax regression.
% You need to compute thetagrad and cost.
% The groundTruth matrix might come in handy.
M1 = theta*data; % (numClasses*numCases)
M2 = bsxfun(@minus, M1, max(M1,[],1)); %max(M,[],1)返回一个行向量,对应每一列的最大值;
%bsxfun将M中每个元素减去每列的最大值,防止数据溢出
M3 = exp(M2);
p = bsxfun(@rdivide, M3,sum(M3)); %概率矩阵,compute the hypothesis (by dividing all elements in each column by their column sum)
cost = -1/numCases*sum(sum(groundTruth.*log(p))) + lambda/2*sum(sum(theta.^2));
thetagrad = -1/numCases*((groundTruth - p)*data') + lambda*theta; %(numClasses*numCases)
% ------------------------------------------------------------------
Step 3: Gradient checking ——代码已给
检测梯度计算是否正确,调试时将 DEBUG 设置为 “true”,只有当DEBUG为“true”时才执行此步操作,真正训练预测时不执行此步操作。Step 4: Learning parameters ——代码已给
Step 5: Testing ——softmaxPredict.m
%% ---------- YOUR CODE HERE --------------------------------------% Instructions: Compute pred using theta assuming that the labels start
% from 1.
[~,pred] = max(theta * data); %pred存放每列最大值所在行号,即该样本所属类别
% ---------------------------------------------------------------------注:运行时将DEBUG设置为“false”。
相关文章推荐
- UFLDL教程答案(4):Exercise:Softmax Regression
- UFLDL教程:Exercise:Softmax Regression
- Deep Learning 6_深度学习UFLDL教程:Softmax Regression_Exercise(斯坦福大学深度学习教程)
- Stanford UFLDL教程 Exercise:Softmax Regression
- UFLDL Exercise: Softmax Regression
- UFLDL——Exercise: Softmax Regression (softmax回归)
- UFLDL教程Exercise答案(2):Vectorization
- ufldl学习笔记与编程作业:Softmax Regression(softmax回归)
- UFLDL学习笔记3(Softmax Regression)
- ufldl学习笔记和编程作业:Softmax Regression(softmax回报)
- UFLDL教程Exercise答案(3.1):PCA in 2D
- UFLDL教程 Exercise:Vectorization(答案)
- UFLDL softmax回归编程答案
- UFLDL教程答案(5):Exercise:Self-Taught Learning
- UFLDL教程答案(8):Exercise:Convolution and Pooling
- UFLDL教程Exercise答案(5):Self-Taught Learning
- UFLDL 教程答案 稀疏编码与softmax篇的答案已经传到资源,大家可以免费下载~
- UFLDL教程Exercise答案(7):Learning color features with Sparse Autoencoders
- UFLDL教程Exercise答案(8):Convolution and Pooling
- UFLDL学习笔记3——Softmax Regression