function [J, grad] = costFunctionReg(theta, X, y, lambda) %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using % theta as the parameter for regularized logistic regression and the % gradient of the cost w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; grad = zeros(size(theta)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta predictions = 1./(1.+exp(-X*theta)); J = (-y'*log(predictions)-(1.-y')*log(1.-predictions))/m; reglutheta = theta(2:size(theta,1),1); J = (sum(reglutheta.^2)*lambda/(2*m))+J; grad = (X'*(predictions - y))./m; grad = [grad(1,:);grad(2:size(grad,1),:)+(lambda/m)*reglutheta]; % ============================================================= end
吴恩达的机器学习编程作业6:costFunctionReg正则化代价函数
猜你喜欢
转载自blog.csdn.net/melon__/article/details/80732732
今日推荐
周排行