Numpy的进阶学习

前言: 在学习cs231n编写课后作业代码过程中 。发现自己对计算的向量化vectorized不是很懂,编写不出代码。对numpy的库也只是停留在表面

Numpy

1.numpy 求解方程组

$Ax=b $ 求解 \(x=A^{-1}b\)

import numpy as np
np.linalg.slove(A,b)

# example
A=np.array([[1,2,3],[4,5,6]])
b=np.transpose(np.array([[2,1]]))
x=np.linalg.slove(A,b)

2.多元线性回归

最简单的最小二乘线性回归

寻找一个向量B可以使得 XB接近于y
\(y=X* \beta\)
\(\beta= (X^{T}X^)^{-1} X^{T}y\)

Xt = np.transpose(X)
XtX = np.dot(Xt,X)
Xty = np.dot(Xt,y)
beta = np.linalg.solve(XtX,Xty)

一个实践案例,项目

import csv
import numpy as np

def readData():
    X = []
    y = []
    with open('Housing.csv') as f:
        rdr = csv.reader(f)
        # Skip the header row
        next(rdr)
        # Read X and y
        for line in rdr:
            xline = [1.0]
            for s in line[:-1]:
                xline.append(float(s))
            X.append(xline)
            y.append(float(line[-1]))
    return (X,y)

X0,y0 = readData()
# Convert all but the last 10 rows of the raw data to numpy arrays
d = len(X0)-10
X = np.array(X0[:d])
y = np.transpose(np.array([y0[:d]]))

# Compute beta
Xt = np.transpose(X)
XtX = np.dot(Xt,X)
Xty = np.dot(Xt,y)
beta = np.linalg.solve(XtX,Xty)
print(beta)

# Make predictions for the last 10 rows in the data set
for data,actual in zip(X0[d:],y0[d:]):
    x = np.array([data])
    prediction = np.dot(x,beta)
    print('prediction = '+str(prediction[0,0])+' actual = '+str(actual))

猜你喜欢

转载自www.cnblogs.com/GeekDanny/p/10458885.html