天天看點

Tensorflow 構造線性回歸模型

一個簡單的使用Tensorflow 實作的線性回歸模型demo

import tensorflow as tf
import matplotlib.pyplot as plt
num_points = 1000
vectors_set = []
for i in range(num_points):
        x1 = np.random.normal(0.0,0.55)
        y1 = x1* 0.1 + 0.3 + np.random.normal(0.0,0.03)
        vectors_set.append([x1,y1])
##生成一些樣本
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]
plt.scatter(x_data,y_data,c='r')
plt.show()
           
Tensorflow 構造線性回歸模型
#生成1維的W矩陣,取值是【-1,1]之間的随機數
W = tf.Variable(tf.random_uniform([1],-1.0,1.0),name='W')
#b 全0
b = tf.Variable(tf.zeros([1]),name='b')
y = W*x_data +b
#優化,用梯度下降不斷去拟合W 和 b
#以預估值Y和實際值y_data之間的均方差作為損失
#tf.square 是平方項,計算出真實值和預算值之間的差異項
#tf.reuce_mean 計算均值
loss = tf.reduce_mean(tf.square(y-y_data),name='loss') 
#采用梯度下降法來優化參數 函數:train.GradientDescentOptimizer  0.5 就是指定的學習率 可以随機指定
optimizer = tf.train.GradientDescentOptimizer(0.5)
#訓練的過程就是最小化這個誤內插補點,讓上面構造出來的優化器去最小化loss 值,
train = optimizer.minimize(loss,name='train')
#完成上面的定義之後,我們需要定義一個session,通過session 來實際的幹這些事情
sess = tf.Session()
#全局變量的初始化
init = tf.global_variables_initializer()
sess.run(init)
#初始化的w和b 是多少
print ("W =", sess.run(W),"b = ", sess.run(b),'loss = ',sess.run(loss))
#執行20次訓練
for step in range(20):
    sess.run(train)
    #輸出訓練好的W 和b
    print ("W =", sess.run(W),"b = ", sess.run(b),'loss = ',sess.run(loss))
           

輸出内容:

W = [0.24886155] b =  [0.] loss =  0.097556226
W = [0.20717728] b =  [0.30054685] loss =  0.0041697198
W = [0.17645806] b =  [0.30068] loss =  0.0025523046
W = [0.15452673] b =  [0.30077815] loss =  0.0017279172
W = [0.13886932] b =  [0.30084822] loss =  0.0013077314
W = [0.12769103] b =  [0.30089822] loss =  0.0010935644
W = [0.11971053] b =  [0.30093396] loss =  0.0009844047
W = [0.11401301] b =  [0.30095944] loss =  0.0009287667
W = [0.10994539] b =  [0.30097765] loss =  0.00090040814
W = [0.10704139] b =  [0.30099064] loss =  0.000885954
W = [0.10496815] b =  [0.3009999] loss =  0.0008785868
W = [0.103488] b =  [0.30100656] loss =  0.00087483175
W = [0.10243127] b =  [0.30101126] loss =  0.0008729178
W = [0.10167685] b =  [0.30101466] loss =  0.0008719423
W = [0.10113824] b =  [0.30101708] loss =  0.0008714451
W = [0.10075372] b =  [0.30101877] loss =  0.00087119173
W = [0.10047919] b =  [0.30102] loss =  0.0008710625
W = [0.1002832] b =  [0.3010209] loss =  0.00087099674
W = [0.10014328] b =  [0.30102152] loss =  0.00087096303
W = [0.10004338] b =  [0.30102196] loss =  0.000870946
W = [0.09997206] b =  [0.3010223] loss =  0.00087093737
           

構圖:

##w 越來越趨于0.1   b 趨近于 0.3  loss 越來越趨于0
##構圖
plt.scatter(x_data,y_data,c='r')
plt.plot(x_data,sess.run(W)*x_data+sess.run(b))
plt.show()
           
Tensorflow 構造線性回歸模型

繼續閱讀