Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Tensorflow线性回归房价_Tensorflow_Linear Regression - Fatal编程技术网

Tensorflow线性回归房价

Tensorflow线性回归房价,tensorflow,linear-regression,Tensorflow,Linear Regression,我试图用神经网络解决一个线性回归问题,但我的损失达到了10的幂次方,并且没有减少训练。我使用的是房价预测数据集(),不知道出了什么问题。请帮助某人 X_train, X_test, y_train, y_test = train_test_split(df2, y, test_size=0.2) X_tr=np.array(X_train) y_tr=np.array(y_train) X_te=np.array(X_test) y_te=np.array(y_test) def get_w

我试图用神经网络解决一个线性回归问题,但我的损失达到了10的幂次方,并且没有减少训练。我使用的是房价预测数据集(),不知道出了什么问题。请帮助某人

X_train, X_test, y_train, y_test = train_test_split(df2, y, test_size=0.2)
X_tr=np.array(X_train)
y_tr=np.array(y_train)

X_te=np.array(X_test)
y_te=np.array(y_test)

def get_weights(shape,name): #(no of neurons*no of columns)
    s=tf.truncated_normal(shape)
    w=tf.Variable(s,name=name)
    return w

def get_bias(number,name):
    s=tf.truncated_normal([number])
    b=tf.Variable(s,name=name)
    return b

x=tf.placeholder(tf.float32,name="input")
w=get_weights([34,100],'layer1')
b=get_bias(100,'bias1')

op=tf.matmul(x,w)+b
a=tf.nn.relu(op)

fl=get_weights([100,1],'output')
b2=get_bias(1,'bias2')


op2=tf.matmul(a,fl)+b2


y=tf.placeholder(tf.float32,name='target')
loss=tf.losses.mean_squared_error(y,op2)
optimizer = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

with tf.Session() as sess:
    for i in range(0,1000):

        sess.run(tf.global_variables_initializer())
        _,l=sess.run([optimizer,loss],feed_dict={x:X_tr,y:y_tr})
        print(l)

您只需在每个培训步骤中随机初始化变量。只需在循环之前调用
sess.run(tf.global\u variables\u initializer())
一次

当我这么做的时候,我开始把“inf nan nan”作为我的损失