Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ionic-framework/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 加州住房数据的神经网络_Python_Machine Learning_Keras_Neural Network - Fatal编程技术网

Python 加州住房数据的神经网络

Python 加州住房数据的神经网络,python,machine-learning,keras,neural-network,Python,Machine Learning,Keras,Neural Network,我试图编写一个神经网络,它是在加利福尼亚住房数据集上训练的,我从Aurelion Geron的GitHup中得到的。 但是当我运行代码时,网络没有得到训练,loss=nan。 有人能解释我做错了什么吗? 向你问好,罗宾 csv文件的链接: 我的代码: import numpy import pandas as pd from keras.models import Sequential from keras.layers import Dense # load dataset df = pd

我试图编写一个神经网络,它是在加利福尼亚住房数据集上训练的,我从Aurelion Geron的GitHup中得到的。 但是当我运行代码时,网络没有得到训练,loss=nan。 有人能解释我做错了什么吗? 向你问好,罗宾

csv文件的链接:

我的代码:

import numpy
import pandas as pd
from keras.models import Sequential
from keras.layers import Dense


# load dataset
df = pd.read_csv("housing.csv", delimiter=",", header=0)
# split into input (X) and output (Y) variables
Y = df["median_house_value"].values
X = df.drop("median_house_value", axis=1)
# Inland / Not Inland -> True / False = 1 / 0
X["ocean_proximity"] = X["ocean_proximity"]== "INLAND"
X=X.values


X= X.astype(float)
Y= Y.astype(float)

model = Sequential()
model.add(Dense(100, activation="relu", input_dim=9))
model.add(Dense(1, activation="linear"))
# Compile model
model.compile(loss="mean_squared_error", optimizer="adam")


model.fit(X, Y, epochs=50, batch_size=1000, verbose=1)

我发现了错误,“总卧室”列中缺少一个值

您需要从数据中删除NaN值

在快速查看数据之后,您还需要对数据进行规范化(就像每次使用神经网络一样,以帮助收敛)


为此,可以使用标准缩放器、最小-最大缩放器等。

数据帧中的nan值导致此行为。删除具有nan值的行并规范化数据:

df = df[~df.isnull().any(axis=1)]
df.iloc[:,:-1]=((df.iloc[:,:-1]-df.iloc[:,:-1].min())/(df.iloc[:,:-1].max()-df.iloc[:,:-1].min()))
您将获得:

Epoch 1/50
 1000/20433 [>.............................] - ETA: 3s - loss: 0.1732
20433/20433 [==============================] - 0s 11us/step - loss: 0.1001
Epoch 2/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0527
20433/20433 [==============================] - 0s 3us/step - loss: 0.0430
Epoch 3/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0388
20433/20433 [==============================] - 0s 2us/step - loss: 0.0338
Epoch 4/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0301
20433/20433 [==============================] - 0s 2us/step - loss: 0.0288
Epoch 5/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0300
20433/20433 [==============================] - 0s 2us/step - loss: 0.0259
Epoch 6/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0235
20433/20433 [==============================] - 0s 3us/step - loss: 0.0238
Epoch 7/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0242
20433/20433 [==============================] - 0s 2us/step - loss: 0.0225
Epoch 8/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0213
20433/20433 [==============================] - 0s 2us/step - loss: 0.0218
Epoch 9/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0228
20433/20433 [==============================] - 0s 2us/step - loss: 0.0214
Epoch 10/50
 1000/20433 [>.............................] - ETA: 0s - loss: 0.0206
20433/20433 [==============================] - 0s 2us/step - loss: 0.0211