Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python keyError:使用sklearn管道/keras拟合时为0_Python_Python 3.x_Pandas_Dataframe_Keras - Fatal编程技术网

Python keyError:使用sklearn管道/keras拟合时为0

Python keyError:使用sklearn管道/keras拟合时为0,python,python-3.x,pandas,dataframe,keras,Python,Python 3.x,Pandas,Dataframe,Keras,我到处都找遍了,但是这个“KeyError:0”仍然没有用。什么是trainInputtype(trainInput)如果trainInput是一个pd.DataFrame,那么添加.values将其转换为np.array。通过使用np.array()强制转换,train输入已经被强制转换为一个numpy数组,尽管我确实使用了.values,只是为了看看是否会有差异,而且不会很遗憾。 Using TensorFlow backend. Traceback (most recent call la

我到处都找遍了,但是这个“KeyError:0”仍然没有用。

什么是trainInput
type(trainInput)
如果trainInput是一个pd.DataFrame,那么添加.values将其转换为np.array。通过使用np.array()强制转换,train输入已经被强制转换为一个numpy数组,尽管我确实使用了.values,只是为了看看是否会有差异,而且不会很遗憾。
Using TensorFlow backend.
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/indexes/base.py", line 3078, in get_loc
    return self._engine.get_loc(key)
  File "pandas/_libs/index.pyx", line 140, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/index.pyx", line 162, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/hashtable_class_helper.pxi", line 1492, in pandas._libs.hashtable.PyObjectHashTable.get_item
  File "pandas/_libs/hashtable_class_helper.pxi", line 1500, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 0

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "eval.py", line 99, in <module>
    pipeline.fit(trainInput[:,0:len(trainInput[0])-1],trainInput[:,len(trainInput[0])-1])
  File "/home/sillerd/.local/lib/python3.5/site-packages/sklearn/pipeline.py", line 250, in fit
    self._final_estimator.fit(Xt, y, **fit_params)
  File "/home/sillerd/.local/lib/python3.5/site-packages/keras/wrappers/scikit_learn.py", line 141, in fit
    self.model = self.build_fn(**self.filter_sk_params(self.build_fn))
  File "eval.py", line 79, in baseline_model
    model.add(Dense(9,input_dim=len(data[0])-1,kernel_initializer='normal',activation='tanh'))
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/frame.py", line 2688, in __getitem__
    return self._getitem_column(key)
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/frame.py", line 2695, in _getitem_column
    return self._get_item_cache(key)
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/generic.py", line 2489, in _get_item_cache
    values = self._data.get(item)
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/internals.py", line 4115, in get
    loc = self.items.get_loc(item)
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/indexes/base.py", line 3080, in get_loc
    return self._engine.get_loc(self._maybe_cast_indexer(key))
  File "pandas/_libs/index.pyx", line 140, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/index.pyx", line 162, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/hashtable_class_helper.pxi", line 1492, in pandas._libs.hashtable.PyObjectHashTable.get_item
  File "pandas/_libs/hashtable_class_helper.pxi", line 1500, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 0

results from fitting items from pandas both by using iloc as well as casting the items into what seems to be properly to be a properly formatted 2d array.
data = data.interpolate(method="nearest")
data = data.fillna(0)
currData = currData.interpolate(method="nearest")
currData = currData.fillna(0)

def baseline_model():
    model = Sequential()
    model.add(Dense(9,input_dim=len(data[0])-1,kernel_initializer='normal',activation='tanh'))
    model.add(Dense(4,kernel_initializer='normal',activation='tanh'))
    model.add(Dense(1,kernel_initializer='normal',activation='tanh'))
    sgd1 = SGD(lr=1, decay=0.001, momentum=0.9, nesterov=True)
    model.compile(loss='mean_squared_error',optimizer=sgd1)
    return model

estimators = []
estimators.append(('standardize', StandardScaler()))
estimators.append(('mlp', KerasRegressor(build_fn=baseline_model, epochs=1000, batch_size=5, verbose=1)))
pipeline = Pipeline(estimators)

numColumns = len(data.columns.values.tolist())

trainInput = np.array(data)
inputs     = np.array(currData)


pipeline.fit(trainInput[:,0:len(trainInput[0])-1],trainInput[:,len(trainInput[0])-1])
currData["predictions"] = pipeline.predict(inputs[:,0:len(trainInput[0])-1])