Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/320.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/214.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 使用Tensorflow估计器并得到以下错误:InvalidArgumentError:assertion failed:,尝试了看似相似但无效的解决方案_Python_Machine Learning_Google Colaboratory_Tensorflow2.0 - Fatal编程技术网

Python 使用Tensorflow估计器并得到以下错误:InvalidArgumentError:assertion failed:,尝试了看似相似但无效的解决方案

Python 使用Tensorflow估计器并得到以下错误:InvalidArgumentError:assertion failed:,尝试了看似相似但无效的解决方案,python,machine-learning,google-colaboratory,tensorflow2.0,Python,Machine Learning,Google Colaboratory,Tensorflow2.0,以下是我收到的错误消息: InvalidArgumentError回溯(最后一次最近调用) /调用中的usr/local/lib/python3.6/dist-packages/tensorflow/python/client/session.py(self,fn,*args) 1364尝试: ->1365返回fn(*args) 1366错误除外。操作错误为e: InvalidArgumentError:断言失败:[标签必须是 def make_input_fn(data_df, label_

以下是我收到的错误消息:

InvalidArgumentError回溯(最后一次最近调用) /调用中的usr/local/lib/python3.6/dist-packages/tensorflow/python/client/session.py(self,fn,*args) 1364尝试: ->1365返回fn(*args) 1366错误除外。操作错误为e:

InvalidArgumentError:断言失败:[标签必须是
 def make_input_fn(data_df, label_df, num_epochs=10, shuffle=True, batch_size=32):
  def input_function():  # inner function, this will be returned
    ds = tf.data.Dataset.from_tensor_slices((dict(data_df), label_df))  # create tf.data.Dataset object with data and its label
    if shuffle:
      ds = ds.shuffle(1000)  # randomize order of data
    ds = ds.batch(batch_size).repeat(num_epochs)  # split dataset into batches of 32 and repeat process for number of epochs
    return ds  # return a batch of the dataset
  return input_function  # return a function object for use

train_input_fn = make_input_fn(dftrain, y_train)  # here we will call the input_function that was returned to us to get a dataset object we can feed to the model
eval_input_fn = make_input_fn(dfeval, y_eval, num_epochs=1, shuffle=False)

# Create Estimator
linear_est = tf.estimator.LinearClassifier(feature_columns=feature_columns, n_classes=len(y_train))

linear_est.train(train_input_fn)  # train
result = linear_est.evaluate(eval_input_fn)  # get model metrics/stats by testing on tetsing data

# clears console output
clear_output()  

# the result variable is simply a dict of stats about our model
print(result['accuracy'])