Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/281.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python Tensorflow csv读卡器错误:“0”;字符串中的引号必须由另一个引号“转义”;_Python_Csv_Tensorflow - Fatal编程技术网

Python Tensorflow csv读卡器错误:“0”;字符串中的引号必须由另一个引号“转义”;

Python Tensorflow csv读卡器错误:“0”;字符串中的引号必须由另一个引号“转义”;,python,csv,tensorflow,Python,Csv,Tensorflow,我有一个csv文件,我很确定里面没有,我正试图用以下代码读取它: filename_queue = tf.train.string_input_producer(["../data/train_no_empty_rows.txt"]) # train_no_empty_rows reader = tf.TextLineReader() key, value = reader.read(filename_queue) record_defaults = [tf.constant(['p'],

我有一个csv文件,我很确定里面没有,我正试图用以下代码读取它:

filename_queue = tf.train.string_input_producer(["../data/train_no_empty_rows.txt"])
# train_no_empty_rows

reader = tf.TextLineReader()
key, value = reader.read(filename_queue)


record_defaults = [tf.constant(['p'], dtype=tf.string),    # Column 0
               tf.constant(['p'], dtype=tf.string),    # Column 1
               tf.constant(['p'], dtype=tf.string)]   # Column 2 


col1, col2, col3 = tf.decode_csv(
value, record_defaults=record_defaults,field_delim=" ")

features = tf.pack([col2, col3])
with tf.Session() as sess:
  # Start populating the filename queue.
  coord = tf.train.Coordinator()
  threads = tf.train.start_queue_runners(coord=coord)
  for i in range(1200):
    # Retrieve a single instance:
    example, label = sess.run([features, col1])

  coord.request_stop()
  coord.join(threads)
但当我运行它时,我得到了以下错误:

InvalidArgumentError: Quote inside a string has to be escaped by another quote
 [[Node: DecodeCSV_25 = DecodeCSV[OUT_TYPE=[DT_STRING, DT_STRING, DT_STRING], 
field_delim=" ", 
_device="/job:localhost/replica:0/task:0/cpu:0"]
(ReaderRead_25:1, Const_75, Const_76, Const_77)]]
我想我可以调试,但我找不到它引用csv文件中有问题的条目的位置。这是一个相当大的csv文件,前100个条目没有这个问题。正如我所说的,我找不到任何条目”,并且“在测试中似乎解析得很好”。有什么办法可以找到麻烦的入口吗


谢谢

查找麻烦条目的一种方法是在以下项之前添加一个运算符:

故障前最后一个记录的条目应指示哪个输入无效。当您进行此修改时,希望根本原因变得明显

# ...

# Prints out the contents of `key` and `value` every time the op executes.
value = tf.Print(value, [key, value])

col1, col2, col3 = tf.decode_csv(
    value, record_defaults=record_defaults, field_delim=" ")

# ...