TF数据集。错误=tensorflow:向协调员报告的错误:没有为任何变量提供梯度:tensorflow 2.2.0

TF数据集。错误=tensorflow:向协调员报告的错误:没有为任何变量提供梯度:tensorflow 2.2.0,tensorflow,tensorflow2.0,Tensorflow,Tensorflow2.0,当我运行下面的代码时,我得到错误INFO:tensorflow:error报告给协调器:没有为任何变量提供梯度: 您将需要以下内容来运行代码 Tensorflow 2.2.0 效率网 克拉斯·贝特() 努比 熊猫 您还需要从下载预训练的BERT模型权重 代码使用tensorflow数据集API动态生成数据 import tensorflow print('TensorFlow version =', tensorflow.__version__) AUTO = tensorflow.dat

当我运行下面的代码时,我得到错误
INFO:tensorflow:error报告给协调器:没有为任何变量提供梯度:

您将需要以下内容来运行代码

  • Tensorflow 2.2.0
  • 效率网
  • 克拉斯·贝特()
  • 努比
  • 熊猫
您还需要从下载预训练的BERT模型权重

代码使用tensorflow数据集API动态生成数据

import tensorflow
print('TensorFlow version =', tensorflow.__version__)

AUTO = tensorflow.data.experimental.AUTOTUNE

import efficientnet.tfkeras as efn
from efficientnet.tfkeras import preprocess_input

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D, Dropout, Input, Embedding, LSTM, Add
from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.preprocessing.image import img_to_array as img_to_array
from tensorflow.keras.preprocessing.image import load_img as load_img
from tensorflow.keras.optimizers import SGD

import codecs
from keras_bert import load_trained_model_from_checkpoint
import ast
import pandas as pd
from keras_bert import Tokenizer
import numpy as np
import os

TF_KERAS = 1

pretrained_path = '../Data/BERT/uncased_L-12_H-768_A-12'
config_path = os.path.join(pretrained_path, 'bert_config.json')
checkpoint_path = os.path.join(pretrained_path, 'bert_model.ckpt')
vocab_path = os.path.join(pretrained_path, 'vocab.txt')
SEQ_LEN = 128

token_dict = {}
with codecs.open(vocab_path, 'r', 'utf8') as reader:
    for line in reader:
        token = line.strip()
        token_dict[token] = len(token_dict)
tokenizer = Tokenizer(token_dict)

EPOCHS = 5
NUM_CLASSES = 10

def get_model(base_model, bert_model, NUM_CLASSES, emdedding_size=768):
    # add a global spatial average pooling layer
    x = base_model.output

    x = Dropout(0.05)(x)
    x = GlobalAveragePooling2D()(x)
    x = Dense(1024, activation='relu')(x)
    x = Dense(emdedding_size, activation='relu')(x)

    # sequence model
    dense = bert_model.get_layer('NSP-Dense').output

    # decoder model
    decoder1 = Add()([x, dense])
    decoder2 = Dense(emdedding_size, activation='relu')(decoder1)
    output = Dense(NUM_CLASSES, activation='softmax', name='output')(decoder2)

    # tie it together 
    model = Model(inputs={'input_1': base_model.input, \
                         'Input-Token': bert_model.inputs[0],\
                         'Input-Segment': bert_model.inputs[1]}, \
                  outputs={'output': output})
    return model

gpus = tensorflow.config.list_physical_devices('GPU'); print(gpus)
if len(gpus)==1: strategy = tensorflow.distribute.OneDeviceStrategy(device="/gpu:0")
else: strategy = tensorflow.distribute.MirroredStrategy()

max_length = 20

DIM = 224

with strategy.scope():
    base_model = efn.EfficientNetB4(weights='imagenet', include_top=False, input_shape=(DIM,DIM,3))  # or weights='noisy-student'

    bert_model = load_trained_model_from_checkpoint(
            config_path,
            checkpoint_path,
            training=True,
            trainable=True,
            seq_len=SEQ_LEN,
        )

    model = get_model(base_model, bert_model, NUM_CLASSES)
    model.compile(optimizer=SGD(lr=.00001, momentum = 0.9), loss ='categorical_crossentropy', metrics=['categorical_accuracy'])

def doaugmentation(img, rand_num=None):
    
    if rand_num==None:
        rand_num = random.randint(0, 2)
    
    if rand_num == 0 :
        return img
    elif rand_num == 1: # brightness
        return tensorflow.image.random_brightness( img, 0.4, seed=1 )
    else:
        return img
        
def get_dataset(csv_path, mode, batch_size, data_path, debug=False):
    if debug: print ('[+] Inside the data function')
    df = pd.read_csv(csv_path)
    if debug: print ('[+] Read the csv file; shape=', df.shape)
    image_paths = df.apply(lambda x: os.path.join(data_path, x['image_name']), axis=1).tolist()
    if debug: print ('[+] Image paths recieved')
    descriptions = df['text'].apply(lambda x: x.lower()).tolist()
    if debug: print ('[+] Descriptions lower cased')
    
    if mode != 'test': ## output
        if debug: print ('[+] Mode= {}'.format(mode))
        output = df['output'].apply(lambda x: ast.literal_eval(x)).tolist()
        if debug: print ('[+] All Ids received')
        dataset = tensorflow.data.Dataset.from_tensor_slices((image_paths, descriptions, output))
        if debug: print ('[+] Tensor to slices done')
        dataset = dataset.shuffle(len(df))
        if debug: print ('[+] Dataset shuffled')
    else:
        dataset = tensorflow.data.Dataset.from_tensor_slices((image_paths, descriptions, [None]*len(image_paths)))
              
    dataset = dataset.batch(batch_size)
    if debug: print ('[+] Batch generated')
    dataset = dataset.map(lambda img_path, description, output: tensorflow.py_function(process_data,\
                          [img_path, description, output],\
                          [tensorflow.float32, tensorflow.float32, tensorflow.float32, tensorflow.int32]), num_parallel_calls=AUTO)
    if debug: print ('[+] Final Map done')
    dataset = dataset.map(split, num_parallel_calls=AUTO)
    if debug: print ('[+] Prefetching now...')
    dataset = dataset.prefetch(AUTO)
    return dataset

def split(image, description, description_like, output):
    return {'input_2': image, 'Input-Token':description, 'Input-Segment': description_like, 'output':output}

def process_data(img_paths, descriptions, output):
    global DIM
    images = [process_image(img_path, DIM) for img_path in img_paths.numpy()]
    desription, desription_like = [process_text(description)[0] for description in descriptions], [process_text(description)[1] for description in descriptions]   
    if output[0].numpy().any() == None:
        return images, desription, desription_like
    return images, desription, desription_like, output

def process_image(img_path, im_size):
    image_string = tensorflow.io.read_file(img_path)
    image = tensorflow.image.decode_jpeg(image_string, channels=3)
    image = tensorflow.image.convert_image_dtype(image, tensorflow.float32)
    image = tensorflow.image.resize(image, [im_size, im_size])
    return image

def process_text(text):
    global tokenizer, SEQ_LEN
    desription = tokenizer.encode(tensorflow.compat.as_str_any(text.numpy()), max_len=SEQ_LEN)[0]
    desription_like = np.zeros_like(desription)
    return desription, desription_like

batch_size  = 1

dataset_train = get_dataset(file_name, 'train', batch_size, dir_path, True)
dataset_val = get_dataset(file_name, 'val', batch_size, dir_path, True)

H = model.fit(x=dataset_train,
              validation_data=dataset_val,
                    verbose=1, 
                    epochs=1)

您还需要该文件位于同一目录中

text,image_name,output
Honeywell MN Series Portable Air Conditioner with Dehumidifier & Fan for Rooms Up To 450 Sq. Ft.,picture1.jpeg,"[1, 0, 0, 0, 0, 0, 0]"
"TCL 10,000 BTU White Window Air Conditioner with Wi-Fi",picture2.png,"[1, 0, 0, 0, 0, 0, 0]"
Honeywell MN Series Portable Air Conditioner with Dehumidifier & Fan for Rooms Up To 450 Sq. Ft.,picture1.jpeg,"[1, 0, 0, 0, 0, 0, 0]"
"TCL 10,000 BTU White Window Air Conditioner with Wi-Fi",picture2.png,"[1, 0, 0, 0, 0, 0, 0]"
Honeywell MN Series Portable Air Conditioner with Dehumidifier & Fan for Rooms Up To 450 Sq. Ft.,picture1.jpeg,"[1, 0, 0, 0, 0, 0, 0]"
"TCL 10,000 BTU White Window Air Conditioner with Wi-Fi",picture2.png,"[1, 0, 0, 0, 0, 0, 0]"
Honeywell MN Series Portable Air Conditioner with Dehumidifier & Fan for Rooms Up To 450 Sq. Ft.,picture1.jpeg,"[1, 0, 0, 0, 0, 0, 0]"
"TCL 10,000 BTU White Window Air Conditioner with Wi-Fi",picture2.png,"[1, 0, 0, 0, 0, 0, 0]"
这些照片呢

我得到以下错误


---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-31-f9c0025d2202> in <module>
      2               validation_data=dataset_val,
      3                     verbose=1,
----> 4                     epochs=1)

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\training.py in _method_wrapper(self, *args, **kwargs)
     64   def _method_wrapper(self, *args, **kwargs):
     65     if not self._in_multi_worker_mode():  # pylint: disable=protected-access
---> 66       return method(self, *args, **kwargs)
     67 
     68     # Running inside `run_distribute_coordinator` already.

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
    846                 batch_size=batch_size):
    847               callbacks.on_train_batch_begin(step)
--> 848               tmp_logs = train_function(iterator)
    849               # Catch OutOfRangeError for Datasets of unknown size.
    850               # This blocks until the batch has finished executing.

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\def_function.py in __call__(self, *args, **kwds)
    578         xla_context.Exit()
    579     else:
--> 580       result = self._call(*args, **kwds)
    581 
    582     if tracing_count == self._get_tracing_count():

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\def_function.py in _call(self, *args, **kwds)
    625       # This is the first call of __call__, so we have to initialize.
    626       initializers = []
--> 627       self._initialize(args, kwds, add_initializers_to=initializers)
    628     finally:
    629       # At this point we know that the initialization is complete (or less

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\def_function.py in _initialize(self, args, kwds, add_initializers_to)
    504     self._concrete_stateful_fn = (
    505         self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
--> 506             *args, **kwds))
    507 
    508     def invalid_creator_scope(*unused_args, **unused_kwds):

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs)
   2444       args, kwargs = None, None
   2445     with self._lock:
-> 2446       graph_function, _, _ = self._maybe_define_function(args, kwargs)
   2447     return graph_function
   2448 

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\function.py in _maybe_define_function(self, args, kwargs)
   2775 
   2776       self._function_cache.missed.add(call_context_key)
-> 2777       graph_function = self._create_graph_function(args, kwargs)
   2778       self._function_cache.primary[cache_key] = graph_function
   2779       return graph_function, args, kwargs

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
   2665             arg_names=arg_names,
   2666             override_flat_arg_shapes=override_flat_arg_shapes,
-> 2667             capture_by_value=self._capture_by_value),
   2668         self._function_attributes,
   2669         # Tell the ConcreteFunction to clean up its graph once it goes out of

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\framework\func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    979         _, original_func = tf_decorator.unwrap(python_func)
    980 
--> 981       func_outputs = python_func(*func_args, **func_kwargs)
    982 
    983       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\eager\def_function.py in wrapped_fn(*args, **kwds)
    439         # __wrapped__ allows AutoGraph to swap in a converted function. We give
    440         # the function a weak reference to itself to avoid a reference cycle.
--> 441         return weak_wrapped_fn().__wrapped__(*args, **kwds)
    442     weak_wrapped_fn = weakref.ref(wrapped_fn)
    443 

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\framework\func_graph.py in wrapper(*args, **kwargs)
    966           except Exception as e:  # pylint:disable=broad-except
    967             if hasattr(e, "ag_error_metadata"):
--> 968               raise e.ag_error_metadata.to_exception(e)
    969             else:
    970               raise

ValueError: in user code:

    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\training.py:571 train_function  *
        outputs = self.distribute_strategy.run(
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\distribute\distribute_lib.py:951 run  **
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\distribute\distribute_lib.py:2290 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\distribute\mirrored_strategy.py:770 _call_for_each_replica
        fn, args, kwargs)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\distribute\mirrored_strategy.py:201 _call_for_each_replica
        coord.join(threads)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\training\coordinator.py:389 join
        six.reraise(*self._exc_info_to_raise)
    C:\Users\i24009\Anaconda3\envs\py36TF2x1\lib\site-packages\six.py:703 reraise
        raise value
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\training\coordinator.py:297 stop_on_exception
        yield
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\distribute\mirrored_strategy.py:998 run
        self.main_result = self.main_fn(*self.main_args, **self.main_kwargs)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\training.py:541 train_step  **
        self.trainable_variables)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\training.py:1804 _minimize
        trainable_variables))
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\optimizer_v2\optimizer_v2.py:521 _aggregate_gradients
        filtered_grads_and_vars = _filter_grads(grads_and_vars)
    C:\Users\i24009\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\optimizer_v2\optimizer_v2.py:1219 _filter_grads
        ([v.name for _, v in grads_and_vars],))

---------------------------------------------------------------------------
ValueError回溯(最近一次调用上次)
在里面
2验证数据=数据集值,
3详细=1,
---->4个时代=1)
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\keras\engine\training.py in\u method\u包装(self,*args,**kwargs)
64定义方法包装(self,*args,**kwargs):
65如果不是自己,则为多工作模式():#pylint:disable=受保护访问
--->66返回方法(self、*args、**kwargs)
67
68#已经在"运行分配协调器"内部运行。
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\keras\engine\training.py-in-fit(self、x、y、批大小、历元、冗余、回调、验证拆分、验证数据、洗牌、类权重、样本权重、初始历元、每个历元的步骤、验证步骤、验证批次大小、验证频率、最大队列大小、工人、使用多处理)
846批次大小=批次大小):
847回拨。列车上批次开始(步骤)
-->848 tmp_日志=训练函数(迭代器)
849#捕获未知大小数据集的范围错误。
850#这会一直阻塞,直到批处理完成执行。
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\def\u function.py in\uuuuu调用(self,*args,**kwds)
578 xla_context.Exit()
579其他:
-->580结果=自调用(*args,**kwds)
581
582如果跟踪计数==self.\u获取跟踪计数():
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\def_function.py in_调用(self,*args,**kwds)
625#这是u call u的第一个调用,因此我们必须初始化。
626初始值设定项=[]
-->627自我初始化(参数、KWD、添加初始化器到=初始化器)
628最后:
629#此时我们知道初始化已完成(或更少)
初始化中的~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\def_function.py(self、args、kwds、add_initializers_to)
504自身.\u具体的\u有状态的\u fn=(
505 self._stateful_fn._get_concrete_function_internal_garbage_collected(#pylint:disable=受保护的访问
-->506*args,**科威特第纳尔)
507
508 def无效的创建者范围(*未使用的参数,**未使用的参数):
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\function.py in\u get\u concrete\u function\u internal\u garbage\u collected(self,*args,**kwargs)
2444 args,kwargs=None,None
2445带自锁:
->2446图形函数,u,u=self._可能定义函数(args,kwargs)
2447返回图函数
2448
函数(self、args、kwargs)中的~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\function.py
2775
2776 self.\u function\u cache.missed.add(调用上下文键)
->2777图形函数=自身。创建图形函数(args、kwargs)
2778 self.\u function\u cache.primary[cache\u key]=图形函数
2779返回图_函数,args,kwargs
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\function.py in\u create\u graph\u函数(self、args、kwargs、override\u flat\u arg\u shapes)
2665 arg_name=arg_name,
2666覆盖平面形状=覆盖平面形状,
->2667按值捕获=自身。_按值捕获),
2668自我功能属性,
2669#告诉concrete函数在退出时清理其图形
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\framework\func\u graph.py in func\u graph\u from\u py func(名称、Python\u func、args、kwargs、签名、func\u图、autograph、autograph\u选项、添加控制依赖项、arg\u名称、op\u返回值、集合、按值捕获、覆盖平面arg\u形状)
979,original\u func=tf\u decorator.unwrap(python\u func)
980
-->981 func_outputs=python_func(*func_args,**func_kwargs)
982
983#不变量:`func_outputs`只包含张量、复合传感器、,
~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\eager\def_function.py in wrapped_fn(*args,**kwds)
439#uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu。我们给予
440#函数对自身进行弱引用以避免引用循环。
-->441返回弱_-wrapped_-fn()
442弱包裹的=weakref.ref(包裹的)
443
包装器中的~\AppData\Roaming\Python\Python36\site packages\tensorflow\Python\framework\func\u graph.py(*args,**kwargs)
966例外情况为e:#pylint:disable=broad Exception
967如果hasattr(e,“ag\u错误\u元数据”):
-->968将e.ag\u错误\u元数据引发到\u异常(e)
969其他:
970加薪
ValueError:在用户代码中:
C:\Users\i24009\AppData\Roaming\Python36\site packages\tensorflow\Python\keras\engine\training.py:571 train\u函数*
输出=self.distribution\u strategy.run(
C:\Users\i24009\AppData\Roaming\Python\Python36\site软件包\