Python 从keras模型中删除警告
在构建模型时,我收到了警告——起初我认为这不是真的;也许情况已经改变了: 型号:Python 从keras模型中删除警告,python,tensorflow,keras,Python,Tensorflow,Keras,在构建模型时,我收到了警告——起初我认为这不是真的;也许情况已经改变了: 型号: # Initialise Sequential model regressor = Sequential() # units is the output dimensionality # return sequences will return the sequence # which will be required to the next LSTM # as a great big rule-o-thum
# Initialise Sequential model
regressor = Sequential()
# units is the output dimensionality
# return sequences will return the sequence
# which will be required to the next LSTM
# as a great big rule-o-thumb, layers should be less than 10, and perhaps 1 per endog plus 1 for all exog
# also see: https://stats.stackexchange.com/questions/181/how-to-choose-the-number-of-hidden-layers-and-nodes-in-a-feedforward-neural-netw/1097#1097
alphaNh = len(columns) if len(columns) < 10 else 10 # 2-10, with 2 or 5 being common
sample_frames = n
nh = int(sample_frames/alphaNh*dim)
dropout = 0.2
print('nh', nh)
# input shape will need only the last 2 dimensions
# of your input
################# 1st layer #######################
regressor.add(LSTM(units=nh, return_sequences=True,
input_shape=(timesteps, dim)))
# add Dropout to do regulariztion
# standard practise to use 20%
# regressor.add(Dropout(dropout))
layers = (len(endog) + 1) if len(endog) > 1 else 2
print('layers', layers)
for i in range(1, layers):
# After the first time, it's not required to
# specify the input_shape
################# layer #######################
# if i > 5:
# break
if i < layers - 1:
cell = LSTM(units=nh, return_sequences=True)
else:
cell = LSTM(units=nh)
regressor.add(cell)
################# Dropout layer #################
# After training layers we use some dropout.
# another option is to put this after each dim
# layer (above)
#
# standard practise to use 20%
regressor.add(Dropout(dropout))
################# Last layer ####################
# Last layer would be the fully connected layer,
# or the Dense layer
#
# The last word will predict a single number
# hence units=1
regressor.add(Dense(units=dim))
# Compiling the RNN
# The loss function for classification problem is
# cross entropy, since this is a regression problem
# the loss function will be mean squared error
regressor.compile(optimizer='adam', loss='mean_squared_error')
### src: https://keras.io/callbacks/
#saves the model weights after each epoch if the validation loss decreased
###
checkpointer = ModelCheckpoint(filepath='weights.hdf5', verbose=1, monitor='loss', mode='min', save_best_only=True)
如何使其现代化(消除警告)?TensorFlow中的警告可以由
tf.logging
模块管理。要关闭您可以使用的警告
tf.logging.set_verbosity(tf.logging.ERROR)
在TensorFlow 2中,它是
tf.get_logger().setLevel('ERROR')
您可以尝试以下操作:
tf.logging.set\u详细信息(tf.logging.ERROR)
@ShubhamPanchal是否将此作为答案提交?否则我可能会删除这个问题警告是什么意思?我的意思是,我应该担心我做错了什么吗?一个弃用通知意味着它可以工作,但很快就会被删除。因此,如果要维护此代码,则应替换它。但对于一次性调查来说,这没关系。这些警告是关于Keras如何使用TensorFlow的。维护Keras的人是能够修复Keras的人。
tf.get_logger().setLevel('ERROR')