Keras TypeError:relu()缺少1个必需的位置参数:';x';
我得到了这个错误,我不知道为什么它来了。有人帮我吗Keras TypeError:relu()缺少1个必需的位置参数:';x';,keras,python-3.5,keras-layer,relu,Keras,Python 3.5,Keras Layer,Relu,我得到了这个错误,我不知道为什么它来了。有人帮我吗 import warnings warnings.filterwarnings('ignore',category=FutureWarning) import tensorflow as tf import keras from keras.layers.convolutional import Conv2D, AtrousConvolution2D from keras.layers import Activation, Dense, Inp
import warnings
warnings.filterwarnings('ignore',category=FutureWarning)
import tensorflow as tf
import keras
from keras.layers.convolutional import Conv2D, AtrousConvolution2D
from keras.layers import Activation, Dense, Input, Conv2DTranspose, Dense, Flatten
from keras.layers import Dropout, Concatenate, BatchNormalization, Reshape
from keras.layers.advanced_activations import LeakyReLU
from keras.models import Model, model_from_json
from keras.optimizers import Adam
from keras.layers.convolutional import UpSampling2D
import keras.backend as K
from keras.activations import relu
def g_build_conv(layer_input, filter_size, kernel_size=4, strides=2, activation='leakyrelu',
dropout_rate=g_dropout, norm='inst', dilation=1):
c = AtrousConvolution2D(filter_size, kernel_size=kernel_size, strides=strides,atrous_rate=
(dilation,dilation), padding='same')(layer_input)
if activation == 'leakyrelu':
c = relu()(c)
if dropout_rate:
c = Dropout(dropout_rate)(c)
if norm == 'inst':
c = InstanceNormalization()(c)
return c
警告(来自警告模块):文件
“C:\Users\xyz\AppData\Local\Programs\Python\35\lib\site packages\keras\legacy\layers.py”,
第762行
warnings.warn(atrusconvolution2d
layer)用户警告:已弃用atrusconvolution2d
层。请改用
Conv2D
带有膨胀率的层
最近调用(最后):文件“D:\Image Outpaining\outpaint.py”,第146行,
在里面
GEN=build\u generator()文件“D:\Image Outpaining\outpaint.py”,第120行,在build\u generator中
g1=g_build_conv(g_input,64,5,strips=1)文件“D:\Image Outpaining\outpaint.py”,第102行,在g_build_conv中
c=relu()(c)类型错误:relu()缺少1个必需的位置参数:“x”
keras.activations.relu
是一个函数,而不是一个层,因此您错误地调用了它。要将relu添加为层,请执行以下操作:
from keras.layers import Activation
if activation == 'leakyrelu':
c = Activation("relu")(c)