Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/r/72.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/reporting-services/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
为什么在执行1D-CNN时R会话中止?_R_Tensorflow_Keras_Rstudio_Conv Neural Network - Fatal编程技术网

为什么在执行1D-CNN时R会话中止?

为什么在执行1D-CNN时R会话中止?,r,tensorflow,keras,rstudio,conv-neural-network,R,Tensorflow,Keras,Rstudio,Conv Neural Network,我试图在一个只有3000多个样本的小数据集上执行1D-CNN。对于生成器,回望参数设置为7。训练几秒钟后,R会话中止 但是,当lookback参数设置为7以外的任何值(例如5、9或10)时,1D-CNN模型可以工作!我还尝试了GRU、LSTM和完全连接的层,将lookback参数设置为7,一切正常。 我的环境如下:R3.6.1、Rstudio 1.5.5001、Tensorflow 2.0.0、Keras 2.2.5.0。 代码如下所示: lookback <- 7 step <-

我试图在一个只有3000多个样本的小数据集上执行1D-CNN。对于生成器,回望参数设置为7。训练几秒钟后,R会话中止

但是,当lookback参数设置为7以外的任何值(例如5、9或10)时,1D-CNN模型可以工作!我还尝试了GRU、LSTM和完全连接的层,将lookback参数设置为7,一切正常。 我的环境如下:R3.6.1、Rstudio 1.5.5001、Tensorflow 2.0.0、Keras 2.2.5.0。 代码如下所示:

lookback <- 7
step <- 1
delay <- 1
batch_size <- 32
nbfeature <- 31

generator_WeeklyTotal <- function(data, lookback, delay, min_index, max_index,
                      shuffle = FALSE, batch_size, step = 1) {
  if (is.null(max_index)) max_index <- nrow(data) - delay - 1
  i <- min_index + lookback
  function() {
    if (shuffle) {
      rows <- sample(c((min_index+lookback):max_index), size = batch_size)
    } else {
      if (i + batch_size >= max_index)
        i <<- min_index + lookback
      rows <- c(i:min(i+batch_size, max_index))
      i <<- i + length(rows)
    }
    samples <- array(0, dim = c(length(rows),
                                lookback / step,
                                nbfeature))
    targets <- array(0, dim = c(length(rows)))
    for (j in 1:length(rows)) {
      indices <- seq(rows[[j]] - lookback, rows[[j]],
                     length.out = dim(samples)[[2]])
      samples[j,,] <- data[indices,60:90] 
      targets[[j]] <- data[rows[[j]] + delay,6]
    }
    list(samples, targets)
  }
}

train_gen <- generator_WeeklyTotal(
  data = DLtrain,
  lookback = lookback,
  delay = delay,
  min_index = 1+lookback,
  max_index = 2922+lookback,
  shuffle = FALSE,
  step = step,
  batch_size = batch_size
)

val_gen = generator_WeeklyTotal(
  data = DLtrain,
  lookback = lookback,
  delay = delay,
  min_index = 2923+lookback, 
  max_index = 3287+lookback, 
  step = step,
  batch_size = batch_size
)

test_gen <- generator_WeeklyTotal(
  data = DLtrain,
  lookback = lookback,
  delay = delay,
  min_index = 3288+lookback,
  max_index = 3652+lookback,
  step = step,
  batch_size = batch_size
)

val_steps <- (3287+lookback - 2923+lookback - lookback) / batch_size
test_steps <- (3652+lookback - 3288+lookback - lookback) / batch_size

model_conv1d <- keras_model_sequential() %>%
  layer_conv_1d(filters = 64, kernel_size = 5, activation = "relu", padding = "same",
                input_shape = list(NULL, nbfeature)) %>%
  layer_max_pooling_1d(pool_size = 3) %>%
  layer_conv_1d(filters = 64, kernel_size = 5, activation = "relu",padding = "same") %>%
  layer_max_pooling_1d(pool_size = 3) %>%
  layer_conv_1d(filters = 64, kernel_size = 5, activation = "relu",padding = "same") %>%
  layer_global_max_pooling_1d() %>%
  layer_dense(units = 1)]

lookback是否收到任何错误消息?否,R会话在我收到任何消息之前已中止。是否收到任何错误消息?否,R会话在我收到任何消息之前已中止。