R 手套中的预初始化权重使用手套文本2VEC fit_变换中的初始参数

R 手套中的预初始化权重使用手套文本2VEC fit_变换中的初始参数,r,text2vec,glove,R,Text2vec,Glove,我想使用fit_变换的初始参数预先初始化手套、单词向量和偏差。函数状态的文档以命名列表形式传递“w_i,w_j,b_i,b_j”值-初始字向量和偏差 因此,我拟合并提取它们。因此,我创建了一个新的手套实例,并将提取的手套传递给一个新的手套实例(使用初始参数)。虽然我希望从第一次fit_变换达到的位置开始“继续”,但成本总是爆炸式增长,这表明我没有以正确的方式进行转换,或者它不受支持 我尝试在GloVe$new上传递初始参数(仅在GloVe\u model$fit\u transform上传递,并

我想使用fit_变换的初始参数预先初始化手套、单词向量和偏差。函数状态的文档以命名列表形式传递“w_i,w_j,b_i,b_j”值-初始字向量和偏差

因此,我拟合并提取它们。因此,我创建了一个新的手套实例,并将提取的手套传递给一个新的手套实例(使用初始参数)。虽然我希望从第一次fit_变换达到的位置开始“继续”,但成本总是爆炸式增长,这表明我没有以正确的方式进行转换,或者它不受支持

我尝试在GloVe$new上传递初始参数(仅在GloVe\u model$fit\u transform上传递,并且在这两种情况下传递)。每次使用初始参数时,错误/成本都会爆炸

# A. make vectoriser, tcm,
vectorizer <- vocab_vectorizer(vocab) 
tcm <- create_tcm(it_train, vectorizer, skip_grams_window = 2, skip_grams_window_context = "left")
在第二次传递时,成本从0.0574激增到1062

Warning in glove_model$fit_transform(tcm, n_iter = 10, progressbar = FALSE,  :
  Cost is too big, probably something goes wrong... try smaller learning rate
INFO [2019-10-12 12:27:49] 2019-10-12 12:27:49 - epoch 1, expected cost 1018.4479
Warning in glove_model$fit_transform(tcm, n_iter = 10, progressbar = FALSE,  :
  Cost is too big, probably something goes wrong... try smaller learning rate
INFO [2019-10-12 12:27:57] 2019-10-12 12:27:57 - epoch 2, expected cost 1062.0293
Warning in glove_model$fit_transform(tcm, n_iter = 10, progressbar = FALSE,  :
  Cost is too big, probably something goes wrong... try smaller learning rate
INFO [2019-10-12 12:28:05] 2019-10-12 12:28:05 - epoch 3, expected cost 1062.0293
我预计成本将从0.0574恢复,但不是:(

文档中说明的参数似乎与源代码匹配

非常感谢您的帮助

按0.95加权初始值,仅用5个历元迭代,如果达到的成本比以前好,则更新初始值,似乎可以解决成本爆炸问题,同时在一定程度上利用以前计算的权重和偏差。按0.95加权初始值,仅用5个历元迭代,如果达到的成本为b,则更新初始值比以前更好的是,似乎解决了爆炸性成本,同时在一定程度上利用了先前计算的权重和偏差。
# C. extract parameters from glove model into a named list
initialisationParamsNames <- c("w_i", "w_j", "b_i", "b_j")
initialParam <- lapply(initialisationParamsNames, function(x)glove_model$.__enclos_env__$private[[x]])
names(initialParam) <- initialisationParamsNames
# D. fit transform by using the initial parameter from the first pass
glove_model <- GloVe$new(word_vectors_size = 300, vocabulary = vocab, x_max = 10, initial = initialParam)
wv2 <- glove_model$fit_transform(tcm, n_iter = 10,  progressbar = FALSE, shuffle = F, learning_rate = 0.01, lambda = 1e-5, initial = initialParam)# convergence_tol = 0.01,
INFO [2019-10-12 12:23:52] 2019-10-12 12:23:52 - epoch 1, expected cost 0.3355
INFO [2019-10-12 12:24:00] 2019-10-12 12:24:00 - epoch 2, expected cost 0.1273
INFO [2019-10-12 12:24:08] 2019-10-12 12:24:08 - epoch 3, expected cost 0.0930
INFO [2019-10-12 12:24:16] 2019-10-12 12:24:16 - epoch 4, expected cost 0.0804
INFO [2019-10-12 12:24:24] 2019-10-12 12:24:24 - epoch 5, expected cost 0.0735
INFO [2019-10-12 12:24:32] 2019-10-12 12:24:32 - epoch 6, expected cost 0.0686
INFO [2019-10-12 12:24:40] 2019-10-12 12:24:40 - epoch 7, expected cost 0.0648
INFO [2019-10-12 12:24:48] 2019-10-12 12:24:48 - epoch 8, expected cost 0.0618
INFO [2019-10-12 12:24:55] 2019-10-12 12:24:55 - epoch 9, expected cost 0.0594
INFO [2019-10-12 12:25:03] 2019-10-12 12:25:03 - epoch 10, expected cost 0.0574
Warning in glove_model$fit_transform(tcm, n_iter = 10, progressbar = FALSE,  :
  Cost is too big, probably something goes wrong... try smaller learning rate
INFO [2019-10-12 12:27:49] 2019-10-12 12:27:49 - epoch 1, expected cost 1018.4479
Warning in glove_model$fit_transform(tcm, n_iter = 10, progressbar = FALSE,  :
  Cost is too big, probably something goes wrong... try smaller learning rate
INFO [2019-10-12 12:27:57] 2019-10-12 12:27:57 - epoch 2, expected cost 1062.0293
Warning in glove_model$fit_transform(tcm, n_iter = 10, progressbar = FALSE,  :
  Cost is too big, probably something goes wrong... try smaller learning rate
INFO [2019-10-12 12:28:05] 2019-10-12 12:28:05 - epoch 3, expected cost 1062.0293