bnlearn如何计算连续数据的BIC?

bnlearn如何计算连续数据的BIC?,r,bnlearn,R,Bnlearn,我在R中使用bnlearn包,我想知道包是如何计算BIC-g(高斯分布中的BIC)的 让我们做一个结构,我可以找到如下的BIC分数 library(bnlearn) X = iris[, 1:3] names(X) = c("A", "B", "C") Network = empty.graph(names(X)) bnlearn::score(Network, X, type="bic-g") bLearn为我提供了有关如何计算此分数的更详细信息 bnlearn::score(Network

我在
R
中使用
bnlearn
包,我想知道包是如何计算BIC-g(高斯分布中的BIC)的

让我们做一个结构,我可以找到如下的BIC分数

library(bnlearn)
X = iris[, 1:3]
names(X) = c("A", "B", "C")
Network = empty.graph(names(X))
bnlearn::score(Network, X, type="bic-g")
bLearn
为我提供了有关如何计算此分数的更详细信息

bnlearn::score(Network, X, type="bic-g", debug=TRUE)
这就导致了

----------------------------------------------------------------
* processing node A.
  > loglikelihood is -184.041441.
  > penalty is 2.505318 x 2 = 5.010635.
----------------------------------------------------------------
* processing node B.
  > loglikelihood is -87.777815.
  > penalty is 2.505318 x 2 = 5.010635.
----------------------------------------------------------------
* processing node C.
  > loglikelihood is -297.588727.
  > penalty is 2.505318 x 2 = 5.010635.
[1] -584.4399
我知道如何计算贝叶斯网络中离散数据的BIC,参考。但我不知道它如何推广到联合高斯(多元正态)情况

显然,这可能与近似似然和惩罚项有关,包过程似乎计算每个节点的似然和惩罚,然后将它们相加

bnlearn::score(Network, X, type="loglik-g", debug=TRUE)
但我想知道,在给定数据的情况下,我如何具体计算可能性和惩罚

我找到了解释拉普拉斯近似的公式(参见第57页),但我无法将其联系起来


有人帮我吗?

BIC的计算方法如下

BIC=-2*logLik+nparams*log(nobs)

但是在
bnlearn
中,这将被-2重新缩放(请参见
?分数
)以给出

BIC=logLik-0.5*nparams*log(nobs)

因此,对于您的示例,在没有边的情况下,使用边际平均值计算可能性,并且误差(或者更一般地说,对于每个节点,参数的数量通过求和1(截距)+1(剩余误差)+父节点的数量给出),例如

如果存在边缘,则使用拟合值和残差计算对数可能性。对于网络:

Network = set.arc(Network, "A", "B")
我们需要节点A和C的对数似然分量

(llA = with(X, sum(dnorm(A, mean(A), sd(A), log=TRUE))))
#[1] -184.0414
(llC = with(X, sum(dnorm(C, mean(C), sd(C), log=TRUE))))
#[1] -297.5887
我们从线性回归中得到B的条件概率

m = lm(B ~ A, X)
(llB = with(X, sum(dnorm(B, fitted(m), stats::sigma(m), log=TRUE))))
#[1] -86.73894
给予

(ll = llA + llB + llC)
#[1] -568.3691
(penalty = 0.5* log(nrow(X))* 7)
#[1] 17.53722
ll - penalty
#[1] -585.9063 

#  bnlearn::score(Network, X, type="bic-g", debug=TRUE)
# ----------------------------------------------------------------
# * processing node A.
#    loglikelihood is -184.041441.
#    penalty is 2.505318 x 2 = 5.010635.
# ----------------------------------------------------------------
# * processing node B.
#    loglikelihood is -86.738936.
#    penalty is 2.505318 x 3 = 7.515953.
# ----------------------------------------------------------------
# * processing node C.
#    loglikelihood is -297.588727.
#    penalty is 2.505318 x 2 = 5.010635.
# [1] -585.9063

loglik通过
sum(sappy(X,函数(i)dnorm(i,平均值(i),标准差(i),log=TRUE))和惩罚项
0.5*log(nrow(X))*6计算得出,惩罚项为0.5*记录观察次数*参数数量
(ll = llA + llB + llC)
#[1] -568.3691
(penalty = 0.5* log(nrow(X))* 7)
#[1] 17.53722
ll - penalty
#[1] -585.9063 

#  bnlearn::score(Network, X, type="bic-g", debug=TRUE)
# ----------------------------------------------------------------
# * processing node A.
#    loglikelihood is -184.041441.
#    penalty is 2.505318 x 2 = 5.010635.
# ----------------------------------------------------------------
# * processing node B.
#    loglikelihood is -86.738936.
#    penalty is 2.505318 x 3 = 7.515953.
# ----------------------------------------------------------------
# * processing node C.
#    loglikelihood is -297.588727.
#    penalty is 2.505318 x 2 = 5.010635.
# [1] -585.9063