Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/visual-studio-2008/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 应用一个";剪报“;在PyMC3中转换为MvNormal_Python_Theano_Pymc3_Probabilistic Programming - Fatal编程技术网

Python 应用一个";剪报“;在PyMC3中转换为MvNormal

Python 应用一个";剪报“;在PyMC3中转换为MvNormal,python,theano,pymc3,probabilistic-programming,Python,Theano,Pymc3,Probabilistic Programming,我定义了如下“剪辑”分布: from pymc3.distributions.transforms import ElemwiseTransform import aesara.tensor as at import numpy as np class MvClippingTransform(ElemwiseTransform): name = "MvClippingTransform" def __init__(self, lower = No

我定义了如下“剪辑”分布:

from pymc3.distributions.transforms import ElemwiseTransform

import aesara.tensor as at
import numpy as np

class MvClippingTransform(ElemwiseTransform):
    name = "MvClippingTransform"
    
    def __init__(self, lower = None, upper = None):
        
        if lower is None:
            lower = float("-inf")
        
        if upper is None:
            upper = float("inf")
        
        self.lower = lower
        self.upper = upper
        
    def backward(self, x):
        return x

    def forward(self, x):
        return at.clip(x, self.lower, self.upper)

    def forward_val(self, x, point=None):
        return np.clip(x, self.lower, self.upper)

    def jacobian_det(self, x):
        # The backwards transformation of clipping as I've defined it is the identity function (perhaps that will change)
        # I have an intuition that the jacobian determinant of the identity function is 1, so log(abs(1)) -> 0
        return at.zeros(x.shape)
import importlib, clipping; importlib.reload(clipping)

with pm.Model() as m:
    
    # Taken from https://docs.pymc.io/pymc-examples/examples/case_studies/LKJ.html
    chol, corr, stds = pm.LKJCholeskyCov(
        # Specifying compute_corr=True also unpacks the cholesky matrix in the returns (otherwise we'd have to unpack ourselves)
        "chol", n=3, eta=2.0, sd_dist=pm.Exponential.dist(1.0), compute_corr=True 
    )
    cov = pm.Deterministic("cov", chol.dot(chol.T))
    
    μ = pm.Uniform("μ", -10, 10, shape=3, testval=samples.mean(axis=0))
    
    clipping = clipping.MvClippingTransform(lower = None, upper = upper_truncation)    
    
    mv = pm.MvNormal("mv", mu = μ, chol=chol, shape = 3, transform = clipping, observed=samples) # , observed = samples
    
    trace = pm.sample(random_seed=44, init="adapt_diag", return_inferencedata=True, target_accept = 0.9)
    
    ppc = pm.sample_posterior_predictive(
        trace, var_names=["mv"], random_seed=42
    )
我将其应用于一个具有LKJ Cholesky先验的多视图法线,如下所示:

from pymc3.distributions.transforms import ElemwiseTransform

import aesara.tensor as at
import numpy as np

class MvClippingTransform(ElemwiseTransform):
    name = "MvClippingTransform"
    
    def __init__(self, lower = None, upper = None):
        
        if lower is None:
            lower = float("-inf")
        
        if upper is None:
            upper = float("inf")
        
        self.lower = lower
        self.upper = upper
        
    def backward(self, x):
        return x

    def forward(self, x):
        return at.clip(x, self.lower, self.upper)

    def forward_val(self, x, point=None):
        return np.clip(x, self.lower, self.upper)

    def jacobian_det(self, x):
        # The backwards transformation of clipping as I've defined it is the identity function (perhaps that will change)
        # I have an intuition that the jacobian determinant of the identity function is 1, so log(abs(1)) -> 0
        return at.zeros(x.shape)
import importlib, clipping; importlib.reload(clipping)

with pm.Model() as m:
    
    # Taken from https://docs.pymc.io/pymc-examples/examples/case_studies/LKJ.html
    chol, corr, stds = pm.LKJCholeskyCov(
        # Specifying compute_corr=True also unpacks the cholesky matrix in the returns (otherwise we'd have to unpack ourselves)
        "chol", n=3, eta=2.0, sd_dist=pm.Exponential.dist(1.0), compute_corr=True 
    )
    cov = pm.Deterministic("cov", chol.dot(chol.T))
    
    μ = pm.Uniform("μ", -10, 10, shape=3, testval=samples.mean(axis=0))
    
    clipping = clipping.MvClippingTransform(lower = None, upper = upper_truncation)    
    
    mv = pm.MvNormal("mv", mu = μ, chol=chol, shape = 3, transform = clipping, observed=samples) # , observed = samples
    
    trace = pm.sample(random_seed=44, init="adapt_diag", return_inferencedata=True, target_accept = 0.9)
    
    ppc = pm.sample_posterior_predictive(
        trace, var_names=["mv"], random_seed=42
    )
(上截断是一个numpy数组)

现在,我通过定义多元正态分布的协方差矩阵并对其应用剪裁生成了模拟数据,以获得以下结果:

但当我从PPC中取样时,我得到:

即使我将剪辑定义为[0,0,0],它仍然不起作用

为什么PPC(或参数采样)不能反映剪裁转换