使用深度值将二维点投影到三维。Maya Python API
我想知道如何从二维点投影三维点。我希望能够给它一个深度值的项目。有人举过玛雅的例子吗 谢谢 以下是我能做的最好的事情:使用深度值将二维点投影到三维。Maya Python API,python,api,matrix,projection,maya,Python,Api,Matrix,Projection,Maya,我想知道如何从二维点投影三维点。我希望能够给它一个深度值的项目。有人举过玛雅的例子吗 谢谢 以下是我能做的最好的事情: def screenToWorld(point2D=None, depth=None, viewMatrix=None, projectionMatrix=None, width=None, heig
def screenToWorld(point2D=None,
depth=None,
viewMatrix=None,
projectionMatrix=None,
width=None,
height=None):
'''
@param point2D - 2D Point.
@param viewMatrix - MMatrix of modelViewMatrix (World inverse of camera.)
@param projectionMatrix - MMatrix of camera's projectionMatrix.
@param width - Resolution width of camera.
@param height - Resolution height of camera.
Returns worldspace MPoint.
'''
point3D = OpenMaya.MPoint()
point3D.x = (2.0 * (point2D[0] / width)) - 1.0
point3D.y = (2.0 * (point2D[1] / height)) - 1.0
viewProjectionMatrix = (viewMatrix * projectionMatrix)
point3D.z = viewProjectionMatrix(3, 2)
point3D.w = viewProjectionMatrix(3, 3)
point3D.x = point3D.x * point3D.w
point3D.y = point3D.y * point3D.w
point3D = point3D * viewProjectionMatrix.inverse()
return point3D
可以看出,它不使用深度值。我不知道如何使用投影矩阵和viewMatrix合并它
非常感谢您的帮助!
-克里斯所以我想我找到了一个解决方案:
import maya.OpenMaya as OpenMaya
def projectPoint(worldPnt, camPnt, depth):
'''
@param worldPnt - MPoint of point to project. (WorldSpace)
@param camPnt - MPoint of camera position. (WorldSpace)
@param depth - Float value of distance.
Returns list of 3 floats.
'''
#Get vector from camera to point and normalize it.
mVec_pointVec = worldPnt - camPnt
mVec_pointVec.normalize()
#Multiply it by the depth and the camera offset to it.
mVec_pointVec *= depth
mVec_pointVec += OpenMaya.MVector(camPnt.x, camPnt.y, camPnt.z)
return [mVec_pointVec.x, mVec_pointVec.y, mVec_pointVec.z]
我真的不需要把它转换成2d然后再转换成3d。我只需要从摄影机扩展向量。你用世界空间单位Z和W=1试过吗?把它乘以矩阵的倒数,然后把结果除以W,看看这是不是你想要的。如果相机是正交的,或者有胶卷偏移量,我认为它在正交相机中不起作用,但是胶卷偏移量会起作用。