为什么Epilolar geometry/sfm在python opencv中没有给出正确的值
我试图找到两个图像之间的旋转/平移 对于最简单的情况,我使用两个完全相同的图像,并检查它是否提供0个平移和旋转(单位矩阵) 然而,它并没有给出我所期望的结果。为什么 利用ORB特征和十个匹配特征来寻找本质矩阵和R/t。结果是(它们是相同的图像): 我所期望的是:为什么Epilolar geometry/sfm在python opencv中没有给出正确的值,python,opencv-python,Python,Opencv Python,我试图找到两个图像之间的旋转/平移 对于最简单的情况,我使用两个完全相同的图像,并检查它是否提供0个平移和旋转(单位矩阵) 然而,它并没有给出我所期望的结果。为什么 利用ORB特征和十个匹配特征来寻找本质矩阵和R/t。结果是(它们是相同的图像): 我所期望的是: t = [[0, 0, 0]] r = [[1, 0, 0], [0, 1, 0], [0, 0, 0]] 为什么它不会给出奇怪的结果 orb = cv2.ORB_create() img1 = self.img1
t = [[0, 0, 0]]
r = [[1, 0, 0], [0, 1, 0], [0, 0, 0]]
为什么它不会给出奇怪的结果
orb = cv2.ORB_create()
img1 = self.img1
img2 = self.img2
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
kpts1, descs1 = orb.detectAndCompute(gray1, None)
kpts2, descs2 = orb.detectAndCompute(gray2, None)
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
matches = bf.match(descs1, descs2)
dmatches = sorted(matches, key=lambda x: x.distance)
src_pts = np.float32([kpts1[m.queryIdx].pt for m in dmatches]).reshape(-1, 1, 2)
src_pts = src_pts[0:10]
dst_pts = np.float32([kpts2[m.trainIdx].pt for m in dmatches]).reshape(-1, 1, 2)
dst_pts = dst_pts[0:10]
K = np.array([[842.102288, 0., 263.697271],
[0., 833.300569, 536.024168],
[0., 0., 1.]])
E, mask2 = cv2.findEssentialMat(src_pts, dst_pts, K, cv2.RANSAC, 0.999, 1.0);
points, R, t, mask = cv2.recoverPose(E, src_pts, dst_pts)
测试图像(图像1=图像2=img)
有时我们需要测试以找到问题,因此如果您将链接放到图像中,这将非常有用 但是,有几件事你应该知道:
- 在OpenCV上,默认情况下,使用ORB检测器检测的最大关键点数等于500个关键点
- 通过计算基本矩阵,您可以得到由掩码表示的内联值和离群值,但根据您的源代码,您没有将其考虑在内,这会影响恢复姿势的结果
- Recoverpose还需要摄像机矩阵
orb = cv2.ORB_create(nfeatures = 2000)
img1 = self.img1
img2 = self.img2
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
kpts1, descs1 = orb.detectAndCompute(gray1, None)
kpts2, descs2 = orb.detectAndCompute(gray2, None)
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
matches = bf.match(descs1, descs2)
dmatches = sorted(matches, key=lambda x: x.distance)
src_pts = np.float32([kpts1[m.queryIdx].pt for m in dmatches]).reshape(-1, 1, 2)
dst_pts = np.float32([kpts2[m.trainIdx].pt for m in dmatches]).reshape(-1, 1, 2)
K = np.array([[842.102288, 0., 263.697271],
[0., 833.300569, 536.024168],
[0., 0., 1.]])
E, inliers_mask_E = cv2.findEssentialMat(src_pts, dst_pts, K method = cv2.RANSAC, prob = 0.999, threshold = 1.0);
points, R, t, inliers_mask_RP = cv2.recoverPose(E, src_pts, dst_pts, cameraMatrix = K, mask = inliers_mask_E)
谢谢你的回复,我尝试了你的代码,但是它仍然给出了奇怪的结果。旋转矩阵是恒等式(我所期望的,正确的mat),但平移仍然很奇怪(-0.57,0.57,-0.57)!
orb = cv2.ORB_create(nfeatures = 2000)
img1 = self.img1
img2 = self.img2
gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
kpts1, descs1 = orb.detectAndCompute(gray1, None)
kpts2, descs2 = orb.detectAndCompute(gray2, None)
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
matches = bf.match(descs1, descs2)
dmatches = sorted(matches, key=lambda x: x.distance)
src_pts = np.float32([kpts1[m.queryIdx].pt for m in dmatches]).reshape(-1, 1, 2)
dst_pts = np.float32([kpts2[m.trainIdx].pt for m in dmatches]).reshape(-1, 1, 2)
K = np.array([[842.102288, 0., 263.697271],
[0., 833.300569, 536.024168],
[0., 0., 1.]])
E, inliers_mask_E = cv2.findEssentialMat(src_pts, dst_pts, K method = cv2.RANSAC, prob = 0.999, threshold = 1.0);
points, R, t, inliers_mask_RP = cv2.recoverPose(E, src_pts, dst_pts, cameraMatrix = K, mask = inliers_mask_E)