C++ 为什么我的值总是小焦距?
每当我在openCv Posite函数中使用相机焦距时,我得到所有值的NaN,但当我使用更高的值时,我没有这个错误(但当时的数字不是真的)。我有一个金字塔作为模型。1个平面上有3个点,1个在上面的中心。有人知道怎么处理吗?还是一些好的链接C++ 为什么我的值总是小焦距?,c++,opencv,nan,C++,Opencv,Nan,每当我在openCv Posite函数中使用相机焦距时,我得到所有值的NaN,但当我使用更高的值时,我没有这个错误(但当时的数字不是真的)。我有一个金字塔作为模型。1个平面上有3个点,1个在上面的中心。有人知道怎么处理吗?还是一些好的链接 int depth = -18; //create the model points std::vector<CvPoint3D32f> modelPoints; modelPoints.push_back(cvPoint3D32f( 0
int depth = -18;
//create the model points
std::vector<CvPoint3D32f> modelPoints;
modelPoints.push_back(cvPoint3D32f( 0.0f, 0.0f, 0.0f)); //(the first must be 0,0,0) middle center diod
modelPoints.push_back(cvPoint3D32f(-18.0f, -30.0f, depth)); //bottom left diode
modelPoints.push_back(cvPoint3D32f( 18.0f, -30.0f, depth)); //bottom right diode
modelPoints.push_back(cvPoint3D32f( 0.0f, 30.0f, depth)); //top center diode
system("cls");
//create the image points
std::vector<CvPoint2D32f> srcImagePoints;
cout << "Source Image Points:" << endl;
for( size_t i = 0; i < circles.size(); i++ )
{
cout << "x: " << cvRound(circles[i][0]) << " y: " << cvRound(circles[i][1]) << endl;
//228, 278
//291, 346 (Pixel coordinates of the points on the image)
//371, 346
//228, 206
srcImagePoints.push_back( cvPoint2D32f(cvRound(circles[i][0]), cvRound(circles[i][1])) );
}
cout << endl;
//create the POSIT object with the model points
CvPOSITObject* positObject = cvCreatePOSITObject( &modelPoints[0], (int)modelPoints.size() );
//calculate the orientation
float* rotation_matrix = new float[9];
float* translation_vector = new float[3];
CvTermCriteria criteria = cvTermCriteria(CV_TERMCRIT_EPS | CV_TERMCRIT_ITER, 100, 1.0e-4f);
double FOCAL_LENGTH = 16; //16.0; //760.0;
cvPOSIT( positObject, &srcImagePoints[0], FOCAL_LENGTH, criteria, rotation_matrix, translation_vector );
//calculate rotation angles
double beta = atan2((double)(-rotation_matrix[2]), (double)(sqrt(pow(rotation_matrix[0], 2) + pow(rotation_matrix[3], 2))));
double alpha = atan2((rotation_matrix[3]/cos(beta)),(rotation_matrix[0]/cos(beta)));
double gamma = atan2((rotation_matrix[7]/cos(beta)),(rotation_matrix[8]/cos(beta)));
int-depth=-18;
//创建模型点
向量模型点;
模型点。推回(cvPoint3D32f(0.0f,0.0f,0.0f))//(第一个必须是0,0,0)中位二极管
模型点。向后推(cvPoint3D32f(-18.0f,-30.0f,深度))//左下二极管
模型点。向后推(cvPoint3D32f(18.0f,-30.0f,深度))//右下二极管
模型点。推回(cvPoint3D32f(0.0f,30.0f,深度))//上中心二极管
系统(“cls”);
//创建图像点
std::矢量图像点;
我真的需要Posite的帮助吗?我就是不能让它工作>。我不确定问题出在哪里。但是,我假设它更多的是数值误差(奇异矩阵的逆、舍入误差等)。您可以尝试跟踪,因为问题在数值上变得不稳定。较小的焦距是正确的吗?较小的焦距是正确的,是的。我开始摆脱Posite,因为我读到它有未知深度的问题,所以我想我的问题得到了回答。。