Matrix 绝对传感器定向
我正在使用中提供的a和矩阵运算的数据。MISB数据提供平台在偏航、俯仰和横滚中的方向,以及传感器相对于平台在偏航、俯仰和横滚中的方向。我试图根据平台方向和传感器相对方向,计算传感器相对于北方的绝对方向(偏航、俯仰、滚转) 我目前正在计算平台旋转矩阵和传感器相对旋转矩阵,并将结果相乘。生成的旋转矩阵似乎不正确。根据第6.2.4节中的MISB文件,欧拉角操作顺序为偏航、俯仰,然后滚转。组合旋转矩阵以获得绝对旋转的正确方法是什么Matrix 绝对传感器定向,matrix,rotational-matrices,Matrix,Rotational Matrices,我正在使用中提供的a和矩阵运算的数据。MISB数据提供平台在偏航、俯仰和横滚中的方向,以及传感器相对于平台在偏航、俯仰和横滚中的方向。我试图根据平台方向和传感器相对方向,计算传感器相对于北方的绝对方向(偏航、俯仰、滚转) 我目前正在计算平台旋转矩阵和传感器相对旋转矩阵,并将结果相乘。生成的旋转矩阵似乎不正确。根据第6.2.4节中的MISB文件,欧拉角操作顺序为偏航、俯仰,然后滚转。组合旋转矩阵以获得绝对旋转的正确方法是什么 //use transpose for clockwise rota
//use transpose for clockwise rotation
Matrix mpYaw = Matrix.fromRotationZ(pYaw).getTranspose();
Matrix mpPitch = Matrix.fromRotationY(pPitch).getTranspose();
Matrix mpRoll = Matrix.fromRotationX(pRoll).getTranspose();
Matrix msYaw = Matrix.fromRotationZ(sYaw).getTranspose();
Matrix msPitch = Matrix.fromRotationY(sPitch).getTranspose();
Matrix msRoll = Matrix.fromRotationX(sRoll).getTranspose();
Matrix mpRot = mpYaw.multiply(mpPitch).multiply(mpRoll); //platform
Matrix msRot = msYaw.multiply(msPitch).multiply(msRoll); //sensor
Matrix maRot = mpRot.multiply(msRot); //absolute
MISB数据样本:
Platform Heading Angle:175.66308079652094
Platform Pitch Angle:3.4296700949125647
Platform Roll Angle:-0.3982665486617634
Sensor Rel. Az. Angle:326.08593764856596
Sensor Rel. El. Angle:-21.60937493741949
Sensor Rel. Roll Angle:0.0
Sensor Latitude:33.03482410173622
Sensor Longitude:-114.45451377632772
Sensor True Altitude:1022.4368657969026
Frame Center Lat.:33.01531312661958
Frame Center Lon.:-114.4367867216639
Frame Center El.:79.58953231097883
Slant Range:2883.640118614687
编辑1:
应用@anjruu建议的修复后,结果看起来很接近,但仍然有点偏离。通过将旋转矩阵的正向向量乘以MISB提供的目标距离,我计算了目标位置的局部NED坐标。然后,我计算了MISB提供的目标位置的本地NED坐标(使用),原点设置为提供的平台位置,结果略有偏差
Matrix mpYaw = Matrix.fromRotationZ(pYaw).getTranspose();
Matrix mpPitch = Matrix.fromRotationY(pPitch).getTranspose();
Matrix mpRoll = Matrix.fromRotationX(pRoll).getTranspose();
Matrix msYaw = Matrix.fromRotationZ(sYaw).getTranspose();
Matrix msPitch = Matrix.fromRotationY(sPitch).getTranspose();
Matrix msRoll = Matrix.fromRotationX(sRoll).getTranspose();
Matrix mpRot = mpRoll.multiply(mpPitch).multiply(mpYaw); //platform
Matrix msRot = msRoll.multiply(msPitch).multiply(msYaw); //sensor
Matrix maRot = msRot.multiply(mpRot); //absolute
Globe globe = new Earth();
Position pPlatform = Position.fromDegrees(33.03482410173622, -114.45451377632772, 1022.4368657969026);
Position pTarget = Position.fromDegrees(33.01531312661958, -114.4367867216639, 79.58953231097883);
double targetRange = 2883.640118614687;
Vec4 vTarNED = new Vec4(1,0,0).transformBy3(maRot.getTranspose()).multiply3(targetRange);
//NED = (-2165.935747907422, 1656.9597179630864, 937.3298046411029, 1.0)
Matrix localENU = ViewUtil.computePositionTransform(globe, pPlatform);
Vec4 vTarENU = globe.computePointFromPosition(pTarget).transformBy4(localENU);
//ENU = (1656.3846316600684, -2163.7501770820236, -943.4305881811306, 1.0)
//NED = (-2163.7501770820236, 1656.3846316600684, 943.4305881811306, 1.0)
对于进一步的研究人员来说,我也面临着同样的问题。主要问题是传感器的错误率,要从传感器数据直接设置视图位置和方向,需要计算此错误并将其作为偏移值添加。但是,我们可能有世界风来为我们处理大多数计算 使用任何3D引擎,实际上你都不会得到任何给定的角度信息,因为你已经有了眼睛和注视位置。您可以从这些位置计算必要的方向值,也可以手动和自动进行管理 这里,在我的函数中,根据给定的MISB KLV数据设置相机位置
public void setCameraPosition(BTelemetryData pData){
// Get Platform Location Information
Angle tPlatformLatitude = Angle.fromDegrees(Double.parseDouble(pData.getAlternatePlatformLatitude()));
Angle tPlatformLongitude = Angle.fromDegrees(Double.parseDouble(pData.getAlternatePlatformLongitude()));
double tPlatformAltitude = Double.parseDouble(pData.getPlatformGPSAltitude());
Position tPlatfromPosition = new Position(tPlatformLatitude, tPlatformLongitude ,tPlatformAltitude);
// Get LookAt Location Information
Angle tLookAtLatitude = Angle.fromDegrees(Double.parseDouble(pData.getFrameCenterLatitude()));
Angle tLookAtLongitude= Angle.fromDegrees(Double.parseDouble(pData.getFrameCenterLongitude()));
// Note must take into account the surface elevation at given lat lon.
double tLookAtAltitude= getWwd().getModel().getGlobe().getElevation(tLookAtLatitude, tLookAtLongitude);
Position tLookAtPosition = new Position(tLookAtLatitude, tLookAtLongitude ,tLookAtAltitude);
// First things first, we need to Set Field of View
getView().setFieldOfView(Angle.fromDegrees(Double.parseDouble(pData.getSensorHorizontalFieldofView())));
if (useAutoCameraPosition())
setCameraPositionAutomatically(tLookAtPosition, tPlatfromPosition);
else
calculateAndSetCameraPosition(tLookAtPosition, tPlatfromPosition);
getView().firePropertyChange(AVKey.VIEW, null, getView());
}
public void setCameraPositionAutomatically(Position pLookAtPosition, Position pEyePosition){
getView().setEyePosition(pEyePosition);
getView().setOrientation(pEyePosition, pLookAtPosition);
}
public void calculateAndSetCameraPosition(Position pLookAtPosition, Position pEyePosition){
double tPitch = getPitchAngleBetweenPositionInDegrees(pLookAtPosition, pEyePosition);
double tHeading = getHeadingAngleBetweenPositionInDegrees(pLookAtPosition, pEyePosition);
getView().setEyePosition(pEyePosition);
getView().setHeading(Angle.fromDegrees(tHeading));
getView().setPitch(Angle.fromDegrees(tPitch));
}
public double getPitchAngleBetweenPositionInDegrees(Position pLookAt, Position pEyePosition) {
// Calculate the radius at given look at position
double tRadius = getWwd().getModel().getGlobe().getRadiusAt(pLookAt);
// Find the Surrounding Radial Length Between those positions
double tRadialDistance = Position.greatCircleDistance(pLookAt, pEyePosition).getRadians() * tRadius;
// Find the Ratio Between Distance, which will give the offset and Angle
double tTheta = tRadialDistance / tRadius;
// Get the surface elevation of lookatposition
double tLookAtElevation = pLookAt.getElevation();
// Get Altitude of given eye position
double tEyeAltitude = pEyePosition.getAltitude();
// Delta Location Changes in cartesian
double tDeltaX = (tRadius + tLookAtElevation) * Math.cos(tTheta);
double tDeltaY = (tRadius + tLookAtElevation) * Math.sin(tTheta);
double tDeltaZ = tRadius + tEyeAltitude - tDeltaX;
double alpha = Math.atan(tDeltaZ / tDeltaY) - tTheta;
// Convert NED to World Wind Coordinate System. The Pitch angle should be 90 - calculated.
double degrees = 90 - Math.toDegrees(alpha);
System.out.println("Elevation Angle Between Positions = " + degrees);
return degrees;
}
public double getHeadingAngleBetweenPositionInDegrees(Position pLookAtPosition, Position pEyePosition) {
double tLatEye = pEyePosition.getLatitude().radians;
double tLatLookAt = pLookAtPosition.getLatitude().radians;
double tLonLookAt = pLookAtPosition.getLongitude().radians;
double tLonEye = pEyePosition.getLongitude().radians;
double dLon = (tLonLookAt - tLonEye);
double y = Math.sin(dLon) * Math.cos(tLatLookAt);
double x = Math.cos(tLatEye) * Math.sin(tLatLookAt) - Math.sin(tLatEye)
* Math.cos(tLatLookAt) * Math.cos(dLon);
// Calculate the Bearing Angle.
double tBearing = Math.toDegrees(Math.atan2(y, x));
// Calculate the absolute value of that Angle
tBearing = (tBearing + 360) % 360;
// Note that world wind takes the Heading in clockwise, if you want to make it counter clockwise, you need to subtract it from 360 degrees
//tBearing = 360 - tBearing;
return tBearing;
}
!!
//
在我的代码中,我没有设置滚动角度,但根据文档,您可以简单地将传感器和平台的滚动角度相加,然后设置滚动角度
请注意,World Wind有两个不同的视图类BasicBitView和BasicFlyView,要模拟给定的数据,必须使用BasicFlyView。原因是,在FlyView中,设置角度时保持相机位置,但另一方面,在OrbitView中,保持注视位置,并更改相对于这些角度的角度和相机位置。如果精度足够高,可以使用setOrientation方法
良好的编码:)您可以包括矩阵类,或者至少是
乘法的规范吗?我猜Matrix::multiply
是一个正确的乘法,相机姿势是相对于平台姿势的,这意味着它应该是msRot.multiply(mpRot)
,并且应该反转乘法链以获得mpRot
和msRot
,但是如果我不知道乘法实际上是做什么的,我就说不出来。@anjruu贴子中的NASA WorldWind链接链接到Matrix类。它是这样的,对不起。是的,A.multiply(B)
是A*B
(这是正常的),所以我认为它应该是矩阵mpRot=mpRoll.multiply(mpPitch)。multiply(mpYaw)
,类似于msRot
和maRot
。另外,请参见。@anjruu我添加了一个测试用例来验证您的更改,但仍然得到了稍微不正确的结果。