C++ 如何拍摄照相机';什么是实时视频?

C++ 如何拍摄照相机';什么是实时视频?,c++,C++,我正在使用Usb 3.0 Basler摄像头acA640-750uc来捕获视频,下面是使用两个摄像头并抓取帧的程序: 问题是,当我运行这个程序时,我的电脑从两个摄像头捕捉到了视频,但是视频比我的实际移动速度慢了大约2秒 我的视频比实时视频慢,我想实时捕获视频。 我怎样才能解决这个问题 我已尝试将(size_t I=0;I

我正在使用Usb 3.0 Basler摄像头acA640-750uc来捕获视频,下面是使用两个摄像头并抓取帧的程序:

问题是,当我运行这个程序时,我的电脑从两个摄像头捕捉到了视频,但是视频比我的实际移动速度慢了大约2秒 我的视频比实时视频慢,我想实时捕获视频。 我怎样才能解决这个问题

我已尝试将(size_t I=0;I
条件从
++I
更改为
I++
,但它不起作用

#include <pylon/PylonIncludes.h>
#ifdef PYLON_WIN_BUILD
#include <pylon/PylonGUI.h>
#endif

// Namespace for using pylon objects.
using namespace Pylon;

// Namespace for using cout.
using namespace std;

// Number of images to be grabbed.
static const uint32_t c_countOfImagesToGrab = 1000;

// Limits the amount of cameras used for grabbing.
// It is important to manage the available bandwidth when grabbing with 
// multiple cameras.
// This applies, for instance, if two GigE cameras are connected to the 
// same network adapter via a switch.
// To manage the bandwidth, the GevSCPD interpacket delay parameter and 
// the GevSCFTD transmission delay
// parameter can be set for each GigE camera device.
// The "Controlling Packet Transmission Timing with the Interpacket and 
// Frame Transmission Delays on Basler GigE Vision Cameras"
// Application Notes (AW000649xx000)
// provide more information about this topic.
// The bandwidth used by a FireWire camera device can be limited by 
// adjusting the packet size.
static const size_t c_maxCamerasToUse = 2;

int main(int argc, char* argv[])
{
// The exit code of the sample application.
int exitCode = 0;

// Before using any pylon methods, the pylon runtime must be initialized. 
PylonInitialize();

try
{
// Get the transport layer factory.
CTlFactory& tlFactory = CTlFactory::GetInstance();

// Get all attached devices and exit application if no device is found.
DeviceInfoList_t devices;
if (tlFactory.EnumerateDevices(devices) == 0)
{
throw RUNTIME_EXCEPTION("No camera present.");
}

// Create an array of instant cameras for the found devices and avoid 
// exceeding a maximum number of devices.
CInstantCameraArray cameras(min(devices.size(), c_maxCamerasToUse));

// Create and attach all Pylon Devices.
for (size_t i = 0; i < cameras.GetSize(); ++i)
{
cameras[i].Attach(tlFactory.CreateDevice(devices[i]));

// Print the model name of the camera.
cout << "Using device " << cameras[i].GetDeviceInfo().GetModelName() << 
endl;
}

// Starts grabbing for all cameras starting with index 0. The grabbing
// is started for one camera after the other. That's why the images of 
// all
// cameras are not taken at the same time.
// However, a hardware trigger setup can be used to cause all cameras to 
// grab images synchronously.
// According to their default configuration, the cameras are
// set up for free-running continuous acquisition.
cameras.StartGrabbing();

// This smart pointer will receive the grab result data.
CGrabResultPtr ptrGrabResult;
// Grab c_countOfImagesToGrab from the cameras.
for (uint32_t i = 0; i < c_countOfImagesToGrab && cameras.IsGrabbing(); 
++i)
{
cameras.RetrieveResult(5000, ptrGrabResult, 
TimeoutHandling_ThrowException);

// When the cameras in the array are created the camera context value
// is set to the index of the camera in the array.
// The camera context is a user settable value.
// This value is attached to each grab result and can be used
// to determine the camera that produced the grab result.
intptr_t cameraContextValue = ptrGrabResult->GetCameraContext();

#ifdef PYLON_WIN_BUILD
// Show the image acquired by each camera in the window related to each 
// camera.

Pylon::DisplayImage(cameraContextValue, ptrGrabResult);
#endif

// Print the index and the model name of the camera.
cout << "Camera " << cameraContextValue << ": " << 
cameras[cameraContextValue].GetDeviceInfo().GetModelName() << endl;

// Now, the image data can be processed.
cout << "GrabSucceeded: " << ptrGrabResult->GrabSucceeded() << endl;
cout << "SizeX: " << ptrGrabResult->GetWidth() << endl;
cout << "SizeY: " << ptrGrabResult->GetHeight() << endl;
const uint8_t* pImageBuffer = (uint8_t*)ptrGrabResult->GetBuffer();
cout << "Gray value of first pixel: " << (uint32_t)pImageBuffer[0] << 
endl << 
endl;
}
}
catch (const GenericException& e)
{
// Error handling
cerr << "An exception occurred." << endl
<< e.GetDescription() << endl;
exitCode = 1;
}

// Comment the following two lines to disable waiting on exit.
cerr << endl << "Press Enter to exit." << endl;
while (cin.get() != '\n');
// Releases all pylon resources. 
PylonTerminate();

return exitCode;
}
#包括
#ifdef塔架\u赢得\u建造
#包括
#恩迪夫
//用于使用挂架对象的命名空间。
使用名称空间挂架;
//用于使用cout的命名空间。
使用名称空间std;
//要抓取的图像数。
图像抓取的静态常数32_t c_计数=1000;
//限制用于抓取的摄影机数量。
//在使用抓取时,管理可用带宽非常重要
//多个摄像头。
//例如,如果两个GigE摄像头连接到
//通过交换机连接相同的网络适配器。
//要管理带宽,请使用GevSCPD包间延迟参数和
//GevSCFTD传输延迟
//可以为每个GigE摄像机设备设置参数。
//“使用Interpacket和
//Basler GigE Vision摄像头上的帧传输延迟”
//申请须知(AW000649xx000)
//提供有关此主题的更多信息。
//FireWire摄像机设备使用的带宽可能受到以下限制:
//调整数据包大小。
静态常量大小\u t c\u最大摄影机数=2;
int main(int argc,char*argv[])
{
//示例应用程序的退出代码。
int-exitCode=0;
//在使用任何挂架方法之前,必须初始化挂架运行时。
幽门初始化();
尝试
{
//获取传输层工厂。
CTlFactory&tlFactory=CTlFactory::GetInstance();
//获取所有连接的设备,如果找不到设备,则退出应用程序。
设备信息列表设备;
if(tlFactory.enumerated设备(设备)==0)
{
抛出运行时_异常(“没有摄像头存在”);
}
//为找到的设备创建即时摄像头阵列,并避免
//超过设备的最大数量。
CinstantCameraRay摄像机(最小(devices.size(),c_maxCamerasToUse));
//创建并连接所有挂架设备。
对于(size_t i=0;icout我在这方面没有经验,但是将
++I
更改为
I++
显然不能解决您的问题,因为它们在定义上是等价的(
(size\u t I=0;I
)。
我不确定,但根据代码中的注释,您可能需要手动配置摄像头(摄像头可能配置不同):

此外,请仔细阅读代码中的这些注释,并查看您是否正确配置了网络和参数。我建议您首先尝试使用一个摄像头:

// Limits the amount of cameras used for grabbing.
// It is important to manage the available bandwidth when grabbing with 
// multiple cameras.
// This applies, for instance, if two GigE cameras are connected to the 
// same network adapter via a switch.
// To manage the bandwidth, the GevSCPD interpacket delay parameter and 
// the GevSCFTD transmission delay
// parameter can be set for each GigE camera device.
// The "Controlling Packet Transmission Timing with the Interpacket and 
// Frame Transmission Delays on Basler GigE Vision Cameras"
// Application Notes (AW000649xx000)
// provide more information about this topic.
// The bandwidth used by a FireWire camera device can be limited by 
// adjusting the packet size.

当我试着一个接一个地使用每台摄像机时,它都起作用了。在Pylon Viewer中,它也起作用了。但当同时使用两台摄像机时,它就不起作用了。好吧。你的网络呢?它们在你的本地网络中是如何连接的?@Farhad Rahmanifard我的两台摄像机通过2 3.0 USB电缆连接到我的计算机上的。没有这样的网络瓶颈。我不是sure如果我能在这里提供帮助,但可能能够帮助您找到问题的根源。请运行两个单独的程序,每个程序都连接到一个摄像头,并查看它们如何工作。如果它们工作顺利,那么处理两个摄像头的源代码中一定有问题。如果没有,那么至少我们知道源代码没有问题。抱歉,re很晚才告诉你。那些日子我很忙,但我可以很高兴地告诉你,它成功了。我已经修复了我的源代码,尝试了你上面告诉我的。谢谢!!
// Limits the amount of cameras used for grabbing.
// It is important to manage the available bandwidth when grabbing with 
// multiple cameras.
// This applies, for instance, if two GigE cameras are connected to the 
// same network adapter via a switch.
// To manage the bandwidth, the GevSCPD interpacket delay parameter and 
// the GevSCFTD transmission delay
// parameter can be set for each GigE camera device.
// The "Controlling Packet Transmission Timing with the Interpacket and 
// Frame Transmission Delays on Basler GigE Vision Cameras"
// Application Notes (AW000649xx000)
// provide more information about this topic.
// The bandwidth used by a FireWire camera device can be limited by 
// adjusting the packet size.