Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/cplusplus/137.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/video/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C++ NV12到RGB32与微软媒体基金会 我想通过媒体基金会得到一些相机预览,我在YT上找到了。我已经列举了我的设备。我可以从我的相机上预览,但我的数据格式是NV12。我需要RGB32格式的数据。这是包含有关设备信息的类 class Media : public IMFSourceReaderCallback { CRITICAL_SECTION criticalSection; long referenceCount; WCHAR *wSymbolicLink; UINT32 cchSymbolicLink; IMFSourceReader* sourceReader; public: LONG stride; int bytesPerPixel; GUID videoFormat; UINT height; UINT width; WCHAR deviceNameString[2048]; BYTE* rawData; HRESULT CreateCaptureDevice(); HRESULT SetSourceReader(IMFActivate *device); HRESULT IsMediaTypeSupported(IMFMediaType* type); HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride); HRESULT Close(); Media(); ~Media(); // the class must implement the methods from IUnknown STDMETHODIMP QueryInterface(REFIID iid, void** ppv); STDMETHODIMP_(ULONG) AddRef(); STDMETHODIMP_(ULONG) Release(); // the class must implement the methods from IMFSourceReaderCallback STDMETHODIMP OnReadSample(HRESULT status, DWORD streamIndex, DWORD streamFlags, LONGLONG timeStamp, IMFSample *sample); STDMETHODIMP OnEvent(DWORD, IMFMediaEvent *); STDMETHODIMP OnFlush(DWORD); };_C++_Video_Type Conversion_Capture_Ms Media Foundation - Fatal编程技术网

C++ NV12到RGB32与微软媒体基金会 我想通过媒体基金会得到一些相机预览,我在YT上找到了。我已经列举了我的设备。我可以从我的相机上预览,但我的数据格式是NV12。我需要RGB32格式的数据。这是包含有关设备信息的类 class Media : public IMFSourceReaderCallback { CRITICAL_SECTION criticalSection; long referenceCount; WCHAR *wSymbolicLink; UINT32 cchSymbolicLink; IMFSourceReader* sourceReader; public: LONG stride; int bytesPerPixel; GUID videoFormat; UINT height; UINT width; WCHAR deviceNameString[2048]; BYTE* rawData; HRESULT CreateCaptureDevice(); HRESULT SetSourceReader(IMFActivate *device); HRESULT IsMediaTypeSupported(IMFMediaType* type); HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride); HRESULT Close(); Media(); ~Media(); // the class must implement the methods from IUnknown STDMETHODIMP QueryInterface(REFIID iid, void** ppv); STDMETHODIMP_(ULONG) AddRef(); STDMETHODIMP_(ULONG) Release(); // the class must implement the methods from IMFSourceReaderCallback STDMETHODIMP OnReadSample(HRESULT status, DWORD streamIndex, DWORD streamFlags, LONGLONG timeStamp, IMFSample *sample); STDMETHODIMP OnEvent(DWORD, IMFMediaEvent *); STDMETHODIMP OnFlush(DWORD); };

C++ NV12到RGB32与微软媒体基金会 我想通过媒体基金会得到一些相机预览,我在YT上找到了。我已经列举了我的设备。我可以从我的相机上预览,但我的数据格式是NV12。我需要RGB32格式的数据。这是包含有关设备信息的类 class Media : public IMFSourceReaderCallback { CRITICAL_SECTION criticalSection; long referenceCount; WCHAR *wSymbolicLink; UINT32 cchSymbolicLink; IMFSourceReader* sourceReader; public: LONG stride; int bytesPerPixel; GUID videoFormat; UINT height; UINT width; WCHAR deviceNameString[2048]; BYTE* rawData; HRESULT CreateCaptureDevice(); HRESULT SetSourceReader(IMFActivate *device); HRESULT IsMediaTypeSupported(IMFMediaType* type); HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride); HRESULT Close(); Media(); ~Media(); // the class must implement the methods from IUnknown STDMETHODIMP QueryInterface(REFIID iid, void** ppv); STDMETHODIMP_(ULONG) AddRef(); STDMETHODIMP_(ULONG) Release(); // the class must implement the methods from IMFSourceReaderCallback STDMETHODIMP OnReadSample(HRESULT status, DWORD streamIndex, DWORD streamFlags, LONGLONG timeStamp, IMFSample *sample); STDMETHODIMP OnEvent(DWORD, IMFMediaEvent *); STDMETHODIMP OnFlush(DWORD); };,c++,video,type-conversion,capture,ms-media-foundation,C++,Video,Type Conversion,Capture,Ms Media Foundation,这是创建设备的方法: HRESULT Media::CreateCaptureDevice() { HRESULT hr = S_OK; //this is important!! hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE);//COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE); UINT32 cou

这是创建设备的方法:

HRESULT Media::CreateCaptureDevice()
{
    HRESULT hr = S_OK;

    //this is important!!
    hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE);//COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE);

    UINT32 count = 0;
    IMFAttributes *attributes = NULL;
    IMFActivate **devices = NULL;

    if (FAILED(hr)) { CLEAN_ATTRIBUTES() }
    // Create an attribute store to specify enumeration parameters.
    hr = MFCreateAttributes(&attributes, 1);

    if (FAILED(hr)) { CLEAN_ATTRIBUTES() }

    //The attribute to be requested is devices that can capture video
    hr = attributes->SetGUID(
        MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE,
        MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID
    );
    if (FAILED(hr)) { CLEAN_ATTRIBUTES() }
    //Enummerate the video capture devices
    hr = MFEnumDeviceSources(attributes, &devices, &count);

    if (FAILED(hr)) { CLEAN_ATTRIBUTES() }
    //if there are any available devices
    if (count > 0)
    {
        /*If you actually need to select one of the available devices
        this is the place to do it. For this example the first device
        is selected
        */
        //Get a source reader from the first available device
        SetSourceReader(devices[0]);

        WCHAR *nameString = NULL;
        // Get the human-friendly name of the device
        UINT32 cchName;
        hr = devices[0]->GetAllocatedString(
            MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME,
            &nameString, &cchName);

        if (SUCCEEDED(hr))
        {
            //allocate a byte buffer for the raw pixel data
            bytesPerPixel = abs(stride) / width;
            rawData = new BYTE[width*height * bytesPerPixel];
            wcscpy(deviceNameString,nameString);
        }
        CoTaskMemFree(nameString);
    }

    //clean
    CLEAN_ATTRIBUTES()
}
这是设置设备的方法:

HRESULT Media::SetSourceReader(IMFActivate *device)
{
    HRESULT hr = S_OK;

    IMFMediaSource *source = NULL;
    IMFAttributes *attributes = NULL;
    IMFMediaType *mediaType = NULL;

    EnterCriticalSection(&criticalSection);

    hr = device->ActivateObject(__uuidof(IMFMediaSource), (void**)&source);

    //get symbolic link for the device
    if(SUCCEEDED(hr))
        hr = device->GetAllocatedString(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, &wSymbolicLink, &cchSymbolicLink);
    //Allocate attributes
    if (SUCCEEDED(hr))
        hr = MFCreateAttributes(&attributes, 2);
    //get attributes
    if (SUCCEEDED(hr))
        hr = attributes->SetUINT32(MF_READWRITE_DISABLE_CONVERTERS, TRUE);
    // Set the callback pointer.
    if (SUCCEEDED(hr))
        hr = attributes->SetUnknown(MF_SOURCE_READER_ASYNC_CALLBACK,this);
    //Create the source reader
    if (SUCCEEDED(hr))
        hr = MFCreateSourceReaderFromMediaSource(source,attributes,&sourceReader);
    // Try to find a suitable output type.
    if (SUCCEEDED(hr))
    {
        for (DWORD i = 0; ; i++)
        {
            hr = sourceReader->GetNativeMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,i,&mediaType);
            if (FAILED(hr)) { break; }

            hr = IsMediaTypeSupported(mediaType);
            if (FAILED(hr)) { break; }
            //Get width and height
            MFGetAttributeSize(mediaType, MF_MT_FRAME_SIZE, &width, &height);
            if (mediaType) 
            { mediaType->Release(); mediaType = NULL; }

            if (SUCCEEDED(hr))// Found an output type.
                break;
        }
    }
    if (SUCCEEDED(hr))
    {
        // Ask for the first sample.
        hr = sourceReader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,   0, NULL, NULL,NULL,NULL);
    }

    if (FAILED(hr))
    {
        if (source)
        {
            source->Shutdown(); 
        }
        Close();
    }
    if (source) { source->Release(); source = NULL; }
    if (attributes) { attributes->Release(); attributes = NULL; }
    if (mediaType) { mediaType->Release(); mediaType = NULL; }

    LeaveCriticalSection(&criticalSection);
    return hr;
}
通常,当yuv(nv12)到rgb(rgb32)时:

字节*p_rgb_数据=((高度/2)+高度)*宽度)*3;//没有阿尔法

编辑

我的回答不正确,你想从相机中得到rgb格式,对吗

您必须从SourceReader枚举MediaType:

}

当MediaType为RGB时,设置MediaType:

编辑…

是,来自:

字节*p_rgb_数据=((高度/2)+高度)*宽度)*3;//没有阿尔法

->应为字节*p_rgb_Size=((高度/2)+高度)*宽度)*3;//没有阿尔法

我错过了转换。比如:

BYTE GetR(const int bY, int const bU){

    int iR = bY + (int)(1.402f * bU);
    iR = iR > 255 ? 255 : iR < 0 ? 0 : iR;

    return iR;
BYTE GetR(常量int bY,int常量bU){
int iR=按+(int)(1.402f*bU);
iR=iR>255?255:iR<0?0:iR;
返回红外光谱;
}

字节GetG(常量int bY、常量int bU、常量int bV){

int-iG=bY-(int)(0.344f*bV+0.714f*bU);
iG=iG>255?255:iG<0?0:iG;
返回免疫球蛋白;
}

字节GetB(常量整型字节,常量整型字节){

intib=bY+(int)(1.772f*bV);
iB=iB>255?255:iB<0?0:iB;
返回iB;
}


如果您想在NV12缓冲区和RGB缓冲区之间进行完全转换,只需询问。

可以处理颜色转换。它的设置和使用相当简单,因为它是一个同步转换

网络摄像头通常支持RGB输出。您可能需要枚举设备媒体类型,在预览流上选择并设置最合适的RGB32类型(宽度/高度/fps)。您能解释一下吗?如果没有来自NV12的任何数据样本,数据怎么可能只是高度和宽度的组合呢。对不起,我就是不明白。无论如何谢谢你!不是真的。我已经试过检查mediatype的子类型,但我的相机使用的唯一格式是NV12和YUY2,所以我正在尝试从NV12转换为RGB32。我通过列举媒体类型并检查subtype==MF\u VIDEOFORMAT\u XXX,只有NV12和YUY2起作用:(
hr = pReader->SetCurrentMediaType(dwStreamIndex, pMediaType);
BYTE GetR(const int bY, int const bU){

    int iR = bY + (int)(1.402f * bU);
    iR = iR > 255 ? 255 : iR < 0 ? 0 : iR;

    return iR;
    int iG = bY - (int)(0.344f * bV + 0.714f * bU);
    iG = iG > 255 ? 255 : iG < 0 ? 0 : iG;

    return iG;
    int iB = bY + (int)(1.772f * bV);
    iB = iB > 255 ? 255 : iB < 0 ? 0 : iB;

    return iB;