Ffmpeg 使用libswscale将NV12转换为RGB/YUV420P

Ffmpeg 使用libswscale将NV12转换为RGB/YUV420P,ffmpeg,libavcodec,libav,Ffmpeg,Libavcodec,Libav,我正在开发一个应用程序,需要将NV12帧从h264_cuvid解码器转换为RGB,以便修改这些帧。我检查了,但我不知道“步幅”值 我的代码如下: uint8_t *inData[2] = { videoFrame->data[0], videoFrame->data[0] + videoFrame->width * videoFrame->height }; int inLinesize[2] = { videoFrame->width, videoFrame-&g

我正在开发一个应用程序,需要将NV12帧从h264_cuvid解码器转换为RGB,以便修改这些帧。我检查了,但我不知道“步幅”值

我的代码如下:

uint8_t *inData[2] = { videoFrame->data[0], videoFrame->data[0] + videoFrame->width * videoFrame->height };
int inLinesize[2] = { videoFrame->width, videoFrame->width };

sws_scale(convert_yuv_to_rgb, inData, inLinesize, 0, videoFrame->height, aux_frame->data, aux_frame->linesize);

但它不起作用。尽管问题出在颜色上,因为我可以正确地看到亮度平面。

我最终使用了基于颜色的视频过滤器


我创建了另一个,在以类似方式编码之前,将RGB转换为YUV420P。

色度数据应该存储在videoFrame->data[1]中,而不是videoFrame->data[0]+videoFrame->width*videoFrame->height@OldC上一个RGB->YUV420P转换工作,但如果原始帧是NV12,则第二个转换不工作。我检查了RGB帧是否正确,并将每个帧保存到PNG文件中。
char args[512];
int ret;
AVFilter *buffersrc = avfilter_get_by_name("buffer");
AVFilter *buffersink = avfilter_get_by_name("buffersink");
AVFilterInOut *outputs = avfilter_inout_alloc();
AVFilterInOut *inputs = avfilter_inout_alloc();
AVFilterGraph *filter_graph = avfilter_graph_alloc();
AVBufferSinkParams *buffersink_params;
enum AVPixelFormat pix_fmts[] = { AV_PIX_FMT_RGB32, AV_PIX_FMT_NONE };

/* buffer video source: the decoded frames from the decoder will be inserted here. */
snprintf(args, sizeof(args),
         "video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
         inStream.width, inStream.height, inStream.pix_fmt,
         inStream.time_base.num, inStream.time_base.den,
         inStream.sample_aspect_ratio.num, inStream.sample_aspect_ratio.den);
ret = avfilter_graph_create_filter(&buffersrc_ctx_to_rgb_, buffersrc, "in", args, NULL, filter_graph);

if (ret < 0) {
    throw SVSException(QString("Could not create filter graph, error: %1").arg(svsAvErrorToFormattedString(ret)));
}

/* buffer video sink: to terminate the filter chain. */
buffersink_params = av_buffersink_params_alloc();
buffersink_params->pixel_fmts = pix_fmts;
ret = avfilter_graph_create_filter(&buffersink_ctx_to_rgb_, buffersink, "out", NULL, buffersink_params, filter_graph);

if (ret < 0) {
    throw SVSException(QString("Cannot create buffer sink, error: %1").arg(svsAvErrorToFormattedString(ret)));
}

/* Endpoints for the filter graph. */
outputs -> name = av_strdup("in");
outputs -> filter_ctx = buffersrc_ctx_to_rgb_;
outputs -> pad_idx = 0;
outputs -> next = NULL;
/* Endpoints for the filter graph. */
inputs -> name = av_strdup("out");
inputs -> filter_ctx = buffersink_ctx_to_rgb_;
inputs -> pad_idx = 0;
inputs -> next = NULL;

QString filter_description = "format=pix_fmts=rgb32";
if ((ret = avfilter_graph_parse_ptr(filter_graph, filter_description.toStdString().c_str(), &inputs, &outputs, NULL)) < 0) {
    svsCritical("", QString("Could not add the filter to graph, error: %1").arg(svsAvErrorToFormattedString(ret)))
}

if ((ret = avfilter_graph_config(filter_graph, NULL)) < 0) {
    svsCritical("", QString("Could not configure the graph, error: %1").arg(svsAvErrorToFormattedString(ret)))
}

return;