GStreamer双管道同步

GStreamer双管道同步,gstreamer,Gstreamer,我开发了一个应用程序,它从文件中获取数据,向每个帧添加元数据,并通过udp over rtp将数据传输到客户端 在接收端,客户端删除该元数据并播放视频 对于我在服务器端使用的: pipeline1 :: gst-launch-1.0 filesrc ! videoparse ! appsink 在appsink添加元数据并将该缓冲区推送到appsrc pipeline 2 :: gst-launch-1.0 appsrc ! rtpgstpay ! udpsink pipeline2 ::

我开发了一个应用程序,它从文件中获取数据,向每个帧添加元数据,并通过udp over rtp将数据传输到客户端

在接收端,客户端删除该元数据并播放视频

对于我在服务器端使用的:

pipeline1 :: gst-launch-1.0 filesrc ! videoparse ! appsink
在appsink添加元数据并将该缓冲区推送到appsrc

pipeline 2 :: gst-launch-1.0 appsrc ! rtpgstpay ! udpsink
pipeline2 :: gst-launch-1.0 appsrc ! videoparse ! autovideoconvert ! autovideosink
最后::

pipeline1 :: gst-launch-1.0 udpsrc ! rtpgstdepay ! appsink
在appsink删除元数据并将缓冲区推送到appsrc

pipeline 2 :: gst-launch-1.0 appsrc ! rtpgstpay ! udpsink
pipeline2 :: gst-launch-1.0 appsrc ! videoparse ! autovideoconvert ! autovideosink
问题是在接收端,我没有得到所有的帧和视频也没有正常播放。仅播放一帧并停止播放,再次仅播放一帧

有人能提供一些解决方案或建议吗

/*服务器端处理帧上的代码*/

/* This is in Function which is called by g_timeout_add_seconds(0, new_sample, NULL);

g_signal_emit_by_name(sink, "pull-sample", &sample, NULL);
buf = gst_buffer_new_wrapped(&header, sizeof(header)); // header is Structure
FrameBuffer = gst_sample_get_buffer(sample);
buffer = gst_buffer_append(buf, FrameBuffer);

g_signal_emit_by_name (appsrc2, "push-buffer", buffer, &ret);

        if(ret != GST_FLOW_OK)
        {
                g_printerr("Failed to Push Buffer");
                return FALSE;
        }
        g_print("Successfully Pushed..\n");

/* Above code is for processing frame at server end. */
/*用于在接收器端处理帧的代码*/

//This is in Function new_sample which is called by g_timeout_add_seconds(0, new_sample, NULL);
        if(!gst_app_sink_is_eos ((GstAppSink *)sink))
        {
            GstSample *sample = NULL;
            g_signal_emit_by_name(sink, "pull-sample", &sample, NULL); 
              buf = gst_sample_get_buffer(sample);
              gst_buffer_extract(buf, 0, temp, 8);
              if(frameid != temp->FrameID)
              {
                gst_element_set_state(pipeline2, GST_STATE_PLAYING);
              g_print("Frame Header :: %d , Frame ID :: %d\n", temp->FrameHeader, temp->FrameID);
            gint size;
            size = gst_buffer_get_size(buf);
            buffer = gst_buffer_copy_region(buf, GST_BUFFER_OFFSET_NONE, 8, size-8);
            g_print("Size :: -- %d\n",size);
            g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
            if(ret != GST_FLOW_OK)
            {
                    g_printerr("Failed to Push Buffer");
                    return FALSE;
            }
            g_print("Successfully Pushed..\n");

appsrc上的设置是live=TRUE怎么样?还要检查appsrc上是否有足够的信号数据。如果您以类似gst-launch的格式(element!element2!等)提供有关管道的详细信息,这将非常有用。Pipeline One::gst-launch-1.0 udpsrc!队列rtpgstdepay!appsink处的appsink我正在修改缓冲区,修改后的缓冲区通过push buffer信号发送给appsrc。管道二::gst-launch-1.0 appsrc!队列视频解析!自动视频转换!AutoVideoSink请更新该问题,以便其他人也能立即看到它-这样有更好的机会回答您的问题:)您可以添加一些关于如何处理帧的代码吗?视频的格式是什么(您确定视频解析,您不需要一些解复用器,例如mp4的qtdemux)?当您将发送方的管道1和2以及接收方的管道1和2连接起来时,元数据处理是否正常工作?