Ios 如何从CMSampleBufferRef获取字节,以便通过网络发送

Ios 如何从CMSampleBufferRef获取字节,以便通过网络发送,ios,video-capture,avfoundation,video-processing,core-video,Ios,Video Capture,Avfoundation,Video Processing,Core Video,在苹果文档的帮助下,我使用AVFoundation框架制作视频字幕 -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 现在我做了以下事情 -(void)captureOutput:(AVCaptureOutput *)captureO

在苹果文档的帮助下,我使用AVFoundation框架制作视频字幕

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
现在我做了以下事情

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
1.创建了视频捕获设备 2.创建
AVCaptureDeviceInput
并设置
videoCaptureDevice

3.创建
AVCaptureVideoDataOutput
并实现委托
4.创建
AVCaptureSession
-将输入设置为AVCaptureDeviceInput,将输出设置为AVCaptureVideoDataOutput

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
5.在AVCaptureVideoDataOutput委托方法中

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
我得到了CMSamplebuffer并将其转换为UIImage,并使用

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
一切都进展顺利

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
我的问题是, 我需要通过UDP套接字发送视频帧。尽管我尝试了以下方法,但我还是将UIImage发送到NSData并通过UDP Pocket发送。但在视频处理方面却有如此大的延迟,主要是因为UIImage到目前为止

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
所以请给我解决问题的办法

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
1) 有没有办法将CMSampleBUffer或CVImageBuffer转换为NSData??
2) 与音频队列服务和视频队列类似,用于存储UIImage和更新UIImage 发送

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
如果我使用了错误的算法,请按写入方向引导我

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

提前感谢

使用
CMSampleBufferGetImageBuffer
从样本缓冲区获取
CVImageBufferRef
,然后使用
CVPixelBufferGetBaseAddress
从中获取位图数据。这样可以避免不必要地复制图像

下面是获取缓冲区的代码。此代码假定为平面图像(例如BGRA)

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
更有效的方法是使用NSMutableData或缓冲池

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

每秒发送480x360图像需要4.1Mbps的连接(假设有3个颜色通道)。

谢谢您的回复。我已经使用了CVPixelBufferGetBaseAddress,但我可以将其用作字节?单独使用它,我能在接收端画出图像吗???是的,这些是字节。其中有CVPixelBufferGetHeight()*CVPixelBufferGetBytesPerRow()。如果你做对了,你就能在另一端重建图像。这对平面格式也一样吗?还是需要使用CVPixelBufferGetHeightOfPlane和CVPixelBufferGetBytesPerRowOfPlane?您应该使用平面版本。保存您对CVPlanarPixelBufferInfo_YCbCrBiPlanar结构的解码。感谢您的回复。CVimagebuffer到NSdata正常。我有一个小疑问,在接收器中,我无法取回CVimagebuffer,bcoz,它需要高度、宽度和每行字节数。如何将其转换回图像缓冲区?此代码给出的数据长度为3686400。但当我输入NSLog(@“%i”,数据长度)时,图像仅为3000;那么它的日志记录总是3686400。使用上面的代码来获取NSData,您如何使用CGImageCreate返回到CVImageBuffer?我看不到链接。还是有别的办法?
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection