iOS 8 iPad AVCaptureMovieFileOutput在录制13-14秒后会丢失/丢失/从未获得音轨

iOS 8 iPad AVCaptureMovieFileOutput在录制13-14秒后会丢失/丢失/从未获得音轨,ios,ipad,avcapturesession,ios8.1,avcapturemoviefileoutput,Ios,Ipad,Avcapturesession,Ios8.1,Avcapturemoviefileoutput,我有以下适用于iOS 6和7.x的代码 在iOS 8.1中,我遇到了一个奇怪的问题,如果你捕获一个会话约13秒或更长时间,那么生成的AVAsset只有一个曲目(视频),音频曲目就不存在了 如果您录制的时间较短,AVAsset将按预期播放2首曲目(视频和音频)。我有足够的磁盘空间,该应用程序允许使用摄像头和麦克风 我用最少的代码创建了一个新项目,它再现了这个问题 任何想法都将不胜感激 #import "ViewController.h" @interface ViewController ()

我有以下适用于iOS 6和7.x的代码

在iOS 8.1中,我遇到了一个奇怪的问题,如果你捕获一个会话约13秒或更长时间,那么生成的AVAsset只有一个曲目(视频),音频曲目就不存在了

如果您录制的时间较短,AVAsset将按预期播放2首曲目(视频和音频)。我有足够的磁盘空间,该应用程序允许使用摄像头和麦克风

我用最少的代码创建了一个新项目,它再现了这个问题

任何想法都将不胜感激

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController
{
    enum RecordingState { Recording, Stopped };
    enum RecordingState recordingState;

    AVCaptureSession *session;
    AVCaptureMovieFileOutput *output;
    AVPlayer *player;
    AVPlayerLayer *playerLayer;
    bool audioGranted;
}

- (void)viewDidLoad {
    [super viewDidLoad];

    [self setupAV];
    recordingState = Stopped;
}

-(void)setupAV
{
    session = [[AVCaptureSession alloc] init];
    [session beginConfiguration];
    AVCaptureDevice *videoDevice = nil;

    for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
        if ( device.position == AVCaptureDevicePositionBack ) {
            videoDevice = device;
            break;
        }
    }
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    if (videoDevice && audioDevice)
    {
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
        [session addInput:input];

        AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
        [session addInput:audioInput];

        NSURL *recordURL = [self tempUrlForRecording];
        [[NSFileManager defaultManager] removeItemAtURL:recordURL error:nil];

        output= [[AVCaptureMovieFileOutput alloc] init];
        output.maxRecordedDuration = CMTimeMake(45, 1);
        output.maxRecordedFileSize = 1028 * 1028 * 1000;
        [session addOutput:output];
    }
    [session commitConfiguration];
}

- (IBAction)recordingButtonClicked:(id)sender {
    if(recordingState == Stopped)
    {
        [self startRecording];
    }
    else
    {
        [self stopRecording];
    }
}

-(void)startRecording
{
    recordingState = Recording;
    [session startRunning];
    [output startRecordingToOutputFileURL:[self tempUrlForRecording] recordingDelegate:self];

}

-(void)stopRecording
{
    recordingState = Stopped;
    [output stopRecording];
    [session stopRunning];
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    AVAsset *cameraInput = [AVAsset assetWithURL:[self tempUrlForRecording]];
    //DEPENDING ON HOW LONG RECORDED THIS DIFFERS (<14 SECS - 2 Tracks, >14 SECS - 1 Track)
    NSLog(@"Number of tracks: %i", cameraInput.tracks.count);
}

-(id)tempUrlForRecording
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [paths objectAtIndex:0];

    NSString *path = @"camerabuffer.mp4";
    NSString *pathCameraInput =[documentsDirectoryPath stringByAppendingPathComponent: path];
    NSURL *urlCameraInput = [NSURL fileURLWithPath:pathCameraInput];

    return urlCameraInput;
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end
#导入“ViewController.h”
@界面视图控制器()
@结束
@实现视图控制器
{
枚举记录状态{正在记录,已停止};
枚举记录状态记录状态;
AVCaptureSession*会话;
AVCaptureMovieFileOutput*输出;
AVPlayer*播放器;
AVPlayerLayer*播放器层;
布尔音频;
}
-(无效)viewDidLoad{
[超级视图下载];
[自我设置AV];
记录状态=停止;
}
-(无效)setupAV
{
会话=[[AVCaptureSession alloc]init];
[会议开始配置];
AVCaptureDevice*视频设备=零;
用于(AVCaptureDevice*设备位于[AVCaptureDevicesWithMediaType:AVMediaTypeVideo]中){
if(device.position==AVCaptureDevicePositionBack){
视频设备=设备;
打破
}
}
AVCaptureDevice*audioDevice=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if(视频设备和音频设备)
{
AVCaptureDeviceInput*输入=[AVCaptureDeviceInputDeviceInputWithDevice:videoDevice错误:nil];
[会话附加输入:输入];
AVCaptureDeviceInput*audioInput=[AVCaptureDeviceInputDeviceInputWithDevice:audioDevice错误:无];
[会话附加输入:音频输入];
NSURL*recordURL=[self tempUrlForRecording];
[[NSFileManager defaultManager]RemoveItemAttribute:recordURL错误:nil];
输出=[[AVCaptureMovieFileOutput alloc]init];
output.maxRecordedDuration=CMTimeMake(45,1);
output.maxRecordedFileSize=1028*1028*1000;
[会话添加输出:输出];
}
[会议委员会配置];
}
-(iAction)录制按钮选中:(id)发件人{
如果(recordingState==已停止)
{
[自启动记录];
}
其他的
{
[自动停止录制];
}
}
-(无效)开始记录
{
记录状态=记录;
[会议开始和结束];
[output startRecordingToOutputFileURL:[self tempUrlForRecording]recordingDelegate:self];
}
-(作废)停止记录
{
记录状态=停止;
[输出停止记录];
[会话停止运行];
}
-(void)captureOutput:(AVCaptureFileOutput*)captureOutput未完成根据输出文件属性:(NSURL*)来自连接的输出文件URL:(NSArray*)连接错误:(NSError*)错误
{
AVAsset*cameraInput=[AVAsset AssetTwithur:[self tempUrlForRecording]];
//根据记录的时间不同(14秒-1首曲目)
NSLog(@“曲目数:%i”,cameraInput.tracks.count);
}
-(id)用于记录的时间间隔
{
NSArray*Path=NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,是);
NSString*documentsDirectoryPath=[paths objectAtIndex:0];
NSString*path=@“camerabuffer.mp4”;
NSString*pathCameraInput=[documentsDirectoryPath stringByAppendingPathComponent:path];
NSURL*urlCameraInput=[NSURL fileURLWithPath:pathCameraInput];
返回到RAINPUT;
}
-(无效)未收到记忆警告{
[超级记忆警告];
//处置所有可以重新创建的资源。
}
@结束

这将帮助您修复它

[movieOutput setMovieFragmentInterval:KCMTIMEIVALID]

我认为这是一个错误。文档中说,如果记录未成功完成,则不会写入示例表。因此,如果它成功完成,它将自动写入。但现在似乎不是这样


有什么想法吗?

我有这个问题,在Swift 4中解决这个问题的方法如下:

  • 不要设置movieFileOutput.maxRecordedDuration。这似乎有一个bug,如果你设置了这个,那么如果你录制的视频超过12-13秒,他们将没有音频

  • 相反,请使用计时器停止录制并按如下方式设置movieFragmentInterval:

movieFileOutput.movieFragmentInterval=CMTime.invalid

下面是一整段代码,向您展示我是如何做到这一点的:

var seconds = 20
var timer = Timer()
var movieFileOutput = AVCaptureMovieFileOutput()

func startRecording(){
    movieFileOutput.movieFragmentInterval = CMTime.invalid
    movieFileOutput.startRecording(to: URL(fileURLWithPath: getVideoFileLocation()), recordingDelegate: self)
    startTimer()
}

func stopRecording(){
    movieFileOutput.stopRecording()
    timer.invalidate()
}

func startTimer(){
    timer = Timer.scheduledTimer(timeInterval: 1, target: self, selector: (#selector(updateTimer)), userInfo: nil, repeats: true)
}

@objc func updateTimer(){
    seconds -= 1
    if(seconds == 0){
        stopRecording()
    }
}

func getVideoFileLocation() -> String {
    return NSTemporaryDirectory().appending("myrecording.mp4")
}


extension FTVideoReviewViewController : AVCaptureFileOutputRecordingDelegate{
    public func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        print("Finished recording: \(outputFileURL)")
        // do stuff here when recording is finished
    }
}

我还应该提到,在DidFinishRedingToOutputFileAtrolok中没有报告任何错误(nil),将碎片间隔设置为大于将要进行的录制。但是我确信我不需要这个
CMTime fragmentInterval=CMTimeMake(5,1);[movieOutput setMovieFragmentInterval:fragmentInterval]如果您不使用
maxRecordedDuration
并在45秒后手动停止录制,会发生什么情况?我也有同样的问题。我发现,如果您使用
ffmpeg
对流进行转码,并显式设置音量(即
ffmpeg-I movie.mp4-vol 256 movie2.mp4
),您就可以恢复声音。哇。这奏效了--我一直在为这只虫子烦恼。供其他人参考,我没有最大持续时间或大小。我有最大持续时间和大小。我以前使用的是
movieOutput.movieFragmentInterval=CMTime(值:2,时间刻度:1)
,偶尔会收到音频,但没有视频。设置为KCMTIMEIVALID解决了我40%的问题。我的视频有8秒长,所以不需要片段。嗨@atlex2。你能给我演示一下吗?我没有遇到这个问题。我很乐意离线分享一个示例。这个bug也是我麻烦的一部分<代码>self.movieOutput!。movieFragmentInterval=KCMTIMEIVALID
(以swift格式)