Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/objective-c/23.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/119.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Objective c 在Avassetrader中设置时间范围会导致冻结_Objective C_Ios_Avassetreader - Fatal编程技术网

Objective c 在Avassetrader中设置时间范围会导致冻结

Objective c 在Avassetrader中设置时间范围会导致冻结,objective-c,ios,avassetreader,Objective C,Ios,Avassetreader,所以,我试图对先前录制的音频(来自AVAsset)进行一个简单的计算,以创建波形视觉效果。目前,我通过平均一组样本来实现这一点,样本的大小由音频文件大小除以波形的分辨率来确定 这一切都很好,除了一个问题……太慢了。在3GS上运行时,处理音频文件大约需要播放时间的3%,这是一种缓慢的方式(例如,处理一个1小时的音频文件大约需要2.5分钟)。我试着尽可能地优化这个方法,但它不起作用。我将发布用于处理文件的代码。也许有人能帮上忙,但我真正想要的是一种不必检查每个字节就可以处理文件的方法。所以,假设给定

所以,我试图对先前录制的音频(来自AVAsset)进行一个简单的计算,以创建波形视觉效果。目前,我通过平均一组样本来实现这一点,样本的大小由音频文件大小除以波形的分辨率来确定

这一切都很好,除了一个问题……太慢了。在3GS上运行时,处理音频文件大约需要播放时间的3%,这是一种缓慢的方式(例如,处理一个1小时的音频文件大约需要2.5分钟)。我试着尽可能地优化这个方法,但它不起作用。我将发布用于处理文件的代码。也许有人能帮上忙,但我真正想要的是一种不必检查每个字节就可以处理文件的方法。所以,假设给定2000的分辨率,我想访问该文件,并在2000点的每个点上取样。我认为这会快得多,尤其是如果文件更大的话。但我知道获取原始数据的唯一方法是以线性方式访问音频文件。有什么想法吗?下面是我用来处理该文件的代码(注意,所有类变量都以“25;”开头):

所以我完全改变了这个问题。我后来才意识到Avassetrader有一个用于“搜索”的时间范围属性,这正是我要寻找的(见上面的原始问题)。此外,这个问题已经被问过并回答过了(我只是以前没有找到),我不想重复这些问题。然而,我仍然有一个问题。我的应用程序冻结了一段时间,当我尝试
copyNextSampleBuffer
时,最终崩溃。我不知道发生了什么事。我似乎不在任何类型的递归循环中,它只是从不从函数调用返回。检查日志显示给我以下错误:

Exception Type:  00000020
Exception Codes: 0x8badf00d
Highlighted Thread:  0

Application Specific Information:
App[10570] has active assertions beyond permitted time: 
{(
    <SBProcessAssertion: 0xddd9300> identifier: Suspending process: App[10570] permittedBackgroundDuration: 10.000000 reason: suspend owner pid:52 preventSuspend  preventThrottleDownCPU  preventThrottleDownUI 
)}
异常类型:00000020
异常代码:0x8badf00d
突出显示的线程:0
特定于应用程序的信息:
应用程序[10570]的活动断言超出了允许的时间:
{(
标识符:挂起进程:App[10570]permittedBackgroundDuration:10.000000原因:挂起所有者pid:52 preventSuspend preventThrottleDownCPU preventThrottleDownUI
)}
我在应用程序上使用了一个时间分析器,是的,它只需要最少的处理就可以了。搞不清楚发生了什么事。需要注意的是,如果我没有设置Avassetrader的timeRange属性,则不会发生这种情况。我已经检查过,时间范围的值是有效的,但是设置它会由于某种原因导致问题。这是我的处理代码:

- (void) processSampleData{
    if (!_asset || CMTimeGetSeconds(_asset.duration) <= 0) return;
    NSError *error = nil;
    AVAssetTrack *songTrack = _asset.tracks.firstObject;
    if (!songTrack) return;
    NSDictionary *outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
                                        [NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
                                        [NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
                                        [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
                                        [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
                                        [NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
                                        nil];

    UInt32 sampleRate = 44100.0; 
    _channelCount = 1;

    NSArray *formatDesc = songTrack.formatDescriptions;
    for(unsigned int i = 0; i < [formatDesc count]; ++i) {
        CMAudioFormatDescriptionRef item = (__bridge_retained CMAudioFormatDescriptionRef)[formatDesc objectAtIndex:i];
        const AudioStreamBasicDescription* fmtDesc = CMAudioFormatDescriptionGetStreamBasicDescription (item);
        if(fmtDesc ) { 
            sampleRate = fmtDesc->mSampleRate;
            _channelCount = fmtDesc->mChannelsPerFrame;
        }
        CFRelease(item);
    }

    UInt32 bytesPerSample = 2 * _channelCount; //Bytes are hard coded by AVLinearPCMBitDepthKey
    _normalizedMax = 0;
    _sampledData = [[NSMutableData alloc] init];

    SInt16 *channels[_channelCount];
    char *sampleRef;
    SInt16 *samples;
    NSInteger sampleTally = 0;
    SInt16 cTotal;
    _sampleCount = DefaultSampleSize * [UIScreen mainScreen].scale;
    NSTimeInterval intervalBetweenSamples = _asset.duration.value / _sampleCount;
    NSTimeInterval sampleSize = fmax(100, intervalBetweenSamples / _sampleCount);
    double assetTimeScale = _asset.duration.timescale;
    CMTimeRange timeRange = CMTimeRangeMake(CMTimeMake(0, assetTimeScale), CMTimeMake(sampleSize, assetTimeScale));

    SInt16 totals[_channelCount];
    @autoreleasepool {
        for (int i = 0; i < _sampleCount; i++) {
            AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:_asset error:&error];
            AVAssetReaderTrackOutput *trackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:songTrack outputSettings:outputSettingsDict];
            [reader addOutput:trackOutput];
            reader.timeRange = timeRange;
            [reader startReading];
            while (reader.status == AVAssetReaderStatusReading) {
                CMSampleBufferRef sampleBufferRef = [trackOutput copyNextSampleBuffer];
                if (sampleBufferRef){
                    CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sampleBufferRef);
                    size_t length = CMBlockBufferGetDataLength(blockBufferRef);
                    int sampleCount = length / bytesPerSample;
                    for (int i = 0; i < sampleCount ; i += _channelCount) {
                        CMBlockBufferAccessDataBytes(blockBufferRef, i * bytesPerSample, _channelCount, channels, &sampleRef);
                        samples = (SInt16 *)sampleRef;
                        for (int channel = 0; channel < _channelCount; channel++)
                            totals[channel] += samples[channel];
                        sampleTally++;
                    }
                    CMSampleBufferInvalidate(sampleBufferRef);
                    CFRelease(sampleBufferRef);
                }
            }
            for (int i = 0; i < _channelCount; i++){
                cTotal = abs(totals[i] / sampleTally);
                if (cTotal > _normalizedMax) _normalizedMax = cTotal;
                [_sampledData appendBytes:&cTotal length:sizeof(cTotal)];
                totals[i] = 0;
            }
            sampleTally = 0;
            timeRange.start = CMTimeMake((intervalBetweenSamples * (i + 1)) - sampleSize, assetTimeScale); //Take the sample just before the interval
        }

    }
    _assetNeedsProcessing = NO;
}
-(void)处理采样数据{
如果(!_asset | | CMTimeGetSeconds(_asset.duration)mSampleRate;
_channelCount=fmtDesc->mChannelsPerFrame;
}
发布(项目);
}
UInt32 bytesPerSample=2*\u channelCount;//字节由AVLinearPCMBitDepthKey硬编码
_归一化最大值=0;
_sampledData=[[NSMutableData alloc]init];
SInt16*通道[_channelCount];
char*ef;
16*个样本;
NSInteger-sampleTally=0;
辛克托尔;
_sampleCount=DefaultSampleSize*[UIScreen mainScreen]。比例;
NSTimeInterval intervalBetweenSamples=\u asset.duration.value/\u sampleCount;
NSTimeInterval sampleSize=fmax(100,样本之间的间隔/_sampleCount);
双assetTimeScale=_asset.duration.timescale;
CMTimeRange timeRange=CMTimeRangeMake(CMTimeMake(0,assetTimeScale),CMTimeMake(sampleSize,assetTimeScale));
共16次(u channelCount);;
@自动释放池{
对于(int i=0;i<\u sampleCount;i++){
Avassetrader*读卡器=[Avassetrader assetReaderWithAsset:_AssetError:&error];
AVAssetReaderTrackOutput*trackOutput=[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:songTrack输出设置:输出设置DICT];
[reader addOutput:trackOutput];
reader.timeRange=时间范围;
[读者开始阅读];
while(reader.status==AvassetraderStatusReading){
CMSampleBufferRef sampleBufferRef=[trackOutput copyNextSampleBuffer];
if(sampleBufferRef){
CMBlockBufferRef blockBufferRef=CMSampleBufferGetDataBuffer(sampleBufferRef);
大小\u t长度=CMBlockBufferGetDataLength(blockBufferRef);
int sampleCount=长度/字节数样本;
对于(int i=0;i\u normalizedMax)\u normalizedMax=cTotal;
[_sampleddataappendbytes:&cTotal length:sizeof(cTotal)];
总数[i]=0;
}
样本数=0;
timeRange.start=CMTimeMake((样本间的间隔*(i+1))-sampleSize,assetTimeScale);//在间隔前采集样本
}
}
_assetNeedsProcessing=否;
}

我终于明白了原因。显然,对于Avassetrader的时间范围,你可以指定某种“最小”持续时间。我不确定该最小持续时间到底是多少,介于1000到5000之间。最小持续时间可能会随资产的持续时间而变化……老实说,我不确定。相反,我保留了持续时间(无穷大)相同且为s