Ios 为什么一个评估生成器生成的框架会同步40次使应用程序崩溃
我试图使用Ios 为什么一个评估生成器生成的框架会同步40次使应用程序崩溃,ios,avfoundation,cgimageref,avassetimagegenerator,Ios,Avfoundation,Cgimageref,Avassetimagegenerator,我试图使用generatecgimagesassynchronouslyfortimes从视频中每秒提取2帧。但我的应用程序崩溃了。我正在监控内存使用情况,但不会超过14MB 代码如下: - (void) createImagesFromVideoURL:(NSURL *) videoUrl atFPS: (int) reqiuredFPS completionBlock: (void(^) (NSMutableArray *frames, CGSize frameSize)) block {
generatecgimagesassynchronouslyfortimes
从视频中每秒提取2帧。但我的应用程序崩溃了。我正在监控内存使用情况,但不会超过14MB
代码如下:
- (void) createImagesFromVideoURL:(NSURL *) videoUrl atFPS: (int) reqiuredFPS completionBlock: (void(^) (NSMutableArray *frames, CGSize frameSize)) block
{
NSMutableArray *requiredFrames = [[NSMutableArray alloc] init];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
generator.appliesPreferredTrackTransform = YES;
UIImage *sampleGeneratedImage;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * reqiuredFPS ; i++)
{
CMTime time = CMTimeMake(i, reqiuredFPS);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
if (! err)
{
sampleGeneratedImage = [[UIImage alloc] initWithCGImage:image];
break;
}
}
//Get the maximum size from 1 frame
generator.maximumSize = [self getMultipleOf16AspectRatioForCurrentFrameSize:sampleGeneratedImage.size];
NSMutableArray *requestedFrameTimes = [[NSMutableArray alloc] init];
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * reqiuredFPS ; i++)
{
CMTime time = CMTimeMake(i, reqiuredFPS);
[requestedFrameTimes addObject:[NSValue valueWithCMTime:time]];
}
[generator generateCGImagesAsynchronouslyForTimes:[requestedFrameTimes copy] completionHandler:^(CMTime requestedTime, CGImageRef _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
if (image)
{
UIImage *generatedImage = [UIImage imageWithCGImage:image];
[requiredFrames addObject:generatedImage];
}
if (CMTimeCompare(requestedTime, [[requestedFrameTimes lastObject] CMTimeValue]) == 0)
{
NSLog(@"Image processing complete");
dispatch_async(dispatch_get_main_queue(), ^{
block(requiredFrames, generator.maximumSize);
});
}
else
{
NSLog(@"Getting frame at %lld", actualTime.value/actualTime.timescale);
}
}];
}
-(void)createImagesFromVideoURL:(NSURL*)videoUrl atFPS:(int)RequiredFPS完成块:(void(^)(NSMutableArray*frames,CGSize frames))块
{
NSMutableArray*requiredFrames=[[NSMutableArray alloc]init];
AVURLAsset*asset=[[AVURLAsset alloc]initWithURL:videoUrl选项:nil];
AVAssetImageGenerator*generator=[[AVAssetImageGenerator alloc]initWithAsset:asset];
generator.requestedTimeToleranceAfter=kCMTimeZero;
generator.requestedTimeToleranceBefore=kCMTimeZero;
generator.appliesPreferredTrackTransform=是;
UIImage*样本生成图像;
对于(Float64 i=0;i
您得到的崩溃是什么?是哪一行导致的?在提取第33帧或第34帧时,我不会因此而死亡。没有任何错误这不会回答您的问题,但如果您调用copyCGImageAtTime
则应释放CGImage
,否则无法保留所有这些图像/帧而不会耗尽内存。确保一次只保留几个。但是我只使用了一次copyCGImageAtTime
来获取帧大小。对于其余的帧,使用了generatecgimagesassynchronouslyfortimes
。您得到的崩溃是什么,是哪一行导致的?在提取第33帧或第34帧时,它不会对我造成任何影响。没有任何错误这不会回答您的问题,但如果您调用copyCGImageAtTime
则应释放CGImage
,否则无法保留所有这些图像/帧而不会耗尽内存。确保一次只保留几个。但是我只使用了一次copyCGImageAtTime
来获取帧大小。对于其余的帧,将使用generatecgimagesassynchronouslyfortimes
。