Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/video/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/arrays/14.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Objective c 如何在iOS中为正在运行的视频添加动态视觉效果?_Objective C_Video_Ios7_Xcode5_Gpuimage - Fatal编程技术网

Objective c 如何在iOS中为正在运行的视频添加动态视觉效果?

Objective c 如何在iOS中为正在运行的视频添加动态视觉效果?,objective-c,video,ios7,xcode5,gpuimage,Objective C,Video,Ios7,Xcode5,Gpuimage,我想将视觉效果动态更改为运行视频。我使用GPUImage框架更改视觉效果。我从下载了示例项目。在这个GPUImage中,我选择了SimpleVideoFileFilter示例。这个示例使用一个过滤器运行,只是我修改了代码,目前它支持10个过滤器。我的问题是,视频文件正在GPUImageView中播放,现在我选择另一个过滤器。突然间,视频效果也发生了变化。但那个视频是从头开始的。我想为当前播放的视频动态更改过滤器。 我的代码是: #pragma mark - Play Video with Eff

我想将视觉效果动态更改为运行视频。我使用GPUImage框架更改视觉效果。我从下载了示例项目。在这个GPUImage中,我选择了SimpleVideoFileFilter示例。这个示例使用一个过滤器运行,只是我修改了代码,目前它支持10个过滤器。我的问题是,视频文件正在GPUImageView中播放,现在我选择另一个过滤器。突然间,视频效果也发生了变化。但那个视频是从头开始的。我想为当前播放的视频动态更改过滤器。 我的代码是:

#pragma mark - Play Video with Effects

- (void)getVideo:(NSURL *)url
{
    movieFile = [[GPUImageMovie alloc] initWithURL:url];

    movieFile.runBenchmark = YES;
    movieFile.playAtActualSpeed = YES;
    //    filter = [[GPUImagePixellateFilter alloc] init];


    [movieFile addTarget:filter];

    // Only rotate the video for display, leave orientation the same for recording
    filterView = (GPUImageView *)self.view;
    [filter addTarget:filterView];

    // In addition to displaying to the screen, write out a processed version of the movie to disk
    NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
    unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
    NSURL *movieURL1 = [NSURL fileURLWithPath:pathToMovie];

    movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL1 size:CGSizeMake(640.0, 480.0)];
    [filter addTarget:movieWriter];

    // Configure this for video from the movie file, where we want to preserve all video frames and audio samples
    movieWriter.shouldPassthroughAudio = YES;
    movieFile.audioEncodingTarget = movieWriter;
    [movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];

    [movieWriter startRecording];
    [movieFile startProcessing];

    [movieWriter setCompletionBlock:^{
        [filter removeTarget:movieWriter];
        [movieWriter finishRecording];
        UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
    }];
}

- (void)event:(UIButton*)sender
{
    [filter removeTarget:filterView];
    UIButton *selectedBtn = sender;
    [movieFile removeTarget:filter];
    switch (selectedBtn.tag)
    {
        case 0:
            filter = [[GPUImageBrightnessFilter alloc] init];
            break;
        case 1:
            filter = [[GPUImageGrayscaleFilter alloc] init];
            break;
        case 2:
            filter = [[GPUImageSketchFilter alloc] init];
            break;
        case 3:
            filter = [[GPUImageToonFilter alloc] init];
            break;
        case 4:
            filter = [[GPUImageMonochromeFilter alloc] init];
            break;
        case 5:
            filter = [[GPUImagePixellateFilter alloc] init];
            break;
        case 6:
            filter = [[GPUImageCrosshatchFilter alloc] init];
            break;
        case 7:
            filter = [[GPUImageVignetteFilter alloc] init];
            break;
        case 8:
            filter = [[GPUImageColorInvertFilter alloc] init];
            break;
        case 9:
            filter = [[GPUImageLevelsFilter alloc] init];
            [(GPUImageLevelsFilter *)filter setRedMin:1.0 gamma:1.0 max:0.0 minOut:0.5 maxOut:0.5];
            break;

        default:
            break;        
    }
    [self getVideo:movieURL];
}

请帮我解决这个问题。

我自己找到了答案。解决办法是

- (void)event:(UIButton*)sender
{
    //  isMoviePlayCompleted = NO;
    if (btnTag != sender.tag)
    {
        btnTag = (int)sender.tag;
        NSLog(@"tag:%d",btnTag);
        [self applyFilter:sender.tag];
    }

}
应用过滤器

    -(void) applyFilter:(NSInteger) tag
    {
        [[NSFileManager defaultManager] removeItemAtURL:saveTempUrl error:nil];
        recording = NO;
        switch (tag)
        {
            case 0:
                filter =nil;
                filter  = [[GPUImagePixellateFilter alloc] init];
                [(GPUImagePixellateFilter *)filter setFractionalWidthOfAPixel:0.0];
                break;
            case 1:
                filter =nil;
                filter = [[GPUImageGrayscaleFilter alloc] init];
                break;
            case 2:
                filter =nil;
                filter = [[GPUImageSketchFilter alloc] init];
                break;
            case 3:
                filter =nil;
                filter = [[GPUImageToonFilter alloc] init];
                break;
            case 4:
                filter =nil;
                filter = [[GPUImageMonochromeFilter alloc] init];
                break;
            case 5:
                filter =nil;
                filter = [[GPUImageVignetteFilter alloc] init];
                break;
      default:
                break;
        }

        [self getVideo:movieURL];
}
播放带有特效的视频

- (void)getVideo:(NSURL *)url
{

        [filter removeAllTargets];
        movieFile.audioEncodingTarget = nil;
        [movieWriter cancelRecording];
        [movieFile cancelProcessing];
        [movieWriter finishRecording];
        movieWriter = nil;
        movieFile = nil;
        filterView = nil;
    recording = YES;
    anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
    movieFile = [[GPUImageMovie alloc] initWithURL:url];
        movieFile.delegate = self;
        movieFile.runBenchmark = NO;
    movieFile.playAtActualSpeed = YES;

        [movieFile addTarget:filter];

    // Only rotate the video for display, leave orientation the same for recording
    filterView = (GPUImageView *)self.view;

        [filter addTarget:filterView];

        NSString *pathName = [NSString stringWithFormat:@"Doc.MOV"];
        // In addition to displaying to the screen, write out a processed version of the movie to disk
        NSString *pathToMovie = [NSTemporaryDirectory() stringByAppendingPathComponent:pathName];
        NSFileManager *fileTmp = [[NSFileManager alloc] init];
        if ([fileTmp fileExistsAtPath:pathToMovie]) {
            [fileTmp removeItemAtPath:pathToMovie error:nil];
        }
        unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
        saveTempUrl =  [NSURL fileURLWithPath:pathToMovie];
        movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:saveTempUrl size:size];
        [filter addTarget:movieWriter];
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
    [movieWriter startRecording];
    [movieFile startProcessing];
    __unsafe_unretained typeof(self) weakSelf = self;
    [weakSelf->movieWriter setCompletionBlock:^{

        NSLog(@"write completed");
        [filter removeTarget:movieWriter];
        [movieWriter finishRecording];
        movieWriter = nil;
        movieFile = nil;
        filterView = nil;
        recording = NO;
        if (saveFilter)
        {
            saveFilter = NO;
            UISaveVideoAtPathToSavedPhotosAlbum([saveTempUrl path], self, @selector(video:didFinishSavingWithError:contextInfo:), nil);
            shareFilter = YES;
        }

    }];
}

就这样。现在,当我选择任何过滤器时,它都会重新填充。所以内存问题就解决了。现在我的应用程序可以正常工作了。

为什么要运行
[self-getVideo:movieURL]每次更改过滤器设置时?当然,这将重新启动您的视频。只需更改过滤器属性或交换过滤器,而无需重新生成电影实例。您可以轻松地暂停电影,更改过滤器,然后重新启动电影以动态更改这些效果。您好@BradLarson,首先感谢您宝贵的GPUImage框架。事实上,我对这个问题感到震惊。请帮我解决这个问题。我不知道如何动态更换过滤器。请编辑上述代码并作为答案发布。@BradLarson如何暂停正在运行的电影?我也面临同样的问题。请发布一个解决方案。@QUserS-发布了解决方案。很抱歉回复太晚,感谢您的解决方案。我刚刚试过你的代码,但没有得到我所期望的。我需要恢复视频每当过滤器被改变。目前,它从应用每个过滤器的起点开始播放。请建议。
// Use this code

 [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieFinished) name:MPMoviePlayerPlaybackDidFinishNotification object:videoPlayer];
[videoPlayer play];


-(void)movieFinished
{
[videoPlayer play];
 } 



-(void) playTheVideo:(NSURL *)videoURL
{
NSTimeInterval time= videoPlayer.currentPlaybackTime;
UIView *parentView = imageViewFiltered; // adjust as needed
CGRect bounds = parentView.bounds; // get bounds of parent view
CGRect subviewFrame = CGRectInset(bounds, 0, 0); 
videoPlayer.view.frame = subviewFrame;
videoPlayer.view.autoresizingMask = (UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight);
[parentView addSubview:videoPlayer.view];
videoPlayer.contentURL = videoURL;
[videoPlayer setCurrentPlaybackTime:time];
[videoPlayer stop];
NSLog(@"Videoplayer stop or play in this view ");
[videoPlayer play];
self.showLoading = NO;
self.showLoading =NO;
}