Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/lua/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何产生类似iOS 7模糊视图的效果?_Ios_Graphics_Transparency - Fatal编程技术网

如何产生类似iOS 7模糊视图的效果?

如何产生类似iOS 7模糊视图的效果?,ios,graphics,transparency,Ios,Graphics,Transparency,我正试图从苹果公开发布的iOS 7示例屏幕复制这种模糊的背景: 建议对以下内容应用CI筛选器,但这是一种完全不同的方法。很明显,iOS 7没有捕获以下视图的内容,原因有很多: 做一些粗略的测试,捕获下面视图的屏幕截图,并应用一个足够大的半径来模拟iOS 7模糊风格的CIGaussianBlur过滤器需要1-2秒,即使在模拟器上也是如此 iOS 7模糊视图能够在动态视图(如视频或动画)上进行模糊,且没有明显的延迟 任何人都可以假设他们可以使用什么样的框架来产生这种效果,以及是否有可能在当前的公共

我正试图从苹果公开发布的iOS 7示例屏幕复制这种模糊的背景:

建议对以下内容应用CI筛选器,但这是一种完全不同的方法。很明显,iOS 7没有捕获以下视图的内容,原因有很多:

  • 做一些粗略的测试,捕获下面视图的屏幕截图,并应用一个足够大的半径来模拟iOS 7模糊风格的CIGaussianBlur过滤器需要1-2秒,即使在模拟器上也是如此
  • iOS 7模糊视图能够在动态视图(如视频或动画)上进行模糊,且没有明显的延迟
  • 任何人都可以假设他们可以使用什么样的框架来产生这种效果,以及是否有可能在当前的公共API中产生类似的效果

    编辑:(来自评论)我们并不确切知道苹果是如何做到这一点的,但我们可以做出一些基本假设吗?我们可以假设他们使用的是硬件,对吗

    效果是否在每个视图中都是独立的,以至于效果实际上不知道它背后是什么?或者,必须根据模糊的工作方式,考虑模糊背后的内容


    如果效果背后的内容是相关的,我们是否可以假设苹果正在接收下面内容的“提要”,并不断地以模糊呈现它们

    事实上,我打赌这很容易实现。它可能不会运行,或者看起来与苹果的情况完全一样,但可能非常接近

    首先,您需要确定要呈现的UIView的CGRect。一旦你确定你只需要抓取一个UI部分的图像,这样它就可以模糊了。像这样的

    - (UIImage*)getBlurredImage {
        // You will want to calculate this in code based on the view you will be presenting.
        CGSize size = CGSizeMake(200,200);
    
        UIGraphicsBeginImageContext(size);
        [view drawViewHierarchyInRect:(CGRect){CGPointZero, w, h} afterScreenUpdates:YES]; // view is the view you are grabbing the screen shot of. The view that is to be blurred.
        UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
    
        // Gaussian Blur
        image = [image applyLightEffect];
    
        // Box Blur
        // image = [image boxblurImageWithBlur:0.2f];
    
        return image;
    }
    
    高斯模糊-推荐 使用苹果提供的
    UIImage+ImageEffects
    类别,你会得到一个高斯模糊,看起来非常像iOS 7中的模糊

    方形模糊 还可以使用以下
    boxBlurImageWithBlur:
    UIImage类别使用框模糊。这是基于您可以找到的算法

    @UIImage(模糊)
    -(UIImage*)框带模糊的模糊图像:(CGFloat)模糊{
    如果(模糊<0.f | |模糊>1.f){
    模糊度=0.5f;
    }
    整数框大小=(整数)(模糊*50);
    boxSize=boxSize-(boxSize%2)+1;
    CGImageRef img=self.CGImage;
    缓冲区中的缓冲区、突出区;
    vImage_错误;
    void*像素缓冲区;
    CGDataProviderRef inProvider=CGImageGetDataProvider(img);
    CFDataRef inBitmapData=CGDataProviderCopyData(inProvider);
    inBuffer.width=CGImageGetWidth(img);
    inBuffer.height=CGImageGetHeight(img);
    inBuffer.rowBytes=CGImageGetBytesPerRow(img);
    inBuffer.data=(void*)CFDataGetBytePtr(inBitmapData);
    pixelBuffer=malloc(CGImageGetBytesPerRow(img)*CGImageGetHeight(img));
    if(像素缓冲区==NULL)
    NSLog(@“无像素缓冲”);
    exputffer.data=像素缓冲区;
    exputffer.width=CGImageGetWidth(img);
    突出高度=CGImageGetHeight(img);
    exputffer.rowBytes=CGImageGetBytesPerRow(img);
    错误=vImageBoxConvolve_argb888(&inBuffer,&outBuffer,NULL,0,0,boxSize,boxSize,NULL,kvImageEdgeExtend);
    如果(错误){
    NSLog(@“JFDepthView:来自卷积%ld的错误”,错误);
    }
    CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
    CGContextRef ctx=CGBitmapContextCreate(exputffer.data,
    宽度,
    身高,
    8.
    exputffer.rowBytes,
    色彩空间,
    kCGImageAlphaNoneSkipLast);
    CGImageRef imageRef=CGBitmapContextCreateImage(ctx);
    UIImage*returnImage=[UIImage imageWithCGImage:imageRef];
    //清理
    CGContextRelease(ctx);
    CGCOLORSPACTERELEASE(色彩空间);
    自由(像素缓冲);
    CFRelease(inBitmapData);
    CGImageRelease(imageRef);
    返回图像;
    }
    @结束
    
    现在,您正在计算要模糊的屏幕区域,将其传递到模糊类别并接收已模糊的UIImage,现在剩下的就是将该模糊图像设置为您将要呈现的视图的背景。正如我所说,这将不是一个完美的匹配苹果正在做的,但它应该仍然看起来很酷


    希望能有所帮助。

    有传言称,苹果工程师声称,为了实现这一性能,他们直接从gpu缓冲区读取数据,这会引发安全问题,这就是为什么还没有公共API来实现这一点

    核心后台实现所需的iOS 7效果


    免责声明:我是这个项目的作者

    苹果公司在WWDC发布的代码,作为UIImage上的一个类别,包括这个功能,如果你有一个开发者帐户,你可以通过点击这个链接:浏览第226节并点击详细信息来获取UIImage类别(以及剩余的示例代码)。我还没有玩过它,但我认为在iOS 6上效果会慢很多,iOS 7有一些增强功能,可以更快地抓取用作模糊输入的初始屏幕快照


    直接链接:

    这里有一个非常简单的方法:

    只要复制UIToolbar的图层,您就完成了,AMBlurView会为您完成。 好吧,它不像控制中心那么模糊,但已经足够模糊了


    记住,iOS7属于保密协议。

    为什么要费心复制这种效果?只需在视图后面绘制一个UIToolbar

    myView.backgroundColor = [UIColor clearColor];
    UIToolbar* bgToolbar = [[UIToolbar alloc] initWithFrame:myView.frame];
    bgToolbar.barStyle = UIBarStyleDefault;
    [myView.superview insertSubview:bgToolbar belowSubview:myView];
    

    您可以在本页的苹果演示中找到您的解决方案: ,查找并下载UIImageEffects示例代码

    然后用@Jeremy Fox的密码。我把它改成了

    - (UIImage*)getDarkBlurredImageWithTargetView:(UIView *)targetView
    {
        CGSize size = targetView.frame.size;
    
        UIGraphicsBeginImageContext(size);
        CGContextRef c = UIGraphicsGetCurrentContext();
        CGContextTranslateCTM(c, 0, 0);
        [targetView.layer renderInContext:c]; // view is the view you are grabbing the screen shot of. The view that is to be blurred.
        UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        return [image applyDarkEffect];
    }
    

    希望这将对您有所帮助。

    您可以尝试使用我的自定义视图,该视图具有模糊背景的功能。这是fak做的
    - (UIImage*)getDarkBlurredImageWithTargetView:(UIView *)targetView
    {
        CGSize size = targetView.frame.size;
    
        UIGraphicsBeginImageContext(size);
        CGContextRef c = UIGraphicsGetCurrentContext();
        CGContextTranslateCTM(c, 0, 0);
        [targetView.layer renderInContext:c]; // view is the view you are grabbing the screen shot of. The view that is to be blurred.
        UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        return [image applyDarkEffect];
    }
    
        //Screen capture.
        UIGraphicsBeginImageContext(self.view.bounds.size);
    
        CGContextRef c = UIGraphicsGetCurrentContext();
        CGContextTranslateCTM(c, 0, 0);
        [self.view.layer renderInContext:c];
    
        UIImage* viewImage = UIGraphicsGetImageFromCurrentImageContext();
        viewImage = [viewImage applyLightEffect];
    
        UIGraphicsEndImageContext();
    
        //.h FILE
        #import <UIKit/UIKit.h>
    
        @interface UIImage (ImageEffects)
    
       - (UIImage *)applyLightEffect;
       - (UIImage *)applyExtraLightEffect;
       - (UIImage *)applyDarkEffect;
       - (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
    
       - (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage;
    
       @end
    
        //.m FILE
        #import "cGaussianEffect.h"
        #import <Accelerate/Accelerate.h>
        #import <float.h>
    
    
         @implementation UIImage (ImageEffects)
    
    
        - (UIImage *)applyLightEffect
        {
            UIColor *tintColor = [UIColor colorWithWhite:1.0 alpha:0.3];
            return [self applyBlurWithRadius:1 tintColor:tintColor saturationDeltaFactor:1.8 maskImage:nil];
        }
    
    
        - (UIImage *)applyExtraLightEffect
        {
            UIColor *tintColor = [UIColor colorWithWhite:0.97 alpha:0.82];
            return [self applyBlurWithRadius:1 tintColor:tintColor saturationDeltaFactor:1.8 maskImage:nil];
        }
    
    
        - (UIImage *)applyDarkEffect
        {
            UIColor *tintColor = [UIColor colorWithWhite:0.11 alpha:0.73];
            return [self applyBlurWithRadius:1 tintColor:tintColor saturationDeltaFactor:1.8 maskImage:nil];
        }
    
    
        - (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor
        {
            const CGFloat EffectColorAlpha = 0.6;
            UIColor *effectColor = tintColor;
            int componentCount = CGColorGetNumberOfComponents(tintColor.CGColor);
            if (componentCount == 2) {
                CGFloat b;
                if ([tintColor getWhite:&b alpha:NULL]) {
                    effectColor = [UIColor colorWithWhite:b alpha:EffectColorAlpha];
                }
            }
            else {
                CGFloat r, g, b;
                if ([tintColor getRed:&r green:&g blue:&b alpha:NULL]) {
                    effectColor = [UIColor colorWithRed:r green:g blue:b alpha:EffectColorAlpha];
                }
            }
            return [self applyBlurWithRadius:10 tintColor:effectColor saturationDeltaFactor:-1.0 maskImage:nil];
        }
    
    
        - (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage
        {
            if (self.size.width < 1 || self.size.height < 1) {
                NSLog (@"*** error: invalid size: (%.2f x %.2f). Both dimensions must be >= 1: %@", self.size.width, self.size.height, self);
                return nil;
            }
            if (!self.CGImage) {
                NSLog (@"*** error: image must be backed by a CGImage: %@", self);
                return nil;
            }
            if (maskImage && !maskImage.CGImage) {
                NSLog (@"*** error: maskImage must be backed by a CGImage: %@", maskImage);
                return nil;
            }
    
            CGRect imageRect = { CGPointZero, self.size };
            UIImage *effectImage = self;
    
            BOOL hasBlur = blurRadius > __FLT_EPSILON__;
            BOOL hasSaturationChange = fabs(saturationDeltaFactor - 1.) > __FLT_EPSILON__;
            if (hasBlur || hasSaturationChange) {
                UIGraphicsBeginImageContextWithOptions(self.size, NO, [[UIScreen mainScreen] scale]);
                CGContextRef effectInContext = UIGraphicsGetCurrentContext();
                CGContextScaleCTM(effectInContext, 1.0, -1.0);
                CGContextTranslateCTM(effectInContext, 0, -self.size.height);
                CGContextDrawImage(effectInContext, imageRect, self.CGImage);
    
                vImage_Buffer effectInBuffer;
                effectInBuffer.data     = CGBitmapContextGetData(effectInContext);
                effectInBuffer.width    = CGBitmapContextGetWidth(effectInContext);
                effectInBuffer.height   = CGBitmapContextGetHeight(effectInContext);
                effectInBuffer.rowBytes = CGBitmapContextGetBytesPerRow(effectInContext);
    
                UIGraphicsBeginImageContextWithOptions(self.size, NO, [[UIScreen mainScreen] scale]);
                CGContextRef effectOutContext = UIGraphicsGetCurrentContext();
                vImage_Buffer effectOutBuffer;
                effectOutBuffer.data     = CGBitmapContextGetData(effectOutContext);
                effectOutBuffer.width    = CGBitmapContextGetWidth(effectOutContext);
                effectOutBuffer.height   = CGBitmapContextGetHeight(effectOutContext);
                effectOutBuffer.rowBytes = CGBitmapContextGetBytesPerRow(effectOutContext);
    
                if (hasBlur) {
                    CGFloat inputRadius = blurRadius * [[UIScreen mainScreen] scale];
                    NSUInteger radius = floor(inputRadius * 3. * sqrt(2 * M_PI) / 4 + 0.5);
                    if (radius % 2 != 1) {
                        radius += 1;
                    }
                    vImageBoxConvolve_ARGB8888(&effectInBuffer, &effectOutBuffer, NULL, 0, 0, radius, radius, 0, kvImageEdgeExtend);
                    vImageBoxConvolve_ARGB8888(&effectOutBuffer, &effectInBuffer, NULL, 0, 0, radius, radius, 0, kvImageEdgeExtend);
                    vImageBoxConvolve_ARGB8888(&effectInBuffer, &effectOutBuffer, NULL, 0, 0, radius, radius, 0, kvImageEdgeExtend);
                }
                BOOL effectImageBuffersAreSwapped = NO;
                if (hasSaturationChange) {
                    CGFloat s = saturationDeltaFactor;
                    CGFloat floatingPointSaturationMatrix[] = {
                        0.0722 + 0.9278 * s,  0.0722 - 0.0722 * s,  0.0722 - 0.0722 * s,  0,
                        0.7152 - 0.7152 * s,  0.7152 + 0.2848 * s,  0.7152 - 0.7152 * s,  0,
                        0.2126 - 0.2126 * s,  0.2126 - 0.2126 * s,  0.2126 + 0.7873 * s,  0,
                                      0,                    0,                    0,  1,
                    };
                    const int32_t divisor = 256;
                    NSUInteger matrixSize = sizeof(floatingPointSaturationMatrix)/sizeof(floatingPointSaturationMatrix[0]);
                    int16_t saturationMatrix[matrixSize];
                    for (NSUInteger i = 0; i < matrixSize; ++i) {
                        saturationMatrix[i] = (int16_t)roundf(floatingPointSaturationMatrix[i] * divisor);
                    }
                    if (hasBlur) {
                        vImageMatrixMultiply_ARGB8888(&effectOutBuffer, &effectInBuffer, saturationMatrix, divisor, NULL, NULL, kvImageNoFlags);
                        effectImageBuffersAreSwapped = YES;
                    }
                    else {
                        vImageMatrixMultiply_ARGB8888(&effectInBuffer, &effectOutBuffer, saturationMatrix, divisor, NULL, NULL, kvImageNoFlags);
                    }
                }
                if (!effectImageBuffersAreSwapped)
                    effectImage = UIGraphicsGetImageFromCurrentImageContext();
                UIGraphicsEndImageContext();
    
                if (effectImageBuffersAreSwapped)
                    effectImage = UIGraphicsGetImageFromCurrentImageContext();
                UIGraphicsEndImageContext();
            }
    
            UIGraphicsBeginImageContextWithOptions(self.size, NO, [[UIScreen mainScreen] scale]);
            CGContextRef outputContext = UIGraphicsGetCurrentContext();
            CGContextScaleCTM(outputContext, 1.0, -1.0);
            CGContextTranslateCTM(outputContext, 0, -self.size.height);
    
            CGContextDrawImage(outputContext, imageRect, self.CGImage);
    
            if (hasBlur) {
                CGContextSaveGState(outputContext);
                if (maskImage) {
                    CGContextClipToMask(outputContext, imageRect, maskImage.CGImage);
                }
                CGContextDrawImage(outputContext, imageRect, effectImage.CGImage);
                CGContextRestoreGState(outputContext);
            }
    
            if (tintColor) {
                CGContextSaveGState(outputContext);
                CGContextSetFillColorWithColor(outputContext, tintColor.CGColor);
                CGContextFillRect(outputContext, imageRect);
                CGContextRestoreGState(outputContext);
            }
    
            UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();
    
            return outputImage;
        }
    
    #import "UIImage+Utils.h"
    
    #import "GPUImagePicture.h"
    #import "GPUImageSolidColorGenerator.h"
    #import "GPUImageAlphaBlendFilter.h"
    #import "GPUImageBoxBlurFilter.h"
    
    @implementation UIImage (Utils)
    
    - (UIImage*) GPUBlurredImage
    {
        GPUImagePicture *source =[[GPUImagePicture alloc] initWithImage:self];
    
        CGSize size = CGSizeMake(self.size.width * self.scale, self.size.height * self.scale);
    
        GPUImageBoxBlurFilter *blur = [[GPUImageBoxBlurFilter alloc] init];
        [blur setBlurRadiusInPixels:4.0f];
        [blur setBlurPasses:2.0f];
        [blur forceProcessingAtSize:size];
        [source addTarget:blur];
    
        GPUImageSolidColorGenerator * white = [[GPUImageSolidColorGenerator alloc] init];
    
        [white setColorRed:1.0f green:1.0f blue:1.0f alpha:0.1f];
        [white forceProcessingAtSize:size];
    
        GPUImageAlphaBlendFilter * blend = [[GPUImageAlphaBlendFilter alloc] init];
        blend.mix = 0.9f;
    
        [blur addTarget:blend];
        [white addTarget:blend];
    
        [blend forceProcessingAtSize:size];
        [source processImage];
    
        return [blend imageFromCurrentlyProcessedOutput];
    }
    
    @end