Iphone 可拖动UIImageView部分透明&;不规则形状

Iphone 可拖动UIImageView部分透明&;不规则形状,iphone,uiimageview,draggable,Iphone,Uiimageview,Draggable,.m文件 @interface UIDraggableImageView : UIImageView { } 从网页上删除了可拖动图像的代码 问题:图像是带有透明区域的不规则形状,单击透明区域也会将其拖动 所需解决方案:如何使透明区域不可交互/不可拖动 如果有任何建议,我将试图掩盖图像作为一种尝试,并将发布结果,但任何解决方法/建议 根据MiRAGe的建议:尝试将代码合并到一个类文件中,因为UIImageView中提供了image属性,并且在interface builder中可以更轻松地插

.m文件

@interface UIDraggableImageView : UIImageView {

}
从网页上删除了可拖动图像的代码

问题:图像是带有透明区域的不规则形状,单击透明区域也会将其拖动

所需解决方案:如何使透明区域不可交互/不可拖动

如果有任何建议,我将试图掩盖图像作为一种尝试,并将发布结果,但任何解决方法/建议

根据MiRAGe的建议:尝试将代码合并到一个类文件中,因为UIImageView中提供了image属性,并且在interface builder中可以更轻松地插入和播放任何UIImageView,但仍然存在问题,透明区域是可移动的,hitTest方法在一次单击中被调用多次,有什么建议吗

- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint point = [[touches anyObject] locationInView:self];
startLocation = point;
[[self superview] bringSubviewToFront:self];
}

 - (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint point = [[touches anyObject] locationInView:self];
CGRect frame = [self frame];
frame.origin.x += point.x - startLocation.x;
frame.origin.y += point.y - startLocation.y;
[self setFrame:frame];
}

您可以轻松地检测阿尔法区域,并使其不可拖动。这里有一些代码可以让你检测阿尔法区域。这对你来说可能会有些开销,但这是我能做的最好的了

我对UIImage进行了子类化,并将此代码放在实现文件中

#import "UIImageViewDraggable.h"

@implementation UIImageViewDraggable

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Retrieve the touch point
CGPoint point = [[touches anyObject] locationInView:self];
startLocation = point;
[[self superview] bringSubviewToFront:self];
}

- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint point = [[touches anyObject] locationInView:self];
CGRect frame = [self frame];
frame.origin.x += point.x - startLocation.x;
frame.origin.y += point.y - startLocation.y;
[self setFrame:frame];
}

- (NSData *)alphaData {
CGContextRef    cgctx = NULL;
void *          bitmapData;
int             bitmapByteCount;

size_t pixelsWide = CGImageGetWidth(self.image.CGImage);
size_t pixelsHigh = CGImageGetHeight(self.image.CGImage);

bitmapByteCount     = (pixelsWide * pixelsHigh);

bitmapData = malloc( bitmapByteCount );
if (bitmapData == NULL) 
    return nil;

cgctx = CGBitmapContextCreate (bitmapData,
                               pixelsWide,
                               pixelsHigh,
                               8,
                               pixelsWide,
                               NULL,
                               kCGImageAlphaOnly);
if (cgctx == NULL) {
    free (bitmapData);
    fprintf (stderr, "Context not created!");

    return nil;
}

CGRect rect = {{0,0},{pixelsWide,pixelsHigh}}; 
CGContextDrawImage(cgctx, rect, self.image.CGImage); 

unsigned char *data = CGBitmapContextGetData(cgctx);

CGContextRelease(cgctx);

if (!data) {
    free(bitmapData);
    return nil;
}

size_t dataSize = pixelsWide * pixelsHigh;

NSData *alphaData = [NSData dataWithBytes:data length:dataSize];

free(bitmapData);
return alphaData;
}    

- (BOOL)isTransparentLocation:(CGPoint)point withData:(NSData *)data {   
if (data == nil)
    NSLog(@"data was nil");

NSUInteger index = point.x + (point.y * [self.image size].width);
unsigned char *rawDataBytes = (unsigned char *)[data bytes];

return (rawDataBytes[index] == 0);
}

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
NSLog(@"test");
NSAutoreleasePool *pool = [NSAutoreleasePool new];

// view responding to the hit test. note that self may respond too.
UIView *anyViewResponding = [super hitTest:point withEvent:event];  
if( anyViewResponding == nil || anyViewResponding == self ) {
    // convert the point in the image, to a global point.
    CGPoint framePoint = [self.superview convertPoint:point fromView:self];
    // if the point is in the image frame, and there is an image, see if we need to let the touch through or not
    if(self.image != nil && CGRectContainsPoint([self frame], framePoint)) {
        NSData *imageData = [self alphaData];         

        // check if the point touched is transparent in the image
        if( imageData != nil && [self isTransparentLocation:point withData:imageData]) {               
            // return nil, so the touch will not arrive at this view
            anyViewResponding = nil;
        }
    }
}

[pool drain];
return anyViewResponding;
}

您可以执行点击检测以确定表示点击手势的CGPoint是否位于由定义的形状内

糟糕的是,为复杂形状建立路径是乏味的。好的是,既然你正在处理手指敲击,粗略的轮廓可能就足够了。此外,这允许您的触摸目标从图像的不透明部分偏离,以防您需要一些额外的空间来提高可用性。它还允许触摸目标内的透明度,从而允许您在必要时拥有更复杂的图像


只要在图像编辑器中绘制出所需的边界点,就可以非常容易地获得它们。

无论如何,我可以将上述所有代码合并到一个类中,并将UIImageView与该类链接,我做了一次尝试,代码在我的原始问题中。
#import <CoreGraphics/CoreGraphics.h>

- (NSData *)alphaData
{
    CGContextRef    cgctx = NULL;
    void *          bitmapData;
    int             bitmapByteCount;

    size_t pixelsWide = CGImageGetWidth(self.CGImage);
    size_t pixelsHigh = CGImageGetHeight(self.CGImage);

    bitmapByteCount     = (pixelsWide * pixelsHigh);

    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL) 
        return nil;

    cgctx = CGBitmapContextCreate (bitmapData,
                                   pixelsWide,
                                   pixelsHigh,
                                   8,
                                   pixelsWide,
                                   NULL,
                                   kCGImageAlphaOnly);
    if (cgctx == NULL)
    {
        free (bitmapData);
        fprintf (stderr, "Context not created!");

        return nil;
    }

    CGRect rect = {{0,0},{pixelsWide,pixelsHigh}}; 
    CGContextDrawImage(cgctx, rect, self.CGImage); 

    unsigned char *data = CGBitmapContextGetData(cgctx);

    CGContextRelease(cgctx);

    if (!data)
    {
        free(bitmapData);
        return nil;
    }

    size_t dataSize = pixelsWide * pixelsHigh;

    NSData *alphaData = [NSData dataWithBytes:data length:dataSize];

    free(bitmapData);
    return alphaData;
}    

- (BOOL)isTransparentLocation:(CGPoint)point withData:(NSData *)data
{   
    if (data == nil)
        NSLog(@"data was nil");

    NSUInteger index = point.x + (point.y * [self size].width);
    unsigned char *rawDataBytes = (unsigned char *)[data bytes];

    return (rawDataBytes[index] == 0);
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
    NSAutoreleasePool *pool = [NSAutoreleasePool new];

    // view responding to the hit test. note that self may respond too.
    UIView *anyViewResponding = [super hitTest:point withEvent:event];  
    if( anyViewResponding == nil || anyViewResponding == self )
    {
        // convert the point in the image, to a global point.
        CGPoint framePoint = [self.superview convertPoint:point fromView:self];
        // if the point is in the image frame, and there is an image, see if we need to let the touch through or not
        if( self.image != nil && CGRectContainsPoint([self frame], framePoint) )
        {
            NSData *imageData = [self.image alphaData];         

            // check if the point touched is transparent in the image
            if( imageData != nil && [self.image isTransparentLocation:point imageData] )
            {               
                // return nil, so the touch will not arrive at this view
                anyViewResponding = nil;
            }
        }
    }

    [pool drain];
    return anyViewResponding;
}
- (id)initWithFrame:(CGRect)frame {
    ...
    CGPathRef outline = CGPathCreateMutable();
    CGPathMoveToPoint(outline, NULL, 20, 20);
    // Build up path
    ...
}

- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
    CGPoint point = [[touches anyObject] locationInView:self];
    if (CGPathContainsPoint(outline, NULL, point, false) {
        ...
        dragIsRespected = YES;
    }
}

- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
    if (dragIsRespected) {
        ...
    }
}

- (void)dealloc {
    CGPathRelease(outline);
    ...
}