Ios CGAffineTransformTranslate和CGAffineTransformScale之后的UIPoints不正确
我正在使用UIKit开发简单的绘图应用程序,使用教程中共享的想法。不同之处在于,我需要实现一个功能,以便可以放大/缩放到图形中,并绘制更细的线条。我能够使用CGAffineTransformScale(当然是使用UIPinchGestureRecognitor)放大UIImage,并使用CGAffineTransform在UIImage周围移动-问题是一旦放大检测到的UITouch点,实际触摸点会有一个巨大的偏移。当我不断缩放图像时,偏移量会变大 在代码中 drawingImage-用户与之交互的图像保存图像-保存绘制的线条Ios CGAffineTransformTranslate和CGAffineTransformScale之后的UIPoints不正确,ios,cgaffinetransform,pinchzoom,cgaffinetransformscale,Ios,Cgaffinetransform,Pinchzoom,Cgaffinetransformscale,我正在使用UIKit开发简单的绘图应用程序,使用教程中共享的想法。不同之处在于,我需要实现一个功能,以便可以放大/缩放到图形中,并绘制更细的线条。我能够使用CGAffineTransformScale(当然是使用UIPinchGestureRecognitor)放大UIImage,并使用CGAffineTransform在UIImage周围移动-问题是一旦放大检测到的UITouch点,实际触摸点会有一个巨大的偏移。当我不断缩放图像时,偏移量会变大 在代码中 drawingImage-用户与之交互
transform_translate-CGAffinetransform
lastScale-CGFloat以保存上次缩放比例值
lastPoint-CGPoint以保存最后一个触摸点
lastPointForPinch-CGPoint以保存最后一个挤压点 挤压手势在viewDidLoad中初始化为-
pinchGestureRecognizer = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:@selector(pinchGestureDetected:)];
[self.drawingImage addGestureRecognizer:pinchGestureRecognizer];
UIPinch的检测方法是-
- (void)pinchGestureDetected:(UIPinchGestureRecognizer *)recognizer
{
if([recognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [recognizer scale];
lastPointForPinch = [recognizer locationInView:self.drawingImage];
}
if ([recognizer state] == UIGestureRecognizerStateBegan ||
[recognizer state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[recognizer view].layer valueForKeyPath:@"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (lastScale - [recognizer scale]);
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[recognizer view] transform], newScale, newScale);
self.savingImage.transform = transform;
self.drawingImage.transform=transform;
lastScale = [recognizer scale]; // Store the previous scale factor for the next pinch gesture call
CGPoint point = [recognizer locationInView:self.drawingImage];
transform_translate = CGAffineTransformTranslate([[recognizer view] transform], point.x - lastPointForPinch.x, point.y - lastPointForPinch.y);
self.savingImage.transform = transform_translate;
self.drawingImage.transform=transform_translate;
lastPointForPinch = [recognizer locationInView:self.drawingImage];
}
}
绘制线条的方法(仅供参考,这是一个相当标准的程序,取自上述教程,如果我在这里犯了一些错误,可以将其放在这里)-
我尝试过做CGPointApplyAffineTransform(点,变换\平移),但仍然存在巨大的偏移
希望我的问题解释清楚,有人能帮助我。我一直在努力在这方面取得进展。提前感谢我终于找到了解决办法……一个愚蠢的错误一再犯下。 locationInView需要来自self.view,而不是图像 @davidkonard感谢您的建议-事实上,我没有意识到(在绘图应用程序的上下文中)用户触摸屏幕的目的是在该点上绘图将完成,因此,即使UIImageView仍然移动,用户也希望在手指下绘制点/线/任何东西。所以locationInView应该是self.view(在我的例子中self.view从未被转换)
希望这能解释我为什么会犯错误,以及我是如何想出解决办法的 您应该发布解决方案并给出代码示例。你的“我找到了解决方案”——评论说,未来的其他人无法了解你的任何信息。
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = NO;
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.drawingImage];
UIGraphicsBeginImageContext(self.savingImage.frame.size);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.drawingImage];
CGContextRef ctxt = UIGraphicsGetCurrentContext();
CGContextMoveToPoint(ctxt, lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(ctxt, currentPoint.x, currentPoint.y);
CGContextSetLineCap(ctxt, kCGLineCapRound);
CGContextSetLineWidth(ctxt, brush );
CGContextSetRGBStrokeColor(ctxt, red, green, blue, opacity);
CGContextSetBlendMode(ctxt,kCGBlendModeNormal);
CGContextSetShouldAntialias(ctxt,YES);
CGContextSetAllowsAntialiasing(ctxt, YES);
CGContextStrokePath(ctxt);
self.drawingImage.image = UIGraphicsGetImageFromCurrentImageContext();
lastPoint = currentPoint;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if(!mouseSwiped) {
UIGraphicsEndImageContext();
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.drawingImage];
UIGraphicsBeginImageContext(self.drawingImage.frame.size);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, opacity);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.drawingImage.image = UIGraphicsGetImageFromCurrentImageContext();
[self.drawingImage.image drawInRect:CGRectMake(0, 0, self.drawingImage.frame.size.width, self.drawingImage.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
}
UIGraphicsEndImageContext();
UIGraphicsBeginImageContext(self.savingImage.frame.size);
[self.savingImage.image drawInRect:CGRectMake(0, 0, self.savingImage.frame.size.width, self.savingImage.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
[self.drawingImage.image drawInRect:CGRectMake(0, 0, self.drawingImage.frame.size.width, self.drawingImage.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
self.savingImage.image = UIGraphicsGetImageFromCurrentImageContext();
self.drawingImage.image=nil;
UIGraphicsEndImageContext();
}
}