Objective c 从触摸点获取单个像素

Objective c 从触摸点获取单个像素,objective-c,ios,cocoa-touch,touch,multi-touch,Objective C,Ios,Cocoa Touch,Touch,Multi Touch,是否可以检测到每个被触摸的像素?更具体地说,当用户触摸屏幕时,是否可以跟踪用户触摸的点簇的所有x-y坐标?我如何区分用户用拇指绘图和用指尖绘图的区别?我想根据用户触摸屏幕的方式反映笔刷的差异,还想跟踪所有被触摸的像素 我目前正在使用苹果开发者网站GLPaint示例中的以下代码: 示例代码允许使用预定义的笔刷大小绘制图形,并沿途跟踪x-y坐标。如何根据用户触摸屏幕的方式更改画笔,并跟踪随时间推移触摸的所有像素 // Drawings a line onscreen based on where

是否可以检测到每个被触摸的像素?更具体地说,当用户触摸屏幕时,是否可以跟踪用户触摸的点簇的所有x-y坐标?我如何区分用户用拇指绘图和用指尖绘图的区别?我想根据用户触摸屏幕的方式反映笔刷的差异,还想跟踪所有被触摸的像素

我目前正在使用苹果开发者网站GLPaint示例中的以下代码:

示例代码允许使用预定义的笔刷大小绘制图形,并沿途跟踪x-y坐标。如何根据用户触摸屏幕的方式更改画笔,并跟踪随时间推移触摸的所有像素

// Drawings a line onscreen based on where the user touches

- (void) renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end

{
     NSLog(@"x:%f   y:%f",start.x, start.y);

     static GLfloat*          vertexBuffer = NULL;

     static NSUInteger     vertexMax = 64;

     NSUInteger               vertexCount = 0,

                    count,

                    i;



     [EAGLContext setCurrentContext:context];

     glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);



     // Convert locations from Points to Pixels

     CGFloat scale = self.contentScaleFactor;

     start.x *= scale;

     start.y *= scale;

     end.x *= scale;

     end.y *= scale;



     // Allocate vertex array buffer

     if(vertexBuffer == NULL)

          vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat));



     // Add points to the buffer so there are drawing points every X pixels

     count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y)) / kBrushPixelStep), 1);

     for(i = 0; i < count; ++i) {

          if(vertexCount == vertexMax) {

               vertexMax = 2 * vertexMax;

               vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat));

          }



          vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i / (GLfloat)count);

          vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i / (GLfloat)count);

          vertexCount += 1;

     }



     // Render the vertex array

     glVertexPointer(2, GL_FLOAT, 0, vertexBuffer);

     glDrawArrays(GL_POINTS, 0, vertexCount);



     // Display the buffer

     glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);

     [context presentRenderbuffer:GL_RENDERBUFFER_OES];

}


// Handles the start of a touch

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event

{

     CGRect                    bounds = [self bounds];

     UITouch*     touch = [[event touchesForView:self] anyObject];

     firstTouch = YES;

     // Convert touch point from UIView referential to OpenGL one (upside-down flip)

     location = [touch locationInView:self];

     location.y = bounds.size.height - location.y;

}

// Handles the continuation of a touch.

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event

{  

     CGRect                    bounds = [self bounds];

     UITouch*               touch = [[event touchesForView:self] anyObject];



     // Convert touch point from UIView referential to OpenGL one (upside-down flip)

     if (firstTouch) {

          firstTouch = NO;

          previousLocation = [touch previousLocationInView:self];

          previousLocation.y = bounds.size.height - previousLocation.y;

     } else {

          location = [touch locationInView:self];

         location.y = bounds.size.height - location.y;

          previousLocation = [touch previousLocationInView:self];

          previousLocation.y = bounds.size.height - previousLocation.y;

     }



     // Render the stroke

     [self renderLineFromPoint:previousLocation toPoint:location];

}



// Handles the end of a touch event when the touch is a tap.

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event

{

     CGRect                    bounds = [self bounds];

    UITouch*     touch = [[event touchesForView:self] anyObject];

     if (firstTouch) {

          firstTouch = NO;

          previousLocation = [touch previousLocationInView:self];

          previousLocation.y = bounds.size.height - previousLocation.y;

          [self renderLineFromPoint:previousLocation toPoint:location];

     }

}


// Handles the end of a touch event.

- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event

{

     // If appropriate, add code necessary to save the state of the application.

     // This application is not saving state.

}
//根据用户触摸的位置在屏幕上绘制线条
-(void)renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end
{
NSLog(@“x:%f y:%f”,start.x,start.y);
静态GLfloat*vertexBuffer=NULL;
静态整数vertexMax=64;
NSU整数vertexCount=0,
计数
我
[EagleContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES,viewFramebuffer);
//将位置从点转换为像素
CGFloat scale=self.contentScaleFactor;
start.x*=刻度;
start.y*=刻度;
end.x*=刻度;
结束。y*=刻度;
//分配顶点数组缓冲区
if(vertexBuffer==NULL)
vertexBuffer=malloc(vertexMax*2*sizeof(GLfloat));
//向缓冲区添加点,以便每X像素有一个绘图点
计数=最大值(ceilf(sqrtf((end.x-start.x)*(end.x-start.x)+(end.y-start.y)*(end.y-start.y))/kBrushPixelStep),1);
对于(i=0;i
AFAIK没有API可以访问触摸区进行触摸。考虑到电容式触摸屏的局限性,我甚至不确定你想要的东西在物理上是否是可能的。我记得最近在Cocoa Heads上的一次演示,演示了一些信息在OSX(通过专用API)上可用于轨迹板,但不适用于iOS

我相信这是图形平板电脑使用特殊的内置传感器技术的光柱的原因之一


对于绘图应用程序来说,部分解决方法可能是像某些桌面应用程序那样模拟“墨迹”:如果用户的触摸停留在给定的点上,绘图时就好像墨迹从“笔”中出来并逐渐扩散到“纸”中一样。iPad中的Broadcomm硬件以64 Hz的频率扫描屏幕。它通过在构成触摸屏电极的39个透明导电层上连续放置400µs的信号来实现这一点。如果您的手指在0.015625秒内移动超过一个像素距离(很可能),则硬件无法检测到这一点,因为它正忙于测量屏幕的其他部分以获取更多的触摸事件

无论是iOS还是Android,这都是一样的。廉价的安卓平板电脑和大屏幕的扫描速度降低了,因此它们的触摸事件间隔更广