如何在iPhone相机预览窗口中动态更改像素颜色?
我正在使用UIImagePickerController在iPhone上拍照。我想在飞行中调整照片,似乎我可以使用UIImagePickerController在飞行中调整照片的形状,但我无法找到在飞行中更改颜色的方法。例如,将所有颜色更改为黑色/白色如何在iPhone相机预览窗口中动态更改像素颜色?,iphone,uiimagepickercontroller,Iphone,Uiimagepickercontroller,我正在使用UIImagePickerController在iPhone上拍照。我想在飞行中调整照片,似乎我可以使用UIImagePickerController在飞行中调整照片的形状,但我无法找到在飞行中更改颜色的方法。例如,将所有颜色更改为黑色/白色 谢谢 您可以在图像上覆盖视图,并更改混合模式以匹配黑白效果 查看Apple的演示,特别是在该演示中,混合模式示例另一种方法是使用AVFoundation转换每个帧。我在这方面没有太多经验,但是WWDC2010的“会话409-与AVFoundati
谢谢 您可以在图像上覆盖视图,并更改混合模式以匹配黑白效果
查看Apple的演示,特别是在该演示中,
混合模式示例另一种方法是使用AVFoundation
转换每个帧。我在这方面没有太多经验,但是WWDC2010的“会话409-与AVFoundation一起使用摄像头”视频及其示例项目应该可以帮助您解决问题
当然,如果您可以使用iOS4类,也就是说。最好的方法是使用AVCaptureSession对象。我正在做的正是你在我的免费应用程序“Live Effects Cam”中所说的
在线有几个代码示例,a也将帮助您实现这一点。下面是一段可能有帮助的代码示例:
- (void) activateCameraFeed
{
videoSettings = nil;
#if USE_32BGRA
pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA];
pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey];
videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil];
#endif
videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL);
captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
[captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES];
[captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
[captureVideoOutput setVideoSettings:videoSettings];
[captureVideoOutput setMinFrameDuration:kCMTimeZero];
dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now
if ( useFrontCamera )
{
currentCameraDeviceIndex = frontCameraDeviceIndex;
cameraImageOrientation = UIImageOrientationLeftMirrored;
}
else
{
currentCameraDeviceIndex = backCameraDeviceIndex;
cameraImageOrientation = UIImageOrientationRight;
}
selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex];
captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil];
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
[self setCaptureConfiguration];
[captureSession addInput:captureVideoInput];
[captureSession addOutput:captureVideoOutput];
[captureSession commitConfiguration];
[captureSession startRunning];
}
// AVCaptureVideoDataOutputSampleBufferDelegate
// AVCaptureAudioDataOutputSampleBufferDelegate
//
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
if ( captureOutput==captureVideoOutput )
{
[self performImageCaptureFrom:sampleBuffer fromConnection:connection];
}
[pool drain];
}
- (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer
{
CVImageBufferRef imageBuffer;
if ( CMSampleBufferGetNumSamples(sampleBuffer) != 1 )
return;
if ( !CMSampleBufferIsValid(sampleBuffer) )
return;
if ( !CMSampleBufferDataIsReady(sampleBuffer) )
return;
imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if ( CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA )
return;
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
int bufferSize = bytesPerRow * height;
uint8_t *tempAddress = malloc( bufferSize );
memcpy( tempAddress, baseAddress, bytesPerRow * height );
baseAddress = tempAddress;
//
// Apply affects to the pixels stored in (uint32_t *)baseAddress
//
//
// example: grayScale( (uint32_t *)baseAddress, width, height );
// example: sepia( (uint32_t *)baseAddress, width, height );
//
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = nil;
if ( cameraDeviceSetting != CameraDeviceSetting640x480 ) // not an iPhone4 or iTouch 5th gen
newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
else
newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage( newContext );
CGColorSpaceRelease( colorSpace );
CGContextRelease( newContext );
free( tempAddress );
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
if ( newImage == nil )
{
return;
}
// To be able to display the CGImageRef newImage in your UI you will need to do it like this
// because you are running on a different thread here…
//
[self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES];
}
尝试了你的实时效果摄像头,它看起来很棒,而且它还有很多我想要实现的功能。干得好!只是很惊讶它是免费的。谢谢。99美分时我每天下载不到50次,免费时平均每天下载超过1500次。我发布了一个更新,其中包括应用程序内购买最受欢迎的新功能。我向今天开发新应用的任何人推荐带有应用内购买方式的免费应用。