Ios 如何在录制视频时对其进行流式处理?
好的,我是我的应用程序我有一个Ios 如何在录制视频时对其进行流式处理?,ios,objective-c,video,chromecast,google-cast,Ios,Objective C,Video,Chromecast,Google Cast,好的,我是我的应用程序我有一个ViewController,它处理摄像机的视频录制,然后保存到我的应用程序文件夹的文档目录中。现在我想做的是在录制视频的同时同时将当前文件的部分内容也上传到服务器上(我对这方面还不熟悉,但我猜是http服务器)。我这样做的原因是因为我希望添加支持,以便在拍摄视频时可以流到chrome cast。这是可能的,因为应用程序已经执行了类似的功能 我已经研究出如何将视频上传到http服务器,将http服务器内的视频发送到chrome chromecast,并使用以下来源实
ViewController
,它处理摄像机的视频录制,然后保存到我的应用程序文件夹的文档目录中。现在我想做的是在录制视频的同时同时将当前文件的部分内容也上传到服务器上(我对这方面还不熟悉,但我猜是http服务器)。我这样做的原因是因为我希望添加支持,以便在拍摄视频时可以流到chrome cast。这是可能的,因为应用程序已经执行了类似的功能
我已经研究出如何将视频上传到http服务器,将http服务器内的视频发送到chrome chromecast,并使用以下来源实际录制视频:
铬铸件:
铬铸件:
Http服务器:
从Idevice摄像机录制:
要播放视频,我显然已连接,但在允许我进入录制视图之前,我必须已连接,因此纯粹播放.mp4视频的代码如下所示:
-(void)startCasting
{
[self establishServer];
self.mediaControlChannel = [[GCKMediaControlChannel alloc] init];
self.mediaControlChannel.delegate = self;
[self.deviceManager addChannel:gblvb.mediaControlChannel];
[self.mediaControlChannel requestStatus];
NSString *path = [NSString stringWithFormat:@"http://%@%@%hu%@%@", [self getIPAddress], @":" ,[httpServer listeningPort], @"/", @"Movie.mp4"];
NSString *image;
NSString *type;
self.metadata = [[GCKMediaMetadata alloc] init];
image = @"";//Image HERE
[gblvb.metadata setString:@"
forKey:kGCKMetadataKeySubtitle];//Description Here
type = @"video/mp4";//Video Type
[self.metadata setString:[NSString stringWithFormat:@"%@%@", @"Casting " , @"Movie.mp4"]forKey:kGCKMetadataKeyTitle];//Title HERE
//define Media information
GCKMediaInformation *mediaInformation =
[[GCKMediaInformation alloc] initWithContentID:path
streamType:GCKMediaStreamTypeNone
contentType:type
metadata:gblvb.metadata
streamDuration:0
customData:nil];
//cast video
[self.mediaControlChannel loadMedia:mediaInformation autoplay:TRUE playPosition:0];
}
- (NSString *)getIPAddress {
NSString *address = @"error";
struct ifaddrs *interfaces = NULL;
struct ifaddrs *temp_addr = NULL;
int success = 0;
// retrieve the current interfaces - returns 0 on success
success = getifaddrs(&interfaces);
if (success == 0) {
// Loop through linked list of interfaces
temp_addr = interfaces;
while(temp_addr != NULL) {
if(temp_addr->ifa_addr->sa_family == AF_INET) {
// Check if interface is en0 which is the wifi connection on the iPhone
if([[NSString stringWithUTF8String:temp_addr->ifa_name] isEqualToString:@"en0"]) {
// Get NSString from C String
address = [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)temp_addr->ifa_addr)->sin_addr)];
}
}
temp_addr = temp_addr->ifa_next;
}
}
// Free memory
freeifaddrs(interfaces);
return address;
}
现在在强制转换之前,我需要建立我的http服务器。这很简单,在将CocoaHTTPServer添加到您的项目中之后需要很小的实现。我启动服务器的代码如下所示:
static const int ddLogLevel = LOG_LEVEL_VERBOSE;
-(void)establishServer
{
[httpServer stop];
// Do any additional setup after loading the view from its nib.
// Configure our logging framework.
// To keep things simple and fast, we're just going to log to the Xcode console.
[DDLog addLogger:[DDTTYLogger sharedInstance]];
// Create server using our custom MyHTTPServer class
httpServer = [[HTTPServer alloc] init];
// Tell the server to broadcast its presence via Bonjour.
// This allows browsers such as Safari to automatically discover our service.
[httpServer setType:@"_http._tcp."];
// Normally there's no need to run our server on any specific port.
// Technologies like Bonjour allow clients to dynamically discover the server's port at runtime.
// However, for easy testing you may want force a certain port so you can just hit the refresh button.
// [httpServer setPort:12345];
// Serve files from our embedded Web folder
NSString *webPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/"];
DDLogInfo(@"Setting document root: %@", webPath);
[httpServer setDocumentRoot:webPath];
[self startServer];
}
- (void)startServer
{
// Start the server (and check for problems)
NSError *error;
if([httpServer start:&error])
{
DDLogInfo(@"Started HTTP Server on port %hu", [httpServer listeningPort]);
}
else
{
DDLogError(@"Error starting HTTP Server: %@", error);
}
}
最后,我使用以下代码开始从iPhone摄像头显示和录制:
- (void)viewDidLoad
{
[super viewDidLoad];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1920x1080 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = NO;
videoCamera.horizontallyMirrorRearFacingCamera = NO;
// filter = [[GPUImageSepiaFilter alloc] init];
// filter = [[GPUImageTiltShiftFilter alloc] init];
// [(GPUImageTiltShiftFilter *)filter setTopFocusLevel:0.65];
// [(GPUImageTiltShiftFilter *)filter setBottomFocusLevel:0.85];
// [(GPUImageTiltShiftFilter *)filter setBlurSize:1.5];
// [(GPUImageTiltShiftFilter *)filter setFocusFallOffRate:0.2];
// filter = [[GPUImageSketchFilter alloc] init];
filter = [[GPUImageFilter alloc] init];
// filter = [[GPUImageSmoothToonFilter alloc] init];
// GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRightFlipVertical];
[videoCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
// filterView.fillMode = kGPUImageFillModeStretch;
// filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
// Record a movie for 10 s and store it in /Documents, visible via iTunes file sharing
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.mp4"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
movieWriter.encodingLiveVideo = YES;
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720.0, 1280.0)];
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(1080.0, 1920.0)];
[filter addTarget:movieWriter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
}
bool recording;
- (IBAction)Record:(id)sender
{
if (recording == YES)
{
Record.titleLabel.text = @"Record";
recording = NO;
double delayInSeconds = 0.1;
dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){
[filter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSLog(@"Movie completed");
// [videoCamera.inputCamera lockForConfiguration:nil];
// [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOff];
// [videoCamera.inputCamera unlockForConfiguration];
});
UIAlertView *message = [[UIAlertView alloc] initWithTitle:@"Do You Wish To Store This Footage?"
message:@"Recording has fineshed. Do you wish to store this video into your camera roll?"
delegate:self
cancelButtonTitle:nil
otherButtonTitles:@"Yes", @"No",nil];
[message show];
[self dismissViewControllerAnimated:YES completion:nil];
}
else
{
double delayToStartRecording = 0.5;
dispatch_time_t startTime = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC);
dispatch_after(startTime, dispatch_get_main_queue(), ^(void){
NSLog(@"Start recording");
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
// NSError *error = nil;
// if (![videoCamera.inputCamera lockForConfiguration:&error])
// {
// NSLog(@"Error locking for configuration: %@", error);
// }
// [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOn];
// [videoCamera.inputCamera unlockForConfiguration];
recording = YES;
Record.titleLabel.text = @"Stop";
});
[self startCasting];
}
}
现在,正如您可能看到的,我正在尝试在录制完视频后直接运行正在录制的视频,并将服务器指向该位置。这不起作用,因为我相信在按下停止按钮之前,文件不会位于该路径上,但我如何修复此问题?有人能帮忙吗
ChromeCast支持的媒体类型:
使用Android镜像功能,该功能允许摄像头应用程序实时流式传输到Chromecast设备。@Leon Nichols我不知道你是否已经注意到了,但我不是在Android上开发的。我是为iOS开发的,把代码转换成objective-c并不像你想象的那么容易,特别是如果平台支持iOS不支持的某些方法?如果你认为这个问题不合适,你可以写建设性的评论。建设性批评的好处:嗨,你找到解决这个问题的方法了吗?我想做一些类似的事情,并且能够上传一个用GPUImage拍摄的视频到我的服务器,但不知道“流”是如何工作的?我会每次上传10秒,然后将它们剪回到我的服务器上吗?