Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/angular/26.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ssl/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Javascript 如何使用HTML5 WebRTC录制和保存视频_Javascript_Angular_Html_Angular6_Webrtc - Fatal编程技术网

Javascript 如何使用HTML5 WebRTC录制和保存视频

Javascript 如何使用HTML5 WebRTC录制和保存视频,javascript,angular,html,angular6,webrtc,Javascript,Angular,Html,Angular6,Webrtc,首先运行代码段,然后阅读说明。。。 它将为您提供结构 我想在第二个视频元素中录制、播放和保存视频。我面临的问题是:流正在第一个视频元素中运行,但无法录制和保存视频 。视频{ 边框:1px纯色灰色; 盒影:3个4个浅灰色; } 欢迎来到WebRTC 开始 停止 玩 您没有“真正”记录流,只是复制了流对象,而不是来自流的事件数据 使用流并将其作为构造函数参数传递。从事件处理程序ondataavailable获取视频blob。将记录的Blob数组加入到新Blob。从那里,您可以使用createO

首先运行代码段,然后阅读说明。。。 它将为您提供结构

我想在第二个
视频元素中录制、播放和保存视频。我面临的问题是:流正在第一个
视频元素中运行,但无法录制和保存视频

。视频{
边框:1px纯色灰色;
盒影:3个4个浅灰色;
}

欢迎来到WebRTC

开始 停止 玩
您没有“真正”记录流,只是复制了流对象,而不是来自流的事件数据

使用流并将其作为构造函数参数传递。从事件处理程序ondataavailable获取视频blob。将记录的Blob数组加入到新Blob。从那里,您可以使用
createObjectUrl(blob)获取url

以下代码段是伪代码:

**typescript无法识别“MediaRecorder”,因此您必须找到向MediaRecorder添加任何类型的方法


mediaRecorder:任何;
记录的Blob:Blob[];
下载URL:string;
HandleData可用(事件){
如果(event.data&&event.data.size>0){
this.recordedBlobs.push(event.data);
}
}
扶手顶(活动){
日志('记录器停止:',事件);
const videoBuffer=new Blob(this.recordedblob,{type:'video/webm'});
this.downloadUrl=window.URL.createObjectURL(videoBuffer);//您可以使用标记下载
this.recordVideoElement.src=this.downloadUrl;
}
startRecording(溪流){
let options={mimeType:'video/webm'};
this.recordedBlobs=[];
试一试{
this.mediaRecorder=新的mediaRecorder(流,选项);
}渔获物(e0){
log('尝试不同的mimeType');
}
log('Created MediaRecorder',this.MediaRecorder',with options',options);
this.mediaRecorder.onstop=this.handleStop;
this.mediaRecorder.ondataavailable=this.handleDataAvailable;
this.mediaRecorder.start(100);//收集100ms的数据
console.log('MediaRecorder started',this.MediaRecorder);
}
停止录制(){
this.mediaRecorder.stop();
log('Recorded Blobs:',this.recordedBlobs);
this.recordVideoElement.controls=true;
}
播放录制(){
如果(!this.recordedBlobs.length){
console.log('无法播放');
返回;
}
这个.recordVideoElement.play();
}
异步ngOnInit(){
navigator.mediaDevices.getUserMedia({video:{width:360}})。然后(stream=>{
this.videoElement.srcObject=流
这是startRecording(溪流);
})
}
完成工作代码,以便在Angular 6中录制视频 RecordComponent.ts
@ViewChild('recordedVideo')recordVideoElementRef:ElementRef
@ViewChild(“视频”)videoElementRef:ElementRef
videoElement:HTMLVideoElement
recordVideoElement:HTMLVideoElement
mediaRecorder:mediaRecorder
RecordedBlob:Blob[]
isRecording:boolean=false
下载URL:string
流:媒体流
构造函数(){
}
异步ngOnInit(){
this.videoElement=this.videoElementRef.nativeElement
this.recordVideoElement=this.recordVideoElementRef.nativeElement
navigator.mediaDevices.getUserMedia({
视频:{
宽度:360
}
})。然后(流=>{
this.stream=流
this.videoElement.srcObject=this.stream
})
}
startRecording(){
this.recordedBlobs=[]
let选项:MediaRecorderOptions={mimeType:'video/webm'}
试一试{
this.mediaRecorder=新的mediaRecorder(this.stream,选项)
}捕捉(错误){
console.log(错误)
}
this.mediaRecorder.start()//收集100ms的数据
this.isRecording=!this.isRecording
this.onDataAvailableEvent()
this.onStopRecordingEvent()
}
停止录制(){
this.mediaRecorder.stop()
this.isRecording=!this.isRecording
console.log('Recorded Blobs:',this.recordedBlobs)
}
播放录制(){
如果(!this.recordedBlobs | |!this.recordedBlobs.length){
console.log('无法播放')
返回
}
这个.recordVideoElement.play()
}
onDataAvailableEvent(){
试一试{
this.mediaRecorder.ondataavailable=(事件:BlobEvent)=>{
如果(event.data&&event.data.size>0){
this.recordedBlobs.push(event.data)
}
}
}捕获(错误){
console.log(错误)
}
}
onStopRecordingEvent(){
试一试{
this.mediaRecorder.onstop=(事件:event)=>{
const videoBuffer=new Blob(this.recordedblob,{type:'video/webm'})
this.downloadUrl=window.URL.createObjectURL(videoBuffer)//您可以使用下载,但找不到等,然后执行此操作

npmi@types/dom媒体捕获记录

一定要更新你的
Chrome
浏览器


祝您愉快

感谢您的回复,我正在尝试实施您提供的解决方案,但面临的问题是:MediaStream的每个事件调用,
null
已创建的对象,如
handleDataAvailable
事件,我在类构造函数中声明为
this.recordedBlobs=new Array()
但此事件使
this.recordedBlobs
未定义。handleDataAvailable将来自mediaRecorder.OnDataAvailable的blob推送到
recordedBlobs
。RecordedBlob不应未定义。我们仅使用
this.recordedBlobs=new Array()为其赋值一次
而且它不应该在事件中
处理数据可用
。您可以将您尝试的新代码添加到问题的底部吗?@WasiF添加我编写的console.log,然后显示日志。我猜您尝试将数据推送到确实存在的数组中。您是否分配了
his.recordedBlobs=new array()
在处理数据事件之前?你能告诉我如何记录屏幕吗?我尝试过
面对模式:
媒体源:
“屏幕”
,但无效。这在移动浏览器中不起作用。有什么建议吗?@ChhaiyaHarshad抱歉,我没有经历过这种经历。有人能帮我吗
mediaRecorder: any;
recordedBlobs: Blob[];
downloadUrl: string;

handleDataAvailable(event) {
    if (event.data && event.data.size > 0) {
      this.recordedBlobs.push(event.data);
    }
}

handleStop(event) {
    console.log('Recorder stopped: ', event);
    const videoBuffer = new Blob(this.recordedBlobs, {type: 'video/webm'});
    this.downloadUrl = window.URL.createObjectURL(videoBuffer); // you can download with <a> tag
    this.recordVideoElement.src = this.downloadUrl;
}

startRecording(stream) {
    let options = {mimeType: 'video/webm'};
    this.recordedBlobs = [];
    try {
        this.mediaRecorder = new MediaRecorder(stream, options);
    } catch (e0) {
        console.log('Try different mimeType');
    }
    console.log('Created MediaRecorder', this.mediaRecorder, 'with options', options);
    this.mediaRecorder.onstop = this.handleStop;
    this.mediaRecorder.ondataavailable = this.handleDataAvailable;
    this.mediaRecorder.start(100); // collect 100ms of data
    console.log('MediaRecorder started', this.mediaRecorder);
}

stopRecording() {
  this.mediaRecorder.stop();
  console.log('Recorded Blobs: ', this.recordedBlobs);
  this.recordVideoElement.controls = true;
}

playRecording() {
  if (!this.recordedBlobs.length) {
      console.log('cannot play.');
      return;
  }
  this.recordVideoElement.play();
}

async ngOnInit() {
  navigator.mediaDevices.getUserMedia({ video: { width: 360 } }).then(stream => {
    this.videoElement.srcObject = stream
    this.startRecording(stream);
  })
}
  @ViewChild('recordedVideo') recordVideoElementRef: ElementRef
  @ViewChild('video') videoElementRef: ElementRef

  videoElement: HTMLVideoElement
  recordVideoElement: HTMLVideoElement
  mediaRecorder: MediaRecorder
  recordedBlobs: Blob[]
  isRecording: boolean = false
  downloadUrl: string
  stream: MediaStream

  constructor() {
  }

  async ngOnInit() {
    this.videoElement = this.videoElementRef.nativeElement
    this.recordVideoElement = this.recordVideoElementRef.nativeElement

    navigator.mediaDevices.getUserMedia({
      video: {
        width: 360
      }
    }).then(stream => {
      this.stream = stream
      this.videoElement.srcObject = this.stream
    })
  }

  startRecording() {
    this.recordedBlobs = []
    let options: MediaRecorderOptions = { mimeType: 'video/webm' }

    try {
      this.mediaRecorder = new MediaRecorder(this.stream, options)
    } catch (err) {
      console.log(err)
    }

    this.mediaRecorder.start() // collect 100ms of data
    this.isRecording = !this.isRecording
    this.onDataAvailableEvent()
    this.onStopRecordingEvent()
  }

  stopRecording() {
    this.mediaRecorder.stop()
    this.isRecording = !this.isRecording
    console.log('Recorded Blobs: ', this.recordedBlobs)
  }

  playRecording() {
    if (!this.recordedBlobs || !this.recordedBlobs.length) {
      console.log('cannot play.')
      return
    }
    this.recordVideoElement.play()
  }

  onDataAvailableEvent() {
    try {
      this.mediaRecorder.ondataavailable = (event: BlobEvent) => {
        if (event.data && event.data.size > 0) {
          this.recordedBlobs.push(event.data)
        }
      }
    } catch (error) {
      console.log(error)
    }
  }

  onStopRecordingEvent() {
    try {
      this.mediaRecorder.onstop = (event: Event) => {
        const videoBuffer = new Blob(this.recordedBlobs, { type: 'video/webm' })
        this.downloadUrl = window.URL.createObjectURL(videoBuffer) // you can download with <a> tag
        this.recordVideoElement.src = this.downloadUrl
      }
    } catch (error) {
      console.log(error)
    }
  }

}
<div style="text-align:center">
    <h1>Welcome to WebRTC</h1>
    <video class="video" #video autoplay controls></video>
    <span class="m-1"></span>
    <video class="video" style="width:360 !important;" controls #recordedVideo></video>
    <br>
    <button class="btn btn-primary btn-lg" *ngIf="!isRecording" (click)="startRecording()">Start Recording</button>
    <button class="btn btn-warning btn-lg" *ngIf="isRecording" (click)="stopRecording()">Stop Recording</button>
  </div>