Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/node.js/42.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Javascript 如何将ArrayBuffer流转换为视频?_Javascript_Node.js_Websocket_Socket.io_Streaming - Fatal编程技术网

Javascript 如何将ArrayBuffer流转换为视频?

Javascript 如何将ArrayBuffer流转换为视频?,javascript,node.js,websocket,socket.io,streaming,Javascript,Node.js,Websocket,Socket.io,Streaming,我正在从我的Raspberry Pi使用ffmpeg从我的网络摄像头生成视频流,方法是使用以下命令: ffmpeg -nostats -loglevel level+info -f v4l2 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 500k -bf 0 http://192.168.10.6:3030/stream 我的API服务器接收流数据并通过socket.io将其发送

我正在从我的Raspberry Pi使用
ffmpeg
从我的网络摄像头生成视频流,方法是使用以下命令:

ffmpeg -nostats -loglevel level+info -f v4l2 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 500k -bf 0 http://192.168.10.6:3030/stream
我的API服务器接收流数据并通过
socket.io
将其发送到我的(React)客户端。我可以在我的客户机上接收数据,但这是一个
ArrayBuffer
;我不知道如何转换这些数据以在画布上显示实际的视频。有什么想法吗

处理流的API路由:

import logger from '@src/startup/logging'
import { Router } from 'express'
import { io } from '@src/index'

const stream = Router()

stream.post('/', (req, res) => {
  logger.info('received')

  req.on('data', (chunk: string) => {
    logger.info(chunk.length)
    io.sockets.emit('data', chunk)
  })

  req.on('end', () => {
    logger.info('Stream Terminated')
    res.end()
  })

  // res.status(200).send('video')
})

export default stream
应转换图像中数据的客户端页面(不工作):

这是客户端接收的数据示例(console.log):


做了类似的事情。帧是一个接一个地发送的mpeg图像。因此,我像从文件中提取数据一样,为每个文件构建了一个图像;如何构建图像?首先要确定如何获取图像数据。你能详细说明一下吗。因为一旦你知道了图像是如何发送和接收的,剩下的就不重要了。谢谢,我在我的问题中添加了更多细节!
import { useEffect, useState } from 'react'
import socketIoClient from './socketIoClient'

export default function App(): JSX.Element {
  const [srcBlob, setSrcBlob] = useState(null)

  useEffect(() => {
    const handleSocketIo = async () => {
      const socket = await socketIoClient()

      socket.on('greeting', (data: string) => {
        console.log(data)
      })

      socket.on('data', (data: ArrayBuffer) => {
        console.log(data)
        // I just copied this from another example, but it does not display anything
        const arrayBufferView = new Uint8Array(data)
        const blob = new Blob([arrayBufferView], { type: 'image/jpeg' })
        const urlCreator = window.URL || window.webkitURL
        setSrcBlob(urlCreator.createObjectURL(blob))
      })

      socket.emit('greeting', 'hello from client')
    }

    handleSocketIo()
  }, [])

  if (!srcBlob) {
    return <div>Loading...</div>
  }

  return <img src={srcBlob} />
}
[info] Input #0, video4linux2,v4l2, from '/dev/video0':
[info]   Duration: N/A, start: 174842.704640, bitrate: 147456 kb/s
[info]     Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
[info] Stream mapping:
[info]   Stream #0:0 -> #0:0 (rawvideo (native) -> mpeg1video (native))
[info] Press [q] to stop, [?] for help
[info] Output #0, mpegts, to 'http://192.168.10.6:3030/stream':
[info]   Metadata:
[info]     encoder         : Lavf58.20.100
[info]     Stream #0:0: Video: mpeg1video, yuv420p, 640x480, q=2-31, 500 kb/s, 30 fps, 90k tbn, 30 tbc
[info]     Metadata:
[info]       encoder         : Lavc58.35.100 mpeg1video
[info]     Side data:
[info]       cpb: bitrate max/min/avg: 0/0/500000 buffer size: 0 vbv_delay: -1