Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/video/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
推荐用于以编程方式创建视频的Java库_Java_Video_Media - Fatal编程技术网

推荐用于以编程方式创建视频的Java库

推荐用于以编程方式创建视频的Java库,java,video,media,Java,Video,Media,有谁能推荐一个Java库,让我能够以编程方式创建视频?具体而言,它将做以下工作: 以一系列BuffereImage作为帧 允许添加背景WAV/MP3 允许在任意编程指定点添加“附带”WAV/MP3 以通用格式(MPEG等)输出视频 有人能推荐什么吗?对于图片/声音混合,我甚至会使用一系列帧,并且对于每一帧,我必须提供与该帧相关的未压缩声音数据的原始字节 另外,如果Java媒体框架有实现上述目标的调用,它甚至不必是“第三方库”,但从我粗略的记忆中,我感觉它不是。我已经使用下面提到的代码在纯Ja

有谁能推荐一个Java库,让我能够以编程方式创建视频?具体而言,它将做以下工作:

  • 以一系列BuffereImage作为帧
  • 允许添加背景WAV/MP3
  • 允许在任意编程指定点添加“附带”WAV/MP3
  • 以通用格式(MPEG等)输出视频
有人能推荐什么吗?对于图片/声音混合,我甚至会使用一系列帧,并且对于每一帧,我必须提供与该帧相关的未压缩声音数据的原始字节


另外,如果Java媒体框架有实现上述目标的调用,它甚至不必是“第三方库”,但从我粗略的记忆中,我感觉它不是。我已经使用下面提到的代码在纯Java中成功地执行了需求列表中的第1、2和4项。值得一看,你可能会想出如何包括#3


我发现了一个名为ffmpeg的工具,它可以将多媒体文件从一种格式转换为另一种格式。ffmpeg中有一个名为libavfilter的过滤器,它是vhook的替代品,vhook允许在解码器和编码器之间修改或检查视频/音频。我认为应该可以输入原始帧并生成视频。 我研究了ffmpeg的任何java实现,发现了名为的页面,它是使用JNA围绕ffmpeg的java包装器

试试JavaFX

JavaFX支持以多种格式呈现图像,并支持在支持JavaFX的所有平台上播放音频和视频

是关于操作图像的教程

是一个关于创建幻灯片、时间线和场景的教程

是关于添加声音的常见问题解答

其中大部分都在JavaFX1.3上。现在JavaFX2.0已经推出。

为什么不使用FFMPEG

它似乎有一个Java包装器:

以下是如何使用FFMPEG将各种媒体源编译成一个视频的示例:

最后,文档:


您可以尝试一个名为JCodec的纯Java编解码器库。
它有一个非常基本的H.264(AVC)编码器和MP4复用器。下面是从他们的示例中获取的完整示例代码--

private static void png2avc(字符串模式,字符串输出)抛出IOException{
filechannelsink=null;
试一试{
sink=newfileoutputstream(新文件(out)).getChannel();
H264编码器=新的H264编码器();
RgbToYuv420变换=新的RgbToYuv420(0,0);
int i;
对于(i=0;i<10000;i++){
File nextImg=新文件(String.format(pattern,i));
如果(!nextImg.exists())
继续;
BuffereImage rgb=ImageIO.read(下一个TIMG);
Picture yuv=Picture.create(rgb.getWidth()、rgb.getHeight()、ColorSpace.YUV420);
transform.transform(AWTUtil.fromBuffereImage(rgb),yuv);
ByteBuffer buf=ByteBuffer.allocate(rgb.getWidth()*rgb.getHeight()*3);
ByteBuffer ff=编码器。编码器帧(buf,yuv);
下沉。写入(ff);
}
如果(i==1){
System.out.println(“未找到图像序列”);
返回;
}
}最后{
if(sink!=null)
sink.close();
}
}
此示例更为复杂,实际上显示了将编码帧多路复用到MP4文件中:

private static void prores2avc(String in, String out, ProresDecoder decoder, RateControl rc) throws IOException {
    SeekableByteChannel sink = null;
    SeekableByteChannel source = null;
    try {
        sink = writableFileChannel(out);
        source = readableFileChannel(in);

        MP4Demuxer demux = new MP4Demuxer(source);
        MP4Muxer muxer = new MP4Muxer(sink, Brand.MOV);

        Transform transform = new Yuv422pToYuv420p(0, 2);

        H264Encoder encoder = new H264Encoder(rc);

        MP4DemuxerTrack inTrack = demux.getVideoTrack();
        CompressedTrack outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, (int) inTrack.getTimescale());

        VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
        Picture target1 = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.YUV422_10);
        Picture target2 = null;
        ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);

        ArrayList<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
        ArrayList<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();
        Packet inFrame;
        int totalFrames = (int) inTrack.getFrameCount();
        long start = System.currentTimeMillis();
        for (int i = 0; (inFrame = inTrack.getFrames(1)) != null && i < 100; i++) {
            Picture dec = decoder.decodeFrame(inFrame.getData(), target1.getData());
            if (target2 == null) {
                target2 = Picture.create(dec.getWidth(), dec.getHeight(), ColorSpace.YUV420);
            }
            transform.transform(dec, target2);
            _out.clear();
            ByteBuffer result = encoder.encodeFrame(_out, target2);
            if (rc instanceof ConstantRateControl) {
                int mbWidth = (dec.getWidth() + 15) >> 4;
                int mbHeight = (dec.getHeight() + 15) >> 4;
                result.limit(((ConstantRateControl) rc).calcFrameSize(mbWidth * mbHeight));
            }
            spsList.clear();
            ppsList.clear();
            H264Utils.encodeMOVPacket(result, spsList, ppsList);
            outTrack.addFrame(new MP4Packet((MP4Packet) inFrame, result));
            if (i % 100 == 0) {
                long elapse = System.currentTimeMillis() - start;
                System.out.println((i * 100 / totalFrames) + "%, " + (i * 1000 / elapse) + "fps");
            }
        }
        outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));

        muxer.writeHeader();
    } finally {
        if (sink != null)
            sink.close();
        if (source != null)
            source.close();
    }
}
private static void prores2avc(字符串输入、字符串输出、ProresDecoder解码器、RateControl rc)引发IOException{
seekablebytechnel sink=null;
SeekableByteChannel源=null;
试一试{
接收器=可写文件通道(输出);
source=可读文件通道(in);
MP4Demuxer demux=新的MP4Demuxer(源);
MP4Muxer muxer=新的MP4Muxer(sink,Brand.MOV);
Transform-Transform=newyuv42ptoyuv420p(0,2);
H264编码器=新的H264编码器(rc);
MP4DemuxerTrack inTrack=demux.getVideoTrack();
CompressedTrack outTrack=muxer.addTrackForCompressed(TrackType.VIDEO,(int)inTrack.getTimescale());
VideoSampleEntry ine=(VideoSampleEntry)inTrack.getSampleEntries()[0];
Picture target1=Picture.create(ine.getWidth()、ine.getHeight()、ColorSpace.YUV422_10);
图片目标2=空;
ByteBuffer _out=ByteBuffer.allocate(ine.getWidth()*ine.getHeight()*6);
ArrayList spsList=新建ArrayList();
ArrayList ppsList=新的ArrayList();
包基;
int totalFrames=(int)inTrack.getFrameCount();
长启动=System.currentTimeMillis();
对于(inti=0;(inFrame=inTrack.getFrames(1))!=null&&i<100;i++){
Picture dec=decoder.decodeFrame(inFrame.getData(),target1.getData());
如果(target2==null){
target2=Picture.create(dec.getWidth()、dec.getHeight()、ColorSpace.YUV420);
}
变换。变换(dec,target2);
_out.clear();
ByteBuffer结果=编码器.encodeFrame(_out,target2);
if(恒速控制的rc实例){
int mbWidth=(12月getWidth()+15)>>4;
int mbHeight=(12月getHeight()+15)>>4;
结果.限值(((ConstantRateControl)rc).calcFrameSize(mbWidth*mbHeight));
}
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(结果、spsList、PPList);
addFrame(新的MP4Packet((MP4Packet)inFrame,result));
如果(i%100==0){
long elapse=System.currentTimeMillis()-开始;
System.out.println((i*100/totalFrames)+“%,”+(i*1000/elapse)+“fps”);
}
}
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList,ppsList));
muxer.writeHeader();
}最后{
private static void prores2avc(String in, String out, ProresDecoder decoder, RateControl rc) throws IOException {
    SeekableByteChannel sink = null;
    SeekableByteChannel source = null;
    try {
        sink = writableFileChannel(out);
        source = readableFileChannel(in);

        MP4Demuxer demux = new MP4Demuxer(source);
        MP4Muxer muxer = new MP4Muxer(sink, Brand.MOV);

        Transform transform = new Yuv422pToYuv420p(0, 2);

        H264Encoder encoder = new H264Encoder(rc);

        MP4DemuxerTrack inTrack = demux.getVideoTrack();
        CompressedTrack outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, (int) inTrack.getTimescale());

        VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
        Picture target1 = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.YUV422_10);
        Picture target2 = null;
        ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);

        ArrayList<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
        ArrayList<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();
        Packet inFrame;
        int totalFrames = (int) inTrack.getFrameCount();
        long start = System.currentTimeMillis();
        for (int i = 0; (inFrame = inTrack.getFrames(1)) != null && i < 100; i++) {
            Picture dec = decoder.decodeFrame(inFrame.getData(), target1.getData());
            if (target2 == null) {
                target2 = Picture.create(dec.getWidth(), dec.getHeight(), ColorSpace.YUV420);
            }
            transform.transform(dec, target2);
            _out.clear();
            ByteBuffer result = encoder.encodeFrame(_out, target2);
            if (rc instanceof ConstantRateControl) {
                int mbWidth = (dec.getWidth() + 15) >> 4;
                int mbHeight = (dec.getHeight() + 15) >> 4;
                result.limit(((ConstantRateControl) rc).calcFrameSize(mbWidth * mbHeight));
            }
            spsList.clear();
            ppsList.clear();
            H264Utils.encodeMOVPacket(result, spsList, ppsList);
            outTrack.addFrame(new MP4Packet((MP4Packet) inFrame, result));
            if (i % 100 == 0) {
                long elapse = System.currentTimeMillis() - start;
                System.out.println((i * 100 / totalFrames) + "%, " + (i * 1000 / elapse) + "fps");
            }
        }
        outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));

        muxer.writeHeader();
    } finally {
        if (sink != null)
            sink.close();
        if (source != null)
            source.close();
    }
}