Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/352.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/201.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 在android中使用jcodec设置图像持续时间_Java_Android_Jcodec - Fatal编程技术网

Java 在android中使用jcodec设置图像持续时间

Java 在android中使用jcodec设置图像持续时间,java,android,jcodec,Java,Android,Jcodec,我正在制作一个应用程序,它可以制作一个带有少量图片的视频。 我的问题是,有20或30个图像的视频有1秒的持续时间。 我在我的OnCreate方法中创建编码器,并在计时器中使用它 encoder.encodeNativeFrame(pic); 计时器每秒运行一次 当我按下finish按钮时,我添加了这个代码 encoder.finish(); 但当我观看视频时,我会在一秒钟内观看所有图像 我可以设置持续时间吗?例如,是否每秒添加一个图像? 提前谢谢你只要打个电话就行了 SequenceEn

我正在制作一个应用程序,它可以制作一个带有少量图片的视频。 我的问题是,有20或30个图像的视频有1秒的持续时间。 我在我的
OnCreate
方法中创建编码器,并在计时器中使用它

 encoder.encodeNativeFrame(pic);
计时器每秒运行一次

当我按下finish按钮时,我添加了这个代码

 encoder.finish();
但当我观看视频时,我会在一秒钟内观看所有图像

我可以设置持续时间吗?例如,是否每秒添加一个图像?
提前谢谢

你只要打个电话就行了

SequenceEncoder encoder = new SequenceEncoder(Outputfile, Constants.VIDEO_WIDTH, Constants.VIDEO_HEIGHT, durationInSeconds);
它适合我。

公共类SequenceEncoder{
public class SequenceEncoder {
    private SeekableByteChannel ch;
    private Picture toEncode;
    private RgbToYuv420 transform;
    private H264Encoder encoder;
    private ArrayList<ByteBuffer> spsList;
    private ArrayList<ByteBuffer> ppsList;
    private FramesMP4MuxerTrack outTrack;
    private ByteBuffer _out;
    private int frameNo;
    private MP4Muxer muxer;
    public SequenceEncoder(File out) throws IOException {
        this.ch = NIOUtils.writableFileChannel(out);

        // Transform to convert between RGB and YUV
        transform = new RgbToYuv420(0, 0);

        // Muxer that will store the encoded frames
        muxer = new MP4Muxer(ch, Brand.MP4);

        // Add video track to muxer
         outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 1);

        // Allocate a buffer big enough to hold output frames
        _out = ByteBuffer.allocate(1920 * 1080 * 6);

        // Create an instance of encoder
        encoder = new H264Encoder();

        // Encoder extra data ( SPS, PPS ) to be stored in a special place of
        // MP4
        spsList = new ArrayList<ByteBuffer>();
        ppsList = new ArrayList<ByteBuffer>();

    }

    public void encodeImage(Bitmap bi) throws IOException {
        // encodeNativeFrame(AWTUtil.fromBufferedImage(bi));
        encodeNativeFrame(fromBitmap(bi));
    }

    public void encodeNativeFrame(Picture pic) throws IOException {
        if (toEncode == null) {
            toEncode = Picture.create(pic.getWidth(), pic.getHeight(),
                    ColorSpace.YUV420);
        }

        // Perform conversion
        transform.transform(pic, toEncode);

        // Encode image into H.264 frame, the result is stored in '_out' buffer
        _out.clear();
        ByteBuffer result = encoder.encodeFrame(_out, toEncode);

        // Based on the frame above form correct MP4 packet
        spsList.clear();
        ppsList.clear();
        H264Utils.encodeMOVPacket(result, spsList, ppsList);

        // Add packet to video track
        outTrack.addFrame(new MP4Packet(result, frameNo, 1, 5, frameNo, true,
                null, frameNo, 0));
        frameNo++;
    }

    public void finish() throws IOException {
        // Push saved SPS/PPS to a special storage in MP4
        outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));

        // Write MP4 header and finalize recording
        muxer.writeHeader();
        NIOUtils.closeQuietly(ch);
    }

    public static Picture fromBitmap(Bitmap src) {
        Picture dst = Picture.create((int) src.getWidth(),
                (int) src.getHeight(), ColorSpace.RGB);
        fromBitmap(src, dst);
        return dst;
    }

    public static void fromBitmap(Bitmap src, Picture dst) {
        int[] dstData = dst.getPlaneData(0);
        int[] packed = new int[src.getWidth() * src.getHeight()];

        src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(),
                src.getHeight());

        for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
            for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
                int rgb = packed[srcOff];
                dstData[dstOff] = (rgb >> 16) & 0xff;
                dstData[dstOff + 1] = (rgb >> 8) & 0xff;
                dstData[dstOff + 2] = rgb & 0xff;
            }
        }
    }

}
私人频道; 私人图片编码; 私有RgbToYuv420变换; 专用H264编码器; 私有数组列表; 私有数组列表; 专用框架4MuxerTrack outTrack; 二等兵拜特伯弗(ByteBuffer)出局;; 私有int框架编号; 私人MP4Muxer-muxer; 公共SequenceEncoder(文件输出)引发IOException{ this.ch=NIOUtils.writableFileChannel(out); //转换以在RGB和YUV之间转换 变换=新的RgbToYuv420(0,0); //存储编码帧的多路复用器 muxer=新的MP4Muxer(ch,品牌.MP4); //将视频曲目添加到muxer outTrack=muxer.addTrackForCompressed(TrackType.VIDEO,1); //分配一个足以容纳输出帧的缓冲区 _out=字节缓冲分配(1920*1080*6); //创建编码器的实例 编码器=新的H264编码器(); //编码器额外数据(SPS、PPS)存储在 //MP4 spsList=newarraylist(); ppsList=newarraylist(); } 公共void encodeImage(位图bi)引发IOException{ //encodeNativeFrame(AWTUtil.fromBuffereImage(bi)); encodeNativeFrame(来自位图(bi)); } public void encodeNativeFrame(图片pic)引发IOException{ 如果(toEncode==null){ toEncode=Picture.create(pic.getWidth(),pic.getHeight(), 色彩空间(YUV420); } //执行转换 transform.transform(pic,toEncode); //将图像编码到H.264帧中,结果存储在“\u out”缓冲区中 _out.clear(); ByteBuffer结果=编码器.encodeFrame(_out,toEncode); //基于上面的帧形成正确的MP4数据包 spsList.clear(); ppsList.clear(); H264Utils.encodeMOVPacket(结果、spsList、PPList); //将数据包添加到视频曲目 outTrack.addFrame(新的MP4包)(结果,帧号,1,5,帧号,真, null,frameNo,0); frameNo++; } public void finish()引发IOException{ //将保存的SP/PP推送到MP4中的特殊存储器 outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList,ppsList)); //写入MP4标题并完成录制 muxer.writeHeader(); NIOUtils.closes(ch); } 来自位图的公共静态图片(位图src){ Picture dst=Picture.create((int)src.getWidth(), (int)src.getHeight(),ColorSpace.RGB); fromBitmap(src、dst); 返回dst; } 位图中的公共静态无效(位图src、图片dst){ int[]dstData=dst.getPlaneData(0); int[]packed=new int[src.getWidth()*src.getHeight()]; src.getPixels(压缩,0,src.getWidth(),0,0,src.getWidth(), src.getHeight()); for(inti=0,srcOff=0,dstOff=0;i>16)和0xff; dstData[dstOff+1]=(rgb>>8)&0xff; dstData[dstOff+2]=rgb&0xff; } } } }
您必须使用fram动画来完成此操作否,我不需要在app build.gradle中添加动画编译'org.jcodec:jcodec android:0.1.9',但无法访问某些类