Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/320.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/179.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 从MediaCodec在GLTexture中渲染_Java_Android_Opengl Es_Bytebuffer_Android Mediacodec - Fatal编程技术网

Java 从MediaCodec在GLTexture中渲染

Java 从MediaCodec在GLTexture中渲染,java,android,opengl-es,bytebuffer,android-mediacodec,Java,Android,Opengl Es,Bytebuffer,Android Mediacodec,我正在尝试将来自Android的视频帧渲染成一种纹理。视频正在播放,而且似乎有效。然而,缓冲区似乎混乱不堪。(见下图) while(!Thread.interrupted()){ 如果(!isEOS){ int INININDEX=解码器.dequeueInputBuffer(10000); 如果(inIndex>=0){ ByteBuffer buffer=inputBuffers[inIndex]; int sampleSize=extractor.readSampleData(缓冲区,0)

我正在尝试将来自Android的视频帧渲染成一种纹理。视频正在播放,而且似乎有效。然而,缓冲区似乎混乱不堪。(见下图)

while(!Thread.interrupted()){
如果(!isEOS){
int INININDEX=解码器.dequeueInputBuffer(10000);
如果(inIndex>=0){
ByteBuffer buffer=inputBuffers[inIndex];
int sampleSize=extractor.readSampleData(缓冲区,0);
如果(样本大小<0){
//我们不应该在这一点上停止播放,只是通过EOS
//标记到解码器,我们将从
//出列输出缓冲区
Log.d(“DecodeActivity”,“InputBuffer BUFFER\u FLAG\u END\u OF_STREAM”);
解码器.queueInputBuffer(inIndex,0,0,0,MediaCodec.BUFFER\u标志\u结束\u流);
isEOS=真;
}否则{
decoder.queueInputBuffer(inIndex,0,sampleSize,提取器.getSampleTime(),0);
提取器;
}
}
}
int OUTIDEX=解码器.dequeueOutputBuffer(信息,10000);
交换机(OUTIDEX){
案例MediaCodec.INFO\u输出\u缓冲区\u已更改:
Log.d(“解码活动”、“信息输出缓冲区更改”);
outputBuffers=解码器。getOutputBuffers();
打破
案例MediaCodec.INFO\u输出\u格式\u已更改:
Log.d(“DecodeActivity”,“新格式”+decoder.getOutputFormat());
打破
case MediaCodec.INFO\u请稍后再试:
Log.d(“DecodeActivity”,“dequeueOutputBuffer超时!”);
打破
违约:
ByteBuffer缓冲区=输出缓冲区[outIndex];
Log.d(标签,“尺寸设置输出:“+videoHeight*videoWidth+”缓冲区大小:“+info.size”);
如果(mImageWidth!=视频宽度){
mImageWidth=视频宽度;
mImageHeight=视频高度;
调整图像缩放();
}
缓冲区位置(信息偏移);
缓冲区限制(信息偏移量+信息大小);
Log.d(标记“偏移量:+info.offset+”大小:+info.size);
最后一个字节[]ba=新字节[buffer.remaining()];
buffer.get(ba);
if(mGLRgbBuffer==null){
mGLRgbBuffer=IntBuffer.allocate(视频高度
*视频宽度);
}
if(mRunOnDraw.isEmpty()){
runOnDraw(新的Runnable(){
@凌驾
公开募捐{
GPUIMagenativelLibrary.YUVtoRBGA(文学士,视频宽度,
videoHeight,mGLRgbBuffer.array());
mGLTextureId=OpenGlUtils.loadTexture(mGLRgbBuffer,
视频宽度、视频高度、mGLTextureId);
}
});
}
Log.v(“DecodeActivity”,“我们不能使用此缓冲区,但由于API限制而呈现它,”+缓冲区);
//我们使用一个非常简单的时钟来保持视频FPS或视频
//播放速度太快
而(info.presentationTimeUs/1000>System.currentTimeMillis()-startMs){
试一试{
睡眠(10);
}捕捉(中断异常e){
e、 printStackTrace();
打破
}
}
解码器.releaseOutputBuffer(outIndex,true);
打破
}
//所有解码帧都已渲染,现在可以停止播放
if((流的info.flags和MediaCodec.BUFFER\u FLAG\u END)!=0){
Log.d(“解码活动”,“输出缓冲缓冲区缓冲区标志\u结束\u流”);
打破
}
}
以下是GLSURFACHEVIEW的屏幕截图,它被渲染到:

我找到了这个答案:
但是这些解决方案似乎都不起作用。

我想你的YUV到RGB的转换是不对的。事实上,您没有将颜色格式作为参数传递给转换函数,这就提示了这种猜测。有什么特别的原因不能解码到SurfaceTexture吗?(中的各种示例。)如果您使用的是高通公司的设备,请参阅感谢您的帮助,fadden!我试试看。我知道视频的颜色格式是mediacodeinfo.CodecCapabilities.color_formatyuv420。但是,我不知道代码中的转换函数到底在做什么。为什么我不能渲染到SurfaceTexture:我正在对视频应用片段着色器。为此,我需要一个GLSurfaceTexture。总的来说,我正在尝试使用Android的GPUImage来制作视频,我几乎做到了……SurfaceTexture将图形缓冲区(如解码视频中的帧)转换为纹理。这似乎正是您在这里试图做的,但您是在软件中做的,这要慢得多,而且更难在可移植的情况下完成。将纹理渲染到曲面时,将应用片段着色器。请注意,SurfaceTexture与用于显示内容的SurfaceView非常不同。请参阅,例如,有关使用片段着色器从摄影机过滤实时视频的示例。抱歉,我混淆了其中的措辞。我以前试过使用SurfaceTexture。为此,我需要使用GL_OES_EGL_image_外部纹理目标。使用GL_OES_EGL_image_外部
        while (!Thread.interrupted()) {
            if (!isEOS) {
                int inIndex = decoder.dequeueInputBuffer(10000);
                if (inIndex >= 0) {
                    ByteBuffer buffer = inputBuffers[inIndex];
                    int sampleSize = extractor.readSampleData(buffer, 0);
                    if (sampleSize < 0) {
                    // We shouldn't stop the playback at this point, just pass the EOS
                    // flag to decoder, we will get it again from the
                    // dequeueOutputBuffer
                        Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                        decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                        isEOS = true;
                    } else {
                        decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
                        extractor.advance();
                    }
                }
            }
            int outIndex = decoder.dequeueOutputBuffer(info, 10000);
            switch (outIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
                    outputBuffers = decoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
                    break;
                default:
                    ByteBuffer buffer = outputBuffers[outIndex];

                    Log.d(TAG, "Dimenstion output: " + videoHeight * videoWidth + " buffer size: " + info.size);

                    if (mImageWidth != videoWidth) {
                        mImageWidth = videoWidth;
                        mImageHeight = videoHeight;
                        adjustImageScaling();
                    }

                    buffer.position(info.offset);
                    buffer.limit(info.offset + info.size);

                    Log.d(TAG, "offset: " + info.offset + " size: " + info.size);

                    final byte[] ba = new byte[buffer.remaining()];
                    buffer.get(ba);

                    if (mGLRgbBuffer == null) {
                        mGLRgbBuffer = IntBuffer.allocate(videoHeight
                                * videoWidth);
                    }

                    if (mRunOnDraw.isEmpty()) {
                        runOnDraw(new Runnable() {
                            @Override
                            public void run() {
                                GPUImageNativeLibrary.YUVtoRBGA(ba, videoWidth,
                                        videoHeight, mGLRgbBuffer.array());
                                mGLTextureId = OpenGlUtils.loadTexture(mGLRgbBuffer,
                                        videoWidth, videoHeight, mGLTextureId);

                            }
                        });
                    }

                    Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
                    // We use a very simple clock to keep the video FPS, or the video
                    // playback will be too fast
                    while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                        try {
                            sleep(10);
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                            break;
                        }
                    }
                    decoder.releaseOutputBuffer(outIndex, true);
                    break;
            }
            // All decoded frames have been rendered, we can stop playing now
            if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                break;
            }
        }