Opengl es 为什么要在空对象引用上获取surfaceTexture.UpdateMaximage()?

Opengl es 为什么要在空对象引用上获取surfaceTexture.UpdateMaximage()?,opengl-es,decode,android-mediacodec,glsurfaceview,render-to-texture,Opengl Es,Decode,Android Mediacodec,Glsurfaceview,Render To Texture,我正在尝试用MediaCodec解码mp4,并用GLSurfaceView呈现它。到MediaCodec configure(),我将使用SurfaceTexture传递创建的曲面。在SurfacetTexture上,我正在调用一个侦听器,并在OnDrawFrame方法中对其进行同步,并调用SurfacetTexture.upadteTexImage();结果是空对象引用上的surfaceTexture.UpdateMaximage()。 我想知道这怎么可能?既然监听器给出了一个消息,有一个可用

我正在尝试用MediaCodec解码mp4,并用GLSurfaceView呈现它。到MediaCodec configure(),我将使用SurfaceTexture传递创建的曲面。在SurfacetTexture上,我正在调用一个侦听器,并在OnDrawFrame方法中对其进行同步,并调用SurfacetTexture.upadteTexImage();结果是空对象引用上的surfaceTexture.UpdateMaximage()。 我想知道这怎么可能?既然监听器给出了一个消息,有一个可用的帧,他怎么能返回null呢? 我必须说,我在用OpenGlES配置MediaCodec时遇到了一个问题。首先,我认为这是因为三星设备解码器,但当我试图解码到SurfaceView时,它工作得很好

SurfaceTexture surfaceTexture = new SurfaceTexture(textures[1]);
    surface = new Surface(surfaceTexture);

    surfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
       @Override
       public void onFrameAvailable(SurfaceTexture surfaceTexture) {
                synchronized (mediaPlayerObject) {
                    mediaPlayeUpdate = true;
                    mediaPlayerObject.notifyAll();
                }
       }
   });


public void onDrawFrame(GL10 gl) {
        GLES20.glClear(GL10.GL_COLOR_BUFFER_BIT);


        synchronized (mediaPlayerObject) {
            surfaceTexture.updateTexImage();
            mediaPlayeUpdate = false;
        }
我在surfaceChanged(德国劳埃德船级社…)

@RequiresApi(api=Build.VERSION\u CODES.JELLY\u BEAN)
公开募捐{
mediaExtractor=新的mediaExtractor();
试一试{
setDataSource(“sdcard/nagranie2.mp4”);
}捕获(IOE异常){
e、 printStackTrace();
}
for(int i=0;i=0){
ByteBuffer inputBuffer=decoderInputBuffers[intBufIndex];
int chunkSize=mediaExtractor.readSampleData(inputBuffer,0);
if(chunkSize<0){
解码器.queueInputBuffer(intBufIndex,0,0,0L,解码器.BUFFER\u标志\u结束\u流);
等电点=真;
}否则{
decoder.queueInputBuffer(intBufIndex,0,chunkSize,mediaExtractor.getSampleTime(),0);
mediaExtractor.advance();
}
}
}
int decoderStatus=decoder.dequeueOutputBuffer(info11000);
开关(解码器状态)
{
案例MediaCodec.INFO\u输出\u缓冲区\u已更改:
decoderOutputBuffers=decoder.getOutputBuffers();
打破
案例MediaCodec.INFO\u输出\u格式\u已更改:
打破
case MediaCodec.INFO\u请稍后再试:
打破
违约:
ByteBuffer ByteBuffer=decoderOutputBuffers[decoderStatus];
/*while(info1.presentationTimeUs/1000>System.currentTimeMillis()-startMs){
试一试{
}捕捉(中断异常e){
e、 printStackTrace();
}
}*/
decoder.releaseOutputBuffer(decoderStatus,true);
打破
}
if((info1.flags和MediaCodec.BUFFER\u FLAG\u END\u流)!=0){
打破
}
}
解码器。停止();
decoder.release();
mediaExtractor.release();
}

您需要保留对SurfaceTexture的强引用,以便它不会被垃圾收集。你是什么意思?SurfaceTexture SurfaceTexture=new SurfaceTexture()不是强引用吗?这是我在上面代码中写的错误吗?这取决于你在哪里声明那个变量,如果它在一个方法中,那么它不是。该引用是该方法的本地引用,可能在该方法返回后被垃圾收集。嘿,你是对的,我创建了该类的surfacetexture对象成员,它可以工作。我愿意接受你的回答!
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
public void run() {
    mediaExtractor = new MediaExtractor();
    try {
        mediaExtractor.setDataSource("sdcard/nagranie2.mp4");
    } catch (IOException e) {
        e.printStackTrace();
    }

    for (int i = 0; i < mediaExtractor.getTrackCount(); i++) {
        MediaFormat format = mediaExtractor.getTrackFormat(i);
        String mime = format.getString(MediaFormat.KEY_MIME);
        if (mime.startsWith("video/")) {
            mediaExtractor.selectTrack(i);
            try {
                decoder = MediaCodec.createDecoderByType(mime);
            } catch (IOException e) {
                e.printStackTrace();
            }
            decoder.configure(format, surface, null, 0);
            break;
        }
    }
    if (decoder == null) {
        Log.e("DecodeActivity", "Can't find video info!");
        return;
    }
    decoder.start();


    ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();
    ByteBuffer[] decoderOutputBuffers = decoder.getOutputBuffers();

    MediaCodec.BufferInfo info1 = new MediaCodec.BufferInfo();
    boolean isOES = false;
    long startMs = System.currentTimeMillis();


    while (!Thread.interrupted()) {
        if (!isOES) {
            int intBufIndex = decoder.dequeueInputBuffer(10000);
            if (intBufIndex >= 0) {
                ByteBuffer inputBuffer = decoderInputBuffers[intBufIndex];
                int chunkSize = mediaExtractor.readSampleData(inputBuffer, 0);
                if (chunkSize < 0) {
                    decoder.queueInputBuffer(intBufIndex, 0, 0, 0L, decoder.BUFFER_FLAG_END_OF_STREAM);
                    isOES = true;

                } else {

                    decoder.queueInputBuffer(intBufIndex, 0, chunkSize, mediaExtractor.getSampleTime(), 0);
                    mediaExtractor.advance();

                }
            }
        }

        int decoderStatus = decoder.dequeueOutputBuffer(info1, 10000);
        switch (decoderStatus)
        {
            case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                decoderOutputBuffers = decoder.getOutputBuffers();
                break;

            case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                break;

            case MediaCodec.INFO_TRY_AGAIN_LATER:
                break;
            default:
                ByteBuffer byteBuffer = decoderOutputBuffers[decoderStatus];
             /*   while (info1.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                    try {
                    } catch (InterruptedException e) {
                        e.printStackTrace();

                    }
                }*/
                decoder.releaseOutputBuffer(decoderStatus, true);
                break;

        }



        if ((info1.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
            break;
        }

    }

    decoder.stop();
    decoder.release();
    mediaExtractor.release();
}