如何在Android上将缓冲区[]绘制到TextureView?
我正在使用JavaCV从视频文件中检索帧。此如何在Android上将缓冲区[]绘制到TextureView?,android,opengl-es,javacv,textureview,Android,Opengl Es,Javacv,Textureview,我正在使用JavaCV从视频文件中检索帧。此FFmpegFrameGrabber返回一个基本上包含用于保存视频帧的图像像素的 由于性能是我的首要任务,我希望使用OpenGL ES直接显示此缓冲区[],而不将其转换为位图 要显示的视图只占屏幕的不到一半,并且遵循OpenGL ES: 想要将OpenGL ES图形整合到布局的一小部分中的开发人员应该看看TextureView 所以我想TextureView是这个任务的正确选择。然而,我还没有找到太多关于这方面的资源(其中大多数是相机预览示例) 我想问
FFmpegFrameGrabber
返回一个基本上包含用于保存视频帧的图像像素的
由于性能是我的首要任务,我希望使用OpenGL ES直接显示此缓冲区[]
,而不将其转换为位图
要显示的视图只占屏幕的不到一半,并且遵循OpenGL ES:
想要将OpenGL ES图形整合到布局的一小部分中的开发人员应该看看TextureView
所以我想TextureView
是这个任务的正确选择。然而,我还没有找到太多关于这方面的资源(其中大多数是相机预览示例)
我想问一下如何将Buffer[]
绘制到TextureView
?如果这不是最有效的方法,我愿意尝试你的替代方案
更新:所以目前我有如下设置:
public class MyGLRenderer2 implements GLSurfaceView.Renderer {
private static final String TAG = "MyGLRenderer2";
private FullFrameTexture mFullFrameTexture;
public MyGLRenderer2(Context context){
super();
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0,0,width, height);
GLES20.glClearColor(0,0,0,1);
mFullFrameTexture = new FullFrameTexture();
}
@Override
public void onDrawFrame(GL10 gl) {
createFrameTexture(mCurrentBuffer, 1280, 720, GLES20.GL_RGB); //should not need to be a power of 2 since I use GL_CLAMP_TO_EDGE
mFullFrameTexture.draw(textureHandle);
if(mCurrentBuffer != null){
mCurrentBuffer.clear();
}
}
//test
private ByteBuffer mCurrentBuffer;
public void setTexture(ByteBuffer buffer){
mCurrentBuffer = buffer.duplicate();
mCurrentBuffer.position(0);
}
private int[] textureHandles = new int[1];
private int textureHandle;
public void createFrameTexture(ByteBuffer data, int width, int height, int format) {
GLES20.glGenTextures(1, textureHandles, 0);
textureHandle = textureHandles[0];
GlUtil.checkGlError("glGenTextures");
// Bind the texture handle to the 2D texture target.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);
// Configure min/mag filtering, i.e. what scaling method do we use if what we're rendering
// is smaller or larger than the source image.
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GlUtil.checkGlError("loadImageTexture");
// Load the data from the buffer into the texture handle.
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, /*level*/ 0, format,
width, height, /*border*/ 0, format, GLES20.GL_UNSIGNED_BYTE, data);
GlUtil.checkGlError("loadImageTexture");
}
public class FullFrameTexture {
private static final String VERTEXT_SHADER =
"uniform mat4 uOrientationM;\n" +
"uniform mat4 uTransformM;\n" +
"attribute vec2 aPosition;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_Position = vec4(aPosition, 0.0, 1.0);\n" +
"vTextureCoord = (uTransformM * ((uOrientationM * gl_Position + 1.0) * 0.5)).xy;" +
"}";
private static final String FRAGMENT_SHADER =
"precision mediump float;\n" +
"uniform sampler2D sTexture;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
"}";
private final byte[] FULL_QUAD_COORDINATES = {-1, 1, -1, -1, 1, 1, 1, -1};
private ShaderProgram shader;
private ByteBuffer fullQuadVertices;
private final float[] orientationMatrix = new float[16];
private final float[] transformMatrix = new float[16];
public FullFrameTexture() {
if (shader != null) {
shader = null;
}
shader = new ShaderProgram(EglUtil.getInstance());
shader.create(VERTEXT_SHADER, FRAGMENT_SHADER);
fullQuadVertices = ByteBuffer.allocateDirect(4 * 2);
fullQuadVertices.put(FULL_QUAD_COORDINATES).position(0);
Matrix.setRotateM(orientationMatrix, 0, 0, 0f, 0f, 1f);
Matrix.setIdentityM(transformMatrix, 0);
}
public void release() {
shader = null;
fullQuadVertices = null;
}
public void draw(int textureId) {
shader.use();
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
int uOrientationM = shader.getAttributeLocation("uOrientationM");
int uTransformM = shader.getAttributeLocation("uTransformM");
GLES20.glUniformMatrix4fv(uOrientationM, 1, false, orientationMatrix, 0);
GLES20.glUniformMatrix4fv(uTransformM, 1, false, transformMatrix, 0);
// Trigger actual rendering.
renderQuad(shader.getAttributeLocation("aPosition"));
shader.unUse();
}
private void renderQuad(int aPosition) {
GLES20.glVertexAttribPointer(aPosition, 2, GLES20.GL_BYTE, false, 0, fullQuadVertices);
GLES20.glEnableVertexAttribArray(aPosition);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
在我的VideoActivity
中,我反复提取视频的帧
,其中包含ByteBuffer
,然后将其发送到我的myglrender2
以转换为OpenGLES的纹理:
...
mGLSurfaceView = (GLSurfaceView)findViewById(R.id.gl_surface_view);
mGLSurfaceView.setEGLContextClientVersion(2);
mRenderer = new MyGLRenderer2(this);
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
...
private void grabCurrentFrame(final long currentPosition){
if(mCanSeek){
new AsyncTask(){
@Override
protected void onPreExecute() {
super.onPreExecute();
mCanSeek = false;
}
@Override
protected Object doInBackground(Object[] params) {
try {
Frame frame = mGrabber.grabImage();
setCurrentFrame((ByteBuffer)frame.image[0]);
}
catch (Exception e) {
e.printStackTrace();
}
return null;
}
@Override
protected void onPostExecute(Object o) {
super.onPostExecute(o);
mCanSeek = true;
}
}
}.execute();
}
}
private void setCurrentFrame(ByteBuffer buffer){
mRenderer.setTexture(buffer);
}
MyGLRenderer2
如下所示:
public class MyGLRenderer2 implements GLSurfaceView.Renderer {
private static final String TAG = "MyGLRenderer2";
private FullFrameTexture mFullFrameTexture;
public MyGLRenderer2(Context context){
super();
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0,0,width, height);
GLES20.glClearColor(0,0,0,1);
mFullFrameTexture = new FullFrameTexture();
}
@Override
public void onDrawFrame(GL10 gl) {
createFrameTexture(mCurrentBuffer, 1280, 720, GLES20.GL_RGB); //should not need to be a power of 2 since I use GL_CLAMP_TO_EDGE
mFullFrameTexture.draw(textureHandle);
if(mCurrentBuffer != null){
mCurrentBuffer.clear();
}
}
//test
private ByteBuffer mCurrentBuffer;
public void setTexture(ByteBuffer buffer){
mCurrentBuffer = buffer.duplicate();
mCurrentBuffer.position(0);
}
private int[] textureHandles = new int[1];
private int textureHandle;
public void createFrameTexture(ByteBuffer data, int width, int height, int format) {
GLES20.glGenTextures(1, textureHandles, 0);
textureHandle = textureHandles[0];
GlUtil.checkGlError("glGenTextures");
// Bind the texture handle to the 2D texture target.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);
// Configure min/mag filtering, i.e. what scaling method do we use if what we're rendering
// is smaller or larger than the source image.
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GlUtil.checkGlError("loadImageTexture");
// Load the data from the buffer into the texture handle.
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, /*level*/ 0, format,
width, height, /*border*/ 0, format, GLES20.GL_UNSIGNED_BYTE, data);
GlUtil.checkGlError("loadImageTexture");
}
public class FullFrameTexture {
private static final String VERTEXT_SHADER =
"uniform mat4 uOrientationM;\n" +
"uniform mat4 uTransformM;\n" +
"attribute vec2 aPosition;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_Position = vec4(aPosition, 0.0, 1.0);\n" +
"vTextureCoord = (uTransformM * ((uOrientationM * gl_Position + 1.0) * 0.5)).xy;" +
"}";
private static final String FRAGMENT_SHADER =
"precision mediump float;\n" +
"uniform sampler2D sTexture;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
"}";
private final byte[] FULL_QUAD_COORDINATES = {-1, 1, -1, -1, 1, 1, 1, -1};
private ShaderProgram shader;
private ByteBuffer fullQuadVertices;
private final float[] orientationMatrix = new float[16];
private final float[] transformMatrix = new float[16];
public FullFrameTexture() {
if (shader != null) {
shader = null;
}
shader = new ShaderProgram(EglUtil.getInstance());
shader.create(VERTEXT_SHADER, FRAGMENT_SHADER);
fullQuadVertices = ByteBuffer.allocateDirect(4 * 2);
fullQuadVertices.put(FULL_QUAD_COORDINATES).position(0);
Matrix.setRotateM(orientationMatrix, 0, 0, 0f, 0f, 1f);
Matrix.setIdentityM(transformMatrix, 0);
}
public void release() {
shader = null;
fullQuadVertices = null;
}
public void draw(int textureId) {
shader.use();
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
int uOrientationM = shader.getAttributeLocation("uOrientationM");
int uTransformM = shader.getAttributeLocation("uTransformM");
GLES20.glUniformMatrix4fv(uOrientationM, 1, false, orientationMatrix, 0);
GLES20.glUniformMatrix4fv(uTransformM, 1, false, transformMatrix, 0);
// Trigger actual rendering.
renderQuad(shader.getAttributeLocation("aPosition"));
shader.unUse();
}
private void renderQuad(int aPosition) {
GLES20.glVertexAttribPointer(aPosition, 2, GLES20.GL_BYTE, false, 0, fullQuadVertices);
GLES20.glEnableVertexAttribArray(aPosition);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
}
而FullFrameTexture
的外观如下:
public class MyGLRenderer2 implements GLSurfaceView.Renderer {
private static final String TAG = "MyGLRenderer2";
private FullFrameTexture mFullFrameTexture;
public MyGLRenderer2(Context context){
super();
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0,0,width, height);
GLES20.glClearColor(0,0,0,1);
mFullFrameTexture = new FullFrameTexture();
}
@Override
public void onDrawFrame(GL10 gl) {
createFrameTexture(mCurrentBuffer, 1280, 720, GLES20.GL_RGB); //should not need to be a power of 2 since I use GL_CLAMP_TO_EDGE
mFullFrameTexture.draw(textureHandle);
if(mCurrentBuffer != null){
mCurrentBuffer.clear();
}
}
//test
private ByteBuffer mCurrentBuffer;
public void setTexture(ByteBuffer buffer){
mCurrentBuffer = buffer.duplicate();
mCurrentBuffer.position(0);
}
private int[] textureHandles = new int[1];
private int textureHandle;
public void createFrameTexture(ByteBuffer data, int width, int height, int format) {
GLES20.glGenTextures(1, textureHandles, 0);
textureHandle = textureHandles[0];
GlUtil.checkGlError("glGenTextures");
// Bind the texture handle to the 2D texture target.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);
// Configure min/mag filtering, i.e. what scaling method do we use if what we're rendering
// is smaller or larger than the source image.
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GlUtil.checkGlError("loadImageTexture");
// Load the data from the buffer into the texture handle.
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, /*level*/ 0, format,
width, height, /*border*/ 0, format, GLES20.GL_UNSIGNED_BYTE, data);
GlUtil.checkGlError("loadImageTexture");
}
public class FullFrameTexture {
private static final String VERTEXT_SHADER =
"uniform mat4 uOrientationM;\n" +
"uniform mat4 uTransformM;\n" +
"attribute vec2 aPosition;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_Position = vec4(aPosition, 0.0, 1.0);\n" +
"vTextureCoord = (uTransformM * ((uOrientationM * gl_Position + 1.0) * 0.5)).xy;" +
"}";
private static final String FRAGMENT_SHADER =
"precision mediump float;\n" +
"uniform sampler2D sTexture;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
"}";
private final byte[] FULL_QUAD_COORDINATES = {-1, 1, -1, -1, 1, 1, 1, -1};
private ShaderProgram shader;
private ByteBuffer fullQuadVertices;
private final float[] orientationMatrix = new float[16];
private final float[] transformMatrix = new float[16];
public FullFrameTexture() {
if (shader != null) {
shader = null;
}
shader = new ShaderProgram(EglUtil.getInstance());
shader.create(VERTEXT_SHADER, FRAGMENT_SHADER);
fullQuadVertices = ByteBuffer.allocateDirect(4 * 2);
fullQuadVertices.put(FULL_QUAD_COORDINATES).position(0);
Matrix.setRotateM(orientationMatrix, 0, 0, 0f, 0f, 1f);
Matrix.setIdentityM(transformMatrix, 0);
}
public void release() {
shader = null;
fullQuadVertices = null;
}
public void draw(int textureId) {
shader.use();
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
int uOrientationM = shader.getAttributeLocation("uOrientationM");
int uTransformM = shader.getAttributeLocation("uTransformM");
GLES20.glUniformMatrix4fv(uOrientationM, 1, false, orientationMatrix, 0);
GLES20.glUniformMatrix4fv(uTransformM, 1, false, transformMatrix, 0);
// Trigger actual rendering.
renderQuad(shader.getAttributeLocation("aPosition"));
shader.unUse();
}
private void renderQuad(int aPosition) {
GLES20.glVertexAttribPointer(aPosition, 2, GLES20.GL_BYTE, false, 0, fullQuadVertices);
GLES20.glEnableVertexAttribArray(aPosition);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
}
目前,我可以在应用程序崩溃前的一小段时间内显示一些帧(颜色也不正确)。最有效的方法是将像素转换为OpenGL ES纹理,并在TextureView上渲染。要使用的函数是 你可以在Grafika中找到一些例子,它使用函数上传一些生成的纹理。看一看。如果你的应用程序中还没有gles代码,那么Grafika的
gles
软件包可能会有用
FWIW,将视频帧直接解码到由TextureView的SurfaceTexture创建的曲面上会更有效,但我不知道JavaCV是否支持这一点
编辑:如果您不介意使用NDK,另一种方法是使用。对于TextureView的SurfaceTexture,将其传递给本机代码,然后从Surface()调用
ANativeWindow\u以获取ANativeWindow。使用ANativeWindow\u setBuffersGeometry()
设置大小和颜色格式。锁定缓冲区,复制像素,解锁缓冲区以发布它。我认为这在内部不需要额外的数据拷贝,并且可能比glTexImage2D()
方法有一些优势。非常感谢您的指导,它非常有用,真的。我将首先尝试OpenGL ES并尽快返回。您介意详细说明“在TextureView上渲染OpenGL ES纹理”吗?也许我弄错了,但即使我使用演示代码GeneratedTexture.createTestTexture(GeneratedTexture.Image.FINE)代码>输出始终为0。请确保您是从具有活动EGL上下文的线程调用它。请参见例如TextureViewLactivity,了解使用专用渲染器线程使用GLES渲染到TextureView的示例。是的,我确实遵循TextureViewLactivity示例。我试图用一个简单的GeneratedTexture.createTestTexture(GeneratedTexture.Image.FINE)替换Renderer.doAnimation()
中的所有GL\u SCISSOR\u TEST
位
但我仍然会出现黑屏。嗨@fadden,我想我必须放弃这个方法,因为它可能不符合我的性能要求。我在和你的朋友玩。最昂贵的操作是png转换,我不需要它。是否可以使用ExtractMpegFramesTest
实现视频清理?假设我提前解码了30帧,让用户在解码下30帧的同时擦洗它。另一个问题是它是否需要连续的帧,因为我可能不需要那么多帧。每3帧就有一个解码帧是完美的。