Java 在PI3B+;上使用处理Pi的GLSL着色器;

Java 在PI3B+;上使用处理Pi的GLSL着色器;,java,raspberry-pi,glsl,processing,shader,Java,Raspberry Pi,Glsl,Processing,Shader,这是我第一次将GLSL着色器从处理OSX移植到处理在Raspberry Pi 3 B+上运行的Pi上。我有一个非常基本的着色器,可以在播放两个视频时分解。它在我的mac上运行完全正常,但当它被移植到处理Pi并更新为使用处理视频库GLvideo时,它抛出了一个错误 该着色器最初是从ShaderToy帖子转换而来的,但我将其重新写入了straight GLSL,以确保没有任何兼容性问题。我环顾四周,没有找到任何具体的,我认为会导致这个问题。因此,任何参考、指点或帮助都将不胜感激 我尝试了其他一些事情

这是我第一次将GLSL着色器从处理OSX移植到处理在Raspberry Pi 3 B+上运行的Pi上。我有一个非常基本的着色器,可以在播放两个视频时分解。它在我的mac上运行完全正常,但当它被移植到处理Pi并更新为使用处理视频库GLvideo时,它抛出了一个错误

该着色器最初是从ShaderToy帖子转换而来的,但我将其重新写入了straight GLSL,以确保没有任何兼容性问题。我环顾四周,没有找到任何具体的,我认为会导致这个问题。因此,任何参考、指点或帮助都将不胜感激

我尝试了其他一些事情,我调整了视频的大小,将Pi的GPU内存更新到256mb等等。我确保它在OSX上仍然可以工作,但当它在Raspberry Pi 3B+上运行时,草图是一个空的白色屏幕

我想Pi处理GLSL的方式可能有些不同?或者如果Pi GPU中的2D纹理有限制?更重要的是,在处理方面,它越过了我的问题,也许没有支持将PGraphics从处理设置为处理时着色器中的sampler2D纹理?当你设置一个纹理3D时,可能有一些GLVideo图像。另外,可能是因为我混淆了frag和颜色着色器的工作方式。目前,我认为我正在使用一个处理颜色着色器

控制台中的唯一输出是:

Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)640, height=(int)360, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, texture-target=(string)2D
Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)640, height=(int)360, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, texture-target=(string)2D
shaderDisolveGLSL.pde

//import processing.video.*;
import gohai.glvideo.*;

PShader mixShader;  

PGraphics pg;
PGraphics pg2;

//Movie movie;
//Movie movie2;

GLMovie movie;
GLMovie movie2;

void setup() {
  size(640, 360, P2D);
  noSmooth();
  pg = createGraphics(640, 360, P2D);

  //movie = new Movie(this, "_sm/LabspaceDawnv1blur2.mp4");
  movie = new GLMovie(this, "_sm/LabspaceDawnv1blur2.mp4");
  movie.loop();

  //movie2 = new Movie(this, "_sm/LabspaceFireblur2.mp4");
  movie2 = new GLMovie(this, "_sm/LabspaceFireblur2.mp4");
  movie2.loop();

  pg = createGraphics(width, height, P2D);
  pg2 = createGraphics(width, height, P2D);

  mixShader = loadShader("fadeshader.glsl");
  mixShader.set("iResolution", float(width), float(height));
  mixShader.set("iTime", millis()/1000.);

  mixShader.set("iChannel0", pg);
  mixShader.set("iChannel1", pg2);

}  

//void movieEvent(Movie m) {
void movieEvent(GLMovie m) {
  m.read();
  redraw();
}

void draw() {

  pg.beginDraw();
    pg.image(movie, 0, 0, width, height);
  pg.endDraw();

  pg2.beginDraw();
    pg2.image(movie2, 0, 0, width, height);
  pg2.endDraw();

  shader(mixShader);
  rect(0, 0, width, height);

}
fadeshader.glsl

// Type of shader expected by Processing
#define PROCESSING_COLOR_SHADER

uniform float iTime;
uniform sampler2D iChannel0;
uniform sampler2D iChannel1;
uniform vec2 iResolution;

void main() {

    vec2 uv = gl_FragCoord.xy / iResolution.xy;
    vec4 mixColor = vec4(0.0);
    vec4 color0 = vec4(uv.x,uv.y,0.0,1.0);
    vec4 color1 = vec4(uv.x,uv.y,0.0,1.0);

    color0 = texture2D(iChannel0, uv);
    color1 = texture2D(iChannel1, uv);

    float duration = 10.0;
    float t = mod(float(iTime), duration) / duration;

    mixColor = mix(color0, color1, t);
    gl_FragColor = mixColor;
}
如果有人好奇,我已经在这里用较小的视频更新了新版本的示例草图:

对于可能出现的问题或从何处开始调试的建议或想法,我们将不胜感激


谢谢

我不是100%确定,但错误可能与视频编码以及GLVideo库在Raspberry PI上可以解码的内容有关(依赖于gstreamer)

我在旧系统的OSX上已经遇到了错误:草图在灰色屏幕上冻结了几秒钟,然后在没有任何警告或错误的情况下崩溃

我建议重新编码视频,如果不需要音频频道,则删除该频道,并使用与处理传输视频相同或类似的H.264编码器(例如,示例>库>视频>电影>循环)

运输视频详细信息:

ffprobe -i /Users/George/Desktop/shaderDisolveGLSL/data/transit.mov 
ffprobe version 3.3.3 Copyright (c) 2007-2017 the FFmpeg developers
  built with Apple LLVM version 7.0.0 (clang-700.0.72)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/3.3.3 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libmp3lame --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-vda
  libavutil      55. 58.100 / 55. 58.100
  libavcodec     57. 89.100 / 57. 89.100
  libavformat    57. 71.100 / 57. 71.100
  libavdevice    57.  6.100 / 57.  6.100
  libavfilter     6. 82.100 /  6. 82.100
  libavresample   3.  5.  0 /  3.  5.  0
  libswscale      4.  6.100 /  4.  6.100
  libswresample   2.  7.100 /  2.  7.100
  libpostproc    54.  5.100 / 54.  5.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/Users/George/Desktop/shaderDisolveGLSL/data/transit.mov':
  Metadata:
    major_brand     : qt  
    minor_version   : 537199360
    compatible_brands: qt  
    creation_time   : 2012-08-31T20:17:39.000000Z
  Duration: 00:00:12.38, start: 0.000000, bitrate: 731 kb/s
    Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 640x360, 727 kb/s, 29.97 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
    Metadata:
      creation_time   : 2012-08-31T20:17:44.000000Z
      handler_name    : Apple Alias Data Handler
      encoder         : H.264
似乎有效的方法:

Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 640x360, 727 kb/s, 29.97 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
似乎崩溃的是:

Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 640x360 [SAR 1:1 DAR 16:9], 119 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
在使用
movieEvent
时,我遇到了一些其他间歇性JOGL错误:潜入
.read()
调用
draw()
似乎可以解决问题

以下是我在OSX上运行的经过调整的代码版本:

import processing.video.*;
//import gohai.glvideo.*;

PShader mixShader;  

PGraphics pg;
PGraphics pg2;

Movie movie;
Movie movie2;

//GLMovie movie;
//GLMovie movie2;

void setup() {
  size(640, 360, P2D);
  noSmooth();
  noStroke();

  //movie = new Movie(this, "_sm/LabspaceDawnv1blur2.mp4");
  movie = new Movie(this, "transit.mov");
  //movie = new GLMovie(this, "_sm/LabspaceDawnv1blur2.mp4");
  movie.loop();

  //movie2 = new Movie(this, "_sm/LabspaceFireblur2.mp4");
  movie2 = new Movie(this, "transit2.mov");
  //movie2 = new GLMovie(this, "_sm/LabspaceFireblur2.mp4");
  movie2.loop();

  pg = createGraphics(width, height, P2D);
  pg2 = createGraphics(width, height, P2D);

  mixShader = loadShader("fadeshader.glsl");
  mixShader.set("iResolution", float(width), float(height));
  mixShader.set("iChannel0", pg);
  mixShader.set("iChannel1", pg2);

}  

//void movieEvent(Movie m) {
//void movieEvent(GLMovie m) {
  //m.read();
  //redraw();
//}

void draw() {
  if(movie.available()){   movie.read(); }
  if(movie2.available()){   movie2.read(); }
  
  pg.beginDraw();
    // for testing only since both movies are the same
    movie.filter(GRAY);
    pg.image(movie, 0, 0, width, height);
  pg.endDraw();

  pg2.beginDraw();
    pg2.image(movie2, 0, 0, width, height);
  pg2.endDraw();
  
  // don't forget to update time
  mixShader.set("iTime", millis() * 0.01);
  
  shader(mixShader);
  rect(0, 0, width, height);
}
希望这能和Raspberry Pi上的transit mov一起工作,以测试编解码器。 一旦运行顺利,请重新编码视频(Handbreak可能会有帮助),然后重试

@jshaw3我需要在RPI3上进行测试。视频似乎有问题。如果你不需要音频,你也许可以用一个图像序列来代替:可以让这更容易。请记住,这应在5秒内初始化,以避免P3D/GL端超时(否则延迟加载到第一帧):

注意以上假设您已将.mp4文件转换为图像序列(例如
LabspaceDawnv1blur2.mp4
->
LabspaceDawnv1blur2帧
)。下面是一个ffmepg示例:

ffmpeg -i LabspaceDawnv1blur2.mp4.mp4 -vf fps=1/60 LabspaceDawnv1blur2.mp4Frames/frame_%04d.png
import com.hirschandmann.image.*;

PShader mixShader;  

PGraphics pg;
PGraphics pg2;

ISPlayer movie1;
ISPlayer movie2;

boolean loadTriggered = false;

void setup() {
  size(640, 360, P2D);
  noSmooth();
  pg = createGraphics(640, 360, P2D);

  pg = createGraphics(width, height, P2D);
  pg2 = createGraphics(width, height, P2D);

  mixShader = loadShader("fadeshader.glsl");
  mixShader.set("iResolution", float(width), float(height));
  mixShader.set("iTime", millis()/1000.);

  mixShader.set("iChannel0", pg);
  mixShader.set("iChannel1", pg2);

}  

void draw() {
 
  if(!loadTriggered){
    movie1 = new ISPlayer(this,dataPath("_sm/LabspaceDawnv1blur2Frames"));
    movie1.loop();
    
    movie2 = new ISPlayer(this,dataPath("_sm/LabspaceFireblur2Frames"));
    movie2.loop();
    
    loadTriggered = true;
  }
  
  pg.beginDraw();
    if(movie1 != null) pg.image(movie1, 0, 0, width, height);
  pg.endDraw();

  pg2.beginDraw();
    if(movie2 != null) pg2.image(movie2, 0, 0, width, height);
  pg2.endDraw();
  
  shader(mixShader);
  rect(0, 0, width, height);
  
}
ffmpeg -i LabspaceDawnv1blur2.mp4.mp4 -vf fps=1/60 LabspaceDawnv1blur2.mp4Frames/frame_%04d.png