如何在Java环境下解码H.264视频帧
有人知道如何在Java环境下解码H.264视频帧吗 我的网络摄像头产品支持RTP/RTSP流媒体 我的网络摄像机提供的服务标准RTP/RTSP,它还支持“HTTP上的RTP/RTSP” RTSP:TCP 554如何在Java环境下解码H.264视频帧,java,decode,h.264,Java,Decode,H.264,有人知道如何在Java环境下解码H.264视频帧吗 我的网络摄像头产品支持RTP/RTSP流媒体 我的网络摄像机提供的服务标准RTP/RTSP,它还支持“HTTP上的RTP/RTSP” RTSP:TCP 554 RTP开始端口:UDP 5000看看Java媒体框架(JMF)—— 我用过一段时间,它有点不成熟,但他们可能从那时起加强了它。或使用。与RTP、RTMP、HTTP或其他协议配合使用,可以对H264和大多数其他编解码器进行解码和编码。并且是主动维护的、免费的、开源的(LGPL)。您可以使用
RTP开始端口:UDP 5000看看Java媒体框架(JMF)——
我用过一段时间,它有点不成熟,但他们可能从那时起加强了它。或使用。与RTP、RTMP、HTTP或其他协议配合使用,可以对H264和大多数其他编解码器进行解码和编码。并且是主动维护的、免费的、开源的(LGPL)。您可以使用一个名为JCodec()的纯Java库。
解码一个H.264帧非常简单:
ByteBuffer bb = ... // Your frame data is stored in this buffer
H264Decoder decoder = new H264Decoder();
Picture out = Picture.create(1920, 1088, ColorSpace.YUV_420); // Allocate output frame of max size
Picture real = decoder.decodeFrame(bb, out.getData());
BufferedImage bi = JCodecUtil.toBufferedImage(real); // If you prefere AWT image
如果要从容器(如MP4)中读取,可以使用方便的帮助器类FrameGrab:
int frameNumber = 150;
BufferedImage frame = FrameGrab.getFrame(new File("filename.mp4"), frameNumber);
ImageIO.write(frame, "png", new File("frame_150.png"));
最后,这里是一个完整的复杂示例:
private static void avc2png(String in, String out) throws IOException {
SeekableByteChannel sink = null;
SeekableByteChannel source = null;
try {
source = readableFileChannel(in);
sink = writableFileChannel(out);
MP4Demuxer demux = new MP4Demuxer(source);
H264Decoder decoder = new H264Decoder();
Transform transform = new Yuv420pToRgb(0, 0);
MP4DemuxerTrack inTrack = demux.getVideoTrack();
VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
Picture target1 = Picture.create((ine.getWidth() + 15) & ~0xf, (ine.getHeight() + 15) & ~0xf,
ColorSpace.YUV420);
Picture rgb = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.RGB);
ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
BufferedImage bi = new BufferedImage(ine.getWidth(), ine.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
AvcCBox avcC = Box.as(AvcCBox.class, Box.findFirst(ine, LeafBox.class, "avcC"));
decoder.addSps(avcC.getSpsList());
decoder.addPps(avcC.getPpsList());
Packet inFrame;
int totalFrames = (int) inTrack.getFrameCount();
for (int i = 0; (inFrame = inTrack.getFrames(1)) != null; i++) {
ByteBuffer data = inFrame.getData();
Picture dec = decoder.decodeFrame(splitMOVPacket(data, avcC), target1.getData());
transform.transform(dec, rgb);
_out.clear();
AWTUtil.toBufferedImage(rgb, bi);
ImageIO.write(bi, "png", new File(format(out, i)));
if (i % 100 == 0)
System.out.println((i * 100 / totalFrames) + "%");
}
} finally {
if (sink != null)
sink.close();
if (source != null)
source.close();
}
}
我认为最好的解决方案是使用“JNI+ffmpeg”。在我当前的项目中,我需要在基于libgdx的javaopengl游戏中同时播放几个全屏视频。我尝试了几乎所有的免费LIB,但没有一个有可接受的性能。所以最后我决定写我自己的JNIC代码来使用ffmpeg。以下是我的笔记本电脑的最终性能:
- 环境:CPU:Core i7 Q740@1.73G,视频:nVidia GeForce GT 435M, OS:Windows7 64位,Java:Java7u60 64位
- 视频:h264rgb/h264编码,无声音,分辨率:1366*768
- 解决方案:解码:JNI+ffmpeg v2.2.2,上传到GPU: 使用lwjgl更新openGL纹理
- 性能:解码速度: 700-800FPS,纹理上传:大约每帧1ms
我游戏中的大多数视频都有透明的背景。这种透明视频是包含2个视频流的mp4文件,一个流存储h264rgb编码的rgb数据,另一个流存储h264编码的alpha数据。所以要播放alpha视频,我需要解码2个视频流并将它们合并在一起,然后上传到GPU。因此,在我的游戏中,我可以在一个不透明的高清视频上同时播放多个透明的高清视频 我发现了一个非常简单直接的解决方案。该库允许您通过在Java中包装ffmpeg来播放流媒体 如何使用它? 首先,您可以使用Maven或Gradle下载并安装该库 这里有一个
StreamingClient
类,它调用一个SimplePlayer
类,该类具有播放视频的线程
public class StreamingClient extends Application implements GrabberListener
{
public static void main(String[] args)
{
launch(args);
}
private Stage primaryStage;
private ImageView imageView;
private SimplePlayer simplePlayer;
@Override
public void start(Stage stage) throws Exception
{
String source = "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov"; // the video is weird for 1 minute then becomes stable
primaryStage = stage;
imageView = new ImageView();
StackPane root = new StackPane();
root.getChildren().add(imageView);
imageView.fitWidthProperty().bind(primaryStage.widthProperty());
imageView.fitHeightProperty().bind(primaryStage.heightProperty());
Scene scene = new Scene(root, 640, 480);
primaryStage.setTitle("Streaming Player");
primaryStage.setScene(scene);
primaryStage.show();
simplePlayer = new SimplePlayer(source, this);
}
@Override
public void onMediaGrabbed(int width, int height)
{
primaryStage.setWidth(width);
primaryStage.setHeight(height);
}
@Override
public void onImageProcessed(Image image)
{
LogHelper.e(TAG, "image: " + image);
Platform.runLater(() -> {
imageView.setImage(image);
});
}
@Override
public void onPlaying() {}
@Override
public void onGainControl(FloatControl gainControl) {}
@Override
public void stop() throws Exception
{
simplePlayer.stop();
}
}
SimplePlayer
类使用FFmpegFrameGrabber
对转换为图像并显示在后台的帧进行解码
public class SimplePlayer
{
private static volatile Thread playThread;
private AnimationTimer timer;
private SourceDataLine soundLine;
private int counter;
public SimplePlayer(String source, GrabberListener grabberListener)
{
if (grabberListener == null) return;
if (source.isEmpty()) return;
counter = 0;
playThread = new Thread(() -> {
try {
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(source);
grabber.start();
grabberListener.onMediaGrabbed(grabber.getImageWidth(), grabber.getImageHeight());
if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {
AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);
DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
soundLine = (SourceDataLine) AudioSystem.getLine(info);
soundLine.open(audioFormat);
soundLine.start();
}
Java2DFrameConverter converter = new Java2DFrameConverter();
while (!Thread.interrupted()) {
Frame frame = grabber.grab();
if (frame == null) {
break;
}
if (frame.image != null) {
Image image = SwingFXUtils.toFXImage(converter.convert(frame), null);
Platform.runLater(() -> {
grabberListener.onImageProcessed(image);
});
} else if (frame.samples != null) {
ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
channelSamplesFloatBuffer.rewind();
ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
short val = channelSamplesFloatBuffer.get(i);
outBuffer.putShort(val);
}
}
}
grabber.stop();
grabber.release();
Platform.exit();
} catch (Exception exception) {
System.exit(1);
}
});
playThread.start();
}
public void stop()
{
playThread.interrupt();
}
}
公共类SimplePlayer
{
私有静态易失性线程;
私人动画定时器;
专用源数据线声音线;
专用int计数器;
公共SimplePlayer(字符串源,GrabberListener GrabberListener)
{
if(grabberListener==null)返回;
if(source.isEmpty())返回;
计数器=0;
playThread=新线程(()->{
试一试{
FFmpegFrameGrabber抓取器=新的FFmpegFrameGrabber(源);
grabber.start();
grabberListener.onMediaGrabbed(grabber.getImageWidth(),grabber.getImageHeight());
if(grabber.getSampleRate()>0&&grabber.getAudioChannel()>0){
AudioFormat AudioFormat=新的音频格式(grabber.getSampleRate(),16,grabber.getAudioChannel(),true,true);
DataLine.Info=newdataline.Info(SourceDataLine.class,audioFormat);
soundLine=(SourceDataLine)AudioSystem.getLine(info);
声音线。打开(音频格式);
soundLine.start();
}
Java2DFrameConverter=新的Java2DFrameConverter();
而(!Thread.interrupted()){
Frame-Frame=grabber.grab();
if(frame==null){
打破
}
如果(frame.image!=null){
Image Image=SwingFXUtils.toFXImage(converter.convert(frame),null);
Platform.runLater(()->{
grabberListener.onImageProcessed(图像);
});
}else if(frame.samples!=null){
ShortBuffer channelSamplesFloatBuffer=(ShortBuffer)frame.samples[0];
channelSamplesFloatBuffer.rewind();
ByteBuffer Exputffer=ByteBuffer.allocate(channelSamplesFloatBuffer.capacity()*2);
对于(int i=0;i
JMF被遗弃,死亡多年。因此,在一个长期项目中依靠它不是一个好主意。但如果这是一次性的事情,我同意JMF是一个很好的解决方案。虽然我相信JMF只支持H263,但是如果JMF已经死了,那么它可以用什么作为它的