Android上的FFmpeg
我已经在Android上编译了FFmpeg(libffmpeg.so)。现在我必须构建一个像RockPlayer这样的应用程序,或者使用现有的Android多媒体框架来调用FFmpegAndroid上的FFmpeg,android,ffmpeg,stagefright,android-ffmpeg,Android,Ffmpeg,Stagefright,Android Ffmpeg,我已经在Android上编译了FFmpeg(libffmpeg.so)。现在我必须构建一个像RockPlayer这样的应用程序,或者使用现有的Android多媒体框架来调用FFmpeg 您是否有在Android/StageFright上集成FFmpeg的步骤/过程/代码/示例 你能告诉我如何使用这个库进行多媒体播放吗 我有一个要求,即我已经有音频和视频传输流,我需要将其馈送到FFmpeg并对其进行解码/渲染。既然IOMXAPI是基于OMX的,并且不能在这里插入FFmpeg,我怎么能在Androi
以下是我在让ffmpeg在Android上工作时所经历的步骤:
make
下即可。您还需要从Android构建中提取bionic(libc)和zlib(libz),因为ffmpeg库依赖于它们LOCAL\u static\u库:=libavcodec libavformat libavutil libc libz
祝你好运:)我做了一个小项目,使用Android NDK配置和构建X264和FFMPEG。缺少的主要东西是一个像样的JNI接口,可以通过Java访问,但这是相对容易的部分。当我开始着手使JNI接口适合我自己的使用时,我会把它推进去 与olvaffe的构建系统相比,它的优点是不需要Android.mk文件来构建库,它只使用常规的makefile和工具链。这使得从FFMPEG或X264中提取新更改时,停止工作的可能性大大降低
由于各种原因,多媒体在不影响效率的情况下完成任务过去是,现在也从来不是一件容易的事。ffmpeg是一项日臻完善的工作。它支持不同格式的编解码器和容器 现在,为了回答如何使用这个库的问题,我想说,在这里编写它并不是那么简单。但是我可以用以下方法来指导你 1) 在源代码的ffmpeg目录中,您有输出\u example.c或api\u example.c。在这里,您可以看到编码/解码完成的代码。您将了解应该调用ffmpeg中的API。这将是你的第一步 2) Dolphin player是Android的开源项目。目前它有bug,但开发人员仍在继续工作。在该项目中,您已经准备好了整个设置,可以使用它继续您的调查。这里是到from code.google.com的链接,或者在终端中运行命令“git clone”。您可以看到两个名为P和P86的项目。你可以使用它们中的任何一个 我想提供的额外提示是,当您构建ffmpeg代码时,在build.sh中,您需要启用您想要使用的格式的muxer/demuxer/encoder/decoders。否则相应的代码将不包含在库中。我花了很多时间才意识到这一点。所以我想和你分享一下 基础知识不多: 当我们说一个视频文件,例如:avi,它是音频和视频的组合 视频文件=视频+音频
视频=编解码器+多路复用器+解复用器 编解码器=编码器+解码器 =>视频=编码器+解码器+多路复用器+解复用器(Mpeg4+Mpeg4+avi+avi-avi容器示例)
音频=编解码器+多路复用器+解复用器 编解码器=编码器+解码器 =>音频=编码器+解码器+多路复用器+解复用器(mp2+mp2+avi+avi-avi容器示例)
编解码器(名称来源于en*co*der/*dec*oder的组合)只是定义用于编码/解码帧的算法的格式的一部分。AVI不是一个编解码器,它是一个使用Mpeg4视频编解码器和mp2音频编解码器的容器 Muxer/demuxer用于将帧与编码/解码时使用的文件合并/分离 所以,如果你想使用avi格式,你需要启用视频组件+音频组件 例如,对于avi,您需要启用以下功能。 mpeg4编码器,mpeg4解码器,mp2编码器,mp2解码器,avi多路复用器,avi解复用器 PHEWWWWWWWW 以编程方式build.sh应包含以下代码:
--enable-muxer=avi --enable-demuxer=avi (Generic for both audio/video. generally Specific to a container)
--enable-encoder=mpeg4 --enable-decoder=mpeg4(For video support)
--enable-encoder=mp2 --enable-decoder=mp2 (For Audio support)
希望在这一切之后我不会让你更困惑
谢谢,任何需要的帮助,请告诉我。我发现的最易于构建、易于使用的实现是由瓜迪亚项目团队制作的:受许多其他关于Android实现的FFmpeg的启发,我找到了一个解决方案(支持也很差) (lame和FFmpeg:and) 要调用FFmpeg:
new Thread(new Runnable() {
@Override
public void run() {
Looper.prepare();
FfmpegController ffmpeg = null;
try {
ffmpeg = new FfmpegController(context);
} catch (IOException ioe) {
Log.e(DEBUG_TAG, "Error loading ffmpeg. " + ioe.getMessage());
}
ShellDummy shell = new ShellDummy();
String mp3BitRate = "192";
try {
ffmpeg.extractAudio(in, out, audio, mp3BitRate, shell);
} catch (IOException e) {
Log.e(DEBUG_TAG, "IOException running ffmpeg" + e.getMessage());
} catch (InterruptedException e) {
Log.e(DEBUG_TAG, "InterruptedException running ffmpeg" + e.getMessage());
}
Looper.loop();
}
}).start();
和处理控制台输出:
private class ShellDummy implements ShellCallback {
@Override
public void shellOut(String shellLine) {
if (someCondition) {
doSomething(shellLine);
}
Utils.logger("d", shellLine, DEBUG_TAG);
}
@Override
public void processComplete(int exitValue) {
if (exitValue == 0) {
// Audio job OK, do your stuff:
// i.e.
// write id3 tags,
// calls the media scanner,
// etc.
}
}
@Override
public void processNotStartedCheck(boolean started) {
if (!started) {
// Audio job error, as above.
}
}
}
奇怪的是,这个项目没有被提及:
对于像我这样的懒人来说,它有非常详细的复制/粘贴到命令行的分步说明
implementation 'com.writingminds:FFmpegAndroid:0.3.2'
FFmpeg ffmpeg;
private void trimVideo(ProgressDialog progressDialog) {
outputAudioMux = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES).getAbsolutePath()
+ "/VidEffectsFilter" + "/" + new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date())
+ "filter_apply.mp4";
if (startTrim.equals("")) {
startTrim = "00:00:00";
}
if (endTrim.equals("")) {
endTrim = timeTrim(player.getDuration());
}
String[] cmd = new String[]{"-ss", startTrim + ".00", "-t", endTrim + ".00", "-noaccurate_seek", "-i", videoPath, "-codec", "copy", "-avoid_negative_ts", "1", outputAudioMux};
execFFmpegBinary1(cmd, progressDialog);
}
private void execFFmpegBinary1(final String[] command, ProgressDialog prpg) {
ProgressDialog progressDialog = prpg;
try {
ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
@Override
public void onFailure(String s) {
progressDialog.dismiss();
Toast.makeText(PlayerTestActivity.this, "Fail to generate video", Toast.LENGTH_SHORT).show();
Log.d(TAG, "FAILED with output : " + s);
}
@Override
public void onSuccess(String s) {
Log.d(TAG, "SUCCESS wgith output : " + s);
// pathVideo = outputAudioMux;
String finalPath = outputAudioMux;
videoPath = outputAudioMux;
Toast.makeText(PlayerTestActivity.this, "Storage Path =" + finalPath, Toast.LENGTH_SHORT).show();
Intent intent = new Intent(PlayerTestActivity.this, ShareVideoActivity.class);
intent.putExtra("pathGPU", finalPath);
startActivity(intent);
finish();
MediaScannerConnection.scanFile(PlayerTestActivity.this, new String[]{finalPath}, new String[]{"mp4"}, null);
}
@Override
public void onProgress(String s) {
Log.d(TAG, "Started gcommand : ffmpeg " + command);
progressDialog.setMessage("Please Wait video triming...");
}
@Override
public void onStart() {
Log.d(TAG, "Startedf command : ffmpeg " + command);
}
@Override
public void onFinish() {
Log.d(TAG, "Finished f command : ffmpeg " + command);
progressDialog.dismiss();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// do nothing for now
}
}
private void loadFFMpegBinary() {
try {
if (ffmpeg == null) {
ffmpeg = FFmpeg.getInstance(this);
}
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
@Override
public void onFailure() {
showUnsupportedExceptionDialog();
}
@Override
public void onSuccess() {
Log.d("dd", "ffmpeg : correct Loaded");
}
});
} catch (FFmpegNotSupportedException e) {
showUnsupportedExceptionDialog();
} catch (Exception e) {
Log.d("dd", "EXception no controlada : " + e);
}
}
private void showUnsupportedExceptionDialog() {
new AlertDialog.Builder(this)
.setIcon(android.R.drawable.ic_dialog_alert)
.setTitle("Not Supported")
.setMessage("Device Not Supported")
.setCancelable(false)
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
finish();
}
})
.create()
.show();
}
public String timeTrim(long milliseconds) {
String finalTimerString = "";
String minutString = "";
String secondsString = "";
// Convert total duration into time
int hours = (int) (milliseconds / (1000 * 60 * 60));
int minutes = (int) (milliseconds % (1000 * 60 * 60)) / (1000 * 60);
int seconds = (int) ((milliseconds % (1000 * 60 * 60)) % (1000 * 60) / 1000);
// Add hours if there
if (hours < 10) {
finalTimerString = "0" + hours + ":";
} else {
finalTimerString = hours + ":";
}
if (minutes < 10) {
minutString = "0" + minutes;
} else {
minutString = "" + minutes;
}
// Prepending 0 to seconds if it is one digit
if (seconds < 10) {
secondsString = "0" + seconds;
} else {
secondsString = "" + seconds;
}
finalTimerString = finalTimerString + minutString + ":" + secondsString;
// return timer string
return finalTimerString;
}
===> merge audio to video
String[] cmd = new String[]{"-i", yourRealPath, "-i", arrayList.get(posmusic).getPath(), "-map", "1:a", "-map", "0:v", "-codec", "copy", "-shortest", outputcrop};
===> Flip vertical :
String[] cm = new String[]{"-i", yourRealPath, "-vf", "vflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};
===> Flip horizontally :
String[] cm = new String[]{"-i", yourRealPath, "-vf", "hflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};
===> Rotate 90 degrees clockwise:
String[] cm=new String[]{"-i", yourRealPath, "-c", "copy", "-metadata:s:v:0", "rotate=90", outputcrop1};
===> Compress Video
String[] complexCommand = {"-y", "-i", yourRealPath, "-strict", "experimental", "-vcodec", "libx264", "-preset", "ultrafast", "-crf", "24", "-acodec", "aac", "-ar", "22050", "-ac", "2", "-b", "360k", "-s", "1280x720", outputcrop1};
===> Speed up down video
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=2.0*PTS[v];[0:a]atempo=0.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=1.0*PTS[v];[0:a]atempo=1.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.75*PTS[v];[0:a]atempo=1.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.5*PTS[v];[0:a]atempo=2.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
===> Add two mp3 files
StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0]concat=n=2:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()
===> Add three mp3 files
StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(firstSngname);
sb.append(" -i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0][2:0]concat=n=3:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()