Android 2个视频1个MediaController

Android 2个视频1个MediaController,android,mediacontroller,Android,Mediacontroller,我希望并排显示2个视频视图。我不希望实现自定义MediaController,因为默认设置非常好,但是,无论我做了什么,我都无法同时控制2个视频 val mediaController = MediaController(requireContext()) mediaController.setAnchorView(videoViewF) videoViewF.setMediaController(mediaController) videoViewR.setMe

我希望并排显示2个视频视图。我不希望实现自定义MediaController,因为默认设置非常好,但是,无论我做了什么,我都无法同时控制2个视频

    val mediaController = MediaController(requireContext())
    mediaController.setAnchorView(videoViewF)
    videoViewF.setMediaController(mediaController)
    videoViewR.setMediaController(mediaController)
我怎样才能做到这一点?当进度被更改/暂停/播放时,我可以从MediaController或第一个视频视图获得回调吗?或者其他方式?

在Android中播放3gp视频的双视频视图 本例将解释如何在布局中包含2个视频视图,以便同时播放不同的3gp

算法:

1.)通过文件->新建->Android项目名称创建一个新项目

2.)将以下内容写入main.xml:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >

<TextView
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="Dual VideoView" />
<LinearLayout
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="match_parent">
<VideoView
android:id="@+id/myvideoview"
android:layout_width="fill_parent"
android:layout_height="wrap_content" />
<VideoView
android:id="@+id/myvideoview2"
android:layout_width="fill_parent"
android:layout_height="wrap_content" />
</LinearLayout>
</LinearLayout>
3.)编译并构建项目

注意:您也可以直接从internet流式播放3gp,而不是从原始文件夹获取,如示例所示

输出


你没有给出太多关于你到底做了什么以及有问题的地方是什么的细节,所以我只是做了一个小测试,看看我是否可以重现你所描述的内容

我没有任何结论性的发现,但至少可以确认我的Galaxy Nexus(Android 4.0.2)能够同时播放三个视频,没有任何问题。另一方面,我身边的一个旧三星Galaxy Spica(Android 2.1-update1)一次只能播放一个文件——它似乎总是第一个SurfaceView

我通过为Android 3.0、2.3.3和2.2设置模拟器,进一步研究了不同的API级别。所有这些平台似乎都能够很好地处理多个视频文件在不同表面视图上的播放。我用一个运行2.1-update1的模拟器做了最后一次测试,有趣的是,与实际的手机不同,它也毫无问题地播放了测试用例。不过,我确实注意到布局的呈现方式有一些细微的差异

这种行为让我怀疑,您所追求的东西实际上没有任何软件限制,但它似乎取决于硬件是否支持同时播放多个视频文件。因此,对该场景的支持将因设备而异。从实证的角度来看,我肯定认为在更多的物理设备上测试这一假设会很有趣

有关实施的一些细节仅供参考:

我设置了两个稍有不同的实现:一个是基于单个活动中的三个MediaPlayer实例,另一个是将它们分解成三个单独的片段,每个片段都有自己的MediaPlayer对象。(顺便说一下,我没有发现这两种实现的播放差异) 资产文件夹中的一个3gp文件(感谢苹果)用于所有播放器的播放。 这两个实现的代码附在下面,主要基于谷歌MediaPlayerDemo_视频示例实现——我去掉了一些实际测试中不需要的代码。其结果绝不完整,也不适合在实时应用程序中使用

基于活动的实施:

public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };

private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_layout);

    // create surface holders
    for (int i=0; i<mSurfaceViews.length; i++) {
        mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
        mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
        mSurfaceHolders[i].addCallback(this);
        mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }
}

public void onBufferingUpdate(MediaPlayer player, int percent) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}

public void onCompletion(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
}

public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
    if (width == 0 || height == 0) {
        Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
        return;
    }

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mSizeKnown[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void onPrepared(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mVideoReady[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}

public void surfaceDestroyed(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}


public void surfaceCreated(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");

    int index = indexOf(holder);
    if (index == -1) return; // sanity check; should never happen
    try { 
        mMediaPlayers[index] = new MediaPlayer();
        AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
        mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
        mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
        mMediaPlayers[index].prepare();
        mMediaPlayers[index].setOnBufferingUpdateListener(this);
        mMediaPlayers[index].setOnCompletionListener(this);
        mMediaPlayers[index].setOnPreparedListener(this);
        mMediaPlayers[index].setOnVideoSizeChangedListener(this);
        mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
    }
    catch (Exception e) { e.printStackTrace(); }
}

@Override protected void onPause() {
    super.onPause();
    releaseMediaPlayers();
}

@Override protected void onDestroy() {
    super.onDestroy();
    releaseMediaPlayers();
}

private void releaseMediaPlayers() {
    for (int i=0; i<mMediaPlayers.length; i++) {
        if (mMediaPlayers[i] != null) {
            mMediaPlayers[i].release();
            mMediaPlayers[i] = null;
        }
    }
}


private void startVideoPlayback(MediaPlayer player) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
    player.start();
}

private int indexOf(MediaPlayer player) {
    for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
    return -1;  
}

private int indexOf(SurfaceHolder holder) {
    for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
    return -1;  
}
}
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<SurfaceView android:id="@+id/video_1_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_2_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_3_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

 </LinearLayout>
public class MultipleVideoPlayFragmentActivity extends FragmentActivity {

private static final String TAG = "MediaPlayer";

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_activity_layout);
}

public static class VideoFragment extends Fragment implements
    OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

    private MediaPlayer mMediaPlayer;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private boolean mSizeKnown;
    private boolean mVideoReady;

    @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
        return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
    }

    @Override public void onActivityCreated(Bundle savedInstanceState) {
        super.onActivityCreated(savedInstanceState);
        mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(this);
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void onBufferingUpdate(MediaPlayer player, int percent) {
        Log.d(TAG, "onBufferingUpdate percent: " + percent);
    }

    public void onCompletion(MediaPlayer player) {
        Log.d(TAG, "onCompletion called");
    }

    public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
        Log.v(TAG, "onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
            return;
        }

        mSizeKnown = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void onPrepared(MediaPlayer player) {
        Log.d(TAG, "onPrepared called");

        mVideoReady = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
        Log.d(TAG, "surfaceChanged called");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "surfaceDestroyed called");
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "surfaceCreated called");

        try { 
            mMediaPlayer = new MediaPlayer();
            AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
            mMediaPlayer.setDisplay(mSurfaceHolder);
            mMediaPlayer.prepare();
            mMediaPlayer.setOnBufferingUpdateListener(this);
            mMediaPlayer.setOnCompletionListener(this);
            mMediaPlayer.setOnPreparedListener(this);
            mMediaPlayer.setOnVideoSizeChangedListener(this);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        }
        catch (Exception e) { e.printStackTrace(); }
    }

    @Override public void onPause() {
        super.onPause();
        releaseMediaPlayer();
    }

    @Override public void onDestroy() {
        super.onDestroy();
        releaseMediaPlayer();
    }

    private void releaseMediaPlayer() {
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    }

    private void startVideoPlayback() {
        Log.v(TAG, "startVideoPlayback");
        mMediaPlayer.start();
    }
}
}
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

</LinearLayout>
<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />
R.layout.多视频活动布局:

public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };

private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_layout);

    // create surface holders
    for (int i=0; i<mSurfaceViews.length; i++) {
        mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
        mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
        mSurfaceHolders[i].addCallback(this);
        mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }
}

public void onBufferingUpdate(MediaPlayer player, int percent) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}

public void onCompletion(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
}

public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
    if (width == 0 || height == 0) {
        Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
        return;
    }

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mSizeKnown[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void onPrepared(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mVideoReady[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}

public void surfaceDestroyed(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}


public void surfaceCreated(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");

    int index = indexOf(holder);
    if (index == -1) return; // sanity check; should never happen
    try { 
        mMediaPlayers[index] = new MediaPlayer();
        AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
        mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
        mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
        mMediaPlayers[index].prepare();
        mMediaPlayers[index].setOnBufferingUpdateListener(this);
        mMediaPlayers[index].setOnCompletionListener(this);
        mMediaPlayers[index].setOnPreparedListener(this);
        mMediaPlayers[index].setOnVideoSizeChangedListener(this);
        mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
    }
    catch (Exception e) { e.printStackTrace(); }
}

@Override protected void onPause() {
    super.onPause();
    releaseMediaPlayers();
}

@Override protected void onDestroy() {
    super.onDestroy();
    releaseMediaPlayers();
}

private void releaseMediaPlayers() {
    for (int i=0; i<mMediaPlayers.length; i++) {
        if (mMediaPlayers[i] != null) {
            mMediaPlayers[i].release();
            mMediaPlayers[i] = null;
        }
    }
}


private void startVideoPlayback(MediaPlayer player) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
    player.start();
}

private int indexOf(MediaPlayer player) {
    for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
    return -1;  
}

private int indexOf(SurfaceHolder holder) {
    for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
    return -1;  
}
}
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<SurfaceView android:id="@+id/video_1_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_2_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_3_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

 </LinearLayout>
public class MultipleVideoPlayFragmentActivity extends FragmentActivity {

private static final String TAG = "MediaPlayer";

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_activity_layout);
}

public static class VideoFragment extends Fragment implements
    OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

    private MediaPlayer mMediaPlayer;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private boolean mSizeKnown;
    private boolean mVideoReady;

    @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
        return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
    }

    @Override public void onActivityCreated(Bundle savedInstanceState) {
        super.onActivityCreated(savedInstanceState);
        mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(this);
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void onBufferingUpdate(MediaPlayer player, int percent) {
        Log.d(TAG, "onBufferingUpdate percent: " + percent);
    }

    public void onCompletion(MediaPlayer player) {
        Log.d(TAG, "onCompletion called");
    }

    public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
        Log.v(TAG, "onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
            return;
        }

        mSizeKnown = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void onPrepared(MediaPlayer player) {
        Log.d(TAG, "onPrepared called");

        mVideoReady = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
        Log.d(TAG, "surfaceChanged called");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "surfaceDestroyed called");
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "surfaceCreated called");

        try { 
            mMediaPlayer = new MediaPlayer();
            AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
            mMediaPlayer.setDisplay(mSurfaceHolder);
            mMediaPlayer.prepare();
            mMediaPlayer.setOnBufferingUpdateListener(this);
            mMediaPlayer.setOnCompletionListener(this);
            mMediaPlayer.setOnPreparedListener(this);
            mMediaPlayer.setOnVideoSizeChangedListener(this);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        }
        catch (Exception e) { e.printStackTrace(); }
    }

    @Override public void onPause() {
        super.onPause();
        releaseMediaPlayer();
    }

    @Override public void onDestroy() {
        super.onDestroy();
        releaseMediaPlayer();
    }

    private void releaseMediaPlayer() {
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    }

    private void startVideoPlayback() {
        Log.v(TAG, "startVideoPlayback");
        mMediaPlayer.start();
    }
}
}
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

</LinearLayout>
<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />

R.layout.多视频片段布局:

public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };

private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_layout);

    // create surface holders
    for (int i=0; i<mSurfaceViews.length; i++) {
        mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
        mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
        mSurfaceHolders[i].addCallback(this);
        mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }
}

public void onBufferingUpdate(MediaPlayer player, int percent) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}

public void onCompletion(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
}

public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
    if (width == 0 || height == 0) {
        Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
        return;
    }

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mSizeKnown[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void onPrepared(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mVideoReady[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}

public void surfaceDestroyed(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}


public void surfaceCreated(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");

    int index = indexOf(holder);
    if (index == -1) return; // sanity check; should never happen
    try { 
        mMediaPlayers[index] = new MediaPlayer();
        AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
        mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
        mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
        mMediaPlayers[index].prepare();
        mMediaPlayers[index].setOnBufferingUpdateListener(this);
        mMediaPlayers[index].setOnCompletionListener(this);
        mMediaPlayers[index].setOnPreparedListener(this);
        mMediaPlayers[index].setOnVideoSizeChangedListener(this);
        mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
    }
    catch (Exception e) { e.printStackTrace(); }
}

@Override protected void onPause() {
    super.onPause();
    releaseMediaPlayers();
}

@Override protected void onDestroy() {
    super.onDestroy();
    releaseMediaPlayers();
}

private void releaseMediaPlayers() {
    for (int i=0; i<mMediaPlayers.length; i++) {
        if (mMediaPlayers[i] != null) {
            mMediaPlayers[i].release();
            mMediaPlayers[i] = null;
        }
    }
}


private void startVideoPlayback(MediaPlayer player) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
    player.start();
}

private int indexOf(MediaPlayer player) {
    for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
    return -1;  
}

private int indexOf(SurfaceHolder holder) {
    for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
    return -1;  
}
}
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<SurfaceView android:id="@+id/video_1_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_2_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_3_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

 </LinearLayout>
public class MultipleVideoPlayFragmentActivity extends FragmentActivity {

private static final String TAG = "MediaPlayer";

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_activity_layout);
}

public static class VideoFragment extends Fragment implements
    OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

    private MediaPlayer mMediaPlayer;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private boolean mSizeKnown;
    private boolean mVideoReady;

    @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
        return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
    }

    @Override public void onActivityCreated(Bundle savedInstanceState) {
        super.onActivityCreated(savedInstanceState);
        mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(this);
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void onBufferingUpdate(MediaPlayer player, int percent) {
        Log.d(TAG, "onBufferingUpdate percent: " + percent);
    }

    public void onCompletion(MediaPlayer player) {
        Log.d(TAG, "onCompletion called");
    }

    public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
        Log.v(TAG, "onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
            return;
        }

        mSizeKnown = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void onPrepared(MediaPlayer player) {
        Log.d(TAG, "onPrepared called");

        mVideoReady = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
        Log.d(TAG, "surfaceChanged called");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "surfaceDestroyed called");
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "surfaceCreated called");

        try { 
            mMediaPlayer = new MediaPlayer();
            AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
            mMediaPlayer.setDisplay(mSurfaceHolder);
            mMediaPlayer.prepare();
            mMediaPlayer.setOnBufferingUpdateListener(this);
            mMediaPlayer.setOnCompletionListener(this);
            mMediaPlayer.setOnPreparedListener(this);
            mMediaPlayer.setOnVideoSizeChangedListener(this);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        }
        catch (Exception e) { e.printStackTrace(); }
    }

    @Override public void onPause() {
        super.onPause();
        releaseMediaPlayer();
    }

    @Override public void onDestroy() {
        super.onDestroy();
        releaseMediaPlayer();
    }

    private void releaseMediaPlayer() {
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    }

    private void startVideoPlayback() {
        Log.v(TAG, "startVideoPlayback");
        mMediaPlayer.start();
    }
}
}
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

</LinearLayout>
<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />


更新:虽然它已经存在了一段时间,但我认为值得指出的是,它展示了一个功能,“将两个视频流同时解码为两个TextureView。”。不确定它扩展到两个以上视频文件的效果如何,但与原始问题相关。

可能有一个选项是对现有MediaController进行子类化,使其内部有2个MediaController对象(每个对象与不同的视频视图相关联),然后将主控制器的每个命令路由到两个内部控制器。只有在MediaController没有静态依赖项的情况下,它才会工作,所以首先检查它。这是一个好主意。但这比构建自定义媒体控制器要复杂得多,我想主控制器对象很快就会遇到一种情况,它取决于子控制器的状态,因此您可以有效地编写自定义控制器实现。唯一的“胜利”将是默认行为仍然“按预期”工作。您好@Dim有任何答案适合您的项目吗?帮你?我等待反馈