Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/216.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
android Camera2 jpeg回调命中超时_Android_Android Camera - Fatal编程技术网

android Camera2 jpeg回调命中超时

android Camera2 jpeg回调命中超时,android,android-camera,Android,Android Camera,我正在使用新的摄像头API(camera2)在android 5.0.2上创建一个应用程序。该应用程序每2.5秒拍摄一张照片,持续3小时(总共4320张照片)。正如你在下面的代码中所看到的,我用“timer”编码了重复的东西,没有引用预览。我正在使用NEXUS7 2013 16G 5.0.2进行测试。对于开始的200-300张图片,它可以正常工作,但失败并显示以下错误消息。失败总是以“E/RequestThread-1”开头﹕ 点击“jpeg回调超时!”,它必须触发某些内容。有人能帮我摆脱这个触

我正在使用新的摄像头API(camera2)在android 5.0.2上创建一个应用程序。该应用程序每2.5秒拍摄一张照片,持续3小时(总共4320张照片)。正如你在下面的代码中所看到的,我用“timer”编码了重复的东西,没有引用预览。我正在使用NEXUS7 2013 16G 5.0.2进行测试。对于开始的200-300张图片,它可以正常工作,但失败并显示以下错误消息。失败总是以“E/RequestThread-1”开头﹕ 点击“jpeg回调超时!”,它必须触发某些内容。有人能帮我摆脱这个触发器吗?或者这将在5.1.0中修复,如果它是android bug

03-30 15:46:04.472  11432-11432/com.example.android.camera2basic V/yo click﹕ ----   174 ---- click
03-30 15:46:05.026  11432-11537/com.example.android.camera2basic E/RequestThread-1﹕ Hit timeout for jpeg callback!
03-30 15:46:05.027  11432-11537/com.example.android.camera2basic W/CaptureCollector﹕ Jpeg buffers dropped for request: 173
03-30 15:46:05.076  11432-11480/com.example.android.camera2basic E/CameraDevice-JV-1﹕ Lost output buffer reported for frame 173
03-30 15:46:05.090  11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
03-30 15:46:05.090  11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ Only received metering rectangles with weight 0.
03-30 15:46:05.091  11432-11537/com.example.android.camera2basic W/LegacyMetadataMapper﹕ convertAfModeToLegacy - ignoring unsupported mode 4, defaulting to fixed
03-30 15:46:05.091  11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
03-30 15:46:05.098  11432-11537/com.example.android.camera2basic E/AndroidRuntime﹕ FATAL EXCEPTION: RequestThread-1
Process: com.example.android.camera2basic, PID: 11432
java.lang.RuntimeException: startPreview failed
        at android.hardware.Camera.startPreview(Native Method)
        at android.hardware.camera2.legacy.RequestThreadManager.startPreview(RequestThreadManager.java:275)
        at android.hardware.camera2.legacy.RequestThreadManager.doJpegCapturePrepare(RequestThreadManager.java:288)
        at android.hardware.camera2.legacy.RequestThreadManager.access$1700(RequestThreadManager.java:61)
        at android.hardware.camera2.legacy.RequestThreadManager$5.handleMessage(RequestThreadManager.java:767)
        at android.os.Handler.dispatchMessage(Handler.java:98)
        at android.os.Looper.loop(Looper.java:135)
        at android.os.HandlerThread.run(HandlerThread.java:61)
这是我的密码:

public class CameraActivity extends Activity {

    Timer mTimer = null;
    Handler mHandler = new Handler();

    private ImageReader imageReader;
    private Handler backgroundHandler;
    private HandlerThread backgroundThread;
    private String cameraId;
    private CameraDevice cameraDevice;
    private CameraCaptureSession cameraCaptureSession;

    static int count = 0;
    static int count2 = 0;

    /**
     * Conversion from screen rotation to JPEG orientation.
     */
    private static final SparseIntArray ORIENTATIONS = new SparseIntArray();

    static {
        ORIENTATIONS.append(Surface.ROTATION_0, 90);
        ORIENTATIONS.append(Surface.ROTATION_90, 0);
        ORIENTATIONS.append(Surface.ROTATION_180, 270);
        ORIENTATIONS.append(Surface.ROTATION_270, 180);
    }




@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    //setContentView(R.layout.activity_camera);
    setContentView(R.layout.activity_main);

    Button takePicture = (Button)findViewById(R.id.takepic);
    takePicture.setOnClickListener(onClickPicture);

    //(1) setting up camera but stop before camera createCaptureRequest
    setupCamera2();
}

private void setupCamera2() {
    CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);



    try {

        for (String cameraId : manager.getCameraIdList()) {
        CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);


        //if (characteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_BACK) {
        if (characteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_FRONT) {
            continue;
        }

    StreamConfigurationMap configs = characteristics.get(
         CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);          


        this.cameraId = cameraId;

    manager.openCamera(this.cameraId, cameraStateCallback, backgroundHandler);

            Size[] sizes = configs.getOutputSizes(ImageFormat.JPEG);

        int   picWidth = 640;//1920;
            int   picHeight = 480;//1080;

        imageReader = ImageReader.newInstance(picWidth, picHeight, ImageFormat.JPEG, 2);
        imageReader.setOnImageAvailableListener(onImageAvailableListener, backgroundHandler);
        }

    } catch (CameraAccessException | NullPointerException e) {
        e.printStackTrace();
    }
}

private final CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
    @Override
    public void onOpened(CameraDevice device) {
        cameraDevice = device;
    //(2) Camera capture session
        createCameraCaptureSession();
    }

    @Override
    public void onDisconnected(CameraDevice cameraDevice) {}

    @Override
    public void onError(CameraDevice cameraDevice, int error) {}

};



    //private void createCaptureSession() {
private void createCameraCaptureSession() {
    List<Surface> outputSurfaces = new LinkedList<>();
    outputSurfaces.add(imageReader.getSurface());

    Log.v("-yo(2)-", "in createcameraCaptureSession now");

    try {

        cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
        @Override
        public void onConfigured(CameraCaptureSession session) {
        //cameraCaptureSession = session;
        cameraCaptureSession = session;
        //commented out to invoked from button 
        //createCaptureRequest();
        }

           @Override
       public void onConfigureFailed(CameraCaptureSession session) {}
        }, null);

    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
}


private final ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
    @Override
    public void onImageAvailable(ImageReader reader) {
        //createCaptureRequest();

    Log.v("yo ireader ","----   "+(count2++)+" ---- ireader");  

    //Image mImage = imageReader.acquireLatestImage();
        Image mImage = reader.acquireLatestImage();
        File mFile = new File(Environment.getExternalStorageDirectory() + "/yP2PTEST/0P2Pimage.jpg");

    Log.v("--yo--", "In ImageReader now writing to "+mFile);
    /////////////////////////////////////
            ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
            byte[] bytes = new byte[buffer.remaining()];
            buffer.get(bytes);
            FileOutputStream output = null;
            try {
                output = new FileOutputStream(mFile);
                output.write(bytes);
            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
                mImage.close();
                if (null != output) {
                    try {
                        output.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }

        ImageView curPic = (ImageView)findViewById(R.id.imageView1);
        Bitmap mCurrentBitmap = BitmapFactory.decodeFile(mFile.getPath());
        curPic.setImageBitmap(mCurrentBitmap); 
        }
    ///////////////////////////////////
    };



private void createCaptureRequest() {

    Log.v("-yo(3)-", "in createCaptureRequest now");

    try {

        CaptureRequest.Builder requestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
        requestBuilder.addTarget(imageReader.getSurface());

        // Focus
        requestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

        // Orientation
        //yo int rotation = windowManager.getDefaultDisplay().getRotation();
        int rotation = this.getWindowManager().getDefaultDisplay().getRotation();
        requestBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));

       // cameraCaptureSession.capture(requestBuilder.build(), camera2Callback, null);
        cameraCaptureSession.capture(requestBuilder.build(), mCaptureCallback, null);

    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
}


    CameraCaptureSession.CaptureCallback mCaptureCallback
            = new CameraCaptureSession.CaptureCallback() {

        @Override
        public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
                                       TotalCaptureResult result) {
            //showToast("JPEG Saved : ");
            //Log.v("yo save","- saved JPEG -");
            //unlockFocus();
        }
    };




    private Handler mMessageHandler = new Handler() {
        @Override
        public void handleMessage(Message msg) {

            if (this != null) {
                Toast.makeText(CameraActivity.this, (String) msg.obj, Toast.LENGTH_SHORT).show();
            }
        }
    };

    private void showToast(String text) {
        // We show a Toast by sending request message to mMessageHandler. This makes sure that the
        // Toast is shown on the UI thread.
        Message message = Message.obtain();
        message.obj = text;
        mMessageHandler.sendMessage(message);
    }



//------------------------------------------------------------//


    public View.OnClickListener onClickPicture = new View.OnClickListener() {
        public void onClick(View v) {


        /*-------  camera2   --------------*/
        mTimer = null;
        mTimer = new Timer(true);
        mTimer.schedule( new TimerTask(){
            @Override
            public void run() {
                /*------------------------*/
                mHandler.post( new Runnable() {
                public void run() {

                    createCaptureRequest(); 
                    Log.v("yo click ","----   "+(count++)+" ---- click");
                }
                });

            }
            }, 1000, 2500);//1500,1600, 1800 etc
        };
    };

};

我对修改过的谷歌示例也有同样的问题。 在拍了两张照片之后,我得到了这个错误,所以您只需要扩展最后一个参数,就可以了

    imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),ImageFormat.JPEG, 2);
但请记住:
@param maxImages
用户希望同时访问的最大图像数。这应该尽可能小,以限制内存使用。一旦用户获得
最大值
图像,必须先释放其中一个图像,然后才能通过
{@link#acquiredlatestimage()}
{@link#acquirednextimage()}
访问新图像。必须大于0

    imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),ImageFormat.JPEG, 2);