天天看點

Android camera createCaptureSession分析

在openCamera() 成功打開相機後,會通過CameraDevice.StateCallback回調接口onOpened()方法傳回一個CameraDevice對象給應用層,而這個CameraDevice對象是一個CameraDeviceImpl,那麼接下來的createCaptureSession就是調用它來實作的。

/frameworks/base/core/java/android/hardware/camera2/impl/CameraDeviceImpl.java

@Override
520    public void createCaptureSession(List<Surface> outputs,
521            CameraCaptureSession.StateCallback callback, Handler handler)
522            throws CameraAccessException {
523        List<OutputConfiguration> outConfigurations = new ArrayList<>(outputs.size());
524        for (Surface surface : outputs) {
525            outConfigurations.add(new OutputConfiguration(surface));
526        }
527        createCaptureSessionInternal(null, outConfigurations, callback,
528                checkAndWrapHandler(handler), /*operatingMode*/ICameraDeviceUser.NORMAL_MODE,
529                /*sessionParams*/ null);
    }
           

第一個參數是一個範型為Surface的List,這裡的Surface就是我們用來建立流的,一般如果沒有特殊的要求,那我們隻需要下兩個Surface,一個提供預覽,一個提供拍照,預覽的Surface就是相機預覽區域,buffer輪轉時,預覽區的buffer就是要從這個預覽Surface當中擷取的,這個Surface一定要正确,否則就會導緻session建立失敗,預覽區就會黑屏了;而至于拍照Surface,我們一般使用ImageReader對象來擷取,ImageReader是系統提供的一個類,它的建立過程已經為我們建立好了一個Surface,我們直接使用它來當作拍照Surface,當拍照成功後,我們就可以從ImageReader.OnImageAvailableListener内部類的onImageAvailable回調方法中擷取到一個ImageReader對象,再調用getPlanes()擷取到Plane數組,一般取第一個Plane,繼續調用getBuffer()就可以擷取到拍攝的照片的byte數組了

第二個參數callback的類型為frameworks\base\core\java\android\hardware\camera2\CameraCaptureSession.java類的内部類StateCallback,和openCamera一樣,當session建立成功後,framework也會通過這個回調接口的onConfigured()方法傳回一個CameraCaptureSession對象,而真正的實作是一個CameraCaptureSessionImpl對象,我們可以使用它來作很多的工作,比如斷開session連接配接調用abortCaptures();拍照調用capture();預覽調用setRepeatingRequest();停預覽調用stopRepeating(),這裡的設計和openCamera是完全一樣的。

第三個參數Handler的作用和openCamera也一樣,還是為了保證線程不發生切換,我們在應用程序的哪個工作線程中執行createCaptureSession,那麼framework回調時也會通過這個handler把回調消息發送到目前handler線程的Looper循環上。

createCaptureSession()實際就是調用createCaptureSessionInternal()方法進一步處理的,這裡的會把我們傳入的surface清單進行一下轉換,轉換為OutputConfiguration對象,調用createCaptureSessionInternal方法時的第一個參數inputConfig一般為空,我們隻需要關注outputConfig。

649    private void createCaptureSessionInternal(InputConfiguration inputConfig,
650            List<OutputConfiguration> outputConfigurations,
651            CameraCaptureSession.StateCallback callback, Executor executor,
652            int operatingMode, CaptureRequest sessionParams) throws CameraAccessException {
653        synchronized(mInterfaceLock) {
654            if (DEBUG) {
655                Log.d(TAG, "createCaptureSessionInternal");
656            }
657
658            checkIfCameraClosedOrInError();
659
660            boolean isConstrainedHighSpeed =
661                    (operatingMode == ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE);
662            if (isConstrainedHighSpeed && inputConfig != null) {
663                throw new IllegalArgumentException("Constrained high speed session doesn't support"
664                        + " input configuration yet.");
665            }
666
667            // Notify current session that it's going away, before starting camera operations
668            // After this call completes, the session is not allowed to call into CameraDeviceImpl
669            if (mCurrentSession != null) {
670                mCurrentSession.replaceSessionClose();
671            }
672
673            // TODO: dont block for this
674            boolean configureSuccess = true;
675            CameraAccessException pendingException = null;
676            Surface input = null;
677            try {
678                // configure streams and then block until IDLE

//執行surface配置, !!重要!!
679                configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations,
680                        operatingMode, sessionParams);
681                if (configureSuccess == true && inputConfig != null) {
682                    input = mRemoteDevice.getInputSurface();
683                }
684            } catch (CameraAccessException e) {
685                configureSuccess = false;
686                pendingException = e;
687                input = null;
688                if (DEBUG) {
689                    Log.v(TAG, "createCaptureSession - failed with exception ", e);
690                }
691            }
692
693            // Fire onConfigured if configureOutputs succeeded, fire onConfigureFailed otherwise.
694            CameraCaptureSessionCore newSession = null;
695            if (isConstrainedHighSpeed) {
696                ArrayList<Surface> surfaces = new ArrayList<>(outputConfigurations.size());
697                for (OutputConfiguration outConfig : outputConfigurations) {
698                    surfaces.add(outConfig.getSurface());
699                }
700                StreamConfigurationMap config =
701                    getCharacteristics().get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
702                SurfaceUtils.checkConstrainedHighSpeedSurfaces(surfaces, /*fpsRange*/null, config);
703
704                newSession = new CameraConstrainedHighSpeedCaptureSessionImpl(mNextSessionId++,
705                        callback, executor, this, mDeviceExecutor, configureSuccess,
706                        mCharacteristics);
707            } else {

//建立CameraCaptureSessionImpl對象
708                newSession = new CameraCaptureSessionImpl(mNextSessionId++, input,
709                        callback, executor, this, mDeviceExecutor, configureSuccess);
710            }
711
712            // TODO: wait until current session closes, then create the new session
713            mCurrentSession = newSession;
714
715            if (pendingException != null) {
716                throw pendingException;
717            }
718
719            mSessionStateCallback = mCurrentSession.getDeviceStateCallback();
720        }
721    }
           

/frameworks\base\core\java\android\hardware\camera2\impl\CameraCaptureSessionImpl.java

CameraCaptureSessionImpl構造函數調用mStateCallback.onConfigured(this)通知應用層,session已經建立成功,同時将目前的impl實作類對象傳回給應用層。

下面看一下configureStreamsChecked方法的實作:

/**
382     * Attempt to configure the input and outputs; the device goes to idle and then configures the
383     * new input and outputs if possible.
384     *
385     * <p>The configuration may gracefully fail, if input configuration is not supported,
386     * if there are too many outputs, if the formats are not supported, or if the sizes for that
387     * format is not supported. In this case this function will return {@code false} and the
388     * unconfigured callback will be fired.</p>
389     *
390     * <p>If the configuration succeeds (with 1 or more outputs with or without an input),
391     * then the idle callback is fired. Unconfiguring the device always fires the idle callback.</p>
392     *
393     * @param inputConfig input configuration or {@code null} for no input
394     * @param outputs a list of one or more surfaces, or {@code null} to unconfigure
395     * @param operatingMode If the stream configuration is for a normal session,
396     *     a constrained high speed session, or something else.
397     * @param sessionParams Session parameters.
398     * @return whether or not the configuration was successful
399     *
400     * @throws CameraAccessException if there were any unexpected problems during configuration
     */
    public boolean configureStreamsChecked(InputConfiguration inputConfig,
403            List<OutputConfiguration> outputs, int operatingMode, CaptureRequest sessionParams)
404                    throws CameraAccessException {
405        // Treat a null input the same an empty list
406        if (outputs == null) {
407            outputs = new ArrayList<OutputConfiguration>();
408        }
409        if (outputs.size() == 0 && inputConfig != null) {
410            throw new IllegalArgumentException("cannot configure an input stream without " +
411                    "any output streams");
412        }
413
414        checkInputConfiguration(inputConfig);
415
416        boolean success = false;
417
418        synchronized(mInterfaceLock) {
419            checkIfCameraClosedOrInError();
420            // Streams to create
421            HashSet<OutputConfiguration> addSet = new HashSet<OutputConfiguration>(outputs);
422            // Streams to delete
423            List<Integer> deleteList = new ArrayList<Integer>();
424
425            // Determine which streams need to be created, which to be deleted
426            for (int i = 0; i < mConfiguredOutputs.size(); ++i) {
427                int streamId = mConfiguredOutputs.keyAt(i);
428                OutputConfiguration outConfig = mConfiguredOutputs.valueAt(i);
429
430                if (!outputs.contains(outConfig) || outConfig.isDeferredConfiguration()) {
431                    // Always delete the deferred output configuration when the session
432                    // is created, as the deferred output configuration doesn't have unique surface
433                    // related identifies.
434                    deleteList.add(streamId);
435                } else {
436                    addSet.remove(outConfig);  // Don't create a stream previously created
437                }
438            }

//往DeviceHandler的Looper循環上發一個CallOnBusy的消息,表示接下來要開始配置surface
//處理繁忙狀态了,然後調用stopRepeating()停預覽
440            mDeviceExecutor.execute(mCallOnBusy);
441            stopRepeating();
442
443            try {
444                waitUntilIdle();

//通知CameraServer程序中的binder服務端對象,開始配置
//mRemoteDevice實際上就是CameraDeviceClient對象
446                mRemoteDevice.beginConfigure();
447
448                // reconfigure the input stream if the input configuration is different.
449                InputConfiguration currentInputConfig = mConfiguredInput.getValue();
450                if (inputConfig != currentInputConfig &&
451                        (inputConfig == null || !inputConfig.equals(currentInputConfig))) {
452                    if (currentInputConfig != null) {
453                        mRemoteDevice.deleteStream(mConfiguredInput.getKey());
454                        mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
455                                REQUEST_ID_NONE, null);
456                    }
457                    if (inputConfig != null) {
458                        int streamId = mRemoteDevice.createInputStream(inputConfig.getWidth(),
459                                inputConfig.getHeight(), inputConfig.getFormat());
460                        mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
461                                streamId, inputConfig);
462                    }
463                }
464
465                // Delete all streams first (to free up HW resources)
466                for (Integer streamId : deleteList) {
467                    mRemoteDevice.deleteStream(streamId);
468                    mConfiguredOutputs.delete(streamId);
469                }
470
471                // Add all new streams
472                for (OutputConfiguration outConfig : outputs) {
473                    if (addSet.contains(outConfig)) {

//配置surface
474                        int streamId = mRemoteDevice.createStream(outConfig);
475                        mConfiguredOutputs.put(streamId, outConfig);
476                    }
477                }
478
479                int offlineStreamIds[];
480                if (sessionParams != null) {

//通知CameraDeviceClient結束配置
481                    offlineStreamIds = mRemoteDevice.endConfigure(operatingMode,
482                            sessionParams.getNativeCopy());
483                } else {
484                    offlineStreamIds = mRemoteDevice.endConfigure(operatingMode, null);
485                }
486
487                mOfflineSupport.clear();
488                if ((offlineStreamIds != null) && (offlineStreamIds.length > 0)) {
489                    for (int offlineStreamId : offlineStreamIds) {
490                        mOfflineSupport.add(offlineStreamId);
491                    }
492                }
493
494                success = true;
495            } catch (IllegalArgumentException e) {
496                // OK. camera service can reject stream config if it's not supported by HAL
497                // This is only the result of a programmer misusing the camera2 api.
498                Log.w(TAG, "Stream configuration failed due to: " + e.getMessage());
499                return false;
500            } catch (CameraAccessException e) {
501                if (e.getReason() == CameraAccessException.CAMERA_IN_USE) {
502                    throw new IllegalStateException("The camera is currently busy." +
503                            " You must wait until the previous operation completes.", e);
504                }
505                throw e;
506            } finally {
507                if (success && outputs.size() > 0) {
508                    mDeviceExecutor.execute(mCallOnIdle);
509                } else {
510                    // Always return to the 'unconfigured' state if we didn't hit a fatal error
511                    mDeviceExecutor.execute(mCallOnUnconfigured);
512                }
513            }
514        }
515
516        return success;
    }
           

 先看下createStream:

/frameworks/av/services/camera/libcameraservice/api2/CameraDeviceClient.cpp

954binder::Status CameraDeviceClient::createStream(
955        const hardware::camera2::params::OutputConfiguration &outputConfiguration,
956        /*out*/
957        int32_t* newStreamId) {
958    ATRACE_CALL();
959
960    binder::Status res;
961    if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
962
963    Mutex::Autolock icl(mBinderSerializationLock);
964
965    const std::vector<sp<IGraphicBufferProducer>>& bufferProducers =
966            outputConfiguration.getGraphicBufferProducers();
967    size_t numBufferProducers = bufferProducers.size();
968    bool deferredConsumer = outputConfiguration.isDeferred();
969    bool isShared = outputConfiguration.isShared();
970    String8 physicalCameraId = String8(outputConfiguration.getPhysicalCameraId());
971    bool deferredConsumerOnly = deferredConsumer && numBufferProducers == 0;
972
973    res = checkSurfaceType(numBufferProducers, deferredConsumer,
974            outputConfiguration.getSurfaceType());
975    if (!res.isOk()) {
976        return res;
977    }
978
979    if (!mDevice.get()) {
980        return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
981    }
982    std::vector<std::string> physicalCameraIds;
983    mProviderManager->isLogicalCamera(mCameraIdStr.string(), &physicalCameraIds);
984    res = checkPhysicalCameraId(physicalCameraIds, physicalCameraId, mCameraIdStr);
985    if (!res.isOk()) {
986        return res;
987    }
988
989    std::vector<sp<Surface>> surfaces;
990    std::vector<sp<IBinder>> binders;
991    status_t err;
992
993    // Create stream for deferred surface case.
994    if (deferredConsumerOnly) {
995        return createDeferredSurfaceStreamLocked(outputConfiguration, isShared, newStreamId);
996    }
997
998    OutputStreamInfo streamInfo;
999    bool isStreamInfoValid = false;

//使用上面outputConfiguration.getGraphicBufferProducers()得到的GraphicBufferProducers建立出對應的surface,
//同時對這些surface對象進行判斷,檢查它們的合法性,合法的話就會将它們加入到surfaces集合中,然後調用mDevice->createStream進一步執行流的建立。
1000    for (auto& bufferProducer : bufferProducers) {
1001        // Don't create multiple streams for the same target surface
1002        sp<IBinder> binder = IInterface::asBinder(bufferProducer);
1003        ssize_t index = mStreamMap.indexOfKey(binder);
1004        if (index != NAME_NOT_FOUND) {
1005            String8 msg = String8::format("Camera %s: Surface already has a stream created for it "
1006                    "(ID %zd)", mCameraIdStr.string(), index);
1007            ALOGW("%s: %s", __FUNCTION__, msg.string());
1008            return STATUS_ERROR(CameraService::ERROR_ALREADY_EXISTS, msg.string());
1009        }
1010
1011        sp<Surface> surface;
1012        res = createSurfaceFromGbp(streamInfo, isStreamInfoValid, surface, bufferProducer,
1013                mCameraIdStr, mDevice->infoPhysical(physicalCameraId));
1014
1015        if (!res.isOk())
1016            return res;
1017
1018        if (!isStreamInfoValid) {
1019            isStreamInfoValid = true;
1020        }
1021
1022        binders.push_back(IInterface::asBinder(bufferProducer));
1023        surfaces.push_back(surface);
1024    }
1025
1026    int streamId = camera3::CAMERA3_STREAM_ID_INVALID;
1027    std::vector<int> surfaceIds;
1028    bool isDepthCompositeStream = camera3::DepthCompositeStream::isDepthCompositeStream(surfaces[0]);
1029    bool isHeicCompisiteStream = camera3::HeicCompositeStream::isHeicCompositeStream(surfaces[0]);
1030    if (isDepthCompositeStream || isHeicCompisiteStream) {
1031        sp<CompositeStream> compositeStream;
1032        if (isDepthCompositeStream) {
1033            compositeStream = new camera3::DepthCompositeStream(mDevice, getRemoteCallback());
1034        } else {
1035            compositeStream = new camera3::HeicCompositeStream(mDevice, getRemoteCallback());
1036        }
1037
1038        err = compositeStream->createStream(surfaces, deferredConsumer, streamInfo.width,
1039                streamInfo.height, streamInfo.format,
1040                static_cast<camera3_stream_rotation_t>(outputConfiguration.getRotation()),
1041                &streamId, physicalCameraId, &surfaceIds, outputConfiguration.getSurfaceSetID(),
1042                isShared);
1043        if (err == OK) {
1044            mCompositeStreamMap.add(IInterface::asBinder(surfaces[0]->getIGraphicBufferProducer()),
1045                    compositeStream);
1046        }
1047    } else {
1048        err = mDevice->createStream(surfaces, deferredConsumer, streamInfo.width,
1049                streamInfo.height, streamInfo.format, streamInfo.dataSpace,
1050                static_cast<camera3_stream_rotation_t>(outputConfiguration.getRotation()),
1051                &streamId, physicalCameraId, &surfaceIds, outputConfiguration.getSurfaceSetID(),
1052                isShared);
1053    }
1054
1055    if (err != OK) {
1056        res = STATUS_ERROR_FMT(CameraService::ERROR_INVALID_OPERATION,
1057                "Camera %s: Error creating output stream (%d x %d, fmt %x, dataSpace %x): %s (%d)",
1058                mCameraIdStr.string(), streamInfo.width, streamInfo.height, streamInfo.format,
1059                streamInfo.dataSpace, strerror(-err), err);
1060    } else {
1061        int i = 0;
1062        for (auto& binder : binders) {
1063            ALOGV("%s: mStreamMap add binder %p streamId %d, surfaceId %d",
1064                    __FUNCTION__, binder.get(), streamId, i);
1065            mStreamMap.add(binder, StreamSurfaceId(streamId, surfaceIds[i]));
1066            i++;
1067        }
1068
1069        mConfiguredOutputs.add(streamId, outputConfiguration);
1070        mStreamInfoMap[streamId] = streamInfo;
1071
1072        ALOGV("%s: Camera %s: Successfully created a new stream ID %d for output surface"
1073                    " (%d x %d) with format 0x%x.",
1074                  __FUNCTION__, mCameraIdStr.string(), streamId, streamInfo.width,
1075                  streamInfo.height, streamInfo.format);
1076
1077        // Set transform flags to ensure preview to be rotated correctly.
1078        res = setStreamTransformLocked(streamId);
1079
1080        *newStreamId = streamId;
1081    }
1082
1083    return res;
1084}
           

/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp

status_t Camera3Device::createStream(const std::vector<sp<Surface>>& consumers,
1337        bool hasDeferredConsumer, uint32_t width, uint32_t height, int format,
1338        android_dataspace dataSpace, camera3_stream_rotation_t rotation, int *id,
1339        const String8& physicalCameraId,
1340        std::vector<int> *surfaceIds, int streamSetId, bool isShared, uint64_t consumerUsage) {
1341    ATRACE_CALL();
1342
1343    Mutex::Autolock il(mInterfaceLock);
1344    nsecs_t maxExpectedDuration = getExpectedInFlightDuration();
1345    Mutex::Autolock l(mLock);
1346    ALOGV("Camera %s: Creating new stream %d: %d x %d, format %d, dataspace %d rotation %d"
1347            " consumer usage %" PRIu64 ", isShared %d, physicalCameraId %s", mId.string(),
1348            mNextStreamId, width, height, format, dataSpace, rotation, consumerUsage, isShared,
1349            physicalCameraId.string());
1350
1351    status_t res;
1352    bool wasActive = false;
1353
1354    switch (mStatus) {
1355        case STATUS_ERROR:
1356            CLOGE("Device has encountered a serious error");
1357            return INVALID_OPERATION;
1358        case STATUS_UNINITIALIZED:
1359            CLOGE("Device not initialized");
1360            return INVALID_OPERATION;
1361        case STATUS_UNCONFIGURED:
1362        case STATUS_CONFIGURED:
1363            // OK
1364            break;
1365        case STATUS_ACTIVE:
1366            ALOGV("%s: Stopping activity to reconfigure streams", __FUNCTION__);
1367            res = internalPauseAndWaitLocked(maxExpectedDuration);
1368            if (res != OK) {
1369                SET_ERR_L("Can't pause captures to reconfigure streams!");
1370                return res;
1371            }
1372            wasActive = true;
1373            break;
1374        default:
1375            SET_ERR_L("Unexpected status: %d", mStatus);
1376            return INVALID_OPERATION;
1377    }
1378    assert(mStatus != STATUS_ACTIVE);
1379
1380    sp<Camera3OutputStream> newStream;
1381
1382    if (consumers.size() == 0 && !hasDeferredConsumer) {
1383        ALOGE("%s: Number of consumers cannot be smaller than 1", __FUNCTION__);
1384        return BAD_VALUE;
1385    }
1386
1387    if (hasDeferredConsumer && format != HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED) {
1388        ALOGE("Deferred consumer stream creation only support IMPLEMENTATION_DEFINED format");
1389        return BAD_VALUE;
1390    }
1391
1392    if (format == HAL_PIXEL_FORMAT_BLOB) {
1393        ssize_t blobBufferSize;
1394        if (dataSpace == HAL_DATASPACE_DEPTH) {
1395            blobBufferSize = getPointCloudBufferSize();
1396            if (blobBufferSize <= 0) {
1397                SET_ERR_L("Invalid point cloud buffer size %zd", blobBufferSize);
1398                return BAD_VALUE;
1399            }
1400        } else if (dataSpace == static_cast<android_dataspace>(HAL_DATASPACE_JPEG_APP_SEGMENTS)) {
1401            blobBufferSize = width * height;
1402        } else {
1403            blobBufferSize = getJpegBufferSize(width, height);
1404            if (blobBufferSize <= 0) {
1405                SET_ERR_L("Invalid jpeg buffer size %zd", blobBufferSize);
1406                return BAD_VALUE;
1407            }
1408        }
1409        newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
1410                width, height, blobBufferSize, format, dataSpace, rotation,
1411                mTimestampOffset, physicalCameraId, streamSetId);
1412    } else if (format == HAL_PIXEL_FORMAT_RAW_OPAQUE) {
1413        ssize_t rawOpaqueBufferSize = getRawOpaqueBufferSize(width, height);
1414        if (rawOpaqueBufferSize <= 0) {
1415            SET_ERR_L("Invalid RAW opaque buffer size %zd", rawOpaqueBufferSize);
1416            return BAD_VALUE;
1417        }
1418        newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
1419                width, height, rawOpaqueBufferSize, format, dataSpace, rotation,
1420                mTimestampOffset, physicalCameraId, streamSetId);
1421    } else if (isShared) {
1422        newStream = new Camera3SharedOutputStream(mNextStreamId, consumers,
1423                width, height, format, consumerUsage, dataSpace, rotation,
1424                mTimestampOffset, physicalCameraId, streamSetId,
1425                mUseHalBufManager);
1426    } else if (consumers.size() == 0 && hasDeferredConsumer) {
1427        newStream = new Camera3OutputStream(mNextStreamId,
1428                width, height, format, consumerUsage, dataSpace, rotation,
1429                mTimestampOffset, physicalCameraId, streamSetId);
1430    } else {
1431        newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
1432                width, height, format, dataSpace, rotation,
1433                mTimestampOffset, physicalCameraId, streamSetId);
1434    }
1435
1436    size_t consumerCount = consumers.size();
1437    for (size_t i = 0; i < consumerCount; i++) {
1438        int id = newStream->getSurfaceId(consumers[i]);
1439        if (id < 0) {
1440            SET_ERR_L("Invalid surface id");
1441            return BAD_VALUE;
1442        }
1443        if (surfaceIds != nullptr) {
1444            surfaceIds->push_back(id);
1445        }
1446    }
1447
1448    newStream->setStatusTracker(mStatusTracker);
1449
1450    newStream->setBufferManager(mBufferManager);
1451
1452    res = mOutputStreams.add(mNextStreamId, newStream);
1453    if (res < 0) {
1454        SET_ERR_L("Can't add new stream to set: %s (%d)", strerror(-res), res);
1455        return res;
1456    }
1457
1458    *id = mNextStreamId++;
1459    mNeedConfig = true;
1460
1461    // Continue captures if active at start
1462    if (wasActive) {
1463        ALOGV("%s: Restarting activity to reconfigure streams", __FUNCTION__);
1464        // Reuse current operating mode and session parameters for new stream config
1465        res = configureStreamsLocked(mOperatingMode, mSessionParams);
1466        if (res != OK) {
1467            CLOGE("Can't reconfigure device for new stream %d: %s (%d)",
1468                    mNextStreamId, strerror(-res), res);
1469            return res;
1470        }
1471        internalResumeLocked();
1472    }
1473    ALOGV("Camera %s: Created new stream", mId.string());
1474    return OK;
}
           

 到這裡,createStream的邏輯就執行完成了,還是要提醒大家,createStream的邏輯是在framework中configureStreamsChecked()的for循環裡執行的,我們的建立相當于隻配置了一個surface,如果有多個surface的話,這裡會執行多次,相應的Camera3OutputStream流的日志也會列印多次。

再回到/frameworks/base/core/java/android/hardware/camera2/impl/CameraDeviceImpl.java的 configureStreamsChecked(),當createStream()完成之後會執行 mRemoteDevice.endConfigure():

binder::Status CameraDeviceClient::endConfigure(int operatingMode,
473        const hardware::camera2::impl::CameraMetadataNative& sessionParams,
474        std::vector<int>* offlineStreamIds /*out*/) {
475    ATRACE_CALL();
476    ALOGV("%s: ending configure (%d input stream, %zu output surfaces)",
477            __FUNCTION__, mInputStream.configured ? 1 : 0,
478            mStreamMap.size());
479
480    binder::Status res;
481    if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
482
483    if (offlineStreamIds == nullptr) {
484        String8 msg = String8::format("Invalid offline stream ids");
485        ALOGE("%s: %s", __FUNCTION__, msg.string());
486        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
487    }
488
489    Mutex::Autolock icl(mBinderSerializationLock);
490
491    if (!mDevice.get()) {
492        return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
493    }
494
495    res = checkOperatingMode(operatingMode, mDevice->info(), mCameraIdStr);
496    if (!res.isOk()) {
497        return res;
498    }
499
500    status_t err = mDevice->configureStreams(sessionParams, operatingMode);
501    if (err == BAD_VALUE) {
502        String8 msg = String8::format("Camera %s: Unsupported set of inputs/outputs provided",
503                mCameraIdStr.string());
504        ALOGE("%s: %s", __FUNCTION__, msg.string());
505        res = STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
506    } else if (err != OK) {
507        String8 msg = String8::format("Camera %s: Error configuring streams: %s (%d)",
508                mCameraIdStr.string(), strerror(-err), err);
509        ALOGE("%s: %s", __FUNCTION__, msg.string());
510        res = STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string());
511    } else {
512        offlineStreamIds->clear();
513        mDevice->getOfflineStreamIds(offlineStreamIds);
514
515        for (size_t i = 0; i < mCompositeStreamMap.size(); ++i) {
516            err = mCompositeStreamMap.valueAt(i)->configureStream();
517            if (err != OK) {
518                String8 msg = String8::format("Camera %s: Error configuring composite "
519                        "streams: %s (%d)", mCameraIdStr.string(), strerror(-err), err);
520                ALOGE("%s: %s", __FUNCTION__, msg.string());
521                res = STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string());
522                break;
523            }
524
525            // Composite streams can only support offline mode in case all individual internal
526            // streams are also supported.
527            std::vector<int> internalStreams;
528            mCompositeStreamMap.valueAt(i)->insertCompositeStreamIds(&internalStreams);
529            offlineStreamIds->erase(
530                    std::remove_if(offlineStreamIds->begin(), offlineStreamIds->end(),
531                    [&internalStreams] (int streamId) {
532                        auto it = std::find(internalStreams.begin(), internalStreams.end(),
533                                streamId);
534                        if (it != internalStreams.end()) {
535                            internalStreams.erase(it);
536                            return true;
537                        }
538
539                        return false;}), offlineStreamIds->end());
540            if (internalStreams.empty()) {
541                offlineStreamIds->push_back(mCompositeStreamMap.valueAt(i)->getStreamId());
542            }
543        }
544
545        for (const auto& offlineStreamId : *offlineStreamIds) {
546            mStreamInfoMap[offlineStreamId].supportsOffline = true;
547        }
548    }
549
550    return res;
}
           

接下來調用mDevice->configureStreams執行/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp中的configureStreams方法,最後是調用configureStreamsLocked進一步處理

status_t Camera3Device::configureStreamsLocked(int operatingMode,
2433        const CameraMetadata& sessionParams, bool notifyRequestThread) {
2434    ATRACE_CALL();
2435    status_t res;
2436
2437    if (mStatus != STATUS_UNCONFIGURED && mStatus != STATUS_CONFIGURED) {
2438        CLOGE("Not idle");
2439        return INVALID_OPERATION;
2440    }
2441
2442    if (operatingMode < 0) {
2443        CLOGE("Invalid operating mode: %d", operatingMode);
2444        return BAD_VALUE;
2445    }
2446
2447    bool isConstrainedHighSpeed =
2448            static_cast<int>(StreamConfigurationMode::CONSTRAINED_HIGH_SPEED_MODE) ==
2449            operatingMode;
2450
2451    if (mOperatingMode != operatingMode) {
2452        mNeedConfig = true;
2453        mIsConstrainedHighSpeedConfiguration = isConstrainedHighSpeed;
2454        mOperatingMode = operatingMode;
2455    }
2456
2457    // In case called from configureStreams, abort queued input buffers not belonging to
2458    // any pending requests.
2459    if (mInputStream != NULL && notifyRequestThread) {
2460        while (true) {
2461            camera3_stream_buffer_t inputBuffer;
2462            status_t res = mInputStream->getInputBuffer(&inputBuffer,
2463                    /*respectHalLimit*/ false);
2464            if (res != OK) {
2465                // Exhausted acquiring all input buffers.
2466                break;
2467            }
2468
2469            inputBuffer.status = CAMERA3_BUFFER_STATUS_ERROR;
2470            res = mInputStream->returnInputBuffer(inputBuffer);
2471            if (res != OK) {
2472                ALOGE("%s: %d: couldn't return input buffer while clearing input queue: "
2473                        "%s (%d)", __FUNCTION__, __LINE__, strerror(-res), res);
2474            }
2475        }
2476    }
2477
2478    if (!mNeedConfig) {
2479        ALOGV("%s: Skipping config, no stream changes", __FUNCTION__);
2480        return OK;
2481    }
2482
2483    // Workaround for device HALv3.2 or older spec bug - zero streams requires
2484    // adding a dummy stream instead.
2485    // TODO: Bug: 17321404 for fixing the HAL spec and removing this workaround.
2486    if (mOutputStreams.size() == 0) {
2487        addDummyStreamLocked();
2488    } else {
2489        tryRemoveDummyStreamLocked();
2490    }
2491
2492    // Start configuring the streams
2493    ALOGV("%s: Camera %s: Starting stream configuration", __FUNCTION__, mId.string());
2494
2495    mPreparerThread->pause();
2496
2497    camera3_stream_configuration config;
2498    config.operation_mode = mOperatingMode;
2499    config.num_streams = (mInputStream != NULL) + mOutputStreams.size();
2500
2501    Vector<camera3_stream_t*> streams;
2502    streams.setCapacity(config.num_streams);
2503    std::vector<uint32_t> bufferSizes(config.num_streams, 0);
2504
2505
2506    if (mInputStream != NULL) {
2507        camera3_stream_t *inputStream;
2508        inputStream = mInputStream->startConfiguration();
2509        if (inputStream == NULL) {
2510            CLOGE("Can't start input stream configuration");
2511            cancelStreamsConfigurationLocked();
2512            return INVALID_OPERATION;
2513        }
2514        streams.add(inputStream);
2515    }
2516
2517    for (size_t i = 0; i < mOutputStreams.size(); i++) {
2518
2519        // Don't configure bidi streams twice, nor add them twice to the list
2520        if (mOutputStreams[i].get() ==
2521            static_cast<Camera3StreamInterface*>(mInputStream.get())) {
2522
2523            config.num_streams--;
2524            continue;
2525        }
2526
2527        camera3_stream_t *outputStream;
2528        outputStream = mOutputStreams[i]->startConfiguration();
2529        if (outputStream == NULL) {
2530            CLOGE("Can't start output stream configuration");
2531            cancelStreamsConfigurationLocked();
2532            return INVALID_OPERATION;
2533        }
2534        streams.add(outputStream);
2535
2536        if (outputStream->format == HAL_PIXEL_FORMAT_BLOB) {
2537            size_t k = i + ((mInputStream != nullptr) ? 1 : 0); // Input stream if present should
2538                                                                // always occupy the initial entry.
2539            if (outputStream->data_space == HAL_DATASPACE_V0_JFIF) {
2540                bufferSizes[k] = static_cast<uint32_t>(
2541                        getJpegBufferSize(outputStream->width, outputStream->height));
2542            } else if (outputStream->data_space ==
2543                    static_cast<android_dataspace>(HAL_DATASPACE_JPEG_APP_SEGMENTS)) {
2544                bufferSizes[k] = outputStream->width * outputStream->height;
2545            } else {
2546                ALOGW("%s: Blob dataSpace %d not supported",
2547                        __FUNCTION__, outputStream->data_space);
2548            }
2549        }
2550    }
2551
2552    config.streams = streams.editArray();
2553
2554    // Do the HAL configuration; will potentially touch stream
2555    // max_buffers, usage, and priv fields, as well as data_space and format
2556    // fields for IMPLEMENTATION_DEFINED formats.
2557
2558    const camera_metadata_t *sessionBuffer = sessionParams.getAndLock();
2559    res = mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
2560    sessionParams.unlock(sessionBuffer);
2561
2562    if (res == BAD_VALUE) {
2563        // HAL rejected this set of streams as unsupported, clean up config
2564        // attempt and return to unconfigured state
2565        CLOGE("Set of requested inputs/outputs not supported by HAL");
2566        cancelStreamsConfigurationLocked();
2567        return BAD_VALUE;
2568    } else if (res != OK) {
2569        // Some other kind of error from configure_streams - this is not
2570        // expected
2571        SET_ERR_L("Unable to configure streams with HAL: %s (%d)",
2572                strerror(-res), res);
2573        return res;
2574    }
2575
2576    // Finish all stream configuration immediately.
2577    // TODO: Try to relax this later back to lazy completion, which should be
2578    // faster
2579
2580    if (mInputStream != NULL && mInputStream->isConfiguring()) {
2581        bool streamReConfigured = false;
2582        res = mInputStream->finishConfiguration(&streamReConfigured);
2583        if (res != OK) {
2584            CLOGE("Can't finish configuring input stream %d: %s (%d)",
2585                    mInputStream->getId(), strerror(-res), res);
2586            cancelStreamsConfigurationLocked();
2587            if ((res == NO_INIT || res == DEAD_OBJECT) && mInputStream->isAbandoned()) {
2588                return DEAD_OBJECT;
2589            }
2590            return BAD_VALUE;
2591        }
2592        if (streamReConfigured) {
2593            mInterface->onStreamReConfigured(mInputStream->getId());
2594        }
2595    }
2596
2597    for (size_t i = 0; i < mOutputStreams.size(); i++) {
2598        sp<Camera3OutputStreamInterface> outputStream = mOutputStreams[i];
2599        if (outputStream->isConfiguring() && !outputStream->isConsumerConfigurationDeferred()) {
2600            bool streamReConfigured = false;
2601            res = outputStream->finishConfiguration(&streamReConfigured);
2602            if (res != OK) {
2603                CLOGE("Can't finish configuring output stream %d: %s (%d)",
2604                        outputStream->getId(), strerror(-res), res);
2605                cancelStreamsConfigurationLocked();
2606                if ((res == NO_INIT || res == DEAD_OBJECT) && outputStream->isAbandoned()) {
2607                    return DEAD_OBJECT;
2608                }
2609                return BAD_VALUE;
2610            }
2611            if (streamReConfigured) {
2612                mInterface->onStreamReConfigured(outputStream->getId());
2613            }
2614        }
2615    }
2616
2617    // Request thread needs to know to avoid using repeat-last-settings protocol
2618    // across configure_streams() calls
2619    if (notifyRequestThread) {
2620        mRequestThread->configurationComplete(mIsConstrainedHighSpeedConfiguration, sessionParams);
2621    }
2622
2623    char value[PROPERTY_VALUE_MAX];
2624    property_get("camera.fifo.disable", value, "0");
2625    int32_t disableFifo = atoi(value);
2626    if (disableFifo != 1) {
2627        // Boost priority of request thread to SCHED_FIFO.
2628        pid_t requestThreadTid = mRequestThread->getTid();
2629        res = requestPriority(getpid(), requestThreadTid,
2630                kRequestThreadPriority, /*isForApp*/ false, /*asynchronous*/ false);
2631        if (res != OK) {
2632            ALOGW("Can't set realtime priority for request processing thread: %s (%d)",
2633                    strerror(-res), res);
2634        } else {
2635            ALOGD("Set real time priority for request queue thread (tid %d)", requestThreadTid);
2636        }
2637    }
2638
2639    // Update device state
2640    const camera_metadata_t *newSessionParams = sessionParams.getAndLock();
2641    const camera_metadata_t *currentSessionParams = mSessionParams.getAndLock();
2642    bool updateSessionParams = (newSessionParams != currentSessionParams) ? true : false;
2643    sessionParams.unlock(newSessionParams);
2644    mSessionParams.unlock(currentSessionParams);
2645    if (updateSessionParams)  {
2646        mSessionParams = sessionParams;
2647    }
2648
2649    mNeedConfig = false;
2650
2651    internalUpdateStatusLocked((mDummyStreamId == NO_STREAM) ?
2652            STATUS_CONFIGURED : STATUS_UNCONFIGURED);
2653
2654    ALOGV("%s: Camera %s: Stream configuration complete", __FUNCTION__, mId.string());
2655
2656    // tear down the deleted streams after configure streams.
2657    mDeletedStreams.clear();
2658
2659    auto rc = mPreparerThread->resume();
2660    if (rc != OK) {
2661        SET_ERR_L("%s: Camera %s: Preparer thread failed to resume!", __FUNCTION__, mId.string());
2662        return rc;
2663    }
2664
2665    if (mDummyStreamId == NO_STREAM) {
2666        mRequestBufferSM.onStreamsConfigured();
2667    }
2668
2669    return OK;
}
           

該方法中會調用res = mInterface->configureStreams(&config)給HAL層去執行配置。配置完成正确的情況下,調用outputStream->finishConfiguration()結束配置:

/frameworks/av/services/camera/libcameraservice/device3/Camera3Stream.cpp

status_t Camera3Stream::finishConfiguration(/*out*/bool* streamReconfigured) {
304    ATRACE_CALL();
305    if (streamReconfigured != nullptr) {
306        *streamReconfigured = false;
307    }
308    Mutex::Autolock l(mLock);
309    switch (mState) {
310        case STATE_ERROR:
311            ALOGE("%s: In error state", __FUNCTION__);
312            return INVALID_OPERATION;
313        case STATE_IN_CONFIG:
314        case STATE_IN_RECONFIG:
315            // OK
316            break;
317        case STATE_CONSTRUCTED:
318        case STATE_CONFIGURED:
319            ALOGE("%s: Cannot finish configuration that hasn't been started",
320                    __FUNCTION__);
321            return INVALID_OPERATION;
322        case STATE_IN_IDLE:
323            //Skip configuration in this state
324            return OK;
325        default:
326            ALOGE("%s: Unknown state", __FUNCTION__);
327            return INVALID_OPERATION;
328    }
329
330    // Register for idle tracking
331    sp<StatusTracker> statusTracker = mStatusTracker.promote();
332    if (statusTracker != 0 && mStatusId == StatusTracker::NO_STATUS_ID) {
333        mStatusId = statusTracker->addComponent();
334    }
335
336    // Check if the stream configuration is unchanged, and skip reallocation if
337    // so. As documented in hardware/camera3.h:configure_streams().
338    if (mState == STATE_IN_RECONFIG &&
339            mOldUsage == mUsage &&
340            mOldMaxBuffers == camera3_stream::max_buffers &&
341            mOldDataSpace == camera3_stream::data_space &&
342            mOldFormat == camera3_stream::format) {
343        mState = STATE_CONFIGURED;
344        return OK;
345    }
346
347    // Reset prepared state, since buffer config has changed, and existing
348    // allocations are no longer valid
349    mPrepared = false;
350    mPrepareBlockRequest = true;
351    mStreamUnpreparable = false;
352
353    bool reconfiguring = (mState == STATE_IN_RECONFIG);
354    status_t res;
355    res = configureQueueLocked();
356    // configureQueueLocked could return error in case of abandoned surface.
357    // Treat as non-fatal error.
358    if (res == NO_INIT || res == DEAD_OBJECT) {
359        ALOGE("%s: Unable to configure stream %d queue (non-fatal): %s (%d)",
360                __FUNCTION__, mId, strerror(-res), res);
361        mState = STATE_ABANDONED;
362        return res;
363    } else if (res != OK) {
364        ALOGE("%s: Unable to configure stream %d queue: %s (%d)",
365                __FUNCTION__, mId, strerror(-res), res);
366        mState = STATE_ERROR;
367        return res;
368    }
369
370    if (reconfiguring && streamReconfigured != nullptr) {
371        *streamReconfigured = true;
372    }
373    mState = STATE_CONFIGURED;
374
375    return res;
}
           

這裡會調用configureQueueLocked回到子類中去處理

/frameworks/av/services/camera/libcameraservice/device3/Camera3OutputStream.cpp

status_t Camera3OutputStream::configureQueueLocked() {
345    status_t res;
346
347    mTraceFirstBuffer = true;
348    if ((res = Camera3IOStreamBase::configureQueueLocked()) != OK) {
349        return res;
350    }
351
352    if ((res = configureConsumerQueueLocked()) != OK) {
353        return res;
354    }
355
356    // Set dequeueBuffer/attachBuffer timeout if the consumer is not hw composer or hw texture.
357    // We need skip these cases as timeout will disable the non-blocking (async) mode.
358    if (!(isConsumedByHWComposer() || isConsumedByHWTexture())) {
359        if (mUseBufferManager) {
360            // When buffer manager is handling the buffer, we should have available buffers in
361            // buffer queue before we calls into dequeueBuffer because buffer manager is tracking
362            // free buffers.
363            // There are however some consumer side feature (ImageReader::discardFreeBuffers) that
364            // can discard free buffers without notifying buffer manager. We want the timeout to
365            // happen immediately here so buffer manager can try to update its internal state and
366            // try to allocate a buffer instead of waiting.
367            mConsumer->setDequeueTimeout(0);
368        } else {
369            mConsumer->setDequeueTimeout(kDequeueBufferTimeout);
370        }
371    }
372
373    return OK;
}
           
status_t Camera3OutputStream::configureConsumerQueueLocked() {
377    status_t res;
378
379    mTraceFirstBuffer = true;
380
381    ALOG_ASSERT(mConsumer != 0, "mConsumer should never be NULL");
382
383    // Configure consumer-side ANativeWindow interface. The listener may be used
384    // to notify buffer manager (if it is used) of the returned buffers.
385    res = mConsumer->connect(NATIVE_WINDOW_API_CAMERA,
386            /*reportBufferRemoval*/true,
387            /*listener*/mBufferProducerListener);
388    if (res != OK) {
389        ALOGE("%s: Unable to connect to native window for stream %d",
390                __FUNCTION__, mId);
391        return res;
392    }
393
394    mConsumerName = mConsumer->getConsumerName();
395
396    res = native_window_set_usage(mConsumer.get(), mUsage);
397    if (res != OK) {
398        ALOGE("%s: Unable to configure usage %" PRIu64 " for stream %d",
399                __FUNCTION__, mUsage, mId);
400        return res;
401    }
402
403    res = native_window_set_scaling_mode(mConsumer.get(),
404            NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);
405    if (res != OK) {
406        ALOGE("%s: Unable to configure stream scaling: %s (%d)",
407                __FUNCTION__, strerror(-res), res);
408        return res;
409    }
410
411    if (mMaxSize == 0) {
412        // For buffers of known size
413        res = native_window_set_buffers_dimensions(mConsumer.get(),
414                camera3_stream::width, camera3_stream::height);
415    } else {
416        // For buffers with bounded size
417        res = native_window_set_buffers_dimensions(mConsumer.get(),
418                mMaxSize, 1);
419    }
420    if (res != OK) {
421        ALOGE("%s: Unable to configure stream buffer dimensions"
422                " %d x %d (maxSize %zu) for stream %d",
423                __FUNCTION__, camera3_stream::width, camera3_stream::height,
424                mMaxSize, mId);
425        return res;
426    }
427    res = native_window_set_buffers_format(mConsumer.get(),
428            camera3_stream::format);
429    if (res != OK) {
430        ALOGE("%s: Unable to configure stream buffer format %#x for stream %d",
431                __FUNCTION__, camera3_stream::format, mId);
432        return res;
433    }
434
435    res = native_window_set_buffers_data_space(mConsumer.get(),
436            camera3_stream::data_space);
437    if (res != OK) {
438        ALOGE("%s: Unable to configure stream dataspace %#x for stream %d",
439                __FUNCTION__, camera3_stream::data_space, mId);
440        return res;
441    }
442
443    int maxConsumerBuffers;
444    res = static_cast<ANativeWindow*>(mConsumer.get())->query(
445            mConsumer.get(),
446            NATIVE_WINDOW_MIN_UNDEQUEUED_BUFFERS, &maxConsumerBuffers);
447    if (res != OK) {
448        ALOGE("%s: Unable to query consumer undequeued"
449                " buffer count for stream %d", __FUNCTION__, mId);
450        return res;
451    }
452
453    ALOGV("%s: Consumer wants %d buffers, HAL wants %d", __FUNCTION__,
454            maxConsumerBuffers, camera3_stream::max_buffers);
455    if (camera3_stream::max_buffers == 0) {
456        ALOGE("%s: Camera HAL requested max_buffer count: %d, requires at least 1",
457                __FUNCTION__, camera3_stream::max_buffers);
458        return INVALID_OPERATION;
459    }
460
461    mTotalBufferCount = maxConsumerBuffers + camera3_stream::max_buffers;
462    mHandoutTotalBufferCount = 0;
463    mFrameCount = 0;
464    mLastTimestamp = 0;
465    mUseMonoTimestamp = (isConsumedByHWComposer() | isVideoStream());
466
467    res = native_window_set_buffer_count(mConsumer.get(),
468            mTotalBufferCount);
469    if (res != OK) {
470        ALOGE("%s: Unable to set buffer count for stream %d",
471                __FUNCTION__, mId);
472        return res;
473    }
474
475    res = native_window_set_buffers_transform(mConsumer.get(),
476            mTransform);
477    if (res != OK) {
478        ALOGE("%s: Unable to configure stream transform to %x: %s (%d)",
479                __FUNCTION__, mTransform, strerror(-res), res);
480        return res;
481    }
482
483    /**
484     * Camera3 Buffer manager is only supported by HAL3.3 onwards, as the older HALs requires
485     * buffers to be statically allocated for internal static buffer registration, while the
486     * buffers provided by buffer manager are really dynamically allocated. Camera3Device only
487     * sets the mBufferManager if device version is > HAL3.2, which guarantees that the buffer
488     * manager setup is skipped in below code. Note that HAL3.2 is also excluded here, as some
489     * HAL3.2 devices may not support the dynamic buffer registeration.
490     * Also Camera3BufferManager does not support display/texture streams as they have its own
491     * buffer management logic.
492     */
493    if (mBufferManager != 0 && mSetId > CAMERA3_STREAM_SET_ID_INVALID &&
494            !(isConsumedByHWComposer() || isConsumedByHWTexture())) {
495        uint64_t consumerUsage = 0;
496        getEndpointUsage(&consumerUsage);
497        StreamInfo streamInfo(
498                getId(), getStreamSetId(), getWidth(), getHeight(), getFormat(), getDataSpace(),
499                mUsage | consumerUsage, mTotalBufferCount,
500                /*isConfigured*/true);
501        wp<Camera3OutputStream> weakThis(this);
502        res = mBufferManager->registerStream(weakThis,
503                streamInfo);
504        if (res == OK) {
505            // Disable buffer allocation for this BufferQueue, buffer manager will take over
506            // the buffer allocation responsibility.
507            mConsumer->getIGraphicBufferProducer()->allowAllocation(false);
508            mUseBufferManager = true;
509        } else {
510            ALOGE("%s: Unable to register stream %d to camera3 buffer manager, "
511                  "(error %d %s), fall back to BufferQueue for buffer management!",
512                  __FUNCTION__, mId, res, strerror(-res));
513        }
514    }
515
516    return OK;
}
           

這裡有一句比較重要,就是

res = mConsumer->connect(NATIVE_WINDOW_API_CAMERA, /*listener*/mBufferReleasedListener, /*reportBufferRemoval*/true)
           

mConsumer就是我們配置流時建立的surface,這裡會執行連接配接,如果我們目前surface的GraphicBufferProducer生産者有被其他連接配接,那麼配置也會失敗,這裡一定要注意,配置失敗的話,也會有日志輸出,可以幫助我們定位問題。ANativeWindow是OpenGL定義的視窗類型了,在Android裝置上,該類型一般有兩類,一類就是Surface,一類就是SurfaceFlinger了。

到這裡,createCaptureSession的過程就完了,它是我們相機預覽最重要的條件,一般session建立成功,那麼我們的預覽就會正常,session建立失敗,則預覽一定黑屏,大家如果有碰到相機黑屏的問題,最大的疑點就是這裡,session建立完成後,framework會通過CameraCaptureSession.StateCallback類的onConfigured()回調到應用層,通知我們session建立成功了,那麼我們就可以使用回調方法中的CameraCaptureSession參數,調用它的setRepeatingRequest方法來預覽了,該邏輯執行完成後,我們相機的預覽就起來了。