FAQ

How Can I Set Up and Preview Camera Feeds?

The Android SDK offers two camera implementations: VisionCameraSource and CameraXSource. (We recommend that you use CameraXSource.) A preview (CameraXSourcePreview or VisionCameraSourcePreview) should be added to the view hierarchy and passed into the constructor.

Relevant sample code can be found in the createCameraSource method within the sample app.

What Do I Do If a Device Is Experiencing Low Performance? (Few Detections Per Second)

This can happen on lower end devices, in which case you can try the following:

  1. Reduce Camera frame resolution. Loop through the returned camera profiles in createCameraSource and select a lower resolution.

    final CameraProfile cp = cameraProfiles.get(facing).get(0); // Select another instead of the first one
  2. If you're using a High Sensitivity detector, try reducing the detector input size.

    settings.setRetinaInputSize(...) // 0 - Extra small; 1 - Small; 2 - Normal; 3 - Large
  3. Switch to the Standard detector. (Note that this will adversely impact the detector's accuracy.)

    //faceDetectorServiceConfig.detectorModel = FDRuntimeOptions.FD_STANDARD; // Uncomment to enable

How Can I Extract and Save Detected Faces?

DetectedObject is used to obtain a face bitmap; via the DetectedObject.getThumbnail() method in particular. The method returns a downscaled bitmap where the size depends on face's aspect ratio. (The smaller side will be 240px.)

If a different size is needed, the DetectedObject also includes the pixel bounds, required to extract a face from the original frame.

The sample code can be found in ShareManger.share() where face thumbnail is saved to the local cache storage and shared via Android share intent.

How Can I Use RGB Liveness Detection?

  1. Initialize livenessActionRecogniser.

    livenessActionRecognizer = new LivenessActionRecognizer(settings);
    livenessActionRecognizer.setDelegate(this);
  2. Pass tracking results to the liveness recognizer.

    if (livenessActionRecognizer != null) {
        livenessActionRecognizer.update(trackingResult);
    }
  3. Read liveness info once received.

    final Boolean isLivenessConfirmed = trackedObject.isLivenessConfirmed();
        if (isLivenessConfirmed) {
        //trackedObject.getLivenessConfidence()
        }
    }
  4. Pass the liveness object action to the object event log.

    @Override
    public void didRecognizeObjectAction(ObjectActionRecognizer objectActionRecognizer, ObjectAction objectAction, TrackedObject trackedObject) {
        if (objectActionRecognizer == livenessActionRecognizer) {
            DispatchQueue.Main.async(() -> overlay.onFaceLivenessConfirmed(trackedObject));
    
            // Propagate the objectAction to the objectEventLog in order to report it
            if (objectEventLog != null) {
                objectEventLog.update(trackedObject, objectAction);
            }
        }
    }

How Can I Play rtsp:// Streams?

Use AglVideoView to play rtsp:// streams.

The sample code can be found in VidiaFragment.java. Search for aglView references.

How Can I Configure AglVideoView to Use HW Decoder?

Set the decoder type option to AglOptions.DECODER_MEDIACODEC.

The sample code can be found in VidiaFragment.java. Search for aglView references.

How Can I Enable Event Reporting?

Initialize ObjectEventLog and PeopleIndexer and pass TrackingResult in ObjectEventLog.update().

The sample code can be found in VidiaFragment.java. Search for objectEventlog references.

How Can I Process rtsp:// Streams in the Background?

The demo app provides an example service which plays rtsp:// streams and feeds frames into the ObjectTracker for face detection. A notification is posted when a person from the stream is recognised.

Usually, the service should be started/stopped from Activity's onPause() and onResume() callbacks.

This isn't SDK-related but there is a sample code to schedule start/stop of the service, located in VidiaFragment.java. Search for CameraProcessorService references.

See Also