Android SDK Objects

Cloud Environment and Settings

This section describes the SAFR SDK objects which are used to control the SDK, which SAFR Cloud environment and account should be used by the object recognition functionality implemented by the SDK.

Settings and ModeSettings

Stores the global, SDK-shared configuration into the SharedPreferences. Note that this object is just a store; new settings are not applied automatically.

For every SDK object, there is an appropriate factory method which returns its configuration. So when a setting is changed, any potentially affected SDK object should be re-created. (In some cases only their configuration should be updated.)

A client app can register itself as a callback that is invoked when a single or bulk settings change is made, at which point the client app should re-create objects or update their configurations.

The difference between Settings and ModeSettings is that the former is the same across all modes, whereas the latter may be configurable per mode/preset.

ModeSettings.Identifier

A configuration preset or mode for specific SDK use cases. i.e. recognition - a configuration optimised for simple face recognition without event reporting.

CloudEnvironment

The CloudEnvironment object encapsulates the Cloud service to which the ArgusKit should connect to execute cloud-based functions like face recognition.

CloudAccount

The CloudAccount object stores the account name and password for the cloud account that should be used for cloud-based functionality like face recognition.

In most cases, it should not be accessed directly but should instead be accessed via the Settings object.

Tracking Objects in a Video

This section describes the SDK objects which you use to detect and recognize objects. The most important SDK object in this category is the object tracker.

The object tracker allows you to detect and recognize faces, get information about them, and it takes care of tracking their location and identity over time.

ObjectTracker.Configuration

An object tracker configuration stores all required and optional settings for an object tracker instance. You create a configuration instance and you fill it in with the desired options before you create an object tracker.

ObjectTracker

The object tracker is the heart of ArgusKit. It receives a stream of video frames which it analyzes to find objects. It tracks found objects as long as they remain visible to the video stream. Object detection, recognition, and tracking are executed in real-time. An object tracker is connected to a delegate which you pass to the object tracker at creation time. The ObjectTracker informs its delegate at every frame about the current state of the objects which it is tracking. The delegate can then use the tracked object APIs to learn what kind of objects the tracker found and what their current spatial location, size, and state are.

The objectTrackerDidTrack() callback is invoked on every frame update. The result of processing a given frame is wrapped within a TrackingResult object as a list of TrackedObjects.

In essence, TrackedObject just holds the details about the object being tracked; at this moment, it will simply represent a face. In other words, to differentiate what has really changed within a given frame TrackingResult provides API for:

TrackingResult

A TrackingResult object contains a snapshot of the current state of the object tracker. The object tracker delivers a new tracking result at every frame boundary. The tracking result contains a list of tracked object which have appeared in the current frame, which have disappeared and which have changed their current state in the current frame. For example, if a tracked object was detected in a previous frame and the object tracker has now been able to recognize the tracked object as a specific identity, then the tracked object is included in the list of updated tracked objects.

DetectedObject

Common interface for all types of detected objects.

Source
/**
 * Common interface for all types of detected objects.
 */
public interface DetectedObject {
 
    /**
     * The type of the object
     *
     * @return
     */
    DetectedObjectType getObjectType();
 
    /**
     * The local ID of the object. Local IDs for detected objects are only unique with
     * respect to the single frame in which the object was detected.
     *
     * @return the local ID
     */
    Long getLocalId();
 
    /**
     * The bounding box of the object inside of the image in which it was detected.
     * Note that the bounding box is in normalized coordinates [0, 1] x [0, 1]
     *
     * @return
     */
    RectF getNormalizedBounds();
 
    /**
     * Returns the expansion factor by which the object bounds have been expanded
     * to cut out the object thumbnail. Note that the object thumbnail is usually
     * bigger than the area enclosed by the normalized bounds.
     * This is the factor by which  it was expanded.
     *
     * @return
     */
    float getThumbnailBoundsExpansionFactor();
 
    /**
     * How confident the object recognizer is that this is actually an object for really
     *
     * @return
     */
    float getConfidence();
 
    /**
     * How suitable is this object for object recognition?
     *
     * @return
     */
    double getCenterPoseQuality();
 
    /**
     * The yaw of the object measured as the object is moves left-right.
     *
     * @return -1 for left, 0 for center and +1 for right.
     */
    double getYaw();
 
    /**
     * The pitch of the object measured as the object moves up/down.
     *
     * @return  -1 for straight down, 0 for center and +1 for straight up.
     */
    double getPitch();
 
    /**
     * The roll of the object is measured as the object is tilted left-right.
     *
     * @return  -1 for left, 0 for center and +1 for the right.
     */
    double getRoll();
 
    /**
     * What is the sharpness score of the image that was used for detection?
     *
     * @return
     */
    double getImageSharpnessQuality();
 
    /**
     * What is the contrast quality of the image from which this object was taken?
     *
     * @return
     */
    double getImageContrastQuality();
 
    /**
     * The object thumbnail
     *
     * @return
     */
    Bitmap getThumbnail();
 
    /**
     * The scene image - the image of the scene from which the object thumbnail was extracted.
     *
     * @return
     */
    Bitmap getSceneThumbnail();
 
    /**
     * Data that may be passed to the object recognizer to speed up recognition
     *
     * @return
     */
    String getRecognizerHint();
 
    /**
     * Max ratio of clipping on either side: if the object is a potentially partial object
     *  which means that its bounding box extends beyond the video frame boundaries
     *
     * @return
     */
    float getClipRatio();
 
    /**
     * The bounding box before normalization.
     *
     * @return
     */
    RectF getPixelBounds();
 
    enum DetectedObjectType {
        badge,
        face,
        recognizedObject
    }
}

TrackedObject

A tracked object represents a single and unique instance of an object that the object tracker has been able to detect in the video stream which it is actively tracking. A tracked object has a type that indicates whether it is a badge(not yet supported) or the face of a person. A tracking object also has an axis-aligned bounding box which tells you where in the input video frame the tracked object can be found and what its size is.

TrackedObject provides the getDetectedObject() method whose return type is DetectedObject - a common interface for all types of detected objects and which is used to get the face info.

This can be referenced in the sample app's GraphicOverlay class, under the drawFace() method.

TrackedFace

A TrackedFace represents a human face. A TrackedFace may be linked to a person. If the object tracker is able to recognize the face as belonging to a specific person than the TrackedFace provides a reference to the corresponding Person object.

Person

A Person object provides information about a person that has been registered with the ArgusKit face recognition service. Each person has a unique identifier. You may assign a name and a set of tags to a person with the help of a TrackedFaceChange object and the ObjectTracker.apply() object tracker function.

TrackedFaceChange

A person change object stores attributes that should be applied to the person record on file in the ArgusKit face recognition service. This object allows you to change a person's name, tags, age, or gender information.

VideoFrame

A VideoFrame object encapsulates a single decoded video frame which should be passed to an object tracker instance.

Event Reporting

This section describes the SDK objects which you use to generate events from object tracker results and to post them to the event server in the SAFR Cloud.

CloudEventStore

This object acts as a cloud store for events. It is needed by the PeopleIndexer object which is the main interface to the event reporting system.

ObjectEventDataStore

This object acts as an intermediate local store for events. It is needed by the PeopleIndexer object which is the main interface to the event reporting system.

ObjectEventLog

The object event log records object tracker results. You should invoke the update() method with the current object tracker result every time the object tracker invokes your didTrack() event handler.

PeopleIndexer

The PeopleIndexer object ties together the object event store and the object event log. In essence, it coordinates the event reporting mechanism. Pass your configuration object when you create an instance of the people indexer.

The PeopleIndexer provides hooks/callbacks which are invoked when an event changes its state or when it's done processing events.

interface PeopleIndexerDelegate {
    /**
     * Called when the event is started.
     */
    fun peopleIndexerDidStartEvent(peopleIndexer: PeopleIndexer?, personEvent: PersonEvent?)
 
    /**
     * Called when the event is updated.
     */
    fun peopleIndexerDidUpdateEvent(
        peopleIndexer: PeopleIndexer?,
        personEvent: PersonEvent?,
        updatedProperties: PersonEventUpdatableProperties?
    )
 
    /**
     * Called when the event had ended.
     */
    fun peopleIndexerDidEndEvent(peopleIndexer: PeopleIndexer?, personEvent: PersonEvent?)
 
    /**
     * Called when there are no ongoing events
     */
    fun peopleIndexerDidEndProcessingEvents(peopleIndexer: PeopleIndexer?)
}

PeopleIndexer.Configuration

The PeopleIndexer.Configuration object stores all relevant configuration information for the event reporting system. Create an instance of this object to get a default configuration and then set the cloud account and environment information to enable reporting to the SAFR Cloud event server.

Analyzing Images

This section describes the image analyzer object which is used to find faces in an image.

ImageAnalyzer

A utility class that does a single image analysis. It processes an image in order to find faces. Object detection and recognition are executed in real time.

ImageAnalysisResult

An ImageAnalysisResult object contains a snapshot of what was detected in an image. The image analyzer result contains a list of objects that have appeared in the current image.

Action Recognition

LivenessActionRecognizer

Used for RGB liveness detection. Feed it with ObjectTracker's TrackingResult. The delegate's didRecognizeObjectAction callback is invoked when the liveness state for a given TrackedObject is concluded.

SmileActionRecognizer

Used for smile detection. Feed it with ObjectTracker's TrackingResult. The delegate's didRecognizeObjectAction callback is invoked when the detectedSmile state for a given TrackedObject is concluded.

Settings and ModeSettings Configuration Options

Generally, Settings should be configured before initializing/creating other SDK objects. The Settings object provides an option to register a callback, which is invoked when any of the settings are changed.

To prevent multiple callback invocations, settings should be changed in bulk mode:

settings
// Begin bulk change
settings.beginBulkChange();
// update configuration
 
settings.setUserIdentifier("user");
settings.setUserPassword("password");
settings.setUserDirectory("directory");
ModeSettings modeSettings = settings.getModeSettings();
 
modeSettings.setFaceRecognizerDetectIdentity(true);  
modeSettings.setFaceRecognizerDetectGender(true);
modeSettings.setFaceRecognizerDetectAge(false);
modeSettings.setFaceRecognizerDetectSentiment(true);
modeSettings.setFaceRecognizerDetectOcclusion(false);
 
// End bulk change
settings.endBulkChange();   

Once done, the callback will be invoked with the list of changed keys:

@Override
public void onSettingsChange(@NonNull HashSet<String> hashSet) {
 
    // i.e. recreate SDK objects
 
    // Or restart activity
    new Handler().post(() -> getActivity().recreate());
}

See Also