Class FeatureTrackerConfig

Nested Relationships

Nested Types

Inheritance Relationships

Base Type

Class Documentation

class FeatureTrackerConfig : public dai::Buffer

FeatureTrackerConfig message. Carries config for feature tracking algorithm

Public Functions

FeatureTrackerConfig() = default

Construct FeatureTrackerConfig message.

virtual ~FeatureTrackerConfig()
FeatureTrackerConfig &setCornerDetector(CornerDetector::Type cornerDetector)

Set corner detector algorithm type.

Parameters:

cornerDetector – Corner detector type, HARRIS or SHI_THOMASI

FeatureTrackerConfig &setCornerDetector(CornerDetector config)

Set corner detector full configuration.

Parameters:

config – Corner detector configuration

FeatureTrackerConfig &setOpticalFlow()

Set optical flow as motion estimation algorithm type.

FeatureTrackerConfig &setOpticalFlow(MotionEstimator::OpticalFlow config)

Set optical flow full configuration.

Parameters:

config – Optical flow configuration

FeatureTrackerConfig &setHwMotionEstimation()

Set hardware accelerated motion estimation using block matching. Faster than optical flow (software implementation) but might not be as accurate.

FeatureTrackerConfig &setNumTargetFeatures(std::int32_t numTargetFeatures)

Set number of target features to detect.

Parameters:

numTargetFeatures – Number of features

FeatureTrackerConfig &setMotionEstimator(bool enable)

Enable or disable motion estimator.

Parameters:

enable

FeatureTrackerConfig &setMotionEstimator(MotionEstimator config)

Set motion estimator full configuration.

Parameters:

config – Motion estimator configuration

FeatureTrackerConfig &setFeatureMaintainer(bool enable)

Enable or disable feature maintainer.

Parameters:

enable

FeatureTrackerConfig &setFeatureMaintainer(FeatureMaintainer config)

Set feature maintainer full configuration.

Parameters:

config – feature maintainer configuration

virtual void serialize(std::vector<std::uint8_t> &metadata, DatatypeEnum &datatype) const override
DEPTHAI_SERIALIZE(FeatureTrackerConfig, cornerDetector, motionEstimator, featureMaintainer)

Public Members

CornerDetector cornerDetector

Corner detector configuration. Used for feature detection.

MotionEstimator motionEstimator

Motion estimator configuration. Used for feature reidentification between current and previous features.

FeatureMaintainer featureMaintainer

FeatureMaintainer configuration. Used for feature maintaining.

Public Static Attributes

static constexpr const std::int32_t AUTO = 0
struct CornerDetector

Corner detector configuration structure.

Public Types

enum class Type : std::int32_t

Values:

enumerator HARRIS

Harris corner detector.

enumerator SHI_THOMASI

Shi-Thomasi corner detector.

Public Members

Type type = Type::HARRIS

Corner detector algorithm type.

std::int32_t cellGridDimension = 4

Ensures distributed feature detection across the image. Image is divided into horizontal and vertical cells, each cell has a target feature count = numTargetFeatures / cellGridDimension. Each cell has its own feature threshold. A value of 4 means that the image is divided into 4x4 cells of equal width/height. Maximum 4, minimum 1.

std::int32_t numTargetFeatures = 320

Target number of features to detect. Maximum number of features is determined at runtime based on algorithm type.

std::int32_t numMaxFeatures = AUTO

Hard limit for the maximum number of features that can be detected. 0 means auto, will be set to the maximum value based on memory constraints.

bool enableSobel = true

Enable 3x3 Sobel operator to smoothen the image whose gradient is to be computed. If disabled, a simple 1D row/column differentiator is used for gradient.

bool enableSorting = true

Enable sorting detected features based on their score or not.

Thresholds thresholds

Threshold settings. These are advanced settings, suitable for debugging/special cases.

struct Thresholds

Threshold settings structure for corner detector.

Public Functions

DEPTHAI_SERIALIZE(Thresholds, initialValue, min, max, decreaseFactor, increaseFactor)

Public Members

float initialValue = AUTO

Minimum strength of a feature which will be detected. 0 means automatic threshold update. Recommended so the tracker can adapt to different scenes/textures. Each cell has its own threshold. Empirical value.

float min = AUTO

Minimum limit for threshold. Applicable when automatic threshold update is enabled. 0 means auto, 6000000 for HARRIS, 1200 for SHI_THOMASI. Empirical value.

float max = AUTO

Maximum limit for threshold. Applicable when automatic threshold update is enabled. 0 means auto. Empirical value.

float decreaseFactor = 0.9f

When detected number of features exceeds the maximum in a cell threshold is lowered by multiplying its value with this factor.

float increaseFactor = 1.1f

When detected number of features doesn’t exceed the maximum in a cell, threshold is increased by multiplying its value with this factor.

struct FeatureMaintainer

FeatureMaintainer configuration structure.

Public Members

bool enable = true

Enable feature maintaining or not.

float minimumDistanceBetweenFeatures = 50

Used to filter out detected feature points that are too close. Requires sorting enabled in detector. Unit of measurement is squared euclidean distance in pixels.

float lostFeatureErrorThreshold = 50000

Optical flow measures the tracking error for every feature. If the point can’t be tracked or it’s out of the image it will set this error to a maximum value. This threshold defines the level where the tracking accuracy is considered too bad to keep the point.

float trackedFeatureThreshold = 200000

Once a feature was detected and we started tracking it, we need to update its Harris score on each image. This is needed because a feature point can disappear, or it can become too weak to be tracked. This threshold defines the point where such a feature must be dropped. As the goal of the algorithm is to provide longer tracks, we try to add strong points and track them until they are absolutely untrackable. This is why, this value is usually smaller than the detection threshold.

struct MotionEstimator

Used for feature reidentification between current and previous features.

Public Types

enum class Type : std::int32_t

Values:

enumerator LUCAS_KANADE_OPTICAL_FLOW

Using the pyramidal Lucas-Kanade optical flow method.

enumerator HW_MOTION_ESTIMATION

Using a dense motion estimation hardware block (Block matcher).

Public Functions

DEPTHAI_SERIALIZE(MotionEstimator, enable, type, opticalFlow)

Public Members

bool enable = true

Enable motion estimation or not.

Type type = Type::LUCAS_KANADE_OPTICAL_FLOW

Motion estimator algorithm type.

OpticalFlow opticalFlow

Optical flow configuration. Takes effect only if MotionEstimator algorithm type set to LUCAS_KANADE_OPTICAL_FLOW.

struct OpticalFlow

Optical flow configuration structure.

Public Members

std::int32_t pyramidLevels = AUTO

Number of pyramid levels, only for optical flow. AUTO means it’s decided based on input resolution: 3 if image width <= 640, else 4. Valid values are either 3/4 for VGA, 4 for 720p and above.

std::int32_t searchWindowWidth = 5

Image patch width used to track features. Must be an odd number, maximum 9. N means the algorithm will be able to track motion at most (N-1)/2 pixels in a direction per pyramid level. Increasing this number increases runtime

std::int32_t searchWindowHeight = 5

Image patch height used to track features. Must be an odd number, maximum 9. N means the algorithm will be able to track motion at most (N-1)/2 pixels in a direction per pyramid level. Increasing this number increases runtime

float epsilon = 0.01f

Feature tracking termination criteria. Optical flow will refine the feature position on each pyramid level until the displacement between two refinements is smaller than this value. Decreasing this number increases runtime.

std::int32_t maxIterations = 9

Feature tracking termination criteria. Optical flow will refine the feature position maximum this many times on each pyramid level. If the Epsilon criteria described in the previous chapter is not met after this number of iterations, the algorithm will continue with the current calculated value. Increasing this number increases runtime.