Class DetectionNetwork
Defined in File DetectionNetwork.hpp
Inheritance Relationships
Base Type
public dai::DeviceNodeGroup(Class DeviceNodeGroup)
Class Documentation
-
class DetectionNetwork : public dai::DeviceNodeGroup
DetectionNetwork, base for different network specializations.
Public Functions
-
std::shared_ptr<DetectionNetwork> build(Node::Output &input, const NNArchive &nnArchive)
-
void setNNArchive(const NNArchive &nnArchive)
Set NNArchive for this Node. If the archive’s type is SUPERBLOB, use default number of shaves.
- Parameters:
nnArchive – NNArchive to set
-
void setNNArchive(const NNArchive &nnArchive, int numShaves)
Set NNArchive for this Node, throws if the archive’s type is not SUPERBLOB.
- Parameters:
nnArchive – NNArchive to set
numShaves – Number of shaves to use
-
void setFromModelZoo(NNModelDescription description, bool useCached = true)
Download model from zoo and set it for this Node.
- Parameters:
description – Model description to download
useCached – Use cached model if available
-
void setFromModelZoo(NNModelDescription description, int numShaves, bool useCached = true)
Download model from zoo and set it for this node.
- Parameters:
description – Model description to download
numShaves – Number of shaves to use
useCached – Use cached model if available
-
void setBlobPath(const std::filesystem::path &path)
Load network blob into assets and use once pipeline is started.
- Throws:
Error – if file doesn’t exist or isn’t a valid network blob.
- Parameters:
path – Path to network blob
-
void setBlob(OpenVINO::Blob blob)
Load network blob into assets and use once pipeline is started.
- Parameters:
blob – Network blob
-
void setBlob(const std::filesystem::path &path)
Same functionality as the setBlobPath(). Load network blob into assets and use once pipeline is started.
- Throws:
Error – if file doesn’t exist or isn’t a valid network blob.
- Parameters:
path – Path to network blob
-
void setModelPath(const std::filesystem::path &modelPath)
Load network model into assets.
- Parameters:
modelPath – Path to the model file.
-
void setNumPoolFrames(int numFrames)
Specifies how many frames will be available in the pool
- Parameters:
numFrames – How many frames will pool have
-
void setNumInferenceThreads(int numThreads)
How many threads should the node use to run the network.
- Parameters:
numThreads – Number of threads to dedicate to this node
-
void setNumNCEPerInferenceThread(int numNCEPerThread)
How many Neural Compute Engines should a single thread use for inference
- Parameters:
numNCEPerThread – Number of NCE per thread
-
void setNumShavesPerInferenceThread(int numShavesPerThread)
How many Shaves should a single thread use for inference
- Parameters:
numShavesPerThread – Number of shaves per thread
-
void setBackend(std::string backend)
Specifies backend to use
- Parameters:
backend – String specifying backend to use
-
void setBackendProperties(std::map<std::string, std::string> properties)
Set backend properties
- Parameters:
backendProperties – backend properties map
-
int getNumInferenceThreads()
How many inference threads will be used to run the network
- Returns:
Number of threads, 0, 1 or 2. Zero means AUTO
-
void setConfidenceThreshold(float thresh)
Specifies confidence threshold at which to filter the rest of the detections.
- Parameters:
thresh – Detection confidence must be greater than specified threshold to be added to the list
-
float getConfidenceThreshold() const
Retrieves threshold at which to filter the rest of the detections.
- Returns:
Detection confidence
-
virtual std::vector<std::pair<Input&, std::shared_ptr<Capability>>> getRequiredInputs() override
-
std::optional<std::vector<std::string>> getClasses() const
-
virtual void buildInternal() override
Function called from within the
createfunction to build the node. This function is useful for initialization, setting up inputs and outputs = stuff that cannot be perform in the constuctor.
Public Members
-
Subnode<NeuralNetwork> neuralNetwork = {*this, "neuralNetwork"}
-
Subnode<DetectionParser> detectionParser = {*this, "detectionParser"}
-
Output &out
Outputs ImgDetections message that carries parsed detection results. Overrides NeuralNetwork ‘out’ with ImgDetections output message type.
-
Output &outNetwork
Outputs unparsed inference results.
-
Input &input
Input message with data to be inferred upon Default queue is blocking with size 5
-
Output &passthrough
Passthrough message on which the inference was performed.
Suitable for when input queue is set to non-blocking behavior.
Public Static Functions
-
std::shared_ptr<DetectionNetwork> build(Node::Output &input, const NNArchive &nnArchive)