Optimize Statistical Results

The topics in this section present useful tips and tricks to increase statistical performance of your VisionPro Deep Learning application.

 

Optimize Precision, Recall, F-Score

Each of the VisionPro Deep Learning tools have a tolerance that can be adjusted, which allows the tool to be "picky" in its predictions.

  Blue Read Tool Blue Locate Tool Green Classify Tool Red Analyze Tool
Where to adjust
  • Feature location
  • Feature score
  • Match data
  • Feature location
  • Feature score
  • Match data
  • Match scores
  • Defect area
  • Score
What's being adjusted

Feature Score

Feature Score

Class Score

Defect Probability

In addition, you can adjust each tool's Threshold parameter (which is one of the ).

Changing the Threshold parameter affects Recall and Precision in the following ways:

Threshold Recall Precision
Lowering the Threshold

Increases

Decreases

Increasing the Threshold

Decreases

Increases

Changing the Threshold parameter affects False Positives and False Negatives in the following ways:

Threshold False Positives False Negatives

Lowering the Threshold

Decreases

Increases

Increasing the Threshold

Increases

Decreases

 

If seeking a balanced rate of False Positive (FP) and False Negative (FN) results, seek to optimize the F-Score (both overall and averaged).

Note: For Blue Locate, a low precision score is less problematic if you have a Model to filter out false positives.

 

Optimize Red Analyze Results

To optimize the performance of the Red Analyze tool, keep in mind the following:

  • Prefer the AUC of untrained items.
  • Prefer region-based AUC.

    • This requires that you mark all bad samples.
    • Is usually much lower than the view-based AUC.
  • Threshold setting depends on the requirements of the application.

    • High threshold to avoid false positives.
    • Low threshold to avoid false negatives.
    • Otherwise, a good compromise is %FP = %FN.