Training Tool Parameters

The Training tool parameters control the training process.

Parameter Description

Training Set

Launches the Training Set dialog, which is used to specify the sample sets and percentage of labeled images to randomly select as training samples for the neural network each time a new training is initiated.

Epoch Count

Specifies the number of optimization iterations done during training. This setting can be reduced when your application has limited complexity, or if a lower quality model may be useful when testing different parameter settings.

An epoch is the term for passing your entire Training Set into the neural network.

The tools typically need to see the training image set data somewhere on the order of 50 times (and is in the Fine Tuning area), which is the default setting, and is typically sufficient for most standard applications.

Choosing to use fewer epochs could result in the neural network being stuck in learning, or unable to accurately solve the problem, while too many epochs could result in overfitting (aka over-training) the results (i.e. it will learn only the trained images, and anything outside of the trained images will be considered invalid). It is important to train the network to the point where it is generalized off of your training image set. If you increase the epochs too much, you run the risk of over-training and overfitting the training image set data.

Tip:
  • If your training image set contains a large number of images, or the size of the images in your training image set are large compared to the Feature Size, you may need to increase the Epoch Count setting.
  • There is a risk of overfitting if you have a very small training image set. If you have a large training image set, and your network is addressing all of the samples in the set, the network is much more general than if you have a smaller training image set and the network is keyed into that small sample set.

Capacity

Specifies the capacity of the model to account for different degrees of visual complexity. For very simple patterns, a lower capacity settings should be used to prevent overfitting. For complex images or objects, a higher capacity setting should be used to prevent underfitting.

Note: The Capacity parameter is only available for the Red Analyze Tool in Unsupervised Mode.

Training Passes

Specifies the number of iterative trainings to perform in order to better cope with statistically difficult problems which show a very unevenly distributed complexity across images, and/or within different areas of the images. This parameter helps guide the tool to where there is complexity in the view. This is useful when the view contains multiple, distinctive structured areas. It will cause the algorithm to ignore areas of low complexity (e.g. white backgrounds with no details) and focus on areas with higher complexity (e.g. lines and other fine details).

Note: The Training Passes parameter is only available for the Red Analyze Tool in Unsupervised Mode.

Complex Areas =

Simple Areas =

With a higher Training Passes setting, the tool will focus on the complex areas in the image. Training time will increase, but the processing time will remain the same.

Low Precision

Specifies whether or not the tool will create a low precision model after training. The low precision model is useful for applications where speed optimization is a priority during runtime operation. There is a likelihood that results will be different between the two modes of operation.

When enabled, there will be a white lightning bolt icon added to the tool's icon.

Note:
  • The Low Precision parameter is only available for the Red Analyze Tool and Green Classify Tool.
  • If the tool has been previously trained prior to checking the check box, you will be prompted to retrain the tool. If you clear the check box after enabling the low precision model, you will also need to retrain the tool.
  • The low precision model requires the Standard or Advanced license, and also requires a GPU with CUDA Compute Capability 6.1 or higher (for more information, visit the NVIDIA website).