|Parameter name and description||Parameter values|
|1. Name||Arbitrary string|
|2. Target feature||List of source features with numeric data types|
|3. Computed metric||
|4. Reference feature||List of reference source features with numeric data types|
Relative entropy in Validio is an adapted implementation of the symmetrised Kullback - Leibler divergence.
Relative entropy is used to detect distribution shifts between a target set and a reference set and will produce a non-negative numerical metric, where zero implies identical empirical distributions and gets larger as the two distributions become increasingly different.
Don’t have experience with relative entropy?
But still want to monitor distribution shifts? No worries! Apply a Smart Alert and monitor changes in relative entropy without having to worry about what the absolute value means
Calculates the difference in max, min, mean or standard deviation between the two datasets:
Difference = target metric - reference metric
Calculates the ratio in max, min, mean or standard deviation between the two datasets:
Ratio = target metric/reference metric
Updated 2 months ago