@Reference(authors="J. Lin",title="Divergence measures based on the Shannon entropy",booktitle="IEEE Transactions on Information Theory 37(1)",url="https://doi.org/10.1109/18.61115",bibkey="DBLP:journals/tit/Lin91") @Reference(authors="D. M. Endres, J. E. Schindelin",title="A new metric for probability distributions",booktitle="IEEE Transactions on Information Theory 49(7)",url="https://doi.org/10.1109/TIT.2003.813506",bibkey="DBLP:journals/tit/EndresS03") @Reference(authors="M.-M. Deza, E. Deza",title="Dictionary of distances",booktitle="Dictionary of distances",url="https://doi.org/10.1007/978-3-642-00234-2",bibkey="doi:10.1007/978-3-642-00234-2") public class JensenShannonDivergenceDistanceFunction extends JeffreyDivergenceDistanceFunction
NumberVector
s is a symmetric,
smoothened version of the
KullbackLeiblerDivergenceAsymmetricDistanceFunction
.
It essentially is the same as JeffreyDivergenceDistanceFunction
, only
scaled by half. For completeness, we include both.
\[JS(\vec{x},\vec{y}):=\tfrac12\sum\nolimits_i x_i\log\tfrac{2x_i}{x_i+y_i}+y_i\log\tfrac{2y_i}{x_i+y_i} = \tfrac12 KL(\vec{x},\tfrac12(\vec{x}+\vec{y})) + \tfrac12 KL(\vec{y},\tfrac12(\vec{x}+\vec{y}))\]
There exists a variable definition where the two vectors are weighted with \(\beta\) and \(1-\beta\), which for the common choice of \(\beta=\tfrac12\) yields this version.
Reference:
J. Lin
Divergence measures based on the Shannon entropy
IEEE Transactions on Information Theory 37(1)
D. M. Endres, J. E. Schindelin
A new metric for probability distributions
IEEE Transactions on Information Theory 49(7)
M.-M. Deza, E. Deza
Dictionary of distances
Modifier and Type | Class and Description |
---|---|
static class |
JensenShannonDivergenceDistanceFunction.Parameterizer
Parameterization class, using the static instance.
|
Modifier and Type | Field and Description |
---|---|
static JensenShannonDivergenceDistanceFunction |
STATIC
Static instance.
|
Constructor and Description |
---|
JensenShannonDivergenceDistanceFunction()
Deprecated.
Use static instance!
|
Modifier and Type | Method and Description |
---|---|
double |
distance(NumberVector v1,
NumberVector v2)
Computes the distance between two given DatabaseObjects according to this
distance function.
|
double |
minDist(SpatialComparable mbr1,
SpatialComparable mbr2)
Computes the distance between the two given MBRs according to this distance
function.
|
java.lang.String |
toString() |
equals, hashCode, isSquared
dimensionality, dimensionality, dimensionality, dimensionality, dimensionality, dimensionality, dimensionality, dimensionality, getInputTypeRestriction
clone, finalize, getClass, notify, notifyAll, wait, wait, wait
instantiate
getInputTypeRestriction
isMetric, isSymmetric
public static final JensenShannonDivergenceDistanceFunction STATIC
@Deprecated public JensenShannonDivergenceDistanceFunction()
public double distance(NumberVector v1, NumberVector v2)
PrimitiveDistanceFunction
distance
in interface NumberVectorDistanceFunction<NumberVector>
distance
in interface PrimitiveDistanceFunction<NumberVector>
distance
in class JeffreyDivergenceDistanceFunction
v1
- first DatabaseObjectv2
- second DatabaseObjectpublic double minDist(SpatialComparable mbr1, SpatialComparable mbr2)
SpatialPrimitiveDistanceFunction
minDist
in interface SpatialPrimitiveDistanceFunction<NumberVector>
minDist
in class JeffreyDivergenceDistanceFunction
mbr1
- the first MBR objectmbr2
- the second MBR objectpublic java.lang.String toString()
toString
in class JeffreyDivergenceDistanceFunction
Copyright © 2019 ELKI Development Team. License information.