@Reference(authors="M. Meil\u0103", title="Comparing clusterings by the variation of information", booktitle="Learning theory and kernel machines", url="https://doi.org/10.1007/978-3-540-45167-9_14", bibkey="DBLP:conf/colt/Meila03") public class Entropy extends java.lang.Object
References:
M. Meilă
Comparing clusterings by the variation of information
Learning theory and kernel machines
Modifier and Type | Field and Description |
---|---|
protected double |
entropyFirst
Entropy in first
|
protected double |
entropyJoint
Joint entropy
|
protected double |
entropySecond
Entropy in second
|
Modifier | Constructor and Description |
---|---|
protected |
Entropy(ClusterContingencyTable table)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
double |
entropyConditionalFirst()
Get the conditional entropy of the first clustering.
|
double |
entropyConditionalSecond()
Get the conditional entropy of the first clustering.
|
double |
entropyFirst()
Get the entropy of the first clustering using Log_2.
|
double |
entropyJoint()
Get the joint entropy of both clusterings (not normalized, 0 = equal)
|
double |
entropyMutualInformation()
Get the mutual information (not normalized, 0 = equal)
|
double |
entropyNMIJoint()
Get the joint-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMIMax()
Get the max-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMIMin()
Get the min-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMISqrt()
Get the sqrt-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMISum()
Get the sum-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyPowers()
Get Powers entropy (normalized, 0 = equal) Powers = 1 - NMI_Sum
|
double |
entropySecond()
Get the entropy of the second clustering using Log_2.
|
double |
normalizedVariationOfInformation()
Get the normalized variation of information (normalized, 0 = equal) NVI = 1
- NMI_Joint
X.
|
double |
variationOfInformation()
Get the variation of information (not normalized, 0 = equal)
|
protected double entropyFirst
protected double entropySecond
protected double entropyJoint
protected Entropy(ClusterContingencyTable table)
table
- Contingency tablepublic double entropyFirst()
public double entropySecond()
public double entropyJoint()
public double entropyConditionalFirst()
public double entropyConditionalSecond()
public double entropyPowers()
public double entropyMutualInformation()
public double entropyNMIJoint()
public double entropyNMIMin()
public double entropyNMIMax()
public double entropyNMISum()
public double entropyNMISqrt()
public double variationOfInformation()
@Reference(authors="X. V. Nguyen, J. Epps, J. Bailey", title="Information theoretic measures for clusterings comparison: is a correction for chance necessary?", booktitle="Proc. 26th Ann. Int. Conf. on Machine Learning (ICML \'09)", url="https://doi.org/10.1145/1553374.1553511", bibkey="DBLP:conf/icml/NguyenEB09") public double normalizedVariationOfInformation()
X. V. Nguyen, J. Epps, J. Bailey
Information theoretic measures for clusterings comparison: is a correction
for chance necessary?
Proc. 26th Ann. Int. Conf. on Machine Learning (ICML '09)
Copyright © 2019 ELKI Development Team. License information.