@Reference(authors="Meil\u0103, M.", title="Comparing clusterings by the variation of information", booktitle="Learning theory and kernel machines", url="http://dx.doi.org/10.1007/978-3-540-45167-9_14") public class Entropy extends Object
Meilă, M.
Comparing clusterings by the variation of information
Learning theory and kernel machines
Modifier and Type | Field and Description |
---|---|
protected double |
entropyFirst
Entropy in first
|
protected double |
entropyJoint
Joint entropy
|
protected double |
entropySecond
Entropy in second
|
Modifier | Constructor and Description |
---|---|
protected |
Entropy(ClusterContingencyTable table)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
double |
entropyConditionalFirst()
Get the conditional entropy of the first clustering.
|
double |
entropyConditionalSecond()
Get the conditional entropy of the first clustering.
|
double |
entropyFirst()
Get the entropy of the first clustering using Log_2.
|
double |
entropyJoint()
Get the joint entropy of both clusterings (not normalized, 0 = equal)
|
double |
entropyMutualInformation()
Get the mutual information (not normalized, 0 = equal)
|
double |
entropyNMIJoint()
Get the joint-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMIMax()
Get the max-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMIMin()
Get the min-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMISqrt()
Get the sqrt-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyNMISum()
Get the sum-normalized mutual information (normalized, 0 = unequal)
|
double |
entropyPowers()
Get Powers entropy (normalized, 0 = equal) Powers = 1 - NMI_Sum
|
double |
entropySecond()
Get the entropy of the second clustering using Log_2.
|
double |
normalizedVariationOfInformation()
Get the normalized variation of information (normalized, 0 = equal) NVI = 1
- NMI_Joint
Nguyen, X.
|
double |
variationOfInformation()
Get the variation of information (not normalized, 0 = equal)
|
protected double entropyFirst
protected double entropySecond
protected double entropyJoint
protected Entropy(ClusterContingencyTable table)
table
- Contingency tablepublic double entropyFirst()
public double entropySecond()
public double entropyJoint()
public double entropyConditionalFirst()
public double entropyConditionalSecond()
public double entropyPowers()
public double entropyMutualInformation()
public double entropyNMIJoint()
public double entropyNMIMin()
public double entropyNMIMax()
public double entropyNMISum()
public double entropyNMISqrt()
public double variationOfInformation()
@Reference(authors="Nguyen, X. V. and Epps, J. and Bailey, J.", title="Information theoretic measures for clusterings comparison: is a correction for chance necessary?", booktitle="Proc. ICML \'09 Proceedings of the 26th Annual International Conference on Machine Learning", url="http://dx.doi.org/10.1145/1553374.1553511") public double normalizedVariationOfInformation()
Nguyen, X. V. and Epps, J. and Bailey, J.
Information theoretic measures for clusterings comparison: is a correction
for chance necessary?
In: Proc. ICML '09 Proceedings of the 26th Annual International Conference
on Machine Learning
Copyright © 2015 ELKI Development Team, Lehr- und Forschungseinheit für Datenbanksysteme, Ludwig-Maximilians-Universität München. License information.