Package | Description |
---|---|
de.lmu.ifi.dbs.elki.algorithm |
Algorithms suitable as a task for the
KDDTask
main routine. |
de.lmu.ifi.dbs.elki.algorithm.clustering |
Clustering algorithms
Clustering algorithms are supposed to implement the
Algorithm -Interface. |
de.lmu.ifi.dbs.elki.algorithm.clustering.em |
Expectation-Maximization clustering algorithm.
|
de.lmu.ifi.dbs.elki.algorithm.clustering.gdbscan.parallel |
Parallel versions of Generalized DBSCAN.
|
de.lmu.ifi.dbs.elki.algorithm.clustering.hierarchical.extraction |
Extraction of partitional clusterings from hierarchical results.
|
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans |
K-means clustering and variations
|
de.lmu.ifi.dbs.elki.algorithm.clustering.subspace |
Axis-parallel subspace clustering algorithms
The clustering algorithms in this package are instances of both, projected
clustering algorithms or subspace clustering algorithms according to the
classical but somewhat obsolete classification schema of clustering
algorithms for axis-parallel subspaces.
|
de.lmu.ifi.dbs.elki.algorithm.outlier |
Outlier detection algorithms
|
de.lmu.ifi.dbs.elki.algorithm.outlier.lof |
LOF family of outlier detection algorithms
|
de.lmu.ifi.dbs.elki.database.datastore |
General data store layer API (along the lines of
Map<DBID, T> - use everywhere!) |
de.lmu.ifi.dbs.elki.database.datastore.memory |
Memory data store implementation for ELKI.
|
de.lmu.ifi.dbs.elki.index.preprocessed |
Index structure based on preprocessors
|
de.lmu.ifi.dbs.elki.index.preprocessed.knn |
Indexes providing KNN and rKNN data.
|
de.lmu.ifi.dbs.elki.parallel.processor |
Processor API of ELKI, and some essential shared processors.
|
tutorial.clustering |
Classes from the tutorial on implementing a custom k-means variation
|
Modifier and Type | Method and Description |
---|---|
WritableDataStore<KNNList> |
KNNJoin.run(Relation<V> relation,
DBIDs ids)
Inner run method.
|
WritableDataStore<KNNList> |
KNNJoin.run(SpatialIndexTree<N,E> index,
DBIDs ids)
Inner run method.
|
Modifier and Type | Field and Description |
---|---|
private WritableDataStore<Assignment> |
GriDBSCAN.Instance.clusterids
Cluster assignments.
|
Modifier and Type | Method and Description |
---|---|
protected void |
GriDBSCAN.Instance.mergeClusterInformation(ModifiableDBIDs cellids,
WritableIntegerDataStore temporary,
WritableDataStore<Assignment> clusterids)
Merge cluster information.
|
Modifier and Type | Method and Description |
---|---|
static double |
EM.assignProbabilitiesToInstances(Relation<? extends NumberVector> relation,
java.util.List<? extends EMClusterModel<?>> models,
WritableDataStore<double[]> probClusterIGivenX)
Assigns the current probability values to the instances in the database and
compute the expectation value of the current mixture of distributions.
|
static void |
EM.recomputeCovarianceMatrices(Relation<? extends NumberVector> relation,
WritableDataStore<double[]> probClusterIGivenX,
java.util.List<? extends EMClusterModel<?>> models,
double prior)
Recompute the covariance matrixes.
|
Modifier and Type | Field and Description |
---|---|
private WritableDataStore<Assignment> |
ParallelGeneralizedDBSCAN.Instance.clusterids
Cluster assignment storage.
|
Modifier and Type | Method and Description |
---|---|
private void |
ClustersWithNoiseExtraction.Instance.mergeClusters(WritableDataStore<ArrayModifiableDBIDs> clusters,
DBIDRef it,
DBIDRef succ)
Merge two clusters
|
Modifier and Type | Field and Description |
---|---|
(package private) WritableDataStore<double[]> |
KMeansSimplifiedElkan.Instance.lower
Lower bounds
|
Modifier and Type | Field and Description |
---|---|
private WritableDataStore<long[]> |
DiSH.Instance.commonPreferenceVectors
Shared preference vectors.
|
private WritableDataStore<long[]> |
DiSH.DiSHClusterOrder.commonPreferenceVectors
Preference vectors.
|
private WritableDataStore<long[]> |
HiSC.Instance.commonPreferenceVectors
Shared preference vectors.
|
private WritableDataStore<long[]> |
DiSH.Instance.tmpPreferenceVectors
Temporary storage for new preference vectors.
|
Modifier and Type | Method and Description |
---|---|
private void |
P3C.assignUnassigned(Relation<V> relation,
WritableDataStore<double[]> probClusterIGivenX,
java.util.List<MultivariateGaussianModel> models,
ModifiableDBIDs unassigned)
Assign unassigned objects to best candidate based on shortest Mahalanobis
distance.
|
private void |
P3C.computeFuzzyMembership(Relation<V> relation,
java.util.ArrayList<P3C.Signature> clusterCores,
ModifiableDBIDs unassigned,
WritableDataStore<double[]> probClusterIGivenX,
java.util.List<MultivariateGaussianModel> models,
int dim)
Computes a fuzzy membership with the weights based on which cluster cores
each data point is part of.
|
private java.util.ArrayList<P3C.ClusterCandidate> |
P3C.hardClustering(WritableDataStore<double[]> probClusterIGivenX,
java.util.List<P3C.Signature> clusterCores,
DBIDs dbids)
Creates a hard clustering from the specified soft membership matrix.
|
Constructor and Description |
---|
DiSHClusterOrder(java.lang.String name,
java.lang.String shortname,
ArrayModifiableDBIDs ids,
WritableDoubleDataStore reachability,
WritableDBIDDataStore predecessor,
WritableIntegerDataStore corrdim,
WritableDataStore<long[]> commonPreferenceVectors)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
private void |
DWOF.clusterData(DBIDs ids,
RangeQuery<O> rnnQuery,
WritableDoubleDataStore radii,
WritableDataStore<ModifiableDBIDs> labels)
This method applies a density based clustering algorithm.
|
private int |
DWOF.updateSizes(DBIDs ids,
WritableDataStore<ModifiableDBIDs> labels,
WritableIntegerDataStore newSizes)
This method updates each object's cluster size after the clustering step.
|
Modifier and Type | Method and Description |
---|---|
protected void |
INFLO.computeINFLO(Relation<O> relation,
ModifiableDBIDs pruned,
KNNQuery<O> knnq,
WritableDataStore<ModifiableDBIDs> rNNminuskNNs,
WritableDoubleDataStore inflos,
DoubleMinMax inflominmax)
Compute the final INFLO scores.
|
private void |
INFLO.computeNeighborhoods(Relation<O> relation,
DataStore<SetDBIDs> knns,
ModifiableDBIDs pruned,
WritableDataStore<ModifiableDBIDs> rNNminuskNNs)
Compute the reverse kNN minus the kNN.
|
protected void |
KDEOS.computeOutlierScores(KNNQuery<O> knnq,
DBIDs ids,
WritableDataStore<double[]> densities,
WritableDoubleDataStore kdeos,
DoubleMinMax minmax)
Compute the final KDEOS scores.
|
protected void |
KDEOS.estimateDensities(Relation<O> rel,
KNNQuery<O> knnq,
DBIDs ids,
WritableDataStore<double[]> densities)
Perform the kernel density estimation step.
|
protected void |
LOCI.precomputeInterestingRadii(DBIDs ids,
RangeQuery<O> rangeQuery,
WritableDataStore<LOCI.DoubleIntArrayList> interestingDistances)
Preprocessing step: determine the radii of interest for each point.
|
Modifier and Type | Interface and Description |
---|---|
interface |
WritableDBIDDataStore
Data store specialized for doubles.
|
interface |
WritableDoubleDataStore
Data store specialized for doubles.
|
interface |
WritableIntegerDataStore
Data store specialized for doubles.
|
Modifier and Type | Method and Description |
---|---|
<T> WritableDataStore<T> |
WritableRecordStore.getStorage(int col,
java.lang.Class<? super T> datatype)
Get a
WritableDataStore instance for a particular record column. |
static <T> WritableDataStore<T> |
DataStoreUtil.makeStorage(DBIDs ids,
int hints,
java.lang.Class<? super T> dataclass)
Make a new storage, to associate the given ids with an object of class
dataclass.
|
<T> WritableDataStore<T> |
DataStoreFactory.makeStorage(DBIDs ids,
int hints,
java.lang.Class<? super T> dataclass)
Make a new storage, to associate the given ids with an object of class
dataclass.
|
Modifier and Type | Class and Description |
---|---|
class |
ArrayDBIDStore
A class to answer representation queries using the stored Array.
|
class |
ArrayDoubleStore
A class to answer representation queries using the stored Array.
|
class |
ArrayIntegerStore
A class to answer representation queries using the stored Array.
|
protected class |
ArrayRecordStore.StorageAccessor<T>
Access a single record in the given data.
|
class |
ArrayStore<T>
A class to answer representation queries using the stored Array.
|
class |
MapIntegerDBIDDBIDStore
Writable data store for double values.
|
class |
MapIntegerDBIDDoubleStore
Writable data store for double values.
|
class |
MapIntegerDBIDIntegerStore
Writable data store for double values.
|
protected class |
MapIntegerDBIDRecordStore.StorageAccessor<T>
Access a single record in the given data.
|
class |
MapIntegerDBIDStore<T>
A class to answer representation queries using a map.
|
protected class |
MapRecordStore.StorageAccessor<T>
Access a single record in the given data.
|
(package private) class |
MapStore<T>
A class to answer representation queries using a map.
|
Modifier and Type | Method and Description |
---|---|
<T> WritableDataStore<T> |
MapRecordStore.getStorage(int col,
java.lang.Class<? super T> datatype) |
<T> WritableDataStore<T> |
ArrayRecordStore.getStorage(int col,
java.lang.Class<? super T> datatype) |
<T> WritableDataStore<T> |
MapIntegerDBIDRecordStore.getStorage(int col,
java.lang.Class<? super T> datatype) |
<T> WritableDataStore<T> |
MemoryDataStoreFactory.makeStorage(DBIDs ids,
int hints,
java.lang.Class<? super T> dataclass) |
Modifier and Type | Field and Description |
---|---|
protected WritableDataStore<R> |
AbstractPreprocessorIndex.storage
The data store.
|
Modifier and Type | Field and Description |
---|---|
private WritableDataStore<java.util.TreeSet<DoubleDBIDPair>> |
MaterializeKNNAndRKNNPreprocessor.materialized_RkNN
Additional data storage for RkNN.
|
(package private) WritableDataStore<int[]> |
NaiveProjectedKNNPreprocessor.positions
Curve position storage
|
(package private) WritableDataStore<int[]> |
SpacefillingKNNPreprocessor.positions
Curve position storage
|
private WritableDataStore<KNNHeap> |
NNDescent.store
store for neighbors
|
Modifier and Type | Method and Description |
---|---|
private void |
NNDescent.addpair(WritableDataStore<HashSetModifiableDBIDs> newNeighbors,
DBIDRef o1,
DBIDRef o2) |
private void |
NNDescent.clearAll(DBIDs ids,
WritableDataStore<HashSetModifiableDBIDs> sets)
Clear (but reuse) all sets in the given storage.
|
private int |
NNDescent.processNewNeighbors(WritableDataStore<HashSetModifiableDBIDs> flag,
HashSetModifiableDBIDs newFwd,
HashSetModifiableDBIDs oldFwd,
HashSetModifiableDBIDs newRev,
HashSetModifiableDBIDs oldRev)
Process new neighbors.
|
private void |
NNDescent.reverse(WritableDataStore<HashSetModifiableDBIDs> sampleNewHash,
WritableDataStore<HashSetModifiableDBIDs> newReverseNeighbors,
WritableDataStore<HashSetModifiableDBIDs> oldReverseNeighbors)
calculates new and old neighbors for database
|
private void |
NNDescent.reverse(WritableDataStore<HashSetModifiableDBIDs> sampleNewHash,
WritableDataStore<HashSetModifiableDBIDs> newReverseNeighbors,
WritableDataStore<HashSetModifiableDBIDs> oldReverseNeighbors)
calculates new and old neighbors for database
|
private void |
NNDescent.reverse(WritableDataStore<HashSetModifiableDBIDs> sampleNewHash,
WritableDataStore<HashSetModifiableDBIDs> newReverseNeighbors,
WritableDataStore<HashSetModifiableDBIDs> oldReverseNeighbors)
calculates new and old neighbors for database
|
private int |
NNDescent.sampleNew(DBIDs ids,
WritableDataStore<HashSetModifiableDBIDs> sampleNewNeighbors,
WritableDataStore<HashSetModifiableDBIDs> newNeighborHash,
int items)
samples newNeighbors for every object
|
private int |
NNDescent.sampleNew(DBIDs ids,
WritableDataStore<HashSetModifiableDBIDs> sampleNewNeighbors,
WritableDataStore<HashSetModifiableDBIDs> newNeighborHash,
int items)
samples newNeighbors for every object
|
Modifier and Type | Field and Description |
---|---|
(package private) WritableDataStore<T> |
WriteDataStoreProcessor.store
Store to write to
|
Constructor and Description |
---|
WriteDataStoreProcessor(WritableDataStore<T> store)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
protected WritableDataStore<SameSizeKMeansAlgorithm.Meta> |
SameSizeKMeansAlgorithm.initializeMeta(Relation<V> relation,
double[][] means)
Initialize the metadata storage.
|
Modifier and Type | Method and Description |
---|---|
protected ArrayModifiableDBIDs |
SameSizeKMeansAlgorithm.initialAssignment(java.util.List<ModifiableDBIDs> clusters,
WritableDataStore<SameSizeKMeansAlgorithm.Meta> metas,
DBIDs ids) |
protected double[][] |
SameSizeKMeansAlgorithm.refineResult(Relation<V> relation,
double[][] means,
java.util.List<ModifiableDBIDs> clusters,
WritableDataStore<SameSizeKMeansAlgorithm.Meta> metas,
ArrayModifiableDBIDs tids)
Perform k-means style iterations to improve the clustering result.
|
protected void |
SameSizeKMeansAlgorithm.transfer(WritableDataStore<SameSizeKMeansAlgorithm.Meta> metas,
SameSizeKMeansAlgorithm.Meta meta,
ModifiableDBIDs src,
ModifiableDBIDs dst,
DBIDRef id,
int dstnum)
Transfer a single element from one cluster to another.
|
protected void |
SameSizeKMeansAlgorithm.updateDistances(Relation<V> relation,
double[][] means,
WritableDataStore<SameSizeKMeansAlgorithm.Meta> metas,
NumberVectorDistanceFunction<? super V> df)
Compute the distances of each object to all means.
|
Copyright © 2019 ELKI Development Team. License information.