public class DoubleOpenHashBigSet extends AbstractDoubleSet implements Serializable, Cloneable, Hash, Size64
Instances of this class use a hash table to represent a big set: the number
of elements in the set is limited only by the amount of core memory. The table is
backed by a big array and is
enlarged as needed by doubling its size when new entries are created, but it is never made
smaller (even on a clear()
). A family of trimming
method lets you control the size of the table; this is particularly useful
if you reuse instances of this class.
The methods of this class are about 30% slower than those of the corresponding non-big set.
Hash
,
HashCommon
,
Serialized FormHash.Strategy<K>
DEFAULT_GROWTH_FACTOR, DEFAULT_INITIAL_SIZE, DEFAULT_LOAD_FACTOR, FAST_LOAD_FACTOR, FREE, OCCUPIED, PRIMES, REMOVED, VERY_FAST_LOAD_FACTOR
Constructor and Description |
---|
DoubleOpenHashBigSet()
Creates a new hash big set with initial expected
Hash.DEFAULT_INITIAL_SIZE elements
and Hash.DEFAULT_LOAD_FACTOR as load factor. |
DoubleOpenHashBigSet(Collection<? extends Double> c)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor
copying a given collection. |
DoubleOpenHashBigSet(Collection<? extends Double> c,
float f)
Creates a new hash big set copying a given collection.
|
DoubleOpenHashBigSet(double[] a)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor
copying the elements of an array. |
DoubleOpenHashBigSet(double[] a,
float f)
Creates a new hash big set copying the elements of an array.
|
DoubleOpenHashBigSet(double[] a,
int offset,
int length)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor and fills it with the elements of a given array. |
DoubleOpenHashBigSet(double[] a,
int offset,
int length,
float f)
Creates a new hash big set and fills it with the elements of a given array.
|
DoubleOpenHashBigSet(DoubleCollection c)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor
copying a given type-specific collection. |
DoubleOpenHashBigSet(DoubleCollection c,
float f)
Creates a new hash big set copying a given type-specific collection.
|
DoubleOpenHashBigSet(DoubleIterator i)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor using elements provided by a type-specific iterator. |
DoubleOpenHashBigSet(DoubleIterator i,
float f)
Creates a new hash big set using elements provided by a type-specific iterator.
|
DoubleOpenHashBigSet(Iterator<?> i)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor using elements provided by an iterator. |
DoubleOpenHashBigSet(Iterator<?> i,
float f)
Creates a new hash big set using elements provided by an iterator.
|
DoubleOpenHashBigSet(long expected)
Creates a new hash big set with
Hash.DEFAULT_LOAD_FACTOR as load factor. |
DoubleOpenHashBigSet(long expected,
float f)
Creates a new hash big set.
|
Modifier and Type | Method and Description |
---|---|
boolean |
add(double k) |
void |
clear() |
DoubleOpenHashBigSet |
clone()
Returns a deep copy of this big set.
|
boolean |
contains(double k) |
int |
hashCode()
Returns a hash code for this set.
|
boolean |
isEmpty() |
DoubleIterator |
iterator()
Returns a type-specific iterator on the elements of this collection.
|
boolean |
rehash()
Deprecated.
A no-op.
|
boolean |
remove(double k)
Removes an element from this set.
|
int |
size()
Deprecated.
|
long |
size64()
Returns the size of this data structure as a long.
|
boolean |
trim()
Rehashes this set, making the table as small as possible.
|
boolean |
trim(long n)
Rehashes this set if the table is too large.
|
equals, rem, remove
add, addAll, addAll, contains, containsAll, containsAll, doubleIterator, rem, removeAll, removeAll, retainAll, retainAll, toArray, toArray, toArray, toDoubleArray, toDoubleArray, toString
addAll, containsAll, doubleIterator, removeAll, retainAll, toArray, toArray, toDoubleArray, toDoubleArray
public DoubleOpenHashBigSet(long expected, float f)
The actual table size will be the least power of two greater than expected
/f
.
expected
- the expected number of elements in the set.f
- the load factor.public DoubleOpenHashBigSet(long expected)
Hash.DEFAULT_LOAD_FACTOR
as load factor.expected
- the expected number of elements in the hash big set.public DoubleOpenHashBigSet()
Hash.DEFAULT_INITIAL_SIZE
elements
and Hash.DEFAULT_LOAD_FACTOR
as load factor.public DoubleOpenHashBigSet(Collection<? extends Double> c, float f)
c
- a Collection
to be copied into the new hash big set.f
- the load factor.public DoubleOpenHashBigSet(Collection<? extends Double> c)
Hash.DEFAULT_LOAD_FACTOR
as load factor
copying a given collection.c
- a Collection
to be copied into the new hash big set.public DoubleOpenHashBigSet(DoubleCollection c, float f)
c
- a type-specific collection to be copied into the new hash big set.f
- the load factor.public DoubleOpenHashBigSet(DoubleCollection c)
Hash.DEFAULT_LOAD_FACTOR
as load factor
copying a given type-specific collection.c
- a type-specific collection to be copied into the new hash big set.public DoubleOpenHashBigSet(DoubleIterator i, float f)
i
- a type-specific iterator whose elements will fill the new hash big set.f
- the load factor.public DoubleOpenHashBigSet(DoubleIterator i)
Hash.DEFAULT_LOAD_FACTOR
as load factor using elements provided by a type-specific iterator.i
- a type-specific iterator whose elements will fill the new hash big set.public DoubleOpenHashBigSet(Iterator<?> i, float f)
i
- an iterator whose elements will fill the new hash big set.f
- the load factor.public DoubleOpenHashBigSet(Iterator<?> i)
Hash.DEFAULT_LOAD_FACTOR
as load factor using elements provided by an iterator.i
- an iterator whose elements will fill the new hash big set.public DoubleOpenHashBigSet(double[] a, int offset, int length, float f)
a
- an array whose elements will be used to fill the new hash big set.offset
- the first element to use.length
- the number of elements to use.f
- the load factor.public DoubleOpenHashBigSet(double[] a, int offset, int length)
Hash.DEFAULT_LOAD_FACTOR
as load factor and fills it with the elements of a given array.a
- an array whose elements will be used to fill the new hash big set.offset
- the first element to use.length
- the number of elements to use.public DoubleOpenHashBigSet(double[] a, float f)
a
- an array to be copied into the new hash big set.f
- the load factor.public DoubleOpenHashBigSet(double[] a)
Hash.DEFAULT_LOAD_FACTOR
as load factor
copying the elements of an array.a
- an array to be copied into the new hash big set.public boolean add(double k)
add
in interface DoubleCollection
add
in class AbstractDoubleCollection
Collection.add(Object)
public boolean remove(double k)
DoubleSet
Note that the corresponding method of the type-specific collection is rem()
.
This unfortunate situation is caused by the clash
with the similarly named index-based method in the List
interface.
remove
in interface DoubleSet
remove
in class AbstractDoubleSet
Collection.remove(Object)
public boolean contains(double k)
contains
in interface DoubleCollection
contains
in class AbstractDoubleCollection
Collection.contains(Object)
public void clear()
clear
in interface Collection<Double>
clear
in interface Set<Double>
clear
in class AbstractCollection<Double>
public DoubleIterator iterator()
DoubleCollection
Note that this specification strengthens the one given in
Iterable.iterator()
, which was already
strengthened in the corresponding type-specific class,
but was weakened by the fact that this interface extends Collection
.
iterator
in interface DoubleCollection
iterator
in interface DoubleIterable
iterator
in interface DoubleSet
iterator
in interface Iterable<Double>
iterator
in interface Collection<Double>
iterator
in interface Set<Double>
iterator
in class AbstractDoubleSet
@Deprecated public boolean rehash()
If you need to reduce the table size to fit exactly
this set, use trim()
.
trim()
public boolean trim()
This method rehashes the table to the smallest size satisfying the load factor. It can be used when the set will not be changed anymore, so to optimize access speed and size.
If the table size is already the minimum possible, this method does nothing.
trim(long)
public boolean trim(long n)
Let N be the smallest table size that can hold
max(n,
entries, still satisfying the load factor. If the current
table size is smaller than or equal to N, this method does
nothing. Otherwise, it rehashes this set in a table of size
N.
size64()
)
This method is useful when reusing sets. Clearing a set leaves the table size untouched. If you are reusing a set many times, you can call this method with a typical size to avoid keeping around a very large table just because of a few large transient sets.
n
- the threshold for the trimming.trim()
@Deprecated public int size()
Size64
Integer.MAX_VALUE
.size
in interface Size64
size
in interface Collection<Double>
size
in interface Set<Double>
size
in class AbstractCollection<Double>
Integer.MAX_VALUE
.Collection.size()
public long size64()
Size64
public boolean isEmpty()
isEmpty
in interface Collection<Double>
isEmpty
in interface Set<Double>
isEmpty
in class AbstractDoubleCollection
public DoubleOpenHashBigSet clone()
This method performs a deep copy of this big hash set; the data stored in the set, however, is not cloned. Note that this makes a difference only for object keys.
public int hashCode()
equals()
is not overriden, it is important
that the value returned by this method is the same value as
the one returned by the overriden method.hashCode
in interface Collection<Double>
hashCode
in interface Set<Double>
hashCode
in class AbstractDoubleSet