Class LimitTokenOffsetFilter
- All Implemented Interfaces:
Closeable
,AutoCloseable
,Unwrappable<TokenStream>
By default, this filter ignores any tokens in the wrapped TokenStream
once the limit
has been exceeded, which can result in reset()
being called prior to
incrementToken()
returning false
. For most TokenStream
implementations this
should be acceptable, and faster then consuming the full stream. If you are wrapping a
TokenStream
which requires that the full stream of tokens be exhausted in order to function
properly, use the LimitTokenOffsetFilter(TokenStream, int, boolean)
option.
-
Nested Class Summary
Nested classes/interfaces inherited from class org.apache.lucene.util.AttributeSource
AttributeSource.State
-
Field Summary
FieldsFields inherited from class org.apache.lucene.analysis.TokenFilter
input
Fields inherited from class org.apache.lucene.analysis.TokenStream
DEFAULT_TOKEN_ATTRIBUTE_FACTORY
-
Constructor Summary
ConstructorsConstructorDescriptionLimitTokenOffsetFilter
(TokenStream input, int maxStartOffset) Lets all tokens pass through until it sees one with a start offset <=maxStartOffset
which won't pass and ends the stream.LimitTokenOffsetFilter
(TokenStream input, int maxStartOffset, boolean consumeAllTokens) -
Method Summary
Modifier and TypeMethodDescriptionboolean
Consumers (i.e.,IndexWriter
) use this method to advance the stream to the next token.Methods inherited from class org.apache.lucene.analysis.TokenFilter
close, end, reset, unwrap
Methods inherited from class org.apache.lucene.util.AttributeSource
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, copyTo, endAttributes, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, reflectAsString, reflectWith, removeAllAttributes, restoreState, toString
-
Field Details
-
offsetAttrib
-
maxStartOffset
private int maxStartOffset -
consumeAllTokens
private final boolean consumeAllTokens
-
-
Constructor Details
-
LimitTokenOffsetFilter
Lets all tokens pass through until it sees one with a start offset <=maxStartOffset
which won't pass and ends the stream. It won't consume any tokens afterwards.- Parameters:
maxStartOffset
- the maximum start offset allowed
-
LimitTokenOffsetFilter
-
-
Method Details
-
incrementToken
Description copied from class:TokenStream
Consumers (i.e.,IndexWriter
) use this method to advance the stream to the next token. Implementing classes must implement this method and update the appropriateAttributeImpl
s with the attributes of the next token.The producer must make no assumptions about the attributes after the method has been returned: the caller may arbitrarily change it. If the producer needs to preserve the state for subsequent calls, it can use
AttributeSource.captureState()
to create a copy of the current attribute state.This method is called for every token of a document, so an efficient implementation is crucial for good performance. To avoid calls to
AttributeSource.addAttribute(Class)
andAttributeSource.getAttribute(Class)
, references to allAttributeImpl
s that this stream uses should be retrieved during instantiation.To ensure that filters and consumers know which attributes are available, the attributes must be added during instantiation. Filters and consumers are not required to check for availability of attributes in
TokenStream.incrementToken()
.- Specified by:
incrementToken
in classTokenStream
- Returns:
- false for end of stream; true otherwise
- Throws:
IOException
-