- abortTransaction() - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Aborts the ongoing transaction.
- abortTransaction() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- abortTransaction() - Method in interface org.apache.kafka.clients.producer.Producer
-
- AbstractConfig - Class in org.apache.kafka.common.config
-
A convenient base class for configurations to extend.
- AbstractConfig(ConfigDef, Map<?, ?>, boolean) - Constructor for class org.apache.kafka.common.config.AbstractConfig
-
- AbstractConfig(ConfigDef, Map<?, ?>) - Constructor for class org.apache.kafka.common.config.AbstractConfig
-
- AbstractNotifyingBatchingRestoreCallback - Class in org.apache.kafka.streams.processor
-
- AbstractNotifyingBatchingRestoreCallback() - Constructor for class org.apache.kafka.streams.processor.AbstractNotifyingBatchingRestoreCallback
-
- AbstractNotifyingRestoreCallback - Class in org.apache.kafka.streams.processor
-
- AbstractNotifyingRestoreCallback() - Constructor for class org.apache.kafka.streams.processor.AbstractNotifyingRestoreCallback
-
- AbstractOptions<T extends AbstractOptions> - Class in org.apache.kafka.clients.admin
-
- AbstractOptions() - Constructor for class org.apache.kafka.clients.admin.AbstractOptions
-
- AbstractProcessor<K,V> - Class in org.apache.kafka.streams.processor
-
- AbstractProcessor() - Constructor for class org.apache.kafka.streams.processor.AbstractProcessor
-
- accept(A, B) - Method in class org.apache.kafka.common.KafkaFuture.BiConsumer
-
- accepts(StateStore) - Method in interface org.apache.kafka.streams.state.QueryableStoreType
-
Called when searching for
StateStore
s to see if they
match the type expected by implementors of this interface
- AccessControlEntry - Class in org.apache.kafka.common.acl
-
Represents an access control entry.
- AccessControlEntry(String, String, AclOperation, AclPermissionType) - Constructor for class org.apache.kafka.common.acl.AccessControlEntry
-
Create an instance of an access control entry with the provided parameters.
- AccessControlEntryFilter - Class in org.apache.kafka.common.acl
-
Represents a filter which matches access control entries.
- AccessControlEntryFilter(String, String, AclOperation, AclPermissionType) - Constructor for class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Create an instance of an access control entry filter with the provided parameters.
- ACKS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
acks
- AclBinding - Class in org.apache.kafka.common.acl
-
Represents a binding between a resource and an access control entry.
- AclBinding(Resource, AccessControlEntry) - Constructor for class org.apache.kafka.common.acl.AclBinding
-
Create an instance of this class with the provided parameters.
- AclBindingFilter - Class in org.apache.kafka.common.acl
-
A filter which can match AclBinding objects.
- AclBindingFilter(ResourceFilter, AccessControlEntryFilter) - Constructor for class org.apache.kafka.common.acl.AclBindingFilter
-
Create an instance of this filter with the provided parameters.
- AclOperation - Enum in org.apache.kafka.common.acl
-
Represents an operation which an ACL grants or denies permission to perform.
- AclPermissionType - Enum in org.apache.kafka.common.acl
-
Represents whether an ACL grants or denies permissions.
- activeTasks() - Method in class org.apache.kafka.streams.processor.ThreadMetadata
-
- add(Header) - Method in interface org.apache.kafka.common.header.Headers
-
Adds a header (key inside), to the end, returning if the operation succeeded.
- add(String, byte[]) - Method in interface org.apache.kafka.common.header.Headers
-
Creates and adds a header, to the end, returning if the operation succeeded.
- addClientSaslSupport(ConfigDef) - Static method in class org.apache.kafka.common.config.SaslConfigs
-
- addClientSslSupport(ConfigDef) - Static method in class org.apache.kafka.common.config.SslConfigs
-
- addDeserializerToConfig(Map<String, Object>, Deserializer<?>, Deserializer<?>) - Static method in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- addDeserializerToConfig(Properties, Deserializer<?>, Deserializer<?>) - Static method in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- addErrorMessage(String) - Method in class org.apache.kafka.common.config.ConfigValue
-
- addGlobalStore(StateStoreSupplier<KeyValueStore>, String, Deserializer, Deserializer, String, String, ProcessorSupplier) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
- addGlobalStore(StateStoreSupplier<KeyValueStore>, String, TimestampExtractor, Deserializer, Deserializer, String, String, ProcessorSupplier) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
- addGlobalStore(StoreBuilder, String, String, Consumed, String, ProcessorSupplier) - Method in class org.apache.kafka.streams.StreamsBuilder
-
- addGlobalStore(StoreBuilder, String, Deserializer, Deserializer, String, String, ProcessorSupplier) - Method in class org.apache.kafka.streams.Topology
-
- addGlobalStore(StoreBuilder, String, TimestampExtractor, Deserializer, Deserializer, String, String, ProcessorSupplier) - Method in class org.apache.kafka.streams.Topology
-
- addInternalTopic(String) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Adds an internal topic
NOTE this function would not needed by developers working with the processor APIs, but only used
for the high-level DSL parsing functionalities.
- addLatencyAndThroughputSensor(String, String, String, Sensor.RecordingLevel, String...) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Add a latency and throughput sensor for a specific operation, which will include the following sensors:
average latency
max latency
throughput (num.operations / time unit)
Also create a parent sensor with the same metrics that aggregates all entities with the same operation under the
same scope if it has not been created.
- addProcessor(String, ProcessorSupplier, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new processor node that receives and processes records output by one or more predecessor source or processor node.
- addProcessor(String, ProcessorSupplier, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new processor node that receives and processes records output by one or more parent source or processor
node.
- addRecord(ConsumerRecord<K, V>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- addSensor(String, Sensor.RecordingLevel) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Generic method to create a sensor.
- addSensor(String, Sensor.RecordingLevel, Sensor...) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Generic method to create a sensor with parent sensors.
- addSerializerToConfig(Map<String, Object>, Serializer<?>, Serializer<?>) - Static method in class org.apache.kafka.clients.producer.ProducerConfig
-
- addSerializerToConfig(Properties, Serializer<?>, Serializer<?>) - Static method in class org.apache.kafka.clients.producer.ProducerConfig
-
- addSink(String, String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new sink that forwards records from predecessor nodes (processors and/or sources) to the named Kafka topic.
- addSink(String, String, StreamPartitioner, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new sink that forwards records from predecessor nodes (processors and/or sources) to the named Kafka topic, using
the supplied partitioner.
- addSink(String, String, Serializer, Serializer, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new sink that forwards records from predecessor nodes (processors and/or sources) to the named Kafka topic.
- addSink(String, String, Serializer<K>, Serializer<V>, StreamPartitioner<? super K, ? super V>, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new sink that forwards records from predecessor nodes (processors and/or sources) to the named Kafka topic.
- addSink(String, String, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic.
- addSink(String, String, StreamPartitioner, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic,
using the supplied partitioner.
- addSink(String, String, Serializer, Serializer, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic.
- addSink(String, String, Serializer<K>, Serializer<V>, StreamPartitioner<? super K, ? super V>, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic.
- addSource(String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(TimestampExtractor, String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, TimestampExtractor, String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(String, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, String, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(TimestampExtractor, String, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, TimestampExtractor, String, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(String, Deserializer, Deserializer, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes the named topics and forwards the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, String, TimestampExtractor, Deserializer, Deserializer, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes the named topics and forwards the records to child processor and/or sink nodes.
- addSource(String, Deserializer, Deserializer, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forwards the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, String, TimestampExtractor, Deserializer, Deserializer, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forwards the records to child processor and/or sink nodes.
- addSource(TopologyBuilder.AutoOffsetReset, String, Deserializer, Deserializer, Pattern) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Add a new source that consumes from topics matching the given pattern
and forwards the records to child processor and/or sink nodes.
- addSource(String, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(String, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(Topology.AutoOffsetReset, String, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(Topology.AutoOffsetReset, String, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(TimestampExtractor, String, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(TimestampExtractor, String, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern
and forward the records to child processor and/or sink nodes.
- addSource(Topology.AutoOffsetReset, TimestampExtractor, String, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes the named topics and forward the records to child processor and/or sink nodes.
- addSource(Topology.AutoOffsetReset, TimestampExtractor, String, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern and forward the records to child processor
and/or sink nodes.
- addSource(String, Deserializer, Deserializer, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes the named topics and forwards the records to child processor and/or sink nodes.
- addSource(String, Deserializer, Deserializer, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern and forwards the records to child processor
and/or sink nodes.
- addSource(Topology.AutoOffsetReset, String, Deserializer, Deserializer, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern and forwards the records to child processor
and/or sink nodes.
- addSource(Topology.AutoOffsetReset, String, Deserializer, Deserializer, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern and forwards the records to child processor
and/or sink nodes.
- addSource(Topology.AutoOffsetReset, String, TimestampExtractor, Deserializer, Deserializer, String...) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes the named topics and forwards the records to child processor and/or sink nodes.
- addSource(Topology.AutoOffsetReset, String, TimestampExtractor, Deserializer, Deserializer, Pattern) - Method in class org.apache.kafka.streams.Topology
-
Add a new source that consumes from topics matching the given pattern and forwards the records to child processor
and/or sink nodes.
- addStateStore(StateStoreSupplier, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Adds a state store
- addStateStore(StoreBuilder) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Adds a state store to the underlying
Topology
.
- addStateStore(StoreBuilder, String...) - Method in class org.apache.kafka.streams.Topology
-
Adds a state store.
- addThroughputSensor(String, String, String, Sensor.RecordingLevel, String...) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Add a throughput sensor for a specific operation:
throughput (num.operations / time unit)
Also create a parent sensor with the same metrics that aggregates all entities with the same operation under the
same scope if it has not been created.
- addWaiter(KafkaFuture.BiConsumer<? super T, ? super Throwable>) - Method in class org.apache.kafka.common.KafkaFuture
-
- AdminClient - Class in org.apache.kafka.clients.admin
-
The administrative client for Kafka, which supports managing and inspecting topics, brokers, configurations and ACLs.
- AdminClient() - Constructor for class org.apache.kafka.clients.admin.AdminClient
-
- AdminClientConfig - Class in org.apache.kafka.clients.admin
-
The AdminClient configuration class, which also contains constants for configuration entry names.
- AdminClientConfig(Map<?, ?>) - Constructor for class org.apache.kafka.clients.admin.AdminClientConfig
-
- advanceBy(long) - Method in class org.apache.kafka.streams.kstream.TimeWindows
-
Return a window definition with the original size, but advance ("hop") the window by the given interval, which
specifies by how much a window moves forward relative to the previous one.
- advanceMs - Variable in class org.apache.kafka.streams.kstream.TimeWindows
-
The size of the window's advance interval in milliseconds, i.e., by how much a window moves forward relative to
the previous one.
- after(long) - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
Changes the end window boundary to timeDifferenceMs
but keep the start window boundary as is.
- afterMs - Variable in class org.apache.kafka.streams.kstream.JoinWindows
-
Maximum time difference for tuples that are after the join tuple.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Serde<VR>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Materialized<K, VR, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
Aggregate the values of records in this stream by the grouped key.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
Aggregate the values of records in this stream by the grouped key.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Serde<VR>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Windows<W>, Serde<VR>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Windows<W>, Serde<VR>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Windows<W>, StateStoreSupplier<WindowStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<T>, Aggregator<? super K, ? super V, T>, Merger<? super K, T>, SessionWindows, Serde<T>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<T>, Aggregator<? super K, ? super V, T>, Merger<? super K, T>, SessionWindows, Serde<T>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<T>, Aggregator<? super K, ? super V, T>, Merger<? super K, T>, SessionWindows, Serde<T>, StateStoreSupplier<SessionStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Aggregator<? super K, ? super V, VR>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Aggregator<? super K, ? super V, VR>, Materialized<K, VR, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Aggregate the value of records of the original
KTable
that got
mapped
to the same key into a new instance of
KTable
using default serializers and deserializers.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Aggregator<? super K, ? super V, VR>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Aggregate the value of records of the original
KTable
that got
mapped
to the same key into a new instance of
KTable
using default serializers and deserializers.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Aggregator<? super K, ? super V, VR>, Serde<VR>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Aggregator<? super K, ? super V, VR>, Serde<VR>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Aggregator<? super K, ? super V, VR>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Merger<? super K, VR>) - Method in interface org.apache.kafka.streams.kstream.SessionWindowedKStream
-
Aggregate the values of records in this stream by the grouped key and defined
SessionWindows
.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Merger<? super K, VR>, Materialized<K, VR, SessionStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.SessionWindowedKStream
-
Aggregate the values of records in this stream by the grouped key and defined
SessionWindows
.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>) - Method in interface org.apache.kafka.streams.kstream.TimeWindowedKStream
-
Aggregate the values of records in this stream by the grouped key.
- aggregate(Initializer<VR>, Aggregator<? super K, ? super V, VR>, Materialized<K, VR, WindowStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.TimeWindowedKStream
-
Aggregate the values of records in this stream by the grouped key.
- Aggregator<K,V,VA> - Interface in org.apache.kafka.streams.kstream
-
The Aggregator
interface for aggregating values of the given key.
- all() - Method in class org.apache.kafka.clients.admin.AlterConfigsResult
-
Return a future which succeeds only if all the alter configs operations succeed.
- all() - Method in class org.apache.kafka.clients.admin.AlterReplicaLogDirsResult
-
Return a future which succeeds if all the replica movement have succeeded
- all() - Method in class org.apache.kafka.clients.admin.CreateAclsResult
-
Return a future which succeeds only if all the ACL creations succeed.
- all() - Method in class org.apache.kafka.clients.admin.CreatePartitionsResult
-
Return a future which succeeds if all the partition creations succeed.
- all() - Method in class org.apache.kafka.clients.admin.CreateTopicsResult
-
Return a future which succeeds if all the topic creations succeed.
- all() - Method in class org.apache.kafka.clients.admin.DeleteAclsResult
-
Return a future which succeeds only if all the ACLs deletions succeed, and which contains all the deleted ACLs.
- all() - Method in class org.apache.kafka.clients.admin.DeleteTopicsResult
-
Return a future which succeeds only if all the topic deletions succeed.
- all() - Method in class org.apache.kafka.clients.admin.DescribeConfigsResult
-
Return a future which succeeds only if all the config descriptions succeed.
- all() - Method in class org.apache.kafka.clients.admin.DescribeLogDirsResult
-
Return a future which succeeds only if all the brokers have responded without error
- all() - Method in class org.apache.kafka.clients.admin.DescribeReplicaLogDirsResult
-
Return a future which succeeds if log directory information of all replicas are available
- all() - Method in class org.apache.kafka.clients.admin.DescribeTopicsResult
-
Return a future which succeeds only if all the topic descriptions succeed.
- all() - Method in interface org.apache.kafka.streams.state.ReadOnlyKeyValueStore
-
Return an iterator over all keys in this store.
- allMetadata() - Method in class org.apache.kafka.streams.KafkaStreams
-
Find all currently running
KafkaStreams
instances (potentially remotely) that use the same
application ID
as this instance (i.e., all instances that belong to
the same Kafka Streams application) and return
StreamsMetadata
for each discovered instance.
- allMetadataForStore(String) - Method in class org.apache.kafka.streams.KafkaStreams
-
Find all currently running
KafkaStreams
instances (potentially remotely) that
use the same
application ID
as this instance (i.e., all
instances that belong to the same Kafka Streams application)
and that contain a
StateStore
with the given
storeName
and return
StreamsMetadata
for each discovered instance.
- allOf(KafkaFuture<?>...) - Static method in class org.apache.kafka.common.KafkaFuture
-
Returns a new KafkaFuture that is completed when all the given futures have completed.
- allPartitionsSorted(Map<String, Integer>, Map<String, PartitionAssignor.Subscription>) - Method in class org.apache.kafka.clients.consumer.RoundRobinAssignor
-
- AlreadyExistsException - Exception in org.apache.kafka.connect.errors
-
Indicates the operation tried to create an entity that already exists.
- AlreadyExistsException(String) - Constructor for exception org.apache.kafka.connect.errors.AlreadyExistsException
-
- AlreadyExistsException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.AlreadyExistsException
-
- AlreadyExistsException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.AlreadyExistsException
-
- AlterConfigPolicy - Interface in org.apache.kafka.server.policy
-
An interface for enforcing a policy on alter configs requests.
- AlterConfigPolicy.RequestMetadata - Class in org.apache.kafka.server.policy
-
Class containing the create request parameters.
- alterConfigs(Map<ConfigResource, Config>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Update the configuration for the specified resources with the default options.
- alterConfigs(Map<ConfigResource, Config>, AlterConfigsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Update the configuration for the specified resources with the default options.
- alterConfigs(Map<ConfigResource, Config>, AlterConfigsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- AlterConfigsOptions - Class in org.apache.kafka.clients.admin
-
- AlterConfigsOptions() - Constructor for class org.apache.kafka.clients.admin.AlterConfigsOptions
-
- AlterConfigsResult - Class in org.apache.kafka.clients.admin
-
- alterReplicaLogDirs(Map<TopicPartitionReplica, String>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Change the log directory for the specified replicas.
- alterReplicaLogDirs(Map<TopicPartitionReplica, String>, AlterReplicaLogDirsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Change the log directory for the specified replicas.
- alterReplicaLogDirs(Map<TopicPartitionReplica, String>, AlterReplicaLogDirsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- AlterReplicaLogDirsOptions - Class in org.apache.kafka.clients.admin
-
- AlterReplicaLogDirsOptions() - Constructor for class org.apache.kafka.clients.admin.AlterReplicaLogDirsOptions
-
- AlterReplicaLogDirsResult - Class in org.apache.kafka.clients.admin
-
- ANONYMOUS - Static variable in class org.apache.kafka.common.security.auth.KafkaPrincipal
-
- ANY - Static variable in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Matches any access control entry.
- ANY - Static variable in class org.apache.kafka.common.acl.AclBindingFilter
-
A filter which matches any ACL binding.
- ANY - Static variable in class org.apache.kafka.common.resource.ResourceFilter
-
Matches any resource.
- ApiException - Exception in org.apache.kafka.common.errors
-
Any API exception that is part of the public protocol and should be a subclass of this class and be part of this
package.
- ApiException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.ApiException
-
- ApiException(String) - Constructor for exception org.apache.kafka.common.errors.ApiException
-
- ApiException(Throwable) - Constructor for exception org.apache.kafka.common.errors.ApiException
-
- ApiException() - Constructor for exception org.apache.kafka.common.errors.ApiException
-
- appConfigs() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns all the application config properties as key/value pairs.
- appConfigsWithPrefix(String) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns all the application config properties with the given key prefix, as key/value pairs
stripping the prefix.
- APPLICATION_ID_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
application.id
- APPLICATION_SERVER_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
user.endpoint
- applicationId() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the application id
- apply(A) - Method in class org.apache.kafka.common.KafkaFuture.Function
-
- apply(R) - Method in interface org.apache.kafka.connect.transforms.Transformation
-
Apply transformation to the record
and return another record object (which may be record
itself) or null
,
corresponding to a map or filter operation respectively.
- apply(K, V, VA) - Method in interface org.apache.kafka.streams.kstream.Aggregator
-
Compute a new aggregate from the key and value of a record and the current aggregate of the same key.
- apply(K, V) - Method in interface org.apache.kafka.streams.kstream.ForeachAction
-
Perform an action for each record of a stream.
- apply() - Method in interface org.apache.kafka.streams.kstream.Initializer
-
Return the initial value for an aggregation.
- apply(K, V) - Method in interface org.apache.kafka.streams.kstream.KeyValueMapper
-
Map a record with the given key and value to a new value.
- apply(K, V, V) - Method in interface org.apache.kafka.streams.kstream.Merger
-
Compute a new aggregate from the key and two aggregates.
- apply(V, V) - Method in interface org.apache.kafka.streams.kstream.Reducer
-
Aggregate the two given values into a single one.
- apply(V1, V2) - Method in interface org.apache.kafka.streams.kstream.ValueJoiner
-
Return a joined value consisting of value1
and value2
.
- apply(V) - Method in interface org.apache.kafka.streams.kstream.ValueMapper
-
Map the given value to a new value.
- approximateNumEntries() - Method in interface org.apache.kafka.streams.state.ReadOnlyKeyValueStore
-
Return an approximate count of key-value mappings in this store.
- array(Schema) - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- as(String) - Static method in class org.apache.kafka.streams.kstream.Materialized
-
- as(WindowBytesStoreSupplier) - Static method in class org.apache.kafka.streams.kstream.Materialized
-
- as(SessionBytesStoreSupplier) - Static method in class org.apache.kafka.streams.kstream.Materialized
-
- as(KeyValueBytesStoreSupplier) - Static method in class org.apache.kafka.streams.kstream.Materialized
-
- assign(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- assign(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Manually assign a list of partitions to this consumer.
- assign(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- assign(Map<String, Integer>, Map<String, PartitionAssignor.Subscription>) - Method in class org.apache.kafka.clients.consumer.RangeAssignor
-
- assign(Map<String, Integer>, Map<String, PartitionAssignor.Subscription>) - Method in class org.apache.kafka.clients.consumer.RoundRobinAssignor
-
- assign(Map<String, Integer>, Map<String, PartitionAssignor.Subscription>) - Method in class org.apache.kafka.clients.consumer.StickyAssignor
-
- assignment() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- assignment() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Get the set of partitions currently assigned to this consumer.
- assignment() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- assignment() - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Get the current set of assigned TopicPartitions for this task.
- assignments() - Method in class org.apache.kafka.clients.admin.NewPartitions
-
The replica assignments for the new partitions, or null if the assignment will be done by the controller.
- AT_LEAST_ONCE - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- atLeast(Number) - Static method in class org.apache.kafka.common.config.ConfigDef.Range
-
A numeric range that checks only the lower bound
- AuthenticationContext - Interface in org.apache.kafka.common.security.auth
-
An object representing contextual information from the authentication session.
- AuthenticationException - Exception in org.apache.kafka.common.errors
-
This exception indicates that SASL authentication has failed.
- AuthenticationException(String) - Constructor for exception org.apache.kafka.common.errors.AuthenticationException
-
- AuthenticationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.AuthenticationException
-
- AuthorizationException - Exception in org.apache.kafka.common.errors
-
- AuthorizationException(String) - Constructor for exception org.apache.kafka.common.errors.AuthorizationException
-
- AuthorizationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.AuthorizationException
-
- AUTO_COMMIT_INTERVAL_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
auto.commit.interval.ms
- AUTO_OFFSET_RESET_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
auto.offset.reset
- AUTO_OFFSET_RESET_DOC - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- availablePartitionsForTopic(String) - Method in class org.apache.kafka.common.Cluster
-
Get the list of available partitions for this topic
- CACHE_MAX_BYTES_BUFFERING_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
cache.max.bytes.buffering
- cachingEnabled - Variable in class org.apache.kafka.streams.kstream.Materialized
-
- Callback - Interface in org.apache.kafka.clients.producer
-
A callback interface that the user can implement to allow code to execute when the request is complete.
- cancel(boolean) - Method in class org.apache.kafka.common.KafkaFuture
-
If not already completed, completes this future with a CancellationException.
- cancel() - Method in interface org.apache.kafka.streams.processor.Cancellable
-
Cancel the scheduled operation to avoid future calls.
- Cancellable - Interface in org.apache.kafka.streams.processor
-
- CHECK_CRCS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
check.crcs
- checksum() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
- checksum() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
- cleanUp() - Method in class org.apache.kafka.streams.KafkaStreams
-
- CLEANUP_POLICY_COMPACT - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- CLEANUP_POLICY_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- CLEANUP_POLICY_DELETE - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- CLEANUP_POLICY_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- clear() - Method in class org.apache.kafka.clients.producer.MockProducer
-
Clear the stored history of sent records, consumer group offsets, and transactional state
- CLIENT_ID_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- CLIENT_ID_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
client.id
- CLIENT_ID_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
client.id
- CLIENT_ID_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
client.id
- clientAddress() - Method in interface org.apache.kafka.common.security.auth.AuthenticationContext
-
Address of the authenticated client
- clientAddress() - Method in class org.apache.kafka.common.security.auth.PlaintextAuthenticationContext
-
- clientAddress() - Method in class org.apache.kafka.common.security.auth.SaslAuthenticationContext
-
- clientAddress() - Method in class org.apache.kafka.common.security.auth.SslAuthenticationContext
-
- close() - Method in class org.apache.kafka.clients.admin.AdminClient
-
Close the AdminClient and release all associated resources.
- close(long, TimeUnit) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Close the AdminClient and release all associated resources.
- close(long, TimeUnit) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- close() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- close(long, TimeUnit) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- close() - Method in interface org.apache.kafka.clients.consumer.ConsumerInterceptor
-
This is called when interceptor is closed
- close() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Close the consumer, waiting for up to the default timeout of 30 seconds for any needed cleanup.
- close(long, TimeUnit) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Tries to close the consumer cleanly within the specified timeout.
- close() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- close(long, TimeUnit) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- close() - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Close this producer.
- close(long, TimeUnit) - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
This method waits up to timeout
for the producer to complete the sending of all incomplete requests.
- close() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- close(long, TimeUnit) - Method in class org.apache.kafka.clients.producer.MockProducer
-
- close() - Method in interface org.apache.kafka.clients.producer.Partitioner
-
This is called when partitioner is closed.
- close() - Method in interface org.apache.kafka.clients.producer.Producer
-
- close(long, TimeUnit) - Method in interface org.apache.kafka.clients.producer.Producer
-
- close() - Method in interface org.apache.kafka.clients.producer.ProducerInterceptor
-
This is called when interceptor is closed
- close() - Method in class org.apache.kafka.common.security.auth.DefaultPrincipalBuilder
-
Deprecated.
- close() - Method in interface org.apache.kafka.common.security.auth.PrincipalBuilder
-
Deprecated.
Closes this instance.
- close() - Method in class org.apache.kafka.common.serialization.ByteArrayDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.ByteArraySerializer
-
- close() - Method in class org.apache.kafka.common.serialization.ByteBufferDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.ByteBufferSerializer
-
- close() - Method in class org.apache.kafka.common.serialization.BytesDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.BytesSerializer
-
- close() - Method in interface org.apache.kafka.common.serialization.Deserializer
-
- close() - Method in class org.apache.kafka.common.serialization.DoubleDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.DoubleSerializer
-
- close() - Method in class org.apache.kafka.common.serialization.ExtendedDeserializer.Wrapper
-
- close() - Method in class org.apache.kafka.common.serialization.ExtendedSerializer.Wrapper
-
- close() - Method in class org.apache.kafka.common.serialization.FloatDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.FloatSerializer
-
- close() - Method in class org.apache.kafka.common.serialization.IntegerDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.IntegerSerializer
-
- close() - Method in class org.apache.kafka.common.serialization.LongDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.LongSerializer
-
- close() - Method in interface org.apache.kafka.common.serialization.Serde
-
Close this serde class, which will close the underlying serializer and deserializer.
- close() - Method in class org.apache.kafka.common.serialization.Serdes.WrapperSerde
-
- close() - Method in interface org.apache.kafka.common.serialization.Serializer
-
Close this serializer.
- close() - Method in class org.apache.kafka.common.serialization.ShortDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.ShortSerializer
-
- close() - Method in class org.apache.kafka.common.serialization.StringDeserializer
-
- close() - Method in class org.apache.kafka.common.serialization.StringSerializer
-
- close(Collection<TopicPartition>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
The SinkTask use this method to close writers for partitions that are no
longer assigned to the SinkTask.
- close() - Method in interface org.apache.kafka.connect.transforms.Transformation
-
Signal that this transformation instance will no longer will be used.
- close() - Method in class org.apache.kafka.streams.KafkaStreams
-
Shutdown this KafkaStreams
instance by signaling all the threads to stop, and then wait for them to join.
- close(long, TimeUnit) - Method in class org.apache.kafka.streams.KafkaStreams
-
Shutdown this KafkaStreams
by signaling all the threads to stop, and then wait up to the timeout for the
threads to join.
- close() - Method in interface org.apache.kafka.streams.kstream.Transformer
-
Close this processor and clean up any resources.
- close() - Method in interface org.apache.kafka.streams.kstream.ValueTransformer
-
Close this processor and clean up any resources.
- close() - Method in class org.apache.kafka.streams.processor.AbstractProcessor
-
Close this processor and clean up any resources.
- close() - Method in interface org.apache.kafka.streams.processor.Processor
-
Close this processor and clean up any resources.
- close() - Method in interface org.apache.kafka.streams.processor.StateStore
-
Close the storage engine.
- close() - Method in interface org.apache.kafka.streams.state.KeyValueIterator
-
- close() - Method in interface org.apache.kafka.streams.state.WindowStoreIterator
-
- closed() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- closed() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- Cluster - Class in org.apache.kafka.common
-
A representation of a subset of the nodes, topics, and partitions in the Kafka cluster.
- Cluster(String, Collection<Node>, Collection<PartitionInfo>, Set<String>, Set<String>) - Constructor for class org.apache.kafka.common.Cluster
-
Create a new cluster with the given id, nodes and partitions
- Cluster(String, Collection<Node>, Collection<PartitionInfo>, Set<String>, Set<String>, Node) - Constructor for class org.apache.kafka.common.Cluster
-
Create a new cluster with the given id, nodes and partitions
- CLUSTER - Static variable in class org.apache.kafka.common.resource.Resource
-
A resource representing the whole cluster.
- CLUSTER_NAME - Static variable in class org.apache.kafka.common.resource.Resource
-
The name of the CLUSTER resource.
- ClusterAuthorizationException - Exception in org.apache.kafka.common.errors
-
- ClusterAuthorizationException(String) - Constructor for exception org.apache.kafka.common.errors.ClusterAuthorizationException
-
- ClusterAuthorizationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.ClusterAuthorizationException
-
- clusterId() - Method in class org.apache.kafka.clients.admin.DescribeClusterResult
-
Returns a future which yields the current cluster id.
- clusterId() - Method in class org.apache.kafka.common.ClusterResource
-
Return the cluster id.
- clusterResource() - Method in class org.apache.kafka.common.Cluster
-
- ClusterResource - Class in org.apache.kafka.common
-
The ClusterResource
class encapsulates metadata for a Kafka cluster.
- ClusterResource(String) - Constructor for class org.apache.kafka.common.ClusterResource
-
- ClusterResourceListener - Interface in org.apache.kafka.common
-
A callback interface that users can implement when they wish to get notified about changes in the Cluster metadata.
- code() - Method in enum org.apache.kafka.common.acl.AclOperation
-
Return the code of this operation.
- code() - Method in enum org.apache.kafka.common.acl.AclPermissionType
-
Return the code of this permission type.
- code() - Method in enum org.apache.kafka.common.resource.ResourceType
-
Return the code of this resource.
- commit() - Method in class org.apache.kafka.connect.source.SourceTask
-
- commit() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Requests a commit
- COMMIT_INTERVAL_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
commit.interval.ms
- commitAsync() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- commitAsync(OffsetCommitCallback) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- commitAsync(Map<TopicPartition, OffsetAndMetadata>, OffsetCommitCallback) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- commitAsync() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Commit offsets returned on the last
poll()
for all the subscribed list of topics and partition.
- commitAsync(OffsetCommitCallback) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Commit offsets returned on the last
poll()
for the subscribed list of topics and partitions.
- commitAsync(Map<TopicPartition, OffsetAndMetadata>, OffsetCommitCallback) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Commit the specified offsets for the specified list of topics and partitions to Kafka.
- commitAsync(Map<TopicPartition, OffsetAndMetadata>, OffsetCommitCallback) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- commitAsync() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- commitAsync(OffsetCommitCallback) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- commitCount() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- CommitFailedException - Exception in org.apache.kafka.clients.consumer
-
- CommitFailedException() - Constructor for exception org.apache.kafka.clients.consumer.CommitFailedException
-
- commitRecord(SourceRecord) - Method in class org.apache.kafka.connect.source.SourceTask
-
Commit an individual
SourceRecord
when the callback from the producer client is received, or if a record is filtered by a transformation.
- commitSync() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- commitSync(Map<TopicPartition, OffsetAndMetadata>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- commitSync() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Commit offsets returned on the last
poll()
for all the subscribed list of topics and partitions.
- commitSync(Map<TopicPartition, OffsetAndMetadata>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Commit the specified offsets for the specified list of topics and partitions.
- commitSync(Map<TopicPartition, OffsetAndMetadata>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- commitSync() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- committed(TopicPartition) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- committed(TopicPartition) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Get the last committed offset for the given partition (whether the commit happened by this process or
another).
- committed(TopicPartition) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- commitTransaction() - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Commits the ongoing transaction.
- commitTransaction() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- commitTransaction() - Method in interface org.apache.kafka.clients.producer.Producer
-
- compareTo(TaskId) - Method in class org.apache.kafka.streams.processor.TaskId
-
- complete(T) - Method in class org.apache.kafka.common.KafkaFuture
-
If not already completed, sets the value returned by get() and related methods to the given
value.
- completedFuture(U) - Static method in class org.apache.kafka.common.KafkaFuture
-
Returns a new KafkaFuture that is already completed with the given value.
- completeExceptionally(Throwable) - Method in class org.apache.kafka.common.KafkaFuture
-
If not already completed, causes invocations of get() and related methods to throw the given
exception.
- completeNext() - Method in class org.apache.kafka.clients.producer.MockProducer
-
Complete the earliest uncompleted call successfully.
- COMPRESSION_TYPE_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
compression.type
- COMPRESSION_TYPE_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- COMPRESSION_TYPE_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- ConcurrentTransactionsException - Exception in org.apache.kafka.common.errors
-
- ConcurrentTransactionsException(String) - Constructor for exception org.apache.kafka.common.errors.ConcurrentTransactionsException
-
- Config - Class in org.apache.kafka.clients.admin
-
A configuration object containing the configuration entries for a resource.
- Config(Collection<ConfigEntry>) - Constructor for class org.apache.kafka.clients.admin.Config
-
Create a configuration instance with the provided entries.
- Config - Class in org.apache.kafka.common.config
-
- Config(List<ConfigValue>) - Constructor for class org.apache.kafka.common.config.Config
-
- config() - Method in class org.apache.kafka.connect.connector.Connector
-
Define the configuration for the connector.
- config() - Method in interface org.apache.kafka.connect.transforms.Transformation
-
Configuration specification for this transformation.
- ConfigDef - Class in org.apache.kafka.common.config
-
This class is used for specifying the set of expected configurations.
- ConfigDef() - Constructor for class org.apache.kafka.common.config.ConfigDef
-
- ConfigDef(ConfigDef) - Constructor for class org.apache.kafka.common.config.ConfigDef
-
- configDef() - Static method in class org.apache.kafka.streams.StreamsConfig
-
Return a copy of the config definition.
- ConfigDef.ConfigKey - Class in org.apache.kafka.common.config
-
- ConfigDef.Importance - Enum in org.apache.kafka.common.config
-
The importance level for a configuration
- ConfigDef.NonEmptyString - Class in org.apache.kafka.common.config
-
- ConfigDef.Range - Class in org.apache.kafka.common.config
-
Validation logic for numeric ranges
- ConfigDef.Recommender - Interface in org.apache.kafka.common.config
-
This is used by the
ConfigDef.validate(Map)
to get valid values for a configuration given the current
configuration values in order to perform full configuration validation and visibility modification.
- ConfigDef.Type - Enum in org.apache.kafka.common.config
-
The config types
- ConfigDef.Validator - Interface in org.apache.kafka.common.config
-
Validation logic the user may provide to perform single configuration validation.
- ConfigDef.ValidList - Class in org.apache.kafka.common.config
-
- ConfigDef.ValidString - Class in org.apache.kafka.common.config
-
- ConfigDef.Width - Enum in org.apache.kafka.common.config
-
The width of a configuration value
- ConfigEntry - Class in org.apache.kafka.clients.admin
-
A class representing a configuration entry containing name, value and additional metadata.
- ConfigEntry(String, String) - Constructor for class org.apache.kafka.clients.admin.ConfigEntry
-
Create a configuration entry with the provided values.
- ConfigEntry(String, String, boolean, boolean, boolean) - Constructor for class org.apache.kafka.clients.admin.ConfigEntry
-
Create a configuration with the provided values.
- ConfigException - Exception in org.apache.kafka.common.config
-
Thrown if the user supplies an invalid configuration
- ConfigException(String) - Constructor for exception org.apache.kafka.common.config.ConfigException
-
- ConfigException(String, Object) - Constructor for exception org.apache.kafka.common.config.ConfigException
-
- ConfigException(String, Object, String) - Constructor for exception org.apache.kafka.common.config.ConfigException
-
- ConfigKey(String, ConfigDef.Type, Object, ConfigDef.Validator, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>, ConfigDef.Recommender, boolean) - Constructor for class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- configKeys() - Method in class org.apache.kafka.common.config.ConfigDef
-
Get the configuration keys
- configNames() - Static method in class org.apache.kafka.clients.admin.AdminClientConfig
-
- configNames() - Static method in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- configNames() - Static method in class org.apache.kafka.clients.producer.ProducerConfig
-
- ConfigResource - Class in org.apache.kafka.common.config
-
A class representing resources that have configs.
- ConfigResource(ConfigResource.Type, String) - Constructor for class org.apache.kafka.common.config.ConfigResource
-
Create an instance of this class with the provided parameters.
- ConfigResource.Type - Enum in org.apache.kafka.common.config
-
Type of resource.
- configs(Map<String, String>) - Method in class org.apache.kafka.clients.admin.NewTopic
-
Set the configuration to use on the new topic.
- configs() - Method in class org.apache.kafka.server.policy.AlterConfigPolicy.RequestMetadata
-
Return the configs in the request.
- configs() - Method in class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
Return topic configs in the request, not including broker defaults.
- Configurable - Interface in org.apache.kafka.common
-
A Mix-in style interface for classes that are instantiated by reflection and need to take configuration parameters
- configure(Map<String, ?>) - Method in interface org.apache.kafka.common.Configurable
-
Configure this class with the given key-value pairs
- configure(Map<String, ?>) - Method in class org.apache.kafka.common.security.auth.DefaultPrincipalBuilder
-
Deprecated.
- configure(Map<String, ?>) - Method in interface org.apache.kafka.common.security.auth.PrincipalBuilder
-
Deprecated.
Configures this class with given key-value pairs.
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ByteArrayDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ByteArraySerializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ByteBufferDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ByteBufferSerializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.BytesDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.BytesSerializer
-
- configure(Map<String, ?>, boolean) - Method in interface org.apache.kafka.common.serialization.Deserializer
-
Configure this class.
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.DoubleDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.DoubleSerializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ExtendedDeserializer.Wrapper
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ExtendedSerializer.Wrapper
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.FloatDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.FloatSerializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.IntegerDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.IntegerSerializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.LongDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.LongSerializer
-
- configure(Map<String, ?>, boolean) - Method in interface org.apache.kafka.common.serialization.Serde
-
Configure this class, which will configure the underlying serializer and deserializer.
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.Serdes.WrapperSerde
-
- configure(Map<String, ?>, boolean) - Method in interface org.apache.kafka.common.serialization.Serializer
-
Configure this class.
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ShortDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.ShortSerializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.StringDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.common.serialization.StringSerializer
-
- configure(Map<String, ?>, boolean) - Method in interface org.apache.kafka.connect.storage.Converter
-
Configure this class.
- configure(Map<String, ?>, boolean) - Method in class org.apache.kafka.connect.storage.StringConverter
-
- configure(Map<String, ?>) - Method in class org.apache.kafka.streams.errors.LogAndContinueExceptionHandler
-
- configure(Map<String, ?>) - Method in class org.apache.kafka.streams.errors.LogAndFailExceptionHandler
-
- ConfigValue - Class in org.apache.kafka.common.config
-
- ConfigValue(String) - Constructor for class org.apache.kafka.common.config.ConfigValue
-
- ConfigValue(String, Object, List<Object>, List<String>) - Constructor for class org.apache.kafka.common.config.ConfigValue
-
- configValues() - Method in class org.apache.kafka.common.config.Config
-
- ConnectException - Exception in org.apache.kafka.connect.errors
-
ConnectException is the top-level exception type generated by Kafka Connect and connector implementations.
- ConnectException(String) - Constructor for exception org.apache.kafka.connect.errors.ConnectException
-
- ConnectException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.ConnectException
-
- ConnectException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.ConnectException
-
- CONNECTIONS_MAX_IDLE_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
connections.max.idle.ms
- CONNECTIONS_MAX_IDLE_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
connections.max.idle.ms
- CONNECTIONS_MAX_IDLE_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
connections.max.idle.ms
- CONNECTIONS_MAX_IDLE_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
connections.max.idle.ms
- Connector - Class in org.apache.kafka.connect.connector
-
Connectors manage integration of Kafka Connect with another system, either as an input that ingests
data into Kafka or an output that passes data to an external system.
- Connector() - Constructor for class org.apache.kafka.connect.connector.Connector
-
- ConnectorContext - Interface in org.apache.kafka.connect.connector
-
ConnectorContext allows Connectors to proactively interact with the Kafka Connect runtime.
- ConnectorUtils - Class in org.apache.kafka.connect.util
-
Utilities that connector implementations might find useful.
- ConnectorUtils() - Constructor for class org.apache.kafka.connect.util.ConnectorUtils
-
- connectProcessorAndStateStores(String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Connects the processor and the state stores
- connectProcessorAndStateStores(String, String...) - Method in class org.apache.kafka.streams.Topology
-
Connects the processor and the state stores.
- connectProcessors(String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Connects a list of processors.
- ConnectRecord<R extends ConnectRecord<R>> - Class in org.apache.kafka.connect.connector
-
Base class for records containing data to be copied to/from Kafka.
- ConnectRecord(String, Integer, Schema, Object, Schema, Object, Long) - Constructor for class org.apache.kafka.connect.connector.ConnectRecord
-
- ConnectSchema - Class in org.apache.kafka.connect.data
-
- ConnectSchema(Schema.Type, boolean, Object, String, Integer, String, Map<String, String>, List<Field>, Schema, Schema) - Constructor for class org.apache.kafka.connect.data.ConnectSchema
-
Construct a Schema.
- ConnectSchema(Schema.Type, boolean, Object, String, Integer, String) - Constructor for class org.apache.kafka.connect.data.ConnectSchema
-
Construct a Schema for a primitive type, setting schema parameters, struct fields, and key and value schemas to null.
- ConnectSchema(Schema.Type) - Constructor for class org.apache.kafka.connect.data.ConnectSchema
-
Construct a default schema for a primitive type.
- connectSourceStoreAndTopic(String, String) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
This is used only for KStreamBuilder: when adding a KTable from a source topic,
we need to add the topic as the KTable's materialized state store's changelog.
- Consumed<K,V> - Class in org.apache.kafka.streams
-
- Consumed(Consumed<K, V>) - Constructor for class org.apache.kafka.streams.Consumed
-
Create an instance of
Consumed
from an existing instance.
- Consumer<K,V> - Interface in org.apache.kafka.clients.consumer
-
- CONSUMER_PREFIX - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- ConsumerConfig - Class in org.apache.kafka.clients.consumer
-
The consumer configuration keys
- consumerGroupOffsetsHistory() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- ConsumerInterceptor<K,V> - Interface in org.apache.kafka.clients.consumer
-
A plugin interface that allows you to intercept (and possibly mutate) records received by the consumer.
- consumerPrefix(String) - Static method in class org.apache.kafka.streams.StreamsConfig
-
- ConsumerRebalanceListener - Interface in org.apache.kafka.clients.consumer
-
A callback interface that the user can implement to trigger custom actions when the set of partitions assigned to the
consumer changes.
- ConsumerRecord<K,V> - Class in org.apache.kafka.clients.consumer
-
A key/value pair to be received from Kafka.
- ConsumerRecord(String, int, long, K, V) - Constructor for class org.apache.kafka.clients.consumer.ConsumerRecord
-
Creates a record to be received from a specified topic and partition (provided for
compatibility with Kafka 0.9 before the message format supported timestamps and before
serialized metadata were exposed).
- ConsumerRecord(String, int, long, long, TimestampType, long, int, int, K, V) - Constructor for class org.apache.kafka.clients.consumer.ConsumerRecord
-
Creates a record to be received from a specified topic and partition (provided for
compatibility with Kafka 0.10 before the message format supported headers).
- ConsumerRecord(String, int, long, long, TimestampType, Long, int, int, K, V, Headers) - Constructor for class org.apache.kafka.clients.consumer.ConsumerRecord
-
Creates a record to be received from a specified topic and partition
- ConsumerRecords<K,V> - Class in org.apache.kafka.clients.consumer
-
A container that holds the list
ConsumerRecord
per partition for a
particular topic.
- ConsumerRecords(Map<TopicPartition, List<ConsumerRecord<K, V>>>) - Constructor for class org.apache.kafka.clients.consumer.ConsumerRecords
-
- context - Variable in class org.apache.kafka.connect.connector.Connector
-
- context - Variable in class org.apache.kafka.connect.sink.SinkTask
-
- context - Variable in class org.apache.kafka.connect.source.SourceTask
-
- context() - Method in class org.apache.kafka.streams.processor.AbstractProcessor
-
- controller() - Method in class org.apache.kafka.clients.admin.DescribeClusterResult
-
Returns a future which yields the current controller id.
- controller() - Method in class org.apache.kafka.common.Cluster
-
- ControllerMovedException - Exception in org.apache.kafka.common.errors
-
- ControllerMovedException(String) - Constructor for exception org.apache.kafka.common.errors.ControllerMovedException
-
- ControllerMovedException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.ControllerMovedException
-
- Converter - Interface in org.apache.kafka.connect.storage
-
The Converter interface provides support for translating between Kafka Connect's runtime data format
and byte[].
- convertToString(Object, ConfigDef.Type) - Static method in class org.apache.kafka.common.config.ConfigDef
-
- CoordinatorLoadInProgressException - Exception in org.apache.kafka.common.errors
-
In the context of the group coordinator, the broker returns this error code for any coordinator request if
it is still loading the group metadata (e.g.
- CoordinatorLoadInProgressException(String) - Constructor for exception org.apache.kafka.common.errors.CoordinatorLoadInProgressException
-
- CoordinatorLoadInProgressException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.CoordinatorLoadInProgressException
-
- CoordinatorNotAvailableException - Exception in org.apache.kafka.common.errors
-
In the context of the group coordinator, the broker returns this error code for metadata or offset commit
requests if the group metadata topic has not been created yet.
- CoordinatorNotAvailableException(String) - Constructor for exception org.apache.kafka.common.errors.CoordinatorNotAvailableException
-
- CoordinatorNotAvailableException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.CoordinatorNotAvailableException
-
- copartitionGroups() - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Returns the copartition groups.
- copartitionSources(Collection<String>) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Asserts that the streams of the specified source nodes must be copartitioned.
- CorruptRecordException - Exception in org.apache.kafka.common.errors
-
This exception indicates a record has failed its internal CRC check, this generally indicates network or disk
corruption.
- CorruptRecordException() - Constructor for exception org.apache.kafka.common.errors.CorruptRecordException
-
- CorruptRecordException(String) - Constructor for exception org.apache.kafka.common.errors.CorruptRecordException
-
- CorruptRecordException(Throwable) - Constructor for exception org.apache.kafka.common.errors.CorruptRecordException
-
- CorruptRecordException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.CorruptRecordException
-
- count() - Method in class org.apache.kafka.clients.consumer.ConsumerRecords
-
The number of records for all topics
- count(String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count() - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
Count the number of records in this stream by the grouped key.
- count(StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(Materialized<K, Long, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
Count the number of records in this stream by the grouped key.
- count(Windows<W>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(Windows<W>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(Windows<W>, StateStoreSupplier<WindowStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(SessionWindows, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(SessionWindows) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(SessionWindows, StateStoreSupplier<SessionStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- count(String) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- count(Materialized<K, Long, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Count number of records of the original
KTable
that got
mapped
to
the same key into a new instance of
KTable
.
- count() - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Count number of records of the original
KTable
that got
mapped
to
the same key into a new instance of
KTable
.
- count(StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- count() - Method in interface org.apache.kafka.streams.kstream.SessionWindowedKStream
-
Count the number of records in this stream by the grouped key into
SessionWindows
.
- count(Materialized<K, Long, SessionStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.SessionWindowedKStream
-
Count the number of records in this stream by the grouped key into
SessionWindows
.
- count() - Method in interface org.apache.kafka.streams.kstream.TimeWindowedKStream
-
Count the number of records in this stream by the grouped key and the defined windows.
- count(Materialized<K, Long, WindowStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.TimeWindowedKStream
-
Count the number of records in this stream by the grouped key and the defined windows.
- create(Properties) - Static method in class org.apache.kafka.clients.admin.AdminClient
-
Create a new AdminClient with the given configuration.
- create(Map<String, Object>) - Static method in class org.apache.kafka.clients.admin.AdminClient
-
Create a new AdminClient with the given configuration.
- create(StateStoreProvider, String) - Method in interface org.apache.kafka.streams.state.QueryableStoreType
-
Create an instance of T (usually a facade) that developers can use
to query the underlying
StateStore
s
- create(String) - Static method in class org.apache.kafka.streams.state.Stores
-
- createAcls(Collection<AclBinding>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
- createAcls(Collection<AclBinding>, CreateAclsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Creates access control lists (ACLs) which are bound to specific resources.
- createAcls(Collection<AclBinding>, CreateAclsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- CreateAclsOptions - Class in org.apache.kafka.clients.admin
-
- CreateAclsOptions() - Constructor for class org.apache.kafka.clients.admin.CreateAclsOptions
-
- CreateAclsResult - Class in org.apache.kafka.clients.admin
-
- createPartitions(Map<String, NewPartitions>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Increase the number of partitions of the topics given as the keys of newPartitions
according to the corresponding values.
- createPartitions(Map<String, NewPartitions>, CreatePartitionsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Increase the number of partitions of the topics given as the keys of newPartitions
according to the corresponding values.
- createPartitions(Map<String, NewPartitions>, CreatePartitionsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- CreatePartitionsOptions - Class in org.apache.kafka.clients.admin
-
- CreatePartitionsOptions() - Constructor for class org.apache.kafka.clients.admin.CreatePartitionsOptions
-
- CreatePartitionsResult - Class in org.apache.kafka.clients.admin
-
- CreateTopicPolicy - Interface in org.apache.kafka.server.policy
-
An interface for enforcing a policy on create topics requests.
- CreateTopicPolicy.RequestMetadata - Class in org.apache.kafka.server.policy
-
Class containing the create request parameters.
- createTopics(Collection<NewTopic>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Create a batch of new topics with the default options.
- createTopics(Collection<NewTopic>, CreateTopicsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Create a batch of new topics.
- createTopics(Collection<NewTopic>, CreateTopicsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- CreateTopicsOptions - Class in org.apache.kafka.clients.admin
-
- CreateTopicsOptions() - Constructor for class org.apache.kafka.clients.admin.CreateTopicsOptions
-
- CreateTopicsResult - Class in org.apache.kafka.clients.admin
-
- DataException - Exception in org.apache.kafka.connect.errors
-
Base class for all Kafka Connect data API exceptions.
- DataException(String) - Constructor for exception org.apache.kafka.connect.errors.DataException
-
- DataException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.DataException
-
- DataException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.DataException
-
- Date - Class in org.apache.kafka.connect.data
-
A date representing a calendar day with no time of day or timezone.
- Date() - Constructor for class org.apache.kafka.connect.data.Date
-
- Decimal - Class in org.apache.kafka.connect.data
-
An arbitrary-precision signed decimal number.
- Decimal() - Constructor for class org.apache.kafka.connect.data.Decimal
-
- DEFAULT_DESERIALIZATION_EXCEPTION_HANDLER_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
default.deserialization.exception.handler
- DEFAULT_EXCLUDE_INTERNAL_TOPICS - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- DEFAULT_FETCH_MAX_BYTES - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- DEFAULT_ISOLATION_LEVEL - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- DEFAULT_KERBEROS_KINIT_CMD - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_KERBEROS_MIN_TIME_BEFORE_RELOGIN - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_KERBEROS_TICKET_RENEW_JITTER - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_KERBEROS_TICKET_RENEW_WINDOW_FACTOR - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_KEY_SERDE_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
default key.serde
- DEFAULT_MAX_PARTITION_FETCH_BYTES - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- DEFAULT_PRINCIPAL_BUILDER_CLASS - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_SASL_ENABLED_MECHANISMS - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_SASL_KERBEROS_PRINCIPAL_TO_LOCAL_RULES - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_SASL_MECHANISM - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- DEFAULT_SECURITY_PROTOCOL - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- DEFAULT_SSL_ENABLED_PROTOCOLS - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_SSL_KEYMANGER_ALGORITHM - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_SSL_KEYSTORE_TYPE - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_SSL_PROTOCOL - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_SSL_TRUSTMANAGER_ALGORITHM - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_SSL_TRUSTSTORE_TYPE - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- DEFAULT_TIMESTAMP_EXTRACTOR_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
default timestamp.extractor
- DEFAULT_VALUE_SERDE_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
default value.serde
- defaultDeserializationExceptionHandler() - Method in class org.apache.kafka.streams.StreamsConfig
-
- defaultKeySerde() - Method in class org.apache.kafka.streams.StreamsConfig
-
- DefaultPartitionGrouper - Class in org.apache.kafka.streams.processor
-
Default implementation of the
PartitionGrouper
interface that groups partitions by the partition id.
- DefaultPartitionGrouper() - Constructor for class org.apache.kafka.streams.processor.DefaultPartitionGrouper
-
- DefaultPrincipalBuilder - Class in org.apache.kafka.common.security.auth
-
- DefaultPrincipalBuilder() - Constructor for class org.apache.kafka.common.security.auth.DefaultPrincipalBuilder
-
Deprecated.
- defaultTimestampExtractor() - Method in class org.apache.kafka.streams.StreamsConfig
-
- defaultValue - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- defaultValue() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- defaultValue() - Method in interface org.apache.kafka.connect.data.Schema
-
- defaultValue() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- defaultValue(Object) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set the default value for this schema.
- defaultValueSerde() - Method in class org.apache.kafka.streams.StreamsConfig
-
- define(ConfigDef.ConfigKey) - Method in class org.apache.kafka.common.config.ConfigDef
-
- define(String, ConfigDef.Type, Object, ConfigDef.Validator, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>, ConfigDef.Recommender) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration
- define(String, ConfigDef.Type, Object, ConfigDef.Validator, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no custom recommender
- define(String, ConfigDef.Type, Object, ConfigDef.Validator, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, ConfigDef.Recommender) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no dependents
- define(String, ConfigDef.Type, Object, ConfigDef.Validator, ConfigDef.Importance, String, String, int, ConfigDef.Width, String) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no dependents and no custom recommender
- define(String, ConfigDef.Type, Object, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>, ConfigDef.Recommender) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no special validation logic
- define(String, ConfigDef.Type, Object, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no special validation logic and no custom recommender
- define(String, ConfigDef.Type, Object, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, ConfigDef.Recommender) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no special validation logic and no custom recommender
- define(String, ConfigDef.Type, Object, ConfigDef.Importance, String, String, int, ConfigDef.Width, String) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no special validation logic, not dependents and no custom recommender
- define(String, ConfigDef.Type, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>, ConfigDef.Recommender) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no default value and no special validation logic
- define(String, ConfigDef.Type, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, List<String>) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no default value, no special validation logic and no custom recommender
- define(String, ConfigDef.Type, ConfigDef.Importance, String, String, int, ConfigDef.Width, String, ConfigDef.Recommender) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no default value, no special validation logic and no custom recommender
- define(String, ConfigDef.Type, ConfigDef.Importance, String, String, int, ConfigDef.Width, String) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no default value, no special validation logic, no dependents and no custom recommender
- define(String, ConfigDef.Type, Object, ConfigDef.Validator, ConfigDef.Importance, String) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no group, no order in group, no width, no display name, no dependents and no custom recommender
- define(String, ConfigDef.Type, Object, ConfigDef.Importance, String) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no special validation logic
- define(String, ConfigDef.Type, ConfigDef.Importance, String) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new configuration with no default value and no special validation logic
- defineInternal(String, ConfigDef.Type, Object, ConfigDef.Importance) - Method in class org.apache.kafka.common.config.ConfigDef
-
Define a new internal configuration.
- delete(K) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Delete the value from the store (if there is one)
- DELETE_RETENTION_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- DELETE_RETENTION_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- deleteAcls(Collection<AclBindingFilter>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
- deleteAcls(Collection<AclBindingFilter>, DeleteAclsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Deletes access control lists (ACLs) according to the supplied filters.
- deleteAcls(Collection<AclBindingFilter>, DeleteAclsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DeleteAclsOptions - Class in org.apache.kafka.clients.admin
-
- DeleteAclsOptions() - Constructor for class org.apache.kafka.clients.admin.DeleteAclsOptions
-
- DeleteAclsResult - Class in org.apache.kafka.clients.admin
-
- DeleteAclsResult.FilterResult - Class in org.apache.kafka.clients.admin
-
A class containing either the deleted ACL binding or an exception if the delete failed.
- DeleteAclsResult.FilterResults - Class in org.apache.kafka.clients.admin
-
A class containing the results of the delete ACLs operation.
- deleteTopics(Collection<String>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
- deleteTopics(Collection<String>, DeleteTopicsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Delete a batch of topics.
- deleteTopics(Collection<String>, DeleteTopicsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DeleteTopicsOptions - Class in org.apache.kafka.clients.admin
-
- DeleteTopicsOptions() - Constructor for class org.apache.kafka.clients.admin.DeleteTopicsOptions
-
- DeleteTopicsResult - Class in org.apache.kafka.clients.admin
-
- dependents - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- describe() - Method in class org.apache.kafka.streams.Topology
-
Returns a description of the specified Topology
.
- describeAcls(AclBindingFilter) - Method in class org.apache.kafka.clients.admin.AdminClient
-
- describeAcls(AclBindingFilter, DescribeAclsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Lists access control lists (ACLs) according to the supplied filter.
- describeAcls(AclBindingFilter, DescribeAclsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DescribeAclsOptions - Class in org.apache.kafka.clients.admin
-
- DescribeAclsOptions() - Constructor for class org.apache.kafka.clients.admin.DescribeAclsOptions
-
- DescribeAclsResult - Class in org.apache.kafka.clients.admin
-
- describeCluster() - Method in class org.apache.kafka.clients.admin.AdminClient
-
Get information about the nodes in the cluster, using the default options.
- describeCluster(DescribeClusterOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Get information about the nodes in the cluster.
- describeCluster(DescribeClusterOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DescribeClusterOptions - Class in org.apache.kafka.clients.admin
-
- DescribeClusterOptions() - Constructor for class org.apache.kafka.clients.admin.DescribeClusterOptions
-
- DescribeClusterResult - Class in org.apache.kafka.clients.admin
-
- describeConfigs(Collection<ConfigResource>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Get the configuration for the specified resources with the default options.
- describeConfigs(Collection<ConfigResource>, DescribeConfigsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Get the configuration for the specified resources.
- describeConfigs(Collection<ConfigResource>, DescribeConfigsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DescribeConfigsOptions - Class in org.apache.kafka.clients.admin
-
- DescribeConfigsOptions() - Constructor for class org.apache.kafka.clients.admin.DescribeConfigsOptions
-
- DescribeConfigsResult - Class in org.apache.kafka.clients.admin
-
- describeLogDirs(Collection<Integer>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
- describeLogDirs(Collection<Integer>, DescribeLogDirsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Query the information of all log directories on the given set of brokers
This operation is supported by brokers with version 1.0.0 or higher.
- describeLogDirs(Collection<Integer>, DescribeLogDirsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DescribeLogDirsOptions - Class in org.apache.kafka.clients.admin
-
- DescribeLogDirsOptions() - Constructor for class org.apache.kafka.clients.admin.DescribeLogDirsOptions
-
- DescribeLogDirsResult - Class in org.apache.kafka.clients.admin
-
- describeReplicaLogDirs(Collection<TopicPartitionReplica>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Query the replica log directory information for the specified replicas.
- describeReplicaLogDirs(Collection<TopicPartitionReplica>, DescribeReplicaLogDirsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Query the replica log directory information for the specified replicas.
- describeReplicaLogDirs(Collection<TopicPartitionReplica>, DescribeReplicaLogDirsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DescribeReplicaLogDirsOptions - Class in org.apache.kafka.clients.admin
-
- DescribeReplicaLogDirsOptions() - Constructor for class org.apache.kafka.clients.admin.DescribeReplicaLogDirsOptions
-
- DescribeReplicaLogDirsResult - Class in org.apache.kafka.clients.admin
-
- DescribeReplicaLogDirsResult.ReplicaLogDirInfo - Class in org.apache.kafka.clients.admin
-
- describeTopics(Collection<String>) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Describe some topics in the cluster, with the default options.
- describeTopics(Collection<String>, DescribeTopicsOptions) - Method in class org.apache.kafka.clients.admin.AdminClient
-
Describe some topics in the cluster.
- describeTopics(Collection<String>, DescribeTopicsOptions) - Method in class org.apache.kafka.clients.admin.KafkaAdminClient
-
- DescribeTopicsOptions - Class in org.apache.kafka.clients.admin
-
- DescribeTopicsOptions() - Constructor for class org.apache.kafka.clients.admin.DescribeTopicsOptions
-
- DescribeTopicsResult - Class in org.apache.kafka.clients.admin
-
- description() - Method in class org.apache.kafka.common.MetricName
-
- description() - Method in class org.apache.kafka.common.MetricNameTemplate
-
Get the description of the metric.
- DeserializationExceptionHandler - Interface in org.apache.kafka.streams.errors
-
Interface that specifies how an exception from source node deserialization
(e.g., reading from Kafka) should be handled.
- DeserializationExceptionHandler.DeserializationHandlerResponse - Enum in org.apache.kafka.streams.errors
-
Enumeration that describes the response from the exception handler.
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.ByteArrayDeserializer
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.ByteBufferDeserializer
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.BytesDeserializer
-
- deserialize(String, byte[]) - Method in interface org.apache.kafka.common.serialization.Deserializer
-
Deserialize a record value from a byte array into a value or object.
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.DoubleDeserializer
-
- deserialize(String, Headers, byte[]) - Method in interface org.apache.kafka.common.serialization.ExtendedDeserializer
-
Deserialize a record value from a byte array into a value or object.
- deserialize(String, Headers, byte[]) - Method in class org.apache.kafka.common.serialization.ExtendedDeserializer.Wrapper
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.ExtendedDeserializer.Wrapper
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.FloatDeserializer
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.IntegerDeserializer
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.LongDeserializer
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.ShortDeserializer
-
- deserialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.StringDeserializer
-
- Deserializer<T> - Interface in org.apache.kafka.common.serialization
-
An interface for converting bytes to objects.
- deserializer() - Method in interface org.apache.kafka.common.serialization.Serde
-
- deserializer() - Method in class org.apache.kafka.common.serialization.Serdes.WrapperSerde
-
- disableLogging() - Method in interface org.apache.kafka.streams.state.Stores.InMemoryKeyValueFactory
-
Deprecated.
Indicates that a changelog should not be created for the key-value store
- disableLogging() - Method in interface org.apache.kafka.streams.state.Stores.PersistentKeyValueFactory
-
Deprecated.
Indicates that a changelog should not be created for the key-value store
- DisconnectException - Exception in org.apache.kafka.common.errors
-
Server disconnected before a request could be completed.
- DisconnectException() - Constructor for exception org.apache.kafka.common.errors.DisconnectException
-
- DisconnectException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.DisconnectException
-
- DisconnectException(String) - Constructor for exception org.apache.kafka.common.errors.DisconnectException
-
- DisconnectException(Throwable) - Constructor for exception org.apache.kafka.common.errors.DisconnectException
-
- displayName - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- doc() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- doc() - Method in interface org.apache.kafka.connect.data.Schema
-
- doc() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- doc(String) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set the documentation for this schema.
- documentation - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- Double() - Static method in class org.apache.kafka.common.serialization.Serdes
-
- DoubleDeserializer - Class in org.apache.kafka.common.serialization
-
- DoubleDeserializer() - Constructor for class org.apache.kafka.common.serialization.DoubleDeserializer
-
- DoubleSerde() - Constructor for class org.apache.kafka.common.serialization.Serdes.DoubleSerde
-
- DoubleSerializer - Class in org.apache.kafka.common.serialization
-
- DoubleSerializer() - Constructor for class org.apache.kafka.common.serialization.DoubleSerializer
-
- DuplicateSequenceException - Exception in org.apache.kafka.common.errors
-
- DuplicateSequenceException(String) - Constructor for exception org.apache.kafka.common.errors.DuplicateSequenceException
-
- FailOnInvalidTimestamp - Class in org.apache.kafka.streams.processor
-
Retrieves embedded metadata timestamps from Kafka messages.
- FailOnInvalidTimestamp() - Constructor for class org.apache.kafka.streams.processor.FailOnInvalidTimestamp
-
- fenceProducer() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- fetch(K) - Method in interface org.apache.kafka.streams.state.ReadOnlySessionStore
-
Retrieve all aggregated sessions for the provided key.
- fetch(K, K) - Method in interface org.apache.kafka.streams.state.ReadOnlySessionStore
-
Retrieve all aggregated sessions for the given range of keys.
- fetch(K, long, long) - Method in interface org.apache.kafka.streams.state.ReadOnlyWindowStore
-
Get all the key-value pairs with the given key and the time range from all
the existing windows.
- fetch(K, K, long, long) - Method in interface org.apache.kafka.streams.state.ReadOnlyWindowStore
-
Get all the key-value pairs in the given key range and time range from all
the existing windows.
- FETCH_MAX_BYTES_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
fetch.max.bytes
- FETCH_MAX_WAIT_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
fetch.max.wait.ms
- FETCH_MIN_BYTES_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
fetch.min.bytes
- field(String) - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- Field - Class in org.apache.kafka.connect.data
-
A field in a
Struct
, consisting of a field name, index, and
Schema
for the field value.
- Field(String, int, Schema) - Constructor for class org.apache.kafka.connect.data.Field
-
- field(String) - Method in interface org.apache.kafka.connect.data.Schema
-
Get a field for this Schema by name.
- field(String, Schema) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Add a field to this struct schema.
- field(String) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- fields() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- fields() - Method in interface org.apache.kafka.connect.data.Schema
-
Get the list of fields for this Schema.
- fields() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Get the list of fields for this Schema.
- FILE_DELETE_DELAY_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- FILE_DELETE_DELAY_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- fillInStackTrace() - Method in exception org.apache.kafka.common.errors.ApiException
-
- fillInStackTrace() - Method in exception org.apache.kafka.common.errors.SerializationException
-
- filter(Predicate<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Create a new KStream
that consists of all records of this stream which satisfy the given predicate.
- filter(Predicate<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Create a new KTable
that consists of all records of this KTable
which satisfy the given
predicate.
- filter(Predicate<? super K, ? super V>, Materialized<K, V, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Create a new KTable
that consists of all records of this KTable
which satisfy the given
predicate.
- filter(Predicate<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- filter(Predicate<? super K, ? super V>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- filterNot(Predicate<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Create a new KStream
that consists all records of this stream which do not satisfy the given
predicate.
- filterNot(Predicate<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Create a new KTable
that consists all records of this KTable
which do not satisfy the
given predicate.
- filterNot(Predicate<? super K, ? super V>, Materialized<K, V, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Create a new KTable
that consists all records of this KTable
which do not satisfy the
given predicate.
- filterNot(Predicate<? super K, ? super V>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- filterNot(Predicate<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- findIndefiniteField() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Returns a string describing an ANY or UNKNOWN field, or null if there is
no such field.
- findIndefiniteField() - Method in class org.apache.kafka.common.acl.AclBindingFilter
-
Return a string describing an ANY or UNKNOWN field, or null if there is no such field.
- findIndefiniteField() - Method in class org.apache.kafka.common.resource.ResourceFilter
-
Return a string describing an ANY or UNKNOWN field, or null if there is no such field.
- findSessions(K, long, long) - Method in interface org.apache.kafka.streams.state.SessionStore
-
Fetch any sessions with the matching key and the sessions end is ≥ earliestSessionEndTime and the sessions
start is ≤ latestSessionStartTime
This iterator must be closed after use.
- findSessions(K, K, long, long) - Method in interface org.apache.kafka.streams.state.SessionStore
-
Fetch any sessions in the given range of keys and the sessions end is ≥ earliestSessionEndTime and the sessions
start is ≤ latestSessionStartTime
This iterator must be closed after use.
- flatMap(KeyValueMapper<? super K, ? super V, ? extends Iterable<? extends KeyValue<? extends KR, ? extends VR>>>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Transform each record of the input stream into zero or more records in the output stream (both key and value type
can be altered arbitrarily).
- flatMapValues(ValueMapper<? super V, ? extends Iterable<? extends VR>>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Create a new KStream
by transforming the value of each record in this stream into zero or more values
with the same key in the new stream.
- Float() - Static method in class org.apache.kafka.common.serialization.Serdes
-
- float32() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- FLOAT32_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- float64() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- FLOAT64_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- FloatDeserializer - Class in org.apache.kafka.common.serialization
-
- FloatDeserializer() - Constructor for class org.apache.kafka.common.serialization.FloatDeserializer
-
- FloatSerde() - Constructor for class org.apache.kafka.common.serialization.Serdes.FloatSerde
-
- FloatSerializer - Class in org.apache.kafka.common.serialization
-
- FloatSerializer() - Constructor for class org.apache.kafka.common.serialization.FloatSerializer
-
- flush() - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Invoking this method makes all buffered records immediately available to send (even if linger.ms
is
greater than 0) and blocks on the completion of the requests associated with these records.
- flush() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- flush() - Method in interface org.apache.kafka.clients.producer.Producer
-
- flush(Map<TopicPartition, OffsetAndMetadata>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
- flush() - Method in interface org.apache.kafka.streams.processor.StateStore
-
Flush any cached data
- FLUSH_MESSAGES_INTERVAL_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- FLUSH_MESSAGES_INTERVAL_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- FLUSH_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- FLUSH_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- flushed() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- foreach(ForeachAction<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Perform an action on each record of KStream
.
- foreach(ForeachAction<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- ForeachAction<K,V> - Interface in org.apache.kafka.streams.kstream
-
The
ForeachAction
interface for performing an action on a
key-value
pair
.
- forId(short) - Static method in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
- forName(String) - Static method in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
Case insensitive lookup by protocol name
- forward(K, V) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Forwards a key/value pair to the downstream processors
- forward(K, V, int) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Forwards a key/value pair to one of the downstream processors designated by childIndex
- forward(K, V, String) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Forwards a key/value pair to one of the downstream processors designated by the downstream processor name
- fromCode(byte) - Static method in enum org.apache.kafka.common.acl.AclOperation
-
Return the AclOperation with the provided code or `AclOperation.UNKNOWN` if one cannot be found.
- fromCode(byte) - Static method in enum org.apache.kafka.common.acl.AclPermissionType
-
Return the AclPermissionType with the provided code or `AclPermissionType.UNKNOWN` if one cannot be found.
- fromCode(byte) - Static method in enum org.apache.kafka.common.resource.ResourceType
-
Return the ResourceType with the provided code or `ResourceType.UNKNOWN` if one cannot be found.
- fromConnectData(String, Schema, Object) - Method in interface org.apache.kafka.connect.storage.Converter
-
Convert a Kafka Connect data object to a native object for serialization.
- fromConnectData(String, Schema, Object) - Method in class org.apache.kafka.connect.storage.StringConverter
-
- fromLogical(Schema, Date) - Static method in class org.apache.kafka.connect.data.Date
-
Convert a value from its logical format (Date) to it's encoded format.
- fromLogical(Schema, BigDecimal) - Static method in class org.apache.kafka.connect.data.Decimal
-
Convert a value from its logical format (BigDecimal) to it's encoded format.
- fromLogical(Schema, Date) - Static method in class org.apache.kafka.connect.data.Time
-
Convert a value from its logical format (Time) to it's encoded format.
- fromLogical(Schema, Date) - Static method in class org.apache.kafka.connect.data.Timestamp
-
Convert a value from its logical format (Date) to it's encoded format.
- fromString(String) - Static method in enum org.apache.kafka.common.acl.AclOperation
-
Parse the given string as an ACL operation.
- fromString(String) - Static method in enum org.apache.kafka.common.acl.AclPermissionType
-
Parse the given string as an ACL permission.
- fromString(String) - Static method in enum org.apache.kafka.common.resource.ResourceType
-
Parse the given string as an ACL resource type.
- fromString(String) - Static method in class org.apache.kafka.common.security.auth.KafkaPrincipal
-
- Function() - Constructor for class org.apache.kafka.common.KafkaFuture.Function
-
- id() - Method in class org.apache.kafka.common.Node
-
The node id of this node
- id - Variable in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
The permanent and immutable id of a security protocol -- this can't change, and must match kafka.cluster.SecurityProtocol
- id - Variable in enum org.apache.kafka.streams.errors.DeserializationExceptionHandler.DeserializationHandlerResponse
-
the permanent and immutable id of an API--this can't change ever
- id() - Method in interface org.apache.kafka.streams.TopologyDescription.Subtopology
-
Internally assigned unique ID.
- idString() - Method in class org.apache.kafka.common.Node
-
String representation of the node id.
- ignore(String) - Method in class org.apache.kafka.common.config.AbstractConfig
-
- IllegalGenerationException - Exception in org.apache.kafka.common.errors
-
- IllegalGenerationException() - Constructor for exception org.apache.kafka.common.errors.IllegalGenerationException
-
- IllegalGenerationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.IllegalGenerationException
-
- IllegalGenerationException(String) - Constructor for exception org.apache.kafka.common.errors.IllegalGenerationException
-
- IllegalGenerationException(Throwable) - Constructor for exception org.apache.kafka.common.errors.IllegalGenerationException
-
- IllegalSaslStateException - Exception in org.apache.kafka.common.errors
-
This exception indicates unexpected requests prior to SASL authentication.
- IllegalSaslStateException(String) - Constructor for exception org.apache.kafka.common.errors.IllegalSaslStateException
-
- IllegalSaslStateException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.IllegalSaslStateException
-
- IllegalWorkerStateException - Exception in org.apache.kafka.connect.errors
-
Indicates that a method has been invoked illegally or at an invalid time by a connector or task.
- IllegalWorkerStateException(String) - Constructor for exception org.apache.kafka.connect.errors.IllegalWorkerStateException
-
- IllegalWorkerStateException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.IllegalWorkerStateException
-
- IllegalWorkerStateException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.IllegalWorkerStateException
-
- importance - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- in(String...) - Static method in class org.apache.kafka.common.config.ConfigDef.ValidList
-
- in(String...) - Static method in class org.apache.kafka.common.config.ConfigDef.ValidString
-
- inactivityGap() - Method in class org.apache.kafka.streams.kstream.SessionWindows
-
Return the specified gap for the session windows in milliseconds.
- InconsistentGroupProtocolException - Exception in org.apache.kafka.common.errors
-
- InconsistentGroupProtocolException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InconsistentGroupProtocolException
-
- InconsistentGroupProtocolException(String) - Constructor for exception org.apache.kafka.common.errors.InconsistentGroupProtocolException
-
- increaseTo(int) - Static method in class org.apache.kafka.clients.admin.NewPartitions
-
Increase the partition count for a topic to the given totalCount
.
- increaseTo(int, List<List<Integer>>) - Static method in class org.apache.kafka.clients.admin.NewPartitions
-
Increase the partition count for a topic to the given totalCount
assigning the new partitions according to the given newAssignments
.
- index() - Method in class org.apache.kafka.connect.data.Field
-
Get the index of this field within the struct.
- INDEX_INTERVAL_BYTES_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- INDEX_INTERVAL_BYTES_DOCS - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- init(ProcessorContext) - Method in interface org.apache.kafka.streams.kstream.Transformer
-
Initialize this transformer.
- init(ProcessorContext) - Method in interface org.apache.kafka.streams.kstream.ValueTransformer
-
Initialize this transformer.
- init(ProcessorContext) - Method in class org.apache.kafka.streams.processor.AbstractProcessor
-
- init(ProcessorContext) - Method in interface org.apache.kafka.streams.processor.Processor
-
Initialize this processor with the given context.
- init(ProcessorContext, StateStore) - Method in interface org.apache.kafka.streams.processor.StateStore
-
Initializes this state store
- initialize(ConnectorContext) - Method in class org.apache.kafka.connect.connector.Connector
-
Initialize this connector, using the provided ConnectorContext to notify the runtime of
input configuration changes.
- initialize(ConnectorContext, List<Map<String, String>>) - Method in class org.apache.kafka.connect.connector.Connector
-
Initialize this connector, using the provided ConnectorContext to notify the runtime of
input configuration changes and using the provided set of Task configurations.
- initialize(SinkTaskContext) - Method in class org.apache.kafka.connect.sink.SinkTask
-
Initialize the context of this task.
- initialize(SourceTaskContext) - Method in class org.apache.kafka.connect.source.SourceTask
-
Initialize this SourceTask with the specified context object.
- Initializer<VA> - Interface in org.apache.kafka.streams.kstream
-
The Initializer
interface for creating an initial value in aggregations.
- initTransactions() - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Needs to be called before any other methods when the transactional.id is set in the configuration.
- initTransactions() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- initTransactions() - Method in interface org.apache.kafka.clients.producer.Producer
-
- inMemory() - Method in interface org.apache.kafka.streams.state.Stores.KeyValueFactory
-
Keep all key-value entries in-memory, although for durability all entries are recorded in a Kafka topic that can be
read to restore the entries if they are lost.
- inMemoryKeyValueStore(String) - Static method in class org.apache.kafka.streams.state.Stores
-
- INSTANCE - Static variable in exception org.apache.kafka.common.errors.CoordinatorNotAvailableException
-
- INSTANCE - Static variable in exception org.apache.kafka.common.errors.DisconnectException
-
- INSTANCE - Static variable in exception org.apache.kafka.common.errors.GroupCoordinatorNotAvailableException
-
Deprecated.
- inSyncReplicas() - Method in class org.apache.kafka.common.PartitionInfo
-
The subset of the replicas that are in sync, that is caught-up to the leader and ready to take over as leader if
the leader should fail
- int16() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- INT16_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- int32() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- INT32_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- int64() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- INT64_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- int8() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- INT8_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- Integer() - Static method in class org.apache.kafka.common.serialization.Serdes
-
- IntegerDeserializer - Class in org.apache.kafka.common.serialization
-
- IntegerDeserializer() - Constructor for class org.apache.kafka.common.serialization.IntegerDeserializer
-
- IntegerSerde() - Constructor for class org.apache.kafka.common.serialization.Serdes.IntegerSerde
-
- IntegerSerializer - Class in org.apache.kafka.common.serialization
-
- IntegerSerializer() - Constructor for class org.apache.kafka.common.serialization.IntegerSerializer
-
- INTERCEPTOR_CLASSES_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
interceptor.classes
- INTERCEPTOR_CLASSES_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
interceptor.classes
- INTERCEPTOR_CLASSES_DOC - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- INTERCEPTOR_CLASSES_DOC - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
- InterfaceStability - Class in org.apache.kafka.common.annotation
-
Annotation to inform users of how much to rely on a particular package, class or method not changing over time.
- InterfaceStability() - Constructor for class org.apache.kafka.common.annotation.InterfaceStability
-
- InterfaceStability.Evolving - Annotation Type in org.apache.kafka.common.annotation
-
Compatibility may be broken at minor release (i.e.
- InterfaceStability.Stable - Annotation Type in org.apache.kafka.common.annotation
-
Compatibility is maintained in major, minor and patch releases with one exception: compatibility may be broken
in a major release (i.e.
- InterfaceStability.Unstable - Annotation Type in org.apache.kafka.common.annotation
-
No guarantee is provided as to reliability or stability across any level of release granularity.
- internalConfig - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- InternalConfig() - Constructor for class org.apache.kafka.streams.StreamsConfig.InternalConfig
-
- internalTopics() - Method in class org.apache.kafka.common.Cluster
-
- internalTopologyBuilder - Variable in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
NOTE this member would not needed by developers working with the processor APIs, but only used
for internal functionalities.
- InterruptException - Exception in org.apache.kafka.common.errors
-
An unchecked wrapper for InterruptedException
- InterruptException(InterruptedException) - Constructor for exception org.apache.kafka.common.errors.InterruptException
-
- InterruptException(String, InterruptedException) - Constructor for exception org.apache.kafka.common.errors.InterruptException
-
- InterruptException(String) - Constructor for exception org.apache.kafka.common.errors.InterruptException
-
- InvalidCommitOffsetSizeException - Exception in org.apache.kafka.common.errors
-
- InvalidCommitOffsetSizeException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidCommitOffsetSizeException
-
- InvalidCommitOffsetSizeException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidCommitOffsetSizeException
-
- InvalidConfigurationException - Exception in org.apache.kafka.common.errors
-
- InvalidConfigurationException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidConfigurationException
-
- InvalidConfigurationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidConfigurationException
-
- InvalidFetchSizeException - Exception in org.apache.kafka.common.errors
-
- InvalidFetchSizeException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidFetchSizeException
-
- InvalidFetchSizeException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidFetchSizeException
-
- InvalidGroupIdException - Exception in org.apache.kafka.common.errors
-
- InvalidGroupIdException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidGroupIdException
-
- InvalidGroupIdException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidGroupIdException
-
- InvalidMetadataException - Exception in org.apache.kafka.common.errors
-
An exception that may indicate the client's metadata is out of date
- InvalidMetadataException() - Constructor for exception org.apache.kafka.common.errors.InvalidMetadataException
-
- InvalidMetadataException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidMetadataException
-
- InvalidMetadataException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidMetadataException
-
- InvalidMetadataException(Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidMetadataException
-
- InvalidOffsetException - Exception in org.apache.kafka.clients.consumer
-
Thrown when the offset for a set of partitions is invalid (either undefined or out of range),
and no reset policy has been configured.
- InvalidOffsetException(String) - Constructor for exception org.apache.kafka.clients.consumer.InvalidOffsetException
-
- InvalidOffsetException - Exception in org.apache.kafka.common.errors
-
Thrown when the offset for a set of partitions is invalid (either undefined or out of range),
and no reset policy has been configured.
- InvalidOffsetException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidOffsetException
-
- InvalidOffsetException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidOffsetException
-
- InvalidPartitionsException - Exception in org.apache.kafka.common.errors
-
- InvalidPartitionsException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidPartitionsException
-
- InvalidPartitionsException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidPartitionsException
-
- InvalidPidMappingException - Exception in org.apache.kafka.common.errors
-
- InvalidPidMappingException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidPidMappingException
-
- InvalidReplicaAssignmentException - Exception in org.apache.kafka.common.errors
-
- InvalidReplicaAssignmentException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidReplicaAssignmentException
-
- InvalidReplicaAssignmentException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidReplicaAssignmentException
-
- InvalidReplicationFactorException - Exception in org.apache.kafka.common.errors
-
- InvalidReplicationFactorException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidReplicationFactorException
-
- InvalidReplicationFactorException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidReplicationFactorException
-
- InvalidRequestException - Exception in org.apache.kafka.common.errors
-
Thrown when a request breaks basic wire protocol rules.
- InvalidRequestException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidRequestException
-
- InvalidRequestException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidRequestException
-
- InvalidRequiredAcksException - Exception in org.apache.kafka.common.errors
-
- InvalidRequiredAcksException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidRequiredAcksException
-
- InvalidSessionTimeoutException - Exception in org.apache.kafka.common.errors
-
- InvalidSessionTimeoutException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidSessionTimeoutException
-
- InvalidSessionTimeoutException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidSessionTimeoutException
-
- InvalidStateStoreException - Exception in org.apache.kafka.streams.errors
-
Indicates that there was a problem when trying to access a
StateStore
, i.e, the Store is no longer valid because it is
closed or doesn't exist any more due to a rebalance.
- InvalidStateStoreException(String) - Constructor for exception org.apache.kafka.streams.errors.InvalidStateStoreException
-
- InvalidStateStoreException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.InvalidStateStoreException
-
- InvalidStateStoreException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.InvalidStateStoreException
-
- InvalidTimestampException - Exception in org.apache.kafka.common.errors
-
Indicate the timestamp of a record is invalid.
- InvalidTimestampException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidTimestampException
-
- InvalidTimestampException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidTimestampException
-
- InvalidTopicException - Exception in org.apache.kafka.common.errors
-
The client has attempted to perform an operation on an invalid topic.
- InvalidTopicException() - Constructor for exception org.apache.kafka.common.errors.InvalidTopicException
-
- InvalidTopicException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidTopicException
-
- InvalidTopicException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidTopicException
-
- InvalidTopicException(Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidTopicException
-
- InvalidTxnStateException - Exception in org.apache.kafka.common.errors
-
- InvalidTxnStateException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidTxnStateException
-
- InvalidTxnTimeoutException - Exception in org.apache.kafka.common.errors
-
The transaction coordinator returns this error code if the timeout received via the InitProducerIdRequest is larger than
the `max.transaction.timeout.ms` config value.
- InvalidTxnTimeoutException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.InvalidTxnTimeoutException
-
- InvalidTxnTimeoutException(String) - Constructor for exception org.apache.kafka.common.errors.InvalidTxnTimeoutException
-
- isBootstrapConfigured() - Method in class org.apache.kafka.common.Cluster
-
- isCancelled() - Method in class org.apache.kafka.common.KafkaFuture
-
Returns true if this CompletableFuture was cancelled before it completed normally.
- isCompletedExceptionally() - Method in class org.apache.kafka.common.KafkaFuture
-
Returns true if this CompletableFuture completed exceptionally, in any way.
- isDefault() - Method in class org.apache.kafka.clients.admin.ConfigEntry
-
Return whether the config value is the default or if it's been explicitly set.
- isDone() - Method in class org.apache.kafka.common.KafkaFuture
-
Returns true if completed in any fashion: normally, exceptionally, or via cancellation.
- isEmpty() - Method in class org.apache.kafka.clients.consumer.ConsumerRecords
-
- isEmpty() - Method in class org.apache.kafka.common.Node
-
Check whether this node is empty, which may be the case if noNode() is used as a placeholder
in a response payload with an error.
- isInternal() - Method in class org.apache.kafka.clients.admin.TopicDescription
-
Whether the topic is internal to Kafka.
- isInternal() - Method in class org.apache.kafka.clients.admin.TopicListing
-
Whether the topic is internal to Kafka.
- ISOLATION_LEVEL_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
isolation.level
- ISOLATION_LEVEL_DOC - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- isOpen() - Method in interface org.apache.kafka.streams.processor.StateStore
-
Is this store open for reading and writing
- isOptional() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- isOptional() - Method in interface org.apache.kafka.connect.data.Schema
-
- isOptional() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- isPrimitive() - Method in enum org.apache.kafka.connect.data.Schema.Type
-
- isr() - Method in class org.apache.kafka.common.TopicPartitionInfo
-
Return the in-sync replicas of the partition.
- isReadOnly() - Method in class org.apache.kafka.clients.admin.ConfigEntry
-
Return whether the config is read-only and cannot be updated.
- isRunning() - Method in enum org.apache.kafka.streams.KafkaStreams.State
-
- isSensitive() - Method in class org.apache.kafka.clients.admin.ConfigEntry
-
Return whether the config value is sensitive.
- isUnknown() - Method in class org.apache.kafka.common.acl.AccessControlEntry
-
Return true if this AclResource has any UNKNOWN components.
- isUnknown() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Return true if there are any UNKNOWN components.
- isUnknown() - Method in class org.apache.kafka.common.acl.AclBinding
-
Return true if this binding has any UNKNOWN components.
- isUnknown() - Method in class org.apache.kafka.common.acl.AclBindingFilter
-
Return true if this filter has any UNKNOWN components.
- isUnknown() - Method in enum org.apache.kafka.common.acl.AclOperation
-
Return true if this operation is UNKNOWN.
- isUnknown() - Method in enum org.apache.kafka.common.acl.AclPermissionType
-
Return true if this permission type is UNKNOWN.
- isUnknown() - Method in class org.apache.kafka.common.resource.Resource
-
Return true if this Resource has any UNKNOWN components.
- isUnknown() - Method in class org.apache.kafka.common.resource.ResourceFilter
-
Return true if this ResourceFilter has any UNKNOWN components.
- isUnknown() - Method in enum org.apache.kafka.common.resource.ResourceType
-
Return whether this resource type is UNKNOWN.
- isValidTransition(KafkaStreams.State) - Method in enum org.apache.kafka.streams.KafkaStreams.State
-
- iterator() - Method in class org.apache.kafka.clients.consumer.ConsumerRecords
-
- KafkaAdminClient - Class in org.apache.kafka.clients.admin
-
- KafkaClientSupplier - Interface in org.apache.kafka.streams
-
KafkaClientSupplier
can be used to provide custom Kafka clients to a
KafkaStreams
instance.
- KafkaConsumer<K,V> - Class in org.apache.kafka.clients.consumer
-
A client that consumes records from a Kafka cluster.
- KafkaConsumer(Map<String, Object>) - Constructor for class org.apache.kafka.clients.consumer.KafkaConsumer
-
A consumer is instantiated by providing a set of key-value pairs as configuration.
- KafkaConsumer(Map<String, Object>, Deserializer<K>, Deserializer<V>) - Constructor for class org.apache.kafka.clients.consumer.KafkaConsumer
-
A consumer is instantiated by providing a set of key-value pairs as configuration, and a key and a value
Deserializer
.
- KafkaConsumer(Properties) - Constructor for class org.apache.kafka.clients.consumer.KafkaConsumer
-
A consumer is instantiated by providing a Properties
object as configuration.
- KafkaConsumer(Properties, Deserializer<K>, Deserializer<V>) - Constructor for class org.apache.kafka.clients.consumer.KafkaConsumer
-
A consumer is instantiated by providing a
Properties
object as configuration, and a
key and a value
Deserializer
.
- KafkaException - Exception in org.apache.kafka.common
-
The base class of all other Kafka exceptions
- KafkaException(String, Throwable) - Constructor for exception org.apache.kafka.common.KafkaException
-
- KafkaException(String) - Constructor for exception org.apache.kafka.common.KafkaException
-
- KafkaException(Throwable) - Constructor for exception org.apache.kafka.common.KafkaException
-
- KafkaException() - Constructor for exception org.apache.kafka.common.KafkaException
-
- KafkaFuture<T> - Class in org.apache.kafka.common
-
A flexible future which supports call chaining and other asynchronous programming patterns.
- KafkaFuture() - Constructor for class org.apache.kafka.common.KafkaFuture
-
- KafkaFuture.BiConsumer<A,B> - Class in org.apache.kafka.common
-
A consumer of two different types of object.
- KafkaFuture.Function<A,B> - Class in org.apache.kafka.common
-
A function which takes objects of type A and returns objects of type B.
- kafkaOffset() - Method in class org.apache.kafka.connect.sink.SinkRecord
-
- kafkaPartition() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- KafkaPrincipal - Class in org.apache.kafka.common.security.auth
-
Principals in Kafka are defined by a type and a name.
- KafkaPrincipal(String, String) - Constructor for class org.apache.kafka.common.security.auth.KafkaPrincipal
-
- KafkaPrincipalBuilder - Interface in org.apache.kafka.common.security.auth
-
- KafkaProducer<K,V> - Class in org.apache.kafka.clients.producer
-
A Kafka client that publishes records to the Kafka cluster.
- KafkaProducer(Map<String, Object>) - Constructor for class org.apache.kafka.clients.producer.KafkaProducer
-
A producer is instantiated by providing a set of key-value pairs as configuration.
- KafkaProducer(Map<String, Object>, Serializer<K>, Serializer<V>) - Constructor for class org.apache.kafka.clients.producer.KafkaProducer
-
A producer is instantiated by providing a set of key-value pairs as configuration, a key and a value
Serializer
.
- KafkaProducer(Properties) - Constructor for class org.apache.kafka.clients.producer.KafkaProducer
-
A producer is instantiated by providing a set of key-value pairs as configuration.
- KafkaProducer(Properties, Serializer<K>, Serializer<V>) - Constructor for class org.apache.kafka.clients.producer.KafkaProducer
-
A producer is instantiated by providing a set of key-value pairs as configuration, a key and a value
Serializer
.
- KafkaStorageException - Exception in org.apache.kafka.common.errors
-
Miscellaneous disk-related IOException occurred when handling a request.
- KafkaStorageException() - Constructor for exception org.apache.kafka.common.errors.KafkaStorageException
-
- KafkaStorageException(String) - Constructor for exception org.apache.kafka.common.errors.KafkaStorageException
-
- KafkaStorageException(Throwable) - Constructor for exception org.apache.kafka.common.errors.KafkaStorageException
-
- KafkaStorageException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.KafkaStorageException
-
- KafkaStreams - Class in org.apache.kafka.streams
-
A Kafka client that allows for performing continuous computation on input coming from one or more input topics and
sends output to zero, one, or more output topics.
- KafkaStreams(TopologyBuilder, Properties) - Constructor for class org.apache.kafka.streams.KafkaStreams
-
- KafkaStreams(TopologyBuilder, StreamsConfig) - Constructor for class org.apache.kafka.streams.KafkaStreams
-
- KafkaStreams(TopologyBuilder, StreamsConfig, KafkaClientSupplier) - Constructor for class org.apache.kafka.streams.KafkaStreams
-
- KafkaStreams(Topology, Properties) - Constructor for class org.apache.kafka.streams.KafkaStreams
-
Create a KafkaStreams
instance.
- KafkaStreams(Topology, StreamsConfig) - Constructor for class org.apache.kafka.streams.KafkaStreams
-
Create a KafkaStreams
instance.
- KafkaStreams(Topology, StreamsConfig, KafkaClientSupplier) - Constructor for class org.apache.kafka.streams.KafkaStreams
-
Create a KafkaStreams
instance.
- KafkaStreams.State - Enum in org.apache.kafka.streams
-
Kafka Streams states are the possible state that a Kafka Streams instance can be in.
- KafkaStreams.StateListener - Interface in org.apache.kafka.streams
-
- key() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The key (or null if no key is specified)
- key() - Method in class org.apache.kafka.clients.producer.ProducerRecord
-
- key() - Method in interface org.apache.kafka.common.header.Header
-
- key() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- key - Variable in class org.apache.kafka.streams.KeyValue
-
The key of the key-value pair.
- key() - Method in class org.apache.kafka.streams.kstream.Windowed
-
Return the key of the window.
- KEY_DESERIALIZER_CLASS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
key.deserializer
- KEY_DESERIALIZER_CLASS_DOC - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- KEY_SERDE_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- KEY_SERIALIZER_CLASS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
key.serializer
- KEY_SERIALIZER_CLASS_DOC - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
- keyDeserializer() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the key deserializer.
- keyFrom(byte[]) - Method in class org.apache.kafka.streams.state.StateSerdes
-
Deserialize the key from raw bytes.
- keySchema() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- keySchema() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- keySchema() - Method in interface org.apache.kafka.connect.data.Schema
-
Get the key schema for this map schema.
- keySchema() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- keySerde - Variable in class org.apache.kafka.streams.Consumed
-
- keySerde(Serde<K>) - Static method in class org.apache.kafka.streams.kstream.Joined
-
Create an instance of
Joined
with a key
Serde
.
- keySerde() - Method in class org.apache.kafka.streams.kstream.Joined
-
- keySerde - Variable in class org.apache.kafka.streams.kstream.Materialized
-
- keySerde - Variable in class org.apache.kafka.streams.kstream.Produced
-
- keySerde(Serde<K>) - Static method in class org.apache.kafka.streams.kstream.Produced
-
Create a Produced instance with provided keySerde.
- keySerde - Variable in class org.apache.kafka.streams.kstream.Serialized
-
- keySerde() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the default key serde
- keySerde() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the key serde.
- keySerde() - Method in class org.apache.kafka.streams.StreamsConfig
-
Deprecated.
- keySerializer() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the key serializer.
- KeyValue<K,V> - Class in org.apache.kafka.streams
-
A key-value pair defined for a single Kafka Streams record.
- KeyValue(K, V) - Constructor for class org.apache.kafka.streams.KeyValue
-
Create a new key-value pair.
- KeyValueBytesStoreSupplier - Interface in org.apache.kafka.streams.state
-
A store supplier that can be used to create one or more
KeyValueStore
instances of type <Byte, byte[]>.
- KeyValueIterator<K,V> - Interface in org.apache.kafka.streams.state
-
- KeyValueMapper<K,V,VR> - Interface in org.apache.kafka.streams.kstream
-
The
KeyValueMapper
interface for mapping a
key-value pair
to a new value of arbitrary type.
- KeyValueStore<K,V> - Interface in org.apache.kafka.streams.state
-
A key-value store that supports put/get/delete and range queries.
- keyValueStore() - Static method in class org.apache.kafka.streams.state.QueryableStoreTypes
-
- keyValueStoreBuilder(KeyValueBytesStoreSupplier, Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.state.Stores
-
- KGroupedStream<K,V> - Interface in org.apache.kafka.streams.kstream
-
KGroupedStream
is an abstraction of a
grouped record stream of
KeyValue
pairs.
- KGroupedTable<K,V> - Interface in org.apache.kafka.streams.kstream
-
KGroupedTable
is an abstraction of a re-grouped changelog stream from a primary-keyed table,
usually on a different grouping key than the original primary key.
- KStream<K,V> - Interface in org.apache.kafka.streams.kstream
-
KStream
is an abstraction of a
record stream of
KeyValue
pairs, i.e., each record is an
independent entity/event in the real world.
- KStreamBuilder - Class in org.apache.kafka.streams.kstream
-
- KStreamBuilder() - Constructor for class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
- KTable<K,V> - Interface in org.apache.kafka.streams.kstream
-
KTable
is an abstraction of a changelog stream from a primary-keyed table.
- main(String[]) - Static method in class org.apache.kafka.clients.admin.AdminClientConfig
-
- main(String[]) - Static method in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- main(String[]) - Static method in class org.apache.kafka.clients.producer.ProducerConfig
-
- main(String[]) - Static method in class org.apache.kafka.streams.StreamsConfig
-
- maintainMs() - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
Return the window maintain duration (retention time) in milliseconds.
- maintainMs() - Method in class org.apache.kafka.streams.kstream.SessionWindows
-
Return the window maintain duration (retention time) in milliseconds.
- maintainMs() - Method in class org.apache.kafka.streams.kstream.TimeWindows
-
Return the window maintain duration (retention time) in milliseconds.
- maintainMs() - Method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
Return the window maintain duration (retention time) in milliseconds.
- maintainMs() - Method in class org.apache.kafka.streams.kstream.Windows
-
Return the window maintain duration (retention time) in milliseconds.
- map(Schema, Schema) - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- map(KeyValueMapper<? super K, ? super V, ? extends KeyValue<? extends KR, ? extends VR>>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Transform each record of the input stream into a new record in the output stream (both key and value type can be
altered arbitrarily).
- mapper - Variable in class org.apache.kafka.streams.kstream.Printed
-
- mapValues(ValueMapper<? super V, ? extends VR>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Transform the value of each input record into a new value (with possible new type) of the output record.
- mapValues(ValueMapper<? super V, ? extends VR>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Create a new KTable
by transforming the value of each record in this KTable
into a new value
(with possible new type)in the new KTable
.
- mapValues(ValueMapper<? super V, ? extends VR>, Materialized<K, VR, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Create a new KTable
by transforming the value of each record in this KTable
into a new value
(with possible new type)in the new KTable
.
- mapValues(ValueMapper<? super V, ? extends VR>, Serde<VR>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- mapValues(ValueMapper<? super V, ? extends VR>, Serde<VR>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- matches(AccessControlEntry) - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Returns true if this filter matches the given AccessControlEntry.
- matches(AclBinding) - Method in class org.apache.kafka.common.acl.AclBindingFilter
-
Return true if the resource filter matches the binding's resource and the entry filter matches binding's entry.
- matches(Resource) - Method in class org.apache.kafka.common.resource.ResourceFilter
-
Return true if this filter matches the given Resource.
- matchesAtMostOne() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Returns true if this filter could only match one ACE -- in other words, if
there are no ANY or UNKNOWN fields.
- matchesAtMostOne() - Method in class org.apache.kafka.common.acl.AclBindingFilter
-
Return true if the resource and entry filters can only match one ACE.
- matchesAtMostOne() - Method in class org.apache.kafka.common.resource.ResourceFilter
-
Return true if this filter could only match one ACE.
- Materialized<K,V,S extends StateStore> - Class in org.apache.kafka.streams.kstream
-
Used to describe how a
StateStore
should be materialized.
- Materialized(Materialized<K, V, S>) - Constructor for class org.apache.kafka.streams.kstream.Materialized
-
Copy constructor.
- MAX_BLOCK_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
max.block.ms
- MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
max.in.flight.requests.per.connection
- MAX_MESSAGE_BYTES_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MAX_MESSAGE_BYTES_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MAX_PARTITION_FETCH_BYTES_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
max.partition.fetch.bytes
- MAX_POLL_INTERVAL_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
max.poll.interval.ms
- MAX_POLL_RECORDS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
max.poll.records
- MAX_REQUEST_SIZE_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
max.request.size
- maxEntries(int) - Method in interface org.apache.kafka.streams.state.Stores.InMemoryKeyValueFactory
-
Deprecated.
Limits the in-memory key-value store to hold a maximum number of entries.
- maxNumPartitions(Cluster, Set<String>) - Method in class org.apache.kafka.streams.processor.DefaultPartitionGrouper
-
- merge(KStream<K, V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Merge this stream and the given stream into one larger stream.
- merge(KStream<K, V>...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
- Merger<K,V> - Interface in org.apache.kafka.streams.kstream
-
The interface for merging aggregate values for
SessionWindows
with the given key.
- MESSAGE_FORMAT_VERSION_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MESSAGE_FORMAT_VERSION_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MESSAGE_TIMESTAMP_DIFFERENCE_MAX_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MESSAGE_TIMESTAMP_DIFFERENCE_MAX_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MESSAGE_TIMESTAMP_TYPE_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MESSAGE_TIMESTAMP_TYPE_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- metadata() - Method in class org.apache.kafka.clients.consumer.OffsetAndMetadata
-
- METADATA_MAX_AGE_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- METADATA_MAX_AGE_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
metadata.max.age.ms
- METADATA_MAX_AGE_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
metadata.max.age.ms
- METADATA_MAX_AGE_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
metadata.max.age.ms
- metadataForKey(String, K, Serializer<K>) - Method in class org.apache.kafka.streams.KafkaStreams
-
Find the currently running
KafkaStreams
instance (potentially remotely) that
use the same
application ID
as this instance (i.e., all
instances that belong to the same Kafka Streams application)
and that contain a
StateStore
with the given
storeName
and the
StateStore
contains the given
key
and return
StreamsMetadata
for it.
- metadataForKey(String, K, StreamPartitioner<? super K, ?>) - Method in class org.apache.kafka.streams.KafkaStreams
-
Find the currently running
KafkaStreams
instance (potentially remotely) that
use the same
application ID
as this instance (i.e., all
instances that belong to the same Kafka Streams application)
and that contain a
StateStore
with the given
storeName
and the
StateStore
contains the given
key
and return
StreamsMetadata
for it.
- Metric - Interface in org.apache.kafka.common
-
A metric tracked for monitoring purposes.
- METRIC_REPORTER_CLASSES_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- METRIC_REPORTER_CLASSES_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
metric.reporters
- METRIC_REPORTER_CLASSES_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
metric.reporters
- METRIC_REPORTER_CLASSES_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
metric.reporters
- metricName() - Method in interface org.apache.kafka.common.Metric
-
A name for this metric
- MetricName - Class in org.apache.kafka.common
-
The MetricName
class encapsulates a metric's name, logical group and its related attributes.
- MetricName(String, String, String, Map<String, String>) - Constructor for class org.apache.kafka.common.MetricName
-
Please create MetricName by method Metrics.metricName(String, String, String, Map)
- MetricNameTemplate - Class in org.apache.kafka.common
-
A template for a MetricName.
- MetricNameTemplate(String, String, String, Set<String>) - Constructor for class org.apache.kafka.common.MetricNameTemplate
-
Create a new template.
- MetricNameTemplate(String, String, String, String...) - Constructor for class org.apache.kafka.common.MetricNameTemplate
-
Create a new template.
- metrics() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- metrics() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Get the metrics kept by the consumer
- metrics() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- metrics() - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Get the full set of internal metrics maintained by the producer.
- metrics() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- metrics() - Method in interface org.apache.kafka.clients.producer.Producer
-
- metrics() - Method in class org.apache.kafka.streams.KafkaStreams
-
Get read-only handle on global metrics registry.
- metrics() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns Metrics instance
- metrics() - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Get read-only handle on global metrics registry.
- METRICS_NUM_SAMPLES_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- METRICS_NUM_SAMPLES_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
metrics.num.samples
- METRICS_NUM_SAMPLES_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
metrics.num.samples
- METRICS_NUM_SAMPLES_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
metrics.num.samples
- METRICS_RECORDING_LEVEL_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- METRICS_RECORDING_LEVEL_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
metrics.log.level
- METRICS_RECORDING_LEVEL_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
metrics.log.level
- METRICS_RECORDING_LEVEL_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
metrics.record.level
- METRICS_SAMPLE_WINDOW_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- METRICS_SAMPLE_WINDOW_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
metrics.sample.window.ms
- METRICS_SAMPLE_WINDOW_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
metrics.sample.window.ms
- METRICS_SAMPLE_WINDOW_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
metrics.sample.window.ms
- metricsScope() - Method in interface org.apache.kafka.streams.state.StoreSupplier
-
Return a String that is used as the scope for metrics recorded by Metered stores.
- metricValue() - Method in interface org.apache.kafka.common.Metric
-
The value of the metric, which may be measurable or a non-measurable gauge
- MIN_CLEANABLE_DIRTY_RATIO_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MIN_CLEANABLE_DIRTY_RATIO_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MIN_COMPACTION_LAG_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MIN_COMPACTION_LAG_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MIN_IN_SYNC_REPLICAS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MIN_IN_SYNC_REPLICAS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- MockConsumer<K,V> - Class in org.apache.kafka.clients.consumer
-
A mock of the
Consumer
interface you can use for testing code that uses Kafka.
- MockConsumer(OffsetResetStrategy) - Constructor for class org.apache.kafka.clients.consumer.MockConsumer
-
- MockProducer<K,V> - Class in org.apache.kafka.clients.producer
-
A mock of the producer interface you can use for testing code that uses Kafka.
- MockProducer(Cluster, boolean, Partitioner, Serializer<K>, Serializer<V>) - Constructor for class org.apache.kafka.clients.producer.MockProducer
-
Create a mock producer
- MockProducer(boolean, Serializer<K>, Serializer<V>) - Constructor for class org.apache.kafka.clients.producer.MockProducer
-
Create a new mock producer with invented metadata the given autoComplete setting and key\value serializers.
- MockProducer(boolean, Partitioner, Serializer<K>, Serializer<V>) - Constructor for class org.apache.kafka.clients.producer.MockProducer
-
Create a new mock producer with invented metadata the given autoComplete setting, partitioner and key\value serializers.
- MockProducer() - Constructor for class org.apache.kafka.clients.producer.MockProducer
-
Create a new mock producer with invented metadata.
- name() - Method in class org.apache.kafka.clients.admin.ConfigEntry
-
Return the config name.
- name() - Method in class org.apache.kafka.clients.admin.NewTopic
-
The name of the topic to be created.
- name() - Method in class org.apache.kafka.clients.admin.TopicDescription
-
The name of the topic.
- name() - Method in class org.apache.kafka.clients.admin.TopicListing
-
The name of the topic.
- name() - Method in class org.apache.kafka.clients.consumer.RangeAssignor
-
- name() - Method in class org.apache.kafka.clients.consumer.RoundRobinAssignor
-
- name() - Method in class org.apache.kafka.clients.consumer.StickyAssignor
-
- name - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- name() - Method in class org.apache.kafka.common.config.ConfigResource
-
Return the resource name.
- name() - Method in class org.apache.kafka.common.config.ConfigValue
-
- name() - Method in class org.apache.kafka.common.MetricName
-
- name() - Method in class org.apache.kafka.common.MetricNameTemplate
-
Get the name of the metric.
- name() - Method in class org.apache.kafka.common.resource.Resource
-
Return the resource name.
- name() - Method in class org.apache.kafka.common.resource.ResourceFilter
-
Return the resource name or null.
- name - Variable in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
Name of the security protocol.
- name() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- name() - Method in class org.apache.kafka.connect.data.Field
-
Get the name of this field.
- name() - Method in interface org.apache.kafka.connect.data.Schema
-
- name() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- name(String) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set the name of this schema.
- name - Variable in enum org.apache.kafka.streams.errors.DeserializationExceptionHandler.DeserializationHandlerResponse
-
an english description of the api--this is for debugging and can change
- name() - Method in interface org.apache.kafka.streams.processor.StateStore
-
The name of this store.
- name() - Method in interface org.apache.kafka.streams.processor.StateStoreSupplier
-
Deprecated.
Return the name of this state store supplier.
- name() - Method in interface org.apache.kafka.streams.state.StoreBuilder
-
Return the name of this state store builder.
- name() - Method in interface org.apache.kafka.streams.state.StoreSupplier
-
Return the name of this state store supplier.
- name() - Method in interface org.apache.kafka.streams.TopologyDescription.Node
-
The name of the node.
- names() - Method in class org.apache.kafka.clients.admin.ListTopicsResult
-
Return a future which yields a collection of topic names.
- names() - Method in class org.apache.kafka.common.config.ConfigDef
-
Returns unmodifiable set of properties names defined in this
ConfigDef
- names() - Static method in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
- namesToListings() - Method in class org.apache.kafka.clients.admin.ListTopicsResult
-
Return a future which yields a map of topic names to TopicListing objects.
- NETWORK_THREAD_PREFIX - Static variable in class org.apache.kafka.clients.producer.KafkaProducer
-
- NetworkException - Exception in org.apache.kafka.common.errors
-
A misc.
- NetworkException() - Constructor for exception org.apache.kafka.common.errors.NetworkException
-
- NetworkException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.NetworkException
-
- NetworkException(String) - Constructor for exception org.apache.kafka.common.errors.NetworkException
-
- NetworkException(Throwable) - Constructor for exception org.apache.kafka.common.errors.NetworkException
-
- newName(String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
This function is only for internal usage only and should not be called.
- NewPartitions - Class in org.apache.kafka.clients.admin
-
- newRecord(String, Integer, Schema, Object, Schema, Object, Long) - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
Generate a new record of the same type as itself, with the specified parameter values.
- newRecord(String, Integer, Schema, Object, Schema, Object, Long) - Method in class org.apache.kafka.connect.sink.SinkRecord
-
- newRecord(String, Integer, Schema, Object, Schema, Object, Long) - Method in class org.apache.kafka.connect.source.SourceRecord
-
- newStoreName(String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
This function is only for internal usage only and should not be called.
- NewTopic - Class in org.apache.kafka.clients.admin
-
- NewTopic(String, int, short) - Constructor for class org.apache.kafka.clients.admin.NewTopic
-
A new topic with the specified replication factor and number of partitions.
- NewTopic(String, Map<Integer, List<Integer>>) - Constructor for class org.apache.kafka.clients.admin.NewTopic
-
A new topic with the specified replica assignment configuration.
- NO_DEFAULT_VALUE - Static variable in class org.apache.kafka.common.config.ConfigDef
-
A unique Java object which represents the lack of a default value.
- NO_TIMESTAMP - Static variable in class org.apache.kafka.clients.consumer.ConsumerRecord
-
- Node - Class in org.apache.kafka.common
-
Information about a Kafka node
- Node(int, String, int) - Constructor for class org.apache.kafka.common.Node
-
- Node(int, String, int, String) - Constructor for class org.apache.kafka.common.Node
-
- nodeById(int) - Method in class org.apache.kafka.common.Cluster
-
Get the node by the node id (or null if no such node exists)
- nodeGroups() - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Returns the map of node groups keyed by the topic group id.
- nodes() - Method in class org.apache.kafka.clients.admin.DescribeClusterResult
-
Returns a future which yields a collection of nodes.
- nodes() - Method in class org.apache.kafka.common.Cluster
-
- nodes() - Method in interface org.apache.kafka.streams.TopologyDescription.Subtopology
-
All nodes of this sub-topology.
- NonEmptyString() - Constructor for class org.apache.kafka.common.config.ConfigDef.NonEmptyString
-
- noNode() - Static method in class org.apache.kafka.common.Node
-
- NoOffsetForPartitionException - Exception in org.apache.kafka.clients.consumer
-
Indicates that there is no stored offset for a partition and no defined offset
reset policy.
- NoOffsetForPartitionException(TopicPartition) - Constructor for exception org.apache.kafka.clients.consumer.NoOffsetForPartitionException
-
- NoOffsetForPartitionException(Collection<TopicPartition>) - Constructor for exception org.apache.kafka.clients.consumer.NoOffsetForPartitionException
-
- NOT_AVAILABLE - Static variable in class org.apache.kafka.streams.state.StreamsMetadata
-
Sentinel to indicate that the StreamsMetadata is currently unavailable.
- NotControllerException - Exception in org.apache.kafka.common.errors
-
- NotControllerException(String) - Constructor for exception org.apache.kafka.common.errors.NotControllerException
-
- NotControllerException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.NotControllerException
-
- NotCoordinatorException - Exception in org.apache.kafka.common.errors
-
In the context of the group coordinator, the broker returns this error code if it receives an offset fetch
or commit request for a group it's not the coordinator of.
- NotCoordinatorException(String) - Constructor for exception org.apache.kafka.common.errors.NotCoordinatorException
-
- NotCoordinatorException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.NotCoordinatorException
-
- NotCoordinatorForGroupException - Exception in org.apache.kafka.common.errors
-
- NotCoordinatorForGroupException() - Constructor for exception org.apache.kafka.common.errors.NotCoordinatorForGroupException
-
Deprecated.
- NotCoordinatorForGroupException(String) - Constructor for exception org.apache.kafka.common.errors.NotCoordinatorForGroupException
-
Deprecated.
- NotCoordinatorForGroupException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.NotCoordinatorForGroupException
-
Deprecated.
- NotCoordinatorForGroupException(Throwable) - Constructor for exception org.apache.kafka.common.errors.NotCoordinatorForGroupException
-
Deprecated.
- NotEnoughReplicasAfterAppendException - Exception in org.apache.kafka.common.errors
-
Number of insync replicas for the partition is lower than min.insync.replicas This exception is raised when the low
ISR size is discovered *after* the message was already appended to the log.
- NotEnoughReplicasAfterAppendException(String) - Constructor for exception org.apache.kafka.common.errors.NotEnoughReplicasAfterAppendException
-
- NotEnoughReplicasException - Exception in org.apache.kafka.common.errors
-
Number of insync replicas for the partition is lower than min.insync.replicas
- NotEnoughReplicasException() - Constructor for exception org.apache.kafka.common.errors.NotEnoughReplicasException
-
- NotEnoughReplicasException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.NotEnoughReplicasException
-
- NotEnoughReplicasException(String) - Constructor for exception org.apache.kafka.common.errors.NotEnoughReplicasException
-
- NotEnoughReplicasException(Throwable) - Constructor for exception org.apache.kafka.common.errors.NotEnoughReplicasException
-
- NotFoundException - Exception in org.apache.kafka.connect.errors
-
Indicates that an operation attempted to modify or delete a connector or task that is not present on the worker.
- NotFoundException(String) - Constructor for exception org.apache.kafka.connect.errors.NotFoundException
-
- NotFoundException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.NotFoundException
-
- NotFoundException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.NotFoundException
-
- NotLeaderForPartitionException - Exception in org.apache.kafka.common.errors
-
This server is not the leader for the given partition
- NotLeaderForPartitionException() - Constructor for exception org.apache.kafka.common.errors.NotLeaderForPartitionException
-
- NotLeaderForPartitionException(String) - Constructor for exception org.apache.kafka.common.errors.NotLeaderForPartitionException
-
- NotLeaderForPartitionException(Throwable) - Constructor for exception org.apache.kafka.common.errors.NotLeaderForPartitionException
-
- NotLeaderForPartitionException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.NotLeaderForPartitionException
-
- NULL - Static variable in class org.apache.kafka.connect.data.SchemaAndValue
-
- NULL_CHECKSUM - Static variable in class org.apache.kafka.clients.consumer.ConsumerRecord
-
- NULL_SIZE - Static variable in class org.apache.kafka.clients.consumer.ConsumerRecord
-
- NUM_STANDBY_REPLICAS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
num.standby.replicas
- NUM_STREAM_THREADS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
num.stream.threads
- numPartitions() - Method in class org.apache.kafka.clients.admin.NewTopic
-
The number of partitions for the new topic or -1 if a replica assignment has been specified.
- numPartitions() - Method in class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
Return the number of partitions to create or null if replicaAssignments is not null.
- of(long) - Static method in class org.apache.kafka.streams.kstream.JoinWindows
-
Specifies that records of the same key are joinable if their timestamps are within timeDifferenceMs
,
i.e., the timestamp of a record from the secondary stream is max timeDifferenceMs
earlier or later than
the timestamp of the record from the primary stream.
- of(long) - Static method in class org.apache.kafka.streams.kstream.TimeWindows
-
Return a window definition with the given window size, and with the advance interval being equal to the window
size.
- of() - Static method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
Return an unlimited window starting at timestamp zero.
- offlineReplicas() - Method in class org.apache.kafka.common.PartitionInfo
-
The subset of the replicas that are offline
- offset() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The position of this record in the corresponding Kafka partition.
- offset() - Method in class org.apache.kafka.clients.consumer.OffsetAndMetadata
-
- offset() - Method in class org.apache.kafka.clients.consumer.OffsetAndTimestamp
-
- offset() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
The offset of the record in the topic/partition.
- offset(Map<TopicPartition, Long>) - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Reset the consumer offsets for the given topic partitions.
- offset(TopicPartition, long) - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Reset the consumer offsets for the given topic partition.
- offset(Map<String, T>) - Method in interface org.apache.kafka.connect.storage.OffsetStorageReader
-
Get the offset for the specified partition.
- offset() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the offset of the current input record; could be -1 if it is not
available (for example, if this method is invoked from the punctuate call)
- OffsetAndMetadata - Class in org.apache.kafka.clients.consumer
-
The Kafka offset commit API allows users to provide additional metadata (in the form of a string)
when an offset is committed.
- OffsetAndMetadata(long, String) - Constructor for class org.apache.kafka.clients.consumer.OffsetAndMetadata
-
Construct a new OffsetAndMetadata object for committing through
KafkaConsumer
.
- OffsetAndMetadata(long) - Constructor for class org.apache.kafka.clients.consumer.OffsetAndMetadata
-
Construct a new OffsetAndMetadata object for committing through
KafkaConsumer
.
- OffsetAndTimestamp - Class in org.apache.kafka.clients.consumer
-
A container class for offset and timestamp.
- OffsetAndTimestamp(long, long) - Constructor for class org.apache.kafka.clients.consumer.OffsetAndTimestamp
-
- OffsetCommitCallback - Interface in org.apache.kafka.clients.consumer
-
A callback interface that the user can implement to trigger custom actions when a commit request completes.
- OffsetMetadataTooLarge - Exception in org.apache.kafka.common.errors
-
The client has tried to save its offset with associated metadata larger than the maximum size allowed by the server.
- OffsetMetadataTooLarge() - Constructor for exception org.apache.kafka.common.errors.OffsetMetadataTooLarge
-
- OffsetMetadataTooLarge(String) - Constructor for exception org.apache.kafka.common.errors.OffsetMetadataTooLarge
-
- OffsetMetadataTooLarge(Throwable) - Constructor for exception org.apache.kafka.common.errors.OffsetMetadataTooLarge
-
- OffsetMetadataTooLarge(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.OffsetMetadataTooLarge
-
- OffsetOutOfRangeException - Exception in org.apache.kafka.clients.consumer
-
No reset policy has been defined, and the offsets for these partitions are either larger or smaller
than the range of offsets the server has for the given partition.
- OffsetOutOfRangeException(Map<TopicPartition, Long>) - Constructor for exception org.apache.kafka.clients.consumer.OffsetOutOfRangeException
-
- OffsetOutOfRangeException - Exception in org.apache.kafka.common.errors
-
No reset policy has been defined, and the offsets for these partitions are either larger or smaller
than the range of offsets the server has for the given partition.
- OffsetOutOfRangeException(String) - Constructor for exception org.apache.kafka.common.errors.OffsetOutOfRangeException
-
- OffsetOutOfRangeException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.OffsetOutOfRangeException
-
- offsetOutOfRangePartitions() - Method in exception org.apache.kafka.clients.consumer.OffsetOutOfRangeException
-
- OffsetResetStrategy - Enum in org.apache.kafka.clients.consumer
-
- offsets(Collection<Map<String, T>>) - Method in interface org.apache.kafka.connect.storage.OffsetStorageReader
-
Get a set of offsets for the specified partition identifiers.
- offsetsForTimes(Map<TopicPartition, Long>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- offsetsForTimes(Map<TopicPartition, Long>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Look up the offsets for the given partitions by timestamp.
- offsetsForTimes(Map<TopicPartition, Long>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- offsetStorageReader() - Method in interface org.apache.kafka.connect.source.SourceTaskContext
-
Get the OffsetStorageReader for this SourceTask.
- OffsetStorageReader - Interface in org.apache.kafka.connect.storage
-
OffsetStorageReader provides access to the offset storage used by sources.
- onAcknowledgement(RecordMetadata, Exception) - Method in interface org.apache.kafka.clients.producer.ProducerInterceptor
-
This method is called when the record sent to the server has been acknowledged, or when sending the record fails before
it gets sent to the server.
- onAssignment(PartitionAssignor.Assignment) - Method in class org.apache.kafka.clients.consumer.StickyAssignor
-
- onBatchRestored(TopicPartition, String, long, long) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingBatchingRestoreCallback
-
- onBatchRestored(TopicPartition, String, long, long) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingRestoreCallback
-
- onBatchRestored(TopicPartition, String, long, long) - Method in interface org.apache.kafka.streams.processor.StateRestoreListener
-
Method called after restoring a batch of records.
- onChange(KafkaStreams.State, KafkaStreams.State) - Method in interface org.apache.kafka.streams.KafkaStreams.StateListener
-
Called when state changes.
- onCommit(Map<TopicPartition, OffsetAndMetadata>) - Method in interface org.apache.kafka.clients.consumer.ConsumerInterceptor
-
This is called when offsets get committed.
- onComplete(Map<TopicPartition, OffsetAndMetadata>, Exception) - Method in interface org.apache.kafka.clients.consumer.OffsetCommitCallback
-
A callback method the user can implement to provide asynchronous handling of commit request completion.
- onCompletion(RecordMetadata, Exception) - Method in interface org.apache.kafka.clients.producer.Callback
-
A callback method the user can implement to provide asynchronous handling of request completion.
- onConsume(ConsumerRecords<K, V>) - Method in interface org.apache.kafka.clients.consumer.ConsumerInterceptor
-
- onInvalidTimestamp(ConsumerRecord<Object, Object>, long, long) - Method in class org.apache.kafka.streams.processor.FailOnInvalidTimestamp
-
Raises an exception on every call.
- onInvalidTimestamp(ConsumerRecord<Object, Object>, long, long) - Method in class org.apache.kafka.streams.processor.LogAndSkipOnInvalidTimestamp
-
Writes a log WARN message when the extracted timestamp is invalid (negative) but returns the invalid timestamp as-is,
which ultimately causes the record to be skipped and not to be processed.
- onInvalidTimestamp(ConsumerRecord<Object, Object>, long, long) - Method in class org.apache.kafka.streams.processor.UsePreviousTimeOnInvalidTimestamp
-
Returns the current stream-time as new timestamp for the record.
- onPartitionsAssigned(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.ConsumerRebalanceListener
-
A callback method the user can implement to provide handling of customized offsets on completion of a successful
partition re-assignment.
- onPartitionsAssigned(Collection<TopicPartition>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
- onPartitionsRevoked(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.ConsumerRebalanceListener
-
A callback method the user can implement to provide handling of offset commits to a customized store on the start
of a rebalance operation.
- onPartitionsRevoked(Collection<TopicPartition>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
- onRestoreEnd(TopicPartition, String, long) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingBatchingRestoreCallback
-
- onRestoreEnd(TopicPartition, String, long) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingRestoreCallback
-
- onRestoreEnd(TopicPartition, String, long) - Method in interface org.apache.kafka.streams.processor.StateRestoreListener
-
Method called when restoring the
StateStore
is complete.
- onRestoreStart(TopicPartition, String, long, long) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingBatchingRestoreCallback
-
- onRestoreStart(TopicPartition, String, long, long) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingRestoreCallback
-
- onRestoreStart(TopicPartition, String, long, long) - Method in interface org.apache.kafka.streams.processor.StateRestoreListener
-
Method called at the very beginning of
StateStore
restoration.
- onSend(ProducerRecord<K, V>) - Method in interface org.apache.kafka.clients.producer.ProducerInterceptor
-
- onUpdate(ClusterResource) - Method in interface org.apache.kafka.common.ClusterResourceListener
-
A callback method that a user can implement to get updates for
ClusterResource
.
- open(Collection<TopicPartition>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
The SinkTask use this method to create writers for newly assigned partitions in case of partition
rebalance.
- operation() - Method in class org.apache.kafka.common.acl.AccessControlEntry
-
Return the AclOperation.
- operation() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Return the AclOperation.
- OperationNotAttemptedException - Exception in org.apache.kafka.common.errors
-
Indicates that the broker did not attempt to execute this operation.
- OperationNotAttemptedException(String) - Constructor for exception org.apache.kafka.common.errors.OperationNotAttemptedException
-
- optional() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set this schema as optional.
- OPTIONAL_BOOLEAN_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_BYTES_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_FLOAT32_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_FLOAT64_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_INT16_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_INT32_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_INT64_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_INT8_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- OPTIONAL_STRING_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- orderInGroup - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- org.apache.kafka.clients.admin - package org.apache.kafka.clients.admin
-
- org.apache.kafka.clients.consumer - package org.apache.kafka.clients.consumer
-
- org.apache.kafka.clients.producer - package org.apache.kafka.clients.producer
-
- org.apache.kafka.common - package org.apache.kafka.common
-
- org.apache.kafka.common.acl - package org.apache.kafka.common.acl
-
- org.apache.kafka.common.annotation - package org.apache.kafka.common.annotation
-
- org.apache.kafka.common.config - package org.apache.kafka.common.config
-
- org.apache.kafka.common.errors - package org.apache.kafka.common.errors
-
- org.apache.kafka.common.header - package org.apache.kafka.common.header
-
- org.apache.kafka.common.resource - package org.apache.kafka.common.resource
-
- org.apache.kafka.common.security.auth - package org.apache.kafka.common.security.auth
-
- org.apache.kafka.common.serialization - package org.apache.kafka.common.serialization
-
- org.apache.kafka.connect.connector - package org.apache.kafka.connect.connector
-
- org.apache.kafka.connect.data - package org.apache.kafka.connect.data
-
- org.apache.kafka.connect.errors - package org.apache.kafka.connect.errors
-
- org.apache.kafka.connect.sink - package org.apache.kafka.connect.sink
-
- org.apache.kafka.connect.source - package org.apache.kafka.connect.source
-
- org.apache.kafka.connect.storage - package org.apache.kafka.connect.storage
-
- org.apache.kafka.connect.transforms - package org.apache.kafka.connect.transforms
-
- org.apache.kafka.connect.util - package org.apache.kafka.connect.util
-
- org.apache.kafka.server.policy - package org.apache.kafka.server.policy
-
- org.apache.kafka.streams - package org.apache.kafka.streams
-
- org.apache.kafka.streams.errors - package org.apache.kafka.streams.errors
-
- org.apache.kafka.streams.kstream - package org.apache.kafka.streams.kstream
-
- org.apache.kafka.streams.processor - package org.apache.kafka.streams.processor
-
- org.apache.kafka.streams.state - package org.apache.kafka.streams.state
-
- originals() - Method in class org.apache.kafka.common.config.AbstractConfig
-
- originalsStrings() - Method in class org.apache.kafka.common.config.AbstractConfig
-
Get all the original settings, ensuring that all values are of type String.
- originalsWithPrefix(String) - Method in class org.apache.kafka.common.config.AbstractConfig
-
Gets all original settings with the given prefix, stripping the prefix before adding it to the output.
- originalsWithPrefix(String, boolean) - Method in class org.apache.kafka.common.config.AbstractConfig
-
Gets all original settings with the given prefix.
- otherValueSerde(Serde<VO>) - Static method in class org.apache.kafka.streams.kstream.Joined
-
Create an instance of
Joined
with an other value
Serde
.
- otherValueSerde() - Method in class org.apache.kafka.streams.kstream.Joined
-
- outerJoin(KStream<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>, JoinWindows) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Join records of this stream with another KStream
's records using windowed outer equi join with default
serializers and deserializers.
- outerJoin(KStream<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>, JoinWindows, Joined<K, V, VO>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Join records of this stream with another KStream
's records using windowed outer equi join with default
serializers and deserializers.
- outerJoin(KStream<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>, JoinWindows, Serde<K>, Serde<V>, Serde<VO>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- outerJoin(KTable<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Join records of this KTable
(left input) with another KTable
's (right input) records using
non-windowed outer equi join.
- outerJoin(KTable<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>, Materialized<K, VR, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Join records of this KTable
(left input) with another KTable
's (right input) records using
non-windowed outer equi join.
- outerJoin(KTable<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>, Serde<VR>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- outerJoin(KTable<K, VO>, ValueJoiner<? super V, ? super VO, ? extends VR>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- OutOfOrderSequenceException - Exception in org.apache.kafka.common.errors
-
This exception indicates that the broker received an unexpected sequence number from the producer,
which means that data may have been lost.
- OutOfOrderSequenceException(String) - Constructor for exception org.apache.kafka.common.errors.OutOfOrderSequenceException
-
- overlap(Window) - Method in class org.apache.kafka.streams.kstream.Window
-
Check if the given window overlaps with this window.
- pair(K, V) - Static method in class org.apache.kafka.streams.KeyValue
-
Create a new key-value pair.
- parameter(String, String) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set a schema parameter.
- parameters() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- parameters() - Method in interface org.apache.kafka.connect.data.Schema
-
Get a map of schema parameters.
- parameters() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- parameters(Map<String, String>) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set schema parameters.
- parse(Map<?, ?>) - Method in class org.apache.kafka.common.config.ConfigDef
-
Parse and validate configs against this configuration definition.
- parse(String) - Static method in class org.apache.kafka.streams.processor.TaskId
-
- parseType(String, Object, ConfigDef.Type) - Static method in class org.apache.kafka.common.config.ConfigDef
-
Parse a value according to its expected type.
- partition() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The partition from which this record is received
- partition() - Method in exception org.apache.kafka.clients.consumer.NoOffsetForPartitionException
-
- partition(String, Object, byte[], Object, byte[], Cluster) - Method in interface org.apache.kafka.clients.producer.Partitioner
-
Compute the partition for the given record.
- partition() - Method in class org.apache.kafka.clients.producer.ProducerRecord
-
- partition() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
The partition the record was sent to
- partition(TopicPartition) - Method in class org.apache.kafka.common.Cluster
-
Get the metadata for the specified partition
- partition() - Method in class org.apache.kafka.common.PartitionInfo
-
The partition id
- partition() - Method in class org.apache.kafka.common.TopicPartition
-
- partition() - Method in class org.apache.kafka.common.TopicPartitionInfo
-
Return the partition id.
- partition() - Method in class org.apache.kafka.common.TopicPartitionReplica
-
- partition() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the partition id of the current input record; could be -1 if it is not
available (for example, if this method is invoked from the punctuate call)
- partition(K, V, int) - Method in interface org.apache.kafka.streams.processor.StreamPartitioner
-
Determine the partition number for a record with the given key and value and the current number of partitions.
- partition - Variable in class org.apache.kafka.streams.processor.TaskId
-
The ID of the partition.
- PARTITION_ASSIGNMENT_STRATEGY_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
partition.assignment.strategy
- PARTITION_GROUPER_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
partition.grouper
- partitionCountForTopic(String) - Method in class org.apache.kafka.common.Cluster
-
Get the number of partitions for the given topic
- Partitioner - Interface in org.apache.kafka.clients.producer
-
Partitioner Interface
- partitioner - Variable in class org.apache.kafka.streams.kstream.Produced
-
- PARTITIONER_CLASS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
partitioner.class
- PartitionGrouper - Interface in org.apache.kafka.streams.processor
-
A partition grouper that generates partition groups given the list of topic-partitions.
- partitionGroups(Map<Integer, Set<String>>, Cluster) - Method in class org.apache.kafka.streams.processor.DefaultPartitionGrouper
-
Generate tasks with the assigned topic partitions.
- partitionGroups(Map<Integer, Set<String>>, Cluster) - Method in interface org.apache.kafka.streams.processor.PartitionGrouper
-
Returns a map of task ids to groups of partitions.
- PartitionInfo - Class in org.apache.kafka.common
-
This is used to describe per-partition state in the MetadataResponse.
- PartitionInfo(String, int, Node, Node[], Node[]) - Constructor for class org.apache.kafka.common.PartitionInfo
-
- PartitionInfo(String, int, Node, Node[], Node[], Node[]) - Constructor for class org.apache.kafka.common.PartitionInfo
-
- partitions() - Method in class org.apache.kafka.clients.admin.TopicDescription
-
A list of partitions where the index represents the partition id and the element contains leadership and replica
information for that partition.
- partitions() - Method in class org.apache.kafka.clients.consumer.ConsumerRecords
-
Get the partitions which have records contained in this record set.
- partitions() - Method in exception org.apache.kafka.clients.consumer.InvalidOffsetException
-
- partitions() - Method in exception org.apache.kafka.clients.consumer.NoOffsetForPartitionException
-
returns all partitions for which no offests are defined.
- partitions() - Method in exception org.apache.kafka.clients.consumer.OffsetOutOfRangeException
-
- partitionsFor(String) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- partitionsFor(String) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Get metadata about the partitions for a given topic.
- partitionsFor(String) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- partitionsFor(String) - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Get the partition metadata for the given topic.
- partitionsFor(String) - Method in class org.apache.kafka.clients.producer.MockProducer
-
- partitionsFor(String) - Method in interface org.apache.kafka.clients.producer.Producer
-
- partitionsForNode(int) - Method in class org.apache.kafka.common.Cluster
-
Get the list of partitions whose leader is this node
- partitionsForTopic(String) - Method in class org.apache.kafka.common.Cluster
-
Get the list of partitions for this topic
- pause(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- pause(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Suspend fetching from the requested partitions.
- pause(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- pause(TopicPartition...) - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Pause consumption of messages from the specified TopicPartitions.
- paused() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- paused() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
- paused() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- peek(ForeachAction<? super K, ? super V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Perform an action on each record of KStream
.
- peekNextKey() - Method in interface org.apache.kafka.streams.state.KeyValueIterator
-
Peek at the next key without advancing the iterator
- permissionType() - Method in class org.apache.kafka.common.acl.AccessControlEntry
-
Return the AclPermissionType.
- permissionType() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Return the AclPermissionType.
- persistent() - Method in interface org.apache.kafka.streams.processor.StateStore
-
Return if the storage is persistent or not.
- persistent() - Method in interface org.apache.kafka.streams.state.Stores.KeyValueFactory
-
Keep all key-value entries off-heap in a local database, although for durability all entries are recorded in a Kafka
topic that can be read to restore the entries if they are lost.
- persistentKeyValueStore(String) - Static method in class org.apache.kafka.streams.state.Stores
-
- persistentSessionStore(String, long) - Static method in class org.apache.kafka.streams.state.Stores
-
- persistentWindowStore(String, long, int, long, boolean) - Static method in class org.apache.kafka.streams.state.Stores
-
- PlaintextAuthenticationContext - Class in org.apache.kafka.common.security.auth
-
- PlaintextAuthenticationContext(InetAddress) - Constructor for class org.apache.kafka.common.security.auth.PlaintextAuthenticationContext
-
- PolicyViolationException - Exception in org.apache.kafka.common.errors
-
Exception thrown if a create topics request does not satisfy the configured policy for a topic.
- PolicyViolationException(String) - Constructor for exception org.apache.kafka.common.errors.PolicyViolationException
-
- PolicyViolationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.PolicyViolationException
-
- poll(long) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- poll(long) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Fetch data for the topics or partitions specified using one of the subscribe/assign APIs.
- poll(long) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- poll() - Method in class org.apache.kafka.connect.source.SourceTask
-
Poll this SourceTask for new records.
- POLL_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
poll.ms
- port() - Method in class org.apache.kafka.common.Node
-
The port for this node
- port() - Method in class org.apache.kafka.streams.state.HostInfo
-
- port() - Method in class org.apache.kafka.streams.state.StreamsMetadata
-
- position(TopicPartition) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- position(TopicPartition) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Get the offset of the next record that will be fetched (if a record with that offset exists).
- position(TopicPartition) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- postProcessParsedConfig(Map<String, Object>) - Method in class org.apache.kafka.clients.admin.AdminClientConfig
-
- postProcessParsedConfig(Map<String, Object>) - Method in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- postProcessParsedConfig(Map<String, Object>) - Method in class org.apache.kafka.clients.producer.ProducerConfig
-
- postProcessParsedConfig(Map<String, Object>) - Method in class org.apache.kafka.common.config.AbstractConfig
-
Called directly after user configs got parsed (and thus default values got set).
- postProcessParsedConfig(Map<String, Object>) - Method in class org.apache.kafka.streams.StreamsConfig
-
- PREALLOCATE_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- PREALLOCATE_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- preCommit(Map<TopicPartition, OffsetAndMetadata>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
Pre-commit hook invoked prior to an offset commit.
- predecessors() - Method in interface org.apache.kafka.streams.TopologyDescription.Node
-
The predecessors of this node within a sub-topology.
- Predicate<K,V> - Interface in org.apache.kafka.streams.kstream
-
The
Predicate
interface represents a predicate (boolean-valued function) of a
KeyValue
pair.
- principal() - Method in class org.apache.kafka.common.acl.AccessControlEntry
-
Return the principal for this entry.
- principal() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
Return the principal or null.
- PRINCIPAL_BUILDER_CLASS_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- PRINCIPAL_BUILDER_CLASS_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- PrincipalBuilder - Interface in org.apache.kafka.common.security.auth
-
- print() - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(KeyValueMapper<? super K, ? super V, String>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(KeyValueMapper<? super K, ? super V, String>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(KeyValueMapper<? super K, ? super V, String>, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(KeyValueMapper<? super K, ? super V, String>, Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- print(Printed<K, V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Print the records of this KStream using the options provided by
Printed
- print() - Method in interface org.apache.kafka.streams.kstream.KTable
-
- print(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- print(Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- print(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- Printed<K,V> - Class in org.apache.kafka.streams.kstream
-
An object to define the options used when printing a
KStream
.
- Printed(Printed<K, V>) - Constructor for class org.apache.kafka.streams.kstream.Printed
-
Copy constructor.
- printWriter - Variable in class org.apache.kafka.streams.kstream.Printed
-
- process(ProcessorSupplier<? super K, ? super V>, String...) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- process(K, V) - Method in interface org.apache.kafka.streams.processor.Processor
-
Process the record with the given key and value.
- PROCESSING_GUARANTEE_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
processing.guarantee
- Processor<K,V> - Interface in org.apache.kafka.streams.processor
-
A processor of key-value pair records.
- processor() - Method in interface org.apache.kafka.streams.TopologyDescription.GlobalStore
-
The processor node maintaining the global store.
- ProcessorContext - Interface in org.apache.kafka.streams.processor
-
Processor context interface.
- ProcessorStateException - Exception in org.apache.kafka.streams.errors
-
Indicates a processor state operation (e.g.
- ProcessorStateException(String) - Constructor for exception org.apache.kafka.streams.errors.ProcessorStateException
-
- ProcessorStateException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.ProcessorStateException
-
- ProcessorStateException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.ProcessorStateException
-
- ProcessorSupplier<K,V> - Interface in org.apache.kafka.streams.processor
-
A processor supplier that can create one or more
Processor
instances.
- Produced<K,V> - Class in org.apache.kafka.streams.kstream
-
- Produced(Produced<K, V>) - Constructor for class org.apache.kafka.streams.kstream.Produced
-
- Producer<K,V> - Interface in org.apache.kafka.clients.producer
-
- PRODUCER_PREFIX - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- ProducerConfig - Class in org.apache.kafka.clients.producer
-
Configuration for the Kafka Producer.
- ProducerFencedException - Exception in org.apache.kafka.common.errors
-
This fatal exception indicates that another producer with the same transactional.id
has been
started.
- ProducerFencedException(String) - Constructor for exception org.apache.kafka.common.errors.ProducerFencedException
-
- ProducerInterceptor<K,V> - Interface in org.apache.kafka.clients.producer
-
A plugin interface that allows you to intercept (and possibly mutate) the records received by the producer before
they are published to the Kafka cluster.
- producerPrefix(String) - Static method in class org.apache.kafka.streams.StreamsConfig
-
- ProducerRecord<K,V> - Class in org.apache.kafka.clients.producer
-
A key/value pair to be sent to Kafka.
- ProducerRecord(String, Integer, Long, K, V, Iterable<Header>) - Constructor for class org.apache.kafka.clients.producer.ProducerRecord
-
Creates a record with a specified timestamp to be sent to a specified topic and partition
- ProducerRecord(String, Integer, Long, K, V) - Constructor for class org.apache.kafka.clients.producer.ProducerRecord
-
Creates a record with a specified timestamp to be sent to a specified topic and partition
- ProducerRecord(String, Integer, K, V, Iterable<Header>) - Constructor for class org.apache.kafka.clients.producer.ProducerRecord
-
Creates a record to be sent to a specified topic and partition
- ProducerRecord(String, Integer, K, V) - Constructor for class org.apache.kafka.clients.producer.ProducerRecord
-
Creates a record to be sent to a specified topic and partition
- ProducerRecord(String, K, V) - Constructor for class org.apache.kafka.clients.producer.ProducerRecord
-
Create a record to be sent to Kafka
- ProducerRecord(String, V) - Constructor for class org.apache.kafka.clients.producer.ProducerRecord
-
Create a record with no key
- project(Schema, Object, Schema) - Static method in class org.apache.kafka.connect.data.SchemaProjector
-
This method project a value between compatible schemas and throw exceptions when non compatible schemas are provided
- punctuate(long) - Method in interface org.apache.kafka.streams.kstream.Transformer
-
- punctuate(long) - Method in interface org.apache.kafka.streams.kstream.ValueTransformer
-
- punctuate(long) - Method in class org.apache.kafka.streams.processor.AbstractProcessor
-
- punctuate(long) - Method in interface org.apache.kafka.streams.processor.Processor
-
- punctuate(long) - Method in interface org.apache.kafka.streams.processor.Punctuator
-
Perform the scheduled periodic operation.
- PunctuationType - Enum in org.apache.kafka.streams.processor
-
- Punctuator - Interface in org.apache.kafka.streams.processor
-
- put(String, Object) - Method in class org.apache.kafka.connect.data.Struct
-
Set the value of a field.
- put(Field, Object) - Method in class org.apache.kafka.connect.data.Struct
-
Set the value of a field.
- put(Collection<SinkRecord>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
Put the records in the sink.
- put(K, V) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Update the value associated with this key
- put(Windowed<K>, AGG) - Method in interface org.apache.kafka.streams.state.SessionStore
-
Write the aggregated value for the provided key to the store
- put(K, V) - Method in interface org.apache.kafka.streams.state.WindowStore
-
Put a key-value pair with the current wall-clock time as the timestamp
into the corresponding window
- put(K, V, long) - Method in interface org.apache.kafka.streams.state.WindowStore
-
Put a key-value pair with the given timestamp into the corresponding window
- putAll(List<KeyValue<K, V>>) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Update all the given key/value pairs
- putIfAbsent(K, V) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Update the value associated with this key, unless a value
is already associated with the key
- rack() - Method in class org.apache.kafka.common.Node
-
The rack for this node
- raiseError(Exception) - Method in interface org.apache.kafka.connect.connector.ConnectorContext
-
Raise an unrecoverable exception to the Connect framework.
- range(K, K) - Method in interface org.apache.kafka.streams.state.ReadOnlyKeyValueStore
-
Get an iterator over a given range of keys.
- RangeAssignor - Class in org.apache.kafka.clients.consumer
-
The range assignor works on a per-topic basis.
- RangeAssignor() - Constructor for class org.apache.kafka.clients.consumer.RangeAssignor
-
- rawKey(K) - Method in class org.apache.kafka.streams.state.StateSerdes
-
Serialize the given key.
- rawValue(V) - Method in class org.apache.kafka.streams.state.StateSerdes
-
Serialize the given value.
- readFrom(DataInputStream) - Static method in class org.apache.kafka.streams.processor.TaskId
-
- readFrom(ByteBuffer) - Static method in class org.apache.kafka.streams.processor.TaskId
-
- ReadOnlyKeyValueStore<K,V> - Interface in org.apache.kafka.streams.state
-
A key value store that only supports read operations.
- ReadOnlySessionStore<K,AGG> - Interface in org.apache.kafka.streams.state
-
A session store that only supports read operations.
- ReadOnlyWindowStore<K,V> - Interface in org.apache.kafka.streams.state
-
A window store that only supports read operations
Implementations should be thread-safe as concurrent reads and writes
are expected.
- ReassignmentInProgressException - Exception in org.apache.kafka.common.errors
-
Thrown if a request cannot be completed because a partition reassignment is in progress.
- ReassignmentInProgressException(String) - Constructor for exception org.apache.kafka.common.errors.ReassignmentInProgressException
-
- ReassignmentInProgressException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.ReassignmentInProgressException
-
- rebalance(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
Simulate a rebalance event.
- RebalanceInProgressException - Exception in org.apache.kafka.common.errors
-
- RebalanceInProgressException() - Constructor for exception org.apache.kafka.common.errors.RebalanceInProgressException
-
- RebalanceInProgressException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.RebalanceInProgressException
-
- RebalanceInProgressException(String) - Constructor for exception org.apache.kafka.common.errors.RebalanceInProgressException
-
- RebalanceInProgressException(Throwable) - Constructor for exception org.apache.kafka.common.errors.RebalanceInProgressException
-
- RECEIVE_BUFFER_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- RECEIVE_BUFFER_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
receive.buffer.bytes
- RECEIVE_BUFFER_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
receive.buffer.bytes
- RECEIVE_BUFFER_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
receive.buffer.bytes
- recommendedValues() - Method in class org.apache.kafka.common.config.ConfigValue
-
- recommendedValues(List<Object>) - Method in class org.apache.kafka.common.config.ConfigValue
-
- recommender - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- reconfigure(Map<String, String>) - Method in class org.apache.kafka.connect.connector.Connector
-
Reconfigure this Connector.
- RECONNECT_BACKOFF_MAX_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
reconnect.backoff.max.ms
- RECONNECT_BACKOFF_MAX_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
reconnect.backoff.max.ms
- RECONNECT_BACKOFF_MAX_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
reconnect.backoff.max.ms
- RECONNECT_BACKOFF_MAX_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
reconnect.backoff.max
- RECONNECT_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
reconnect.backoff.ms
- RECONNECT_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
reconnect.backoff.ms
- RECONNECT_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
reconnect.backoff.ms
- RECONNECT_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
reconnect.backoff.ms
- RecordBatchTooLargeException - Exception in org.apache.kafka.common.errors
-
This record batch is larger than the maximum allowable size
- RecordBatchTooLargeException() - Constructor for exception org.apache.kafka.common.errors.RecordBatchTooLargeException
-
- RecordBatchTooLargeException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.RecordBatchTooLargeException
-
- RecordBatchTooLargeException(String) - Constructor for exception org.apache.kafka.common.errors.RecordBatchTooLargeException
-
- RecordBatchTooLargeException(Throwable) - Constructor for exception org.apache.kafka.common.errors.RecordBatchTooLargeException
-
- recordLatency(Sensor, long, long) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Record the given latency value of the sensor.
- RecordMetadata - Class in org.apache.kafka.clients.producer
-
The metadata for a record that has been acknowledged by the server
- RecordMetadata(TopicPartition, long, long, long, Long, int, int) - Constructor for class org.apache.kafka.clients.producer.RecordMetadata
-
- RecordMetadata(TopicPartition, long, long, long, long, int, int) - Constructor for class org.apache.kafka.clients.producer.RecordMetadata
-
- records(TopicPartition) - Method in class org.apache.kafka.clients.consumer.ConsumerRecords
-
Get just the records for the given partition
- records(String) - Method in class org.apache.kafka.clients.consumer.ConsumerRecords
-
Get just the records for the given topic
- recordThroughput(Sensor, long) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Record the throughput value of a sensor.
- RecordTooLargeException - Exception in org.apache.kafka.common.errors
-
This record is larger than the maximum allowable size
- RecordTooLargeException() - Constructor for exception org.apache.kafka.common.errors.RecordTooLargeException
-
- RecordTooLargeException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.RecordTooLargeException
-
- RecordTooLargeException(String) - Constructor for exception org.apache.kafka.common.errors.RecordTooLargeException
-
- RecordTooLargeException(Throwable) - Constructor for exception org.apache.kafka.common.errors.RecordTooLargeException
-
- RecordTooLargeException(String, Map<TopicPartition, Long>) - Constructor for exception org.apache.kafka.common.errors.RecordTooLargeException
-
- recordTooLargePartitions() - Method in exception org.apache.kafka.common.errors.RecordTooLargeException
-
- reduce(Reducer<V>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
Combine the values of records in this stream by the grouped key.
- reduce(Reducer<V>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, Materialized<K, V, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
Combine the value of records in this stream by the grouped key.
- reduce(Reducer<V>, Windows<W>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, Windows<W>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, Windows<W>, StateStoreSupplier<WindowStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, SessionWindows, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, SessionWindows) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, SessionWindows, StateStoreSupplier<SessionStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- reduce(Reducer<V>, Reducer<V>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- reduce(Reducer<V>, Reducer<V>, Materialized<K, V, KeyValueStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Combine the value of records of the original
KTable
that got
mapped
to the same key into a new instance of
KTable
.
- reduce(Reducer<V>, Reducer<V>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Combine the value of records of the original
KTable
that got
mapped
to the same key into a new instance of
KTable
.
- reduce(Reducer<V>, Reducer<V>, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
- reduce(Reducer<V>) - Method in interface org.apache.kafka.streams.kstream.SessionWindowedKStream
-
- reduce(Reducer<V>, Materialized<K, V, SessionStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.SessionWindowedKStream
-
- reduce(Reducer<V>) - Method in interface org.apache.kafka.streams.kstream.TimeWindowedKStream
-
Combine the values of records in this stream by the grouped key.
- reduce(Reducer<V>, Materialized<K, V, WindowStore<Bytes, byte[]>>) - Method in interface org.apache.kafka.streams.kstream.TimeWindowedKStream
-
Combine the values of records in this stream by the grouped key.
- Reducer<V> - Interface in org.apache.kafka.streams.kstream
-
The Reducer
interface for combining two values of the same type into a new value.
- register(StateStore, boolean, StateRestoreCallback) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Registers and possibly restores the specified storage engine.
- remove(String) - Method in interface org.apache.kafka.common.header.Headers
-
Removes all headers for the given key returning if the operation succeeded.
- remove(Windowed<K>) - Method in interface org.apache.kafka.streams.state.SessionStore
-
Remove the session aggregated with provided
Windowed
key from the store
- removeSensor(Sensor) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
Remove a sensor.
- repartitionSourceTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
Deprecated.
- ReplicaNotAvailableException - Exception in org.apache.kafka.common.errors
-
- ReplicaNotAvailableException(String) - Constructor for exception org.apache.kafka.common.errors.ReplicaNotAvailableException
-
- ReplicaNotAvailableException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.ReplicaNotAvailableException
-
- ReplicaNotAvailableException(Throwable) - Constructor for exception org.apache.kafka.common.errors.ReplicaNotAvailableException
-
- replicas() - Method in class org.apache.kafka.common.PartitionInfo
-
The complete set of replicas for this partition regardless of whether they are alive or up-to-date
- replicas() - Method in class org.apache.kafka.common.TopicPartitionInfo
-
Return the replicas of the partition in the same order as the replica assignment.
- replicasAssignments() - Method in class org.apache.kafka.clients.admin.NewTopic
-
A map from partition id to replica ids (i.e.
- replicasAssignments() - Method in class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
Return a map from partition id to replica (broker) ids or null if numPartitions and replicationFactor are
set instead.
- REPLICATION_FACTOR_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
replication.factor
- replicationFactor() - Method in class org.apache.kafka.clients.admin.NewTopic
-
The replication factor for the new topic or -1 if a replica assignment has been specified.
- replicationFactor() - Method in class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
Return the number of replicas to create or null if replicaAssignments is not null.
- REQUEST_TIMEOUT_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
request.timeout.ms
- REQUEST_TIMEOUT_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
request.timeout.ms
- REQUEST_TIMEOUT_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
request.timeout.ms
- REQUEST_TIMEOUT_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
request.timeout.ms
- requestCommit() - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Request an offset commit.
- RequestMetadata(ConfigResource, Map<String, String>) - Constructor for class org.apache.kafka.server.policy.AlterConfigPolicy.RequestMetadata
-
Create an instance of this class with the provided parameters.
- RequestMetadata(String, Integer, Short, Map<Integer, List<Integer>>, Map<String, String>) - Constructor for class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
Create an instance of this class with the provided parameters.
- requestTaskReconfiguration() - Method in interface org.apache.kafka.connect.connector.ConnectorContext
-
Requests that the runtime reconfigure the Tasks for this source.
- required() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set this schema as required.
- resetPolicy - Variable in class org.apache.kafka.streams.Consumed
-
- resource() - Method in class org.apache.kafka.common.acl.AclBinding
-
Return the resource for this binding.
- Resource - Class in org.apache.kafka.common.resource
-
Represents a cluster resource with a tuple of (type, name).
- Resource(ResourceType, String) - Constructor for class org.apache.kafka.common.resource.Resource
-
Create an instance of this class with the provided parameters.
- resource() - Method in class org.apache.kafka.server.policy.AlterConfigPolicy.RequestMetadata
-
- resourceFilter() - Method in class org.apache.kafka.common.acl.AclBindingFilter
-
Return the resource filter.
- ResourceFilter - Class in org.apache.kafka.common.resource
-
A filter which matches Resource objects.
- ResourceFilter(ResourceType, String) - Constructor for class org.apache.kafka.common.resource.ResourceFilter
-
Create an instance of this class with the provided parameters.
- resourceType() - Method in class org.apache.kafka.common.resource.Resource
-
Return the resource type.
- resourceType() - Method in class org.apache.kafka.common.resource.ResourceFilter
-
Return the resource type.
- ResourceType - Enum in org.apache.kafka.common.resource
-
Represents a type of resource which an ACL can be applied to.
- restore(byte[], byte[]) - Method in class org.apache.kafka.streams.processor.AbstractNotifyingBatchingRestoreCallback
-
- restore(byte[], byte[]) - Method in interface org.apache.kafka.streams.processor.StateRestoreCallback
-
- restoreAll(Collection<KeyValue<byte[], byte[]>>) - Method in interface org.apache.kafka.streams.processor.BatchingStateRestoreCallback
-
Called to restore a number of records.
- resume(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- resume(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
- resume(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- resume(TopicPartition...) - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Resume consumption of messages from previously paused TopicPartitions.
- retainDuplicates() - Method in interface org.apache.kafka.streams.state.WindowBytesStoreSupplier
-
Whether or not this store is retaining duplicate keys.
- RETENTION_BYTES_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- RETENTION_BYTES_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- RETENTION_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- RETENTION_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- retentionPeriod() - Method in interface org.apache.kafka.streams.state.WindowBytesStoreSupplier
-
The time period for which the
WindowStore
will retain historic data.
- RetriableCommitFailedException - Exception in org.apache.kafka.clients.consumer
-
- RetriableCommitFailedException(Throwable) - Constructor for exception org.apache.kafka.clients.consumer.RetriableCommitFailedException
-
- RetriableCommitFailedException(String) - Constructor for exception org.apache.kafka.clients.consumer.RetriableCommitFailedException
-
- RetriableCommitFailedException(String, Throwable) - Constructor for exception org.apache.kafka.clients.consumer.RetriableCommitFailedException
-
- RetriableException - Exception in org.apache.kafka.common.errors
-
A retryable exception is a transient exception that if retried may succeed.
- RetriableException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.RetriableException
-
- RetriableException(String) - Constructor for exception org.apache.kafka.common.errors.RetriableException
-
- RetriableException(Throwable) - Constructor for exception org.apache.kafka.common.errors.RetriableException
-
- RetriableException() - Constructor for exception org.apache.kafka.common.errors.RetriableException
-
- RetriableException - Exception in org.apache.kafka.connect.errors
-
An exception that indicates the operation can be reattempted.
- RetriableException(String) - Constructor for exception org.apache.kafka.connect.errors.RetriableException
-
- RetriableException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.RetriableException
-
- RetriableException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.RetriableException
-
- RETRIES_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- RETRIES_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
retries
- RETRY_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
retry.backoff.ms
- RETRY_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
retry.backoff.ms
- RETRY_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
retry.backoff.ms
- RETRY_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
retry.backoff.ms
- ROCKSDB_CONFIG_SETTER_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
rocksdb.config.setter
- RocksDBConfigSetter - Interface in org.apache.kafka.streams.state
-
An interface to that allows developers to customize the RocksDB settings
for a given Store.
- RoundRobinAssignor - Class in org.apache.kafka.clients.consumer
-
The round robin assignor lays out all the available partitions and all the available consumers.
- RoundRobinAssignor() - Constructor for class org.apache.kafka.clients.consumer.RoundRobinAssignor
-
- SASL_ENABLED_MECHANISMS - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_ENABLED_MECHANISMS_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_JAAS_CONFIG - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_JAAS_CONFIG_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_KINIT_CMD - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_KINIT_CMD_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_MIN_TIME_BEFORE_RELOGIN - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_MIN_TIME_BEFORE_RELOGIN_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_PRINCIPAL_TO_LOCAL_RULES - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_PRINCIPAL_TO_LOCAL_RULES_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_SERVICE_NAME - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_SERVICE_NAME_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_TICKET_RENEW_JITTER - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_TICKET_RENEW_JITTER_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_TICKET_RENEW_WINDOW_FACTOR - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_KERBEROS_TICKET_RENEW_WINDOW_FACTOR_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SASL_MECHANISM - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
SASL mechanism configuration - standard mechanism names are listed
here.
- SASL_MECHANISM_DOC - Static variable in class org.apache.kafka.common.config.SaslConfigs
-
- SaslAuthenticationContext - Class in org.apache.kafka.common.security.auth
-
- SaslAuthenticationContext(SaslServer, SecurityProtocol, InetAddress) - Constructor for class org.apache.kafka.common.security.auth.SaslAuthenticationContext
-
- SaslAuthenticationException - Exception in org.apache.kafka.common.errors
-
This exception indicates that SASL authentication has failed.
- SaslAuthenticationException(String) - Constructor for exception org.apache.kafka.common.errors.SaslAuthenticationException
-
- SaslAuthenticationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.SaslAuthenticationException
-
- SaslConfigs - Class in org.apache.kafka.common.config
-
- SaslConfigs() - Constructor for class org.apache.kafka.common.config.SaslConfigs
-
- SCALE_FIELD - Static variable in class org.apache.kafka.connect.data.Decimal
-
- schedule(long, PunctuationType, Punctuator) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Schedules a periodic operation for processors.
- schedule(long) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
- scheduleNopPollTask() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- schedulePollTask(Runnable) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
Schedule a task to be executed during a poll().
- schema() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- SCHEMA - Static variable in class org.apache.kafka.connect.data.Date
-
- schema(int) - Static method in class org.apache.kafka.connect.data.Decimal
-
- schema() - Method in class org.apache.kafka.connect.data.Field
-
Get the schema of this field
- Schema - Interface in org.apache.kafka.connect.data
-
Definition of an abstract data type.
- schema() - Method in interface org.apache.kafka.connect.data.Schema
-
Return a concrete instance of the
Schema
- schema() - Method in class org.apache.kafka.connect.data.SchemaAndValue
-
- schema() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Return a concrete instance of the
Schema
specified by this builder
- schema() - Method in class org.apache.kafka.connect.data.Struct
-
Get the schema for this Struct.
- SCHEMA - Static variable in class org.apache.kafka.connect.data.Time
-
- SCHEMA - Static variable in class org.apache.kafka.connect.data.Timestamp
-
- Schema.Type - Enum in org.apache.kafka.connect.data
-
The type of a schema.
- SchemaAndValue - Class in org.apache.kafka.connect.data
-
- SchemaAndValue(Schema, Object) - Constructor for class org.apache.kafka.connect.data.SchemaAndValue
-
- SchemaBuilder - Class in org.apache.kafka.connect.data
-
SchemaBuilder provides a fluent API for constructing
Schema
objects.
- SchemaBuilder(Schema.Type) - Constructor for class org.apache.kafka.connect.data.SchemaBuilder
-
- SchemaBuilderException - Exception in org.apache.kafka.connect.errors
-
- SchemaBuilderException(String) - Constructor for exception org.apache.kafka.connect.errors.SchemaBuilderException
-
- SchemaBuilderException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.SchemaBuilderException
-
- SchemaBuilderException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.SchemaBuilderException
-
- SchemaProjector - Class in org.apache.kafka.connect.data
-
SchemaProjector is utility to project a value between compatible schemas and throw exceptions
when non compatible schemas are provided.
- SchemaProjector() - Constructor for class org.apache.kafka.connect.data.SchemaProjector
-
- SchemaProjectorException - Exception in org.apache.kafka.connect.errors
-
- SchemaProjectorException(String) - Constructor for exception org.apache.kafka.connect.errors.SchemaProjectorException
-
- SchemaProjectorException(String, Throwable) - Constructor for exception org.apache.kafka.connect.errors.SchemaProjectorException
-
- SchemaProjectorException(Throwable) - Constructor for exception org.apache.kafka.connect.errors.SchemaProjectorException
-
- schemaType(Class<?>) - Static method in class org.apache.kafka.connect.data.ConnectSchema
-
- SECURITY_PROTOCOL_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- SECURITY_PROTOCOL_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
security.protocol
- SecurityDisabledException - Exception in org.apache.kafka.common.errors
-
An error indicating that security is disabled on the broker.
- SecurityDisabledException(String) - Constructor for exception org.apache.kafka.common.errors.SecurityDisabledException
-
- SecurityDisabledException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.SecurityDisabledException
-
- securityProtocol() - Method in interface org.apache.kafka.common.security.auth.AuthenticationContext
-
Underlying security protocol of the authentication session.
- securityProtocol() - Method in class org.apache.kafka.common.security.auth.PlaintextAuthenticationContext
-
- securityProtocol() - Method in class org.apache.kafka.common.security.auth.SaslAuthenticationContext
-
- SecurityProtocol - Enum in org.apache.kafka.common.security.auth
-
- securityProtocol() - Method in class org.apache.kafka.common.security.auth.SslAuthenticationContext
-
- seek(TopicPartition, long) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- seek(TopicPartition, long) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Overrides the fetch offsets that the consumer will use on the next
poll(timeout)
.
- seek(TopicPartition, long) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- seekToBeginning(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- seekToBeginning(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Seek to the first offset for each of the given partitions.
- seekToBeginning(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- seekToEnd(Collection<TopicPartition>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- seekToEnd(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Seek to the last offset for each of the given partitions.
- seekToEnd(Collection<TopicPartition>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- SEGMENT_BYTES_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_BYTES_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_INDEX_BYTES_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_INDEX_BYTES_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_JITTER_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_JITTER_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_MS_CONFIG - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- SEGMENT_MS_DOC - Static variable in class org.apache.kafka.common.config.TopicConfig
-
- segmentIntervalMs() - Method in interface org.apache.kafka.streams.state.SessionBytesStoreSupplier
-
The size of a segment, in milliseconds.
- segments - Variable in class org.apache.kafka.streams.kstream.Windows
-
- segments(int) - Method in class org.apache.kafka.streams.kstream.Windows
-
Set the number of segments to be used for rolling the window store.
- segments() - Method in interface org.apache.kafka.streams.state.WindowBytesStoreSupplier
-
The number of segments the store has.
- selectKey(KeyValueMapper<? super K, ? super V, ? extends KR>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Set a new key (with possibly new type) for each input record.
- send(ProducerRecord<K, V>) - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Asynchronously send a record to a topic.
- send(ProducerRecord<K, V>, Callback) - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Asynchronously send a record to a topic and invoke the provided callback when the send has been acknowledged.
- send(ProducerRecord<K, V>) - Method in class org.apache.kafka.clients.producer.MockProducer
-
Adds the record to the list of sent records.
- send(ProducerRecord<K, V>, Callback) - Method in class org.apache.kafka.clients.producer.MockProducer
-
Adds the record to the list of sent records.
- send(ProducerRecord<K, V>) - Method in interface org.apache.kafka.clients.producer.Producer
-
- send(ProducerRecord<K, V>, Callback) - Method in interface org.apache.kafka.clients.producer.Producer
-
- SEND_BUFFER_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
-
- SEND_BUFFER_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
send.buffer.bytes
- SEND_BUFFER_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
send.buffer.bytes
- SEND_BUFFER_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
send.buffer.bytes
- sendOffsetsToTransaction(Map<TopicPartition, OffsetAndMetadata>, String) - Method in class org.apache.kafka.clients.producer.KafkaProducer
-
Sends a list of specified offsets to the consumer group coordinator, and also marks
those offsets as part of the current transaction.
- sendOffsetsToTransaction(Map<TopicPartition, OffsetAndMetadata>, String) - Method in class org.apache.kafka.clients.producer.MockProducer
-
- sendOffsetsToTransaction(Map<TopicPartition, OffsetAndMetadata>, String) - Method in interface org.apache.kafka.clients.producer.Producer
-
- sentOffsets() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- Serde<T> - Interface in org.apache.kafka.common.serialization
-
The interface for wrapping a serializer and deserializer for the given data type.
- serdeFrom(Class<T>) - Static method in class org.apache.kafka.common.serialization.Serdes
-
- serdeFrom(Serializer<T>, Deserializer<T>) - Static method in class org.apache.kafka.common.serialization.Serdes
-
Construct a serde object from separate serializer and deserializer
- Serdes - Class in org.apache.kafka.common.serialization
-
Factory for creating serializers / deserializers.
- Serdes() - Constructor for class org.apache.kafka.common.serialization.Serdes
-
- Serdes.ByteArraySerde - Class in org.apache.kafka.common.serialization
-
- Serdes.ByteBufferSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.BytesSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.DoubleSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.FloatSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.IntegerSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.LongSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.ShortSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.StringSerde - Class in org.apache.kafka.common.serialization
-
- Serdes.WrapperSerde<T> - Class in org.apache.kafka.common.serialization
-
- SerializationException - Exception in org.apache.kafka.common.errors
-
Any exception during serialization in the producer
- SerializationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.SerializationException
-
- SerializationException(String) - Constructor for exception org.apache.kafka.common.errors.SerializationException
-
- SerializationException(Throwable) - Constructor for exception org.apache.kafka.common.errors.SerializationException
-
- SerializationException() - Constructor for exception org.apache.kafka.common.errors.SerializationException
-
- serialize(String, byte[]) - Method in class org.apache.kafka.common.serialization.ByteArraySerializer
-
- serialize(String, ByteBuffer) - Method in class org.apache.kafka.common.serialization.ByteBufferSerializer
-
- serialize(String, Bytes) - Method in class org.apache.kafka.common.serialization.BytesSerializer
-
- serialize(String, Double) - Method in class org.apache.kafka.common.serialization.DoubleSerializer
-
- serialize(String, Headers, T) - Method in interface org.apache.kafka.common.serialization.ExtendedSerializer
-
Convert data
into a byte array.
- serialize(String, Headers, T) - Method in class org.apache.kafka.common.serialization.ExtendedSerializer.Wrapper
-
- serialize(String, T) - Method in class org.apache.kafka.common.serialization.ExtendedSerializer.Wrapper
-
- serialize(String, Float) - Method in class org.apache.kafka.common.serialization.FloatSerializer
-
- serialize(String, Integer) - Method in class org.apache.kafka.common.serialization.IntegerSerializer
-
- serialize(String, Long) - Method in class org.apache.kafka.common.serialization.LongSerializer
-
- serialize(String, T) - Method in interface org.apache.kafka.common.serialization.Serializer
-
Convert data
into a byte array.
- serialize(String, Short) - Method in class org.apache.kafka.common.serialization.ShortSerializer
-
- serialize(String, String) - Method in class org.apache.kafka.common.serialization.StringSerializer
-
- Serialized<K,V> - Class in org.apache.kafka.streams.kstream
-
- Serialized(Serialized<K, V>) - Constructor for class org.apache.kafka.streams.kstream.Serialized
-
- serializedKeySize() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The size of the serialized, uncompressed key in bytes.
- serializedKeySize() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
The size of the serialized, uncompressed key in bytes.
- serializedValueSize() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The size of the serialized, uncompressed value in bytes.
- serializedValueSize() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
The size of the serialized, uncompressed value in bytes.
- serializer() - Method in interface org.apache.kafka.common.serialization.Serde
-
- serializer() - Method in class org.apache.kafka.common.serialization.Serdes.WrapperSerde
-
- Serializer<T> - Interface in org.apache.kafka.common.serialization
-
An interface for converting objects to bytes.
- server() - Method in class org.apache.kafka.common.security.auth.SaslAuthenticationContext
-
- session() - Method in class org.apache.kafka.common.security.auth.SslAuthenticationContext
-
- SESSION_TIMEOUT_MS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
session.timeout.ms
- SessionBytesStoreSupplier - Interface in org.apache.kafka.streams.state
-
A store supplier that can be used to create one or more
SessionStore>
instances of type <Byte, byte[]>.
- sessionStore() - Static method in class org.apache.kafka.streams.state.QueryableStoreTypes
-
- SessionStore<K,AGG> - Interface in org.apache.kafka.streams.state
-
Interface for storing the aggregated values of sessions
- sessionStoreBuilder(SessionBytesStoreSupplier, Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.state.Stores
-
- sessionWindowed(long) - Method in interface org.apache.kafka.streams.state.Stores.PersistentKeyValueFactory
-
Deprecated.
- SessionWindowedKStream<K,V> - Interface in org.apache.kafka.streams.kstream
-
SessionWindowedKStream
is an abstraction of a
windowed record stream of
KeyValue
pairs.
- SessionWindows - Class in org.apache.kafka.streams.kstream
-
A session based window specification used for aggregating events into sessions.
- setApplicationId(String) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
This class is not part of public API and should never be used by a developer.
- setConfig(String, Options, Map<String, Object>) - Method in interface org.apache.kafka.streams.state.RocksDBConfigSetter
-
Set the rocks db options for the provided storeName.
- setException(KafkaException) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- setGlobalStateRestoreListener(StateRestoreListener) - Method in class org.apache.kafka.streams.KafkaStreams
-
Set the listener which is triggered whenever a
StateStore
is being restored in order to resume
processing.
- setStateListener(KafkaStreams.StateListener) - Method in class org.apache.kafka.streams.KafkaStreams
-
- setUncaughtExceptionHandler(Thread.UncaughtExceptionHandler) - Method in class org.apache.kafka.streams.KafkaStreams
-
Set the handler invoked when a
internal thread
abruptly
terminates due to an uncaught exception.
- Short() - Static method in class org.apache.kafka.common.serialization.Serdes
-
- ShortDeserializer - Class in org.apache.kafka.common.serialization
-
- ShortDeserializer() - Constructor for class org.apache.kafka.common.serialization.ShortDeserializer
-
- ShortSerde() - Constructor for class org.apache.kafka.common.serialization.Serdes.ShortSerde
-
- ShortSerializer - Class in org.apache.kafka.common.serialization
-
- ShortSerializer() - Constructor for class org.apache.kafka.common.serialization.ShortSerializer
-
- shouldListInternal() - Method in class org.apache.kafka.clients.admin.ListTopicsOptions
-
Return true if we should list internal topics.
- shouldValidateOnly() - Method in class org.apache.kafka.clients.admin.AlterConfigsOptions
-
Return true if the request should be validated without altering the configs.
- shouldValidateOnly() - Method in class org.apache.kafka.clients.admin.CreateTopicsOptions
-
Return true if the request should be validated without creating the topic.
- SinkConnector - Class in org.apache.kafka.connect.sink
-
SinkConnectors implement the Connector interface to send Kafka data to another system.
- SinkConnector() - Constructor for class org.apache.kafka.connect.sink.SinkConnector
-
- SinkRecord - Class in org.apache.kafka.connect.sink
-
SinkRecord is a
ConnectRecord
that has been read from Kafka and includes the kafkaOffset of
the record in the Kafka topic-partition in addition to the standard fields.
- SinkRecord(String, int, Schema, Object, Schema, Object, long) - Constructor for class org.apache.kafka.connect.sink.SinkRecord
-
- SinkRecord(String, int, Schema, Object, Schema, Object, long, Long, TimestampType) - Constructor for class org.apache.kafka.connect.sink.SinkRecord
-
- SinkTask - Class in org.apache.kafka.connect.sink
-
SinkTask is a Task that takes records loaded from Kafka and sends them to another system.
- SinkTask() - Constructor for class org.apache.kafka.connect.sink.SinkTask
-
- SinkTaskContext - Interface in org.apache.kafka.connect.sink
-
Context passed to SinkTasks, allowing them to access utilities in the Kafka Connect runtime.
- sinkTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
Deprecated.
- size() - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
- size() - Method in class org.apache.kafka.streams.kstream.TimeWindows
-
- size() - Method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
Return the size of the specified windows in milliseconds.
- size() - Method in class org.apache.kafka.streams.kstream.Windows
-
Return the size of the specified windows in milliseconds.
- sizeMs - Variable in class org.apache.kafka.streams.kstream.TimeWindows
-
The size of the windows in milliseconds.
- source() - Method in interface org.apache.kafka.streams.TopologyDescription.GlobalStore
-
The source node reading from a "global" topic.
- SourceConnector - Class in org.apache.kafka.connect.source
-
SourceConnectors implement the connector interface to pull data from another system and send
it to Kafka.
- SourceConnector() - Constructor for class org.apache.kafka.connect.source.SourceConnector
-
- sourceOffset() - Method in class org.apache.kafka.connect.source.SourceRecord
-
- sourcePartition() - Method in class org.apache.kafka.connect.source.SourceRecord
-
- SourceRecord - Class in org.apache.kafka.connect.source
-
SourceRecords are generated by SourceTasks and passed to Kafka Connect for storage in
Kafka.
- SourceRecord(Map<String, ?>, Map<String, ?>, String, Integer, Schema, Object) - Constructor for class org.apache.kafka.connect.source.SourceRecord
-
- SourceRecord(Map<String, ?>, Map<String, ?>, String, Schema, Object) - Constructor for class org.apache.kafka.connect.source.SourceRecord
-
- SourceRecord(Map<String, ?>, Map<String, ?>, String, Schema, Object, Schema, Object) - Constructor for class org.apache.kafka.connect.source.SourceRecord
-
- SourceRecord(Map<String, ?>, Map<String, ?>, String, Integer, Schema, Object, Schema, Object) - Constructor for class org.apache.kafka.connect.source.SourceRecord
-
- SourceRecord(Map<String, ?>, Map<String, ?>, String, Integer, Schema, Object, Schema, Object, Long) - Constructor for class org.apache.kafka.connect.source.SourceRecord
-
- SourceTask - Class in org.apache.kafka.connect.source
-
SourceTask is a Task that pulls records from another system for storage in Kafka.
- SourceTask() - Constructor for class org.apache.kafka.connect.source.SourceTask
-
- SourceTaskContext - Interface in org.apache.kafka.connect.source
-
SourceTaskContext is provided to SourceTasks to allow them to interact with the underlying
runtime.
- sourceTopicPattern() - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
NOTE this function would not needed by developers working with the processor APIs, but only used
for the high-level DSL parsing functionalities.
- sourceTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
Deprecated.
- SSL_CIPHER_SUITES_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_CIPHER_SUITES_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_CLIENT_AUTH_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_CLIENT_AUTH_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_ENABLED_PROTOCOLS_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_ENABLED_PROTOCOLS_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_ENDPOINT_IDENTIFICATION_ALGORITHM_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_ENDPOINT_IDENTIFICATION_ALGORITHM_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEY_PASSWORD_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEY_PASSWORD_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYMANAGER_ALGORITHM_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYMANAGER_ALGORITHM_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYSTORE_LOCATION_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYSTORE_LOCATION_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYSTORE_PASSWORD_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYSTORE_PASSWORD_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYSTORE_TYPE_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_KEYSTORE_TYPE_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_PROTOCOL_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_PROTOCOL_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_PROVIDER_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_PROVIDER_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_SECURE_RANDOM_IMPLEMENTATION_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_SECURE_RANDOM_IMPLEMENTATION_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTMANAGER_ALGORITHM_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTMANAGER_ALGORITHM_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTSTORE_LOCATION_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTSTORE_LOCATION_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTSTORE_PASSWORD_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTSTORE_PASSWORD_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTSTORE_TYPE_CONFIG - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SSL_TRUSTSTORE_TYPE_DOC - Static variable in class org.apache.kafka.common.config.SslConfigs
-
- SslAuthenticationContext - Class in org.apache.kafka.common.security.auth
-
- SslAuthenticationContext(SSLSession, InetAddress) - Constructor for class org.apache.kafka.common.security.auth.SslAuthenticationContext
-
- SslAuthenticationException - Exception in org.apache.kafka.common.errors
-
This exception indicates that SSL handshake has failed.
- SslAuthenticationException(String) - Constructor for exception org.apache.kafka.common.errors.SslAuthenticationException
-
- SslAuthenticationException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.SslAuthenticationException
-
- SslConfigs - Class in org.apache.kafka.common.config
-
- SslConfigs() - Constructor for class org.apache.kafka.common.config.SslConfigs
-
- standbyTasks() - Method in class org.apache.kafka.streams.processor.ThreadMetadata
-
- start(Map<String, String>) - Method in class org.apache.kafka.connect.connector.Connector
-
Start this Connector.
- start(Map<String, String>) - Method in interface org.apache.kafka.connect.connector.Task
-
Start the Task
- start(Map<String, String>) - Method in class org.apache.kafka.connect.sink.SinkTask
-
Start the Task.
- start(Map<String, String>) - Method in class org.apache.kafka.connect.source.SourceTask
-
Start the Task.
- start() - Method in class org.apache.kafka.streams.KafkaStreams
-
Start the KafkaStreams
instance by starting all its threads.
- start() - Method in class org.apache.kafka.streams.kstream.Window
-
Return the start timestamp of this window.
- startMs - Variable in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
The start timestamp of the window.
- startMs - Variable in class org.apache.kafka.streams.kstream.Window
-
- startOn(long) - Method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
Return a new unlimited window for the specified start timestamp.
- state() - Method in class org.apache.kafka.streams.KafkaStreams
-
- STATE_CLEANUP_DELAY_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
state.cleanup.delay
- STATE_DIR_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
state.dir
- stateChangelogTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
Deprecated.
- stateDir() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the state directory for the partition.
- StateRestoreCallback - Interface in org.apache.kafka.streams.processor
-
Restoration logic for log-backed state stores upon restart,
it takes one record at a time from the logs to apply to the restoring state.
- StateRestoreListener - Interface in org.apache.kafka.streams.processor
-
Class for listening to various states of the restoration process of a StateStore.
- StateSerdes<K,V> - Class in org.apache.kafka.streams.state
-
Factory for creating serializers / deserializers for state stores in Kafka Streams.
- StateSerdes(String, Serde<K>, Serde<V>) - Constructor for class org.apache.kafka.streams.state.StateSerdes
-
Create a context for serialization using the specified serializers and deserializers which
must match the key and value types used as parameters for this object; the state changelog topic
is provided to bind this serde factory to, so that future calls for serialize / deserialize do not
need to provide the topic name any more.
- StateStore - Interface in org.apache.kafka.streams.processor
-
A storage engine for managing state maintained by a stream processor.
- stateStoreNames() - Method in class org.apache.kafka.streams.state.StreamsMetadata
-
- stateStoreNameToSourceTopics() - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
NOTE this function would not needed by developers working with the processor APIs, but only used
for the high-level DSL parsing functionalities.
- StateStoreSupplier<T extends StateStore> - Interface in org.apache.kafka.streams.processor
-
- StickyAssignor - Class in org.apache.kafka.clients.consumer
-
The sticky assignor serves two purposes.
- StickyAssignor() - Constructor for class org.apache.kafka.clients.consumer.StickyAssignor
-
- stop() - Method in class org.apache.kafka.connect.connector.Connector
-
Stop this connector.
- stop() - Method in interface org.apache.kafka.connect.connector.Task
-
Stop this task.
- stop() - Method in class org.apache.kafka.connect.sink.SinkTask
-
Perform any cleanup to stop this task.
- stop() - Method in class org.apache.kafka.connect.source.SourceTask
-
Signal this SourceTask to stop.
- store(String, QueryableStoreType<T>) - Method in class org.apache.kafka.streams.KafkaStreams
-
Get a facade wrapping the local
StateStore
instances with the provided
storeName
if the Store's
type is accepted by the provided
queryableStoreType
.
- StoreBuilder<T extends StateStore> - Interface in org.apache.kafka.streams.state
-
Build a
StateStore
wrapped with optional caching and logging.
- StoreFactory() - Constructor for class org.apache.kafka.streams.state.Stores.StoreFactory
-
- storeName - Variable in class org.apache.kafka.streams.kstream.Materialized
-
- Stores - Class in org.apache.kafka.streams.state
-
Factory for creating state stores in Kafka Streams.
- Stores() - Constructor for class org.apache.kafka.streams.state.Stores
-
- stores() - Method in interface org.apache.kafka.streams.TopologyDescription.Processor
-
The names of all connected stores.
- Stores.InMemoryKeyValueFactory<K,V> - Interface in org.apache.kafka.streams.state
-
Deprecated.
- Stores.KeyValueFactory<K,V> - Interface in org.apache.kafka.streams.state
-
- Stores.PersistentKeyValueFactory<K,V> - Interface in org.apache.kafka.streams.state
-
Deprecated.
- Stores.StoreFactory - Class in org.apache.kafka.streams.state
-
- Stores.ValueFactory<K> - Class in org.apache.kafka.streams.state
-
The factory for creating off-heap key-value stores.
- storeSupplier - Variable in class org.apache.kafka.streams.kstream.Materialized
-
- StoreSupplier<T extends StateStore> - Interface in org.apache.kafka.streams.state
-
A state store supplier which can create one or more
StateStore
instances.
- stream(String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topics.
- stream(TopologyBuilder.AutoOffsetReset, String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topics.
- stream(Pattern) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topic pattern.
- stream(TopologyBuilder.AutoOffsetReset, Pattern) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topic pattern.
- stream(Serde<K>, Serde<V>, String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topics.
- stream(TopologyBuilder.AutoOffsetReset, Serde<K>, Serde<V>, String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topics.
- stream(TimestampExtractor, Serde<K>, Serde<V>, String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topics.
- stream(TopologyBuilder.AutoOffsetReset, TimestampExtractor, Serde<K>, Serde<V>, String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topics.
- stream(Serde<K>, Serde<V>, Pattern) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topic pattern.
- stream(TopologyBuilder.AutoOffsetReset, Serde<K>, Serde<V>, Pattern) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topic pattern.
- stream(TimestampExtractor, Serde<K>, Serde<V>, Pattern) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topic pattern.
- stream(TopologyBuilder.AutoOffsetReset, TimestampExtractor, Serde<K>, Serde<V>, Pattern) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KStream
from the specified topic pattern.
- stream(String) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KStream
from the specified topics.
- stream(String, Consumed<K, V>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KStream
from the specified topics.
- stream(Collection<String>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KStream
from the specified topics.
- stream(Collection<String>, Consumed<K, V>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KStream
from the specified topics.
- stream(Pattern) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KStream
from the specified topic pattern.
- stream(Pattern, Consumed<K, V>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KStream
from the specified topic pattern.
- STREAM_THREAD_INSTANCE - Static variable in class org.apache.kafka.streams.StreamsConfig.InternalConfig
-
- streamPartitioner(StreamPartitioner<? super K, ? super V>) - Static method in class org.apache.kafka.streams.kstream.Produced
-
Create a Produced instance with provided partitioner.
- StreamPartitioner<K,V> - Interface in org.apache.kafka.streams.processor
-
Determine how records are distributed among the partitions in a Kafka topic.
- StreamsBuilder - Class in org.apache.kafka.streams
-
StreamsBuilder
provide the high-level Kafka Streams DSL to specify a Kafka Streams topology.
- StreamsBuilder() - Constructor for class org.apache.kafka.streams.StreamsBuilder
-
- StreamsConfig - Class in org.apache.kafka.streams
-
- StreamsConfig(Map<?, ?>) - Constructor for class org.apache.kafka.streams.StreamsConfig
-
Create a new StreamsConfig
using the given properties.
- StreamsConfig.InternalConfig - Class in org.apache.kafka.streams
-
- StreamsException - Exception in org.apache.kafka.streams.errors
-
- StreamsException(String) - Constructor for exception org.apache.kafka.streams.errors.StreamsException
-
- StreamsException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.StreamsException
-
- StreamsException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.StreamsException
-
- StreamsMetadata - Class in org.apache.kafka.streams.state
-
Represents the state of an instance (process) in a
KafkaStreams
application.
- StreamsMetadata(HostInfo, Set<String>, Set<TopicPartition>) - Constructor for class org.apache.kafka.streams.state.StreamsMetadata
-
- StreamsMetrics - Interface in org.apache.kafka.streams
-
The Kafka Streams metrics interface for adding metric sensors and collecting metric values.
- String() - Static method in class org.apache.kafka.common.serialization.Serdes
-
- string() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- STRING_SCHEMA - Static variable in interface org.apache.kafka.connect.data.Schema
-
- StringConverter - Class in org.apache.kafka.connect.storage
-
Converter
implementation that only supports serializing to strings.
- StringConverter() - Constructor for class org.apache.kafka.connect.storage.StringConverter
-
- StringDeserializer - Class in org.apache.kafka.common.serialization
-
String encoding defaults to UTF8 and can be customized by setting the property key.deserializer.encoding,
value.deserializer.encoding or deserializer.encoding.
- StringDeserializer() - Constructor for class org.apache.kafka.common.serialization.StringDeserializer
-
- StringSerde() - Constructor for class org.apache.kafka.common.serialization.Serdes.StringSerde
-
- StringSerializer - Class in org.apache.kafka.common.serialization
-
String encoding defaults to UTF8 and can be customized by setting the property key.serializer.encoding,
value.serializer.encoding or serializer.encoding.
- StringSerializer() - Constructor for class org.apache.kafka.common.serialization.StringSerializer
-
- struct() - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
- Struct - Class in org.apache.kafka.connect.data
-
A structured record containing a set of named fields with values, each field using an independent
Schema
.
- Struct(Schema) - Constructor for class org.apache.kafka.connect.data.Struct
-
Create a new Struct for this
Schema
- subscribe(Collection<String>) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- subscribe(Collection<String>, ConsumerRebalanceListener) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- subscribe(Pattern, ConsumerRebalanceListener) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- subscribe(Pattern) - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- subscribe(Collection<String>, ConsumerRebalanceListener) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Subscribe to the given list of topics to get dynamically
assigned partitions.
- subscribe(Collection<String>) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Subscribe to the given list of topics to get dynamically assigned partitions.
- subscribe(Pattern, ConsumerRebalanceListener) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Subscribe to all topics matching specified pattern to get dynamically assigned partitions.
- subscribe(Pattern) - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Subscribe to all topics matching specified pattern to get dynamically assigned partitions.
- subscribe(Collection<String>) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- subscribe(Pattern, ConsumerRebalanceListener) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- subscribe(Pattern) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- subscribe(Collection<String>, ConsumerRebalanceListener) - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- subscription() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- subscription() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Get the current subscription.
- subscription() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- subscription(Set<String>) - Method in class org.apache.kafka.clients.consumer.StickyAssignor
-
- subscriptionUpdates() - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
NOTE this function would not needed by developers working with the processor APIs, but only used
for the high-level DSL parsing functionalities.
- subtopologies() - Method in interface org.apache.kafka.streams.TopologyDescription
-
All sub-topologies of the represented topology.
- successors() - Method in interface org.apache.kafka.streams.TopologyDescription.Node
-
The successor of this node within a sub-topology.
- table(String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(String, StateStoreSupplier<KeyValueStore>) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, String, StateStoreSupplier<KeyValueStore>) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TimestampExtractor, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, TimestampExtractor, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(Serde<K>, Serde<V>, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(Serde<K>, Serde<V>, String, StateStoreSupplier<KeyValueStore>) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(Serde<K>, Serde<V>, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, Serde<K>, Serde<V>, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TimestampExtractor, Serde<K>, Serde<V>, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, TimestampExtractor, Serde<K>, Serde<V>, String, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, Serde<K>, Serde<V>, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(TopologyBuilder.AutoOffsetReset, TimestampExtractor, Serde<K>, Serde<V>, String, StateStoreSupplier<KeyValueStore>) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Deprecated.
Create a
KTable
for the specified topic.
- table(String, Consumed<K, V>, Materialized<K, V, KeyValueStore<Bytes, byte[]>>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KTable
for the specified topic.
- table(String) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KTable
for the specified topic.
- table(String, Consumed<K, V>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KTable
for the specified topic.
- table(String, Materialized<K, V, KeyValueStore<Bytes, byte[]>>) - Method in class org.apache.kafka.streams.StreamsBuilder
-
Create a
KTable
for the specified topic.
- tags() - Method in class org.apache.kafka.common.MetricName
-
- tags() - Method in class org.apache.kafka.common.MetricNameTemplate
-
Get the set of tag names for the metric.
- Task - Interface in org.apache.kafka.connect.connector
-
Tasks contain the code that actually copies data to/from another system.
- TaskAssignmentException - Exception in org.apache.kafka.streams.errors
-
Indicates a run time error incurred while trying to assign
stream tasks
to
threads
.
- TaskAssignmentException(String) - Constructor for exception org.apache.kafka.streams.errors.TaskAssignmentException
-
- TaskAssignmentException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskAssignmentException
-
- TaskAssignmentException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskAssignmentException
-
- taskClass() - Method in class org.apache.kafka.connect.connector.Connector
-
Returns the Task implementation for this Connector.
- taskConfigs(int) - Method in class org.apache.kafka.connect.connector.Connector
-
Returns a set of configurations for Tasks based on the current configuration,
producing at most count configurations.
- taskId() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the task id
- TaskId - Class in org.apache.kafka.streams.processor
-
The task ID representation composed as topic group ID plus the assigned partition ID.
- TaskId(int, int) - Constructor for class org.apache.kafka.streams.processor.TaskId
-
- taskId() - Method in class org.apache.kafka.streams.processor.TaskMetadata
-
- TaskIdFormatException - Exception in org.apache.kafka.streams.errors
-
Indicates a run time error incurred while trying parse the
task id
from the read string.
- TaskIdFormatException(String) - Constructor for exception org.apache.kafka.streams.errors.TaskIdFormatException
-
- TaskIdFormatException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskIdFormatException
-
- TaskIdFormatException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskIdFormatException
-
- TaskMetadata - Class in org.apache.kafka.streams.processor
-
Represents the state of a single task running within a
KafkaStreams
application.
- TaskMetadata(String, Set<TopicPartition>) - Constructor for class org.apache.kafka.streams.processor.TaskMetadata
-
- TaskMigratedException - Exception in org.apache.kafka.streams.errors
-
Indicates that a task got migrated to another thread.
- TaskMigratedException(Task) - Constructor for exception org.apache.kafka.streams.errors.TaskMigratedException
-
- TaskMigratedException(Task, TopicPartition, long, long) - Constructor for exception org.apache.kafka.streams.errors.TaskMigratedException
-
- TaskMigratedException(Task, Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskMigratedException
-
- test(K, V) - Method in interface org.apache.kafka.streams.kstream.Predicate
-
Test if the record with the given key and value satisfies the predicate.
- thenApply(KafkaFuture.Function<T, R>) - Method in class org.apache.kafka.common.KafkaFuture
-
Returns a new KafkaFuture that, when this future completes normally, is executed with this
futures's result as the argument to the supplied function.
- ThreadMetadata - Class in org.apache.kafka.streams.processor
-
Represents the state of a single thread running within a
KafkaStreams
application.
- ThreadMetadata(String, String, Set<TaskMetadata>, Set<TaskMetadata>) - Constructor for class org.apache.kafka.streams.processor.ThreadMetadata
-
- threadName() - Method in class org.apache.kafka.streams.processor.ThreadMetadata
-
- threadState() - Method in class org.apache.kafka.streams.processor.ThreadMetadata
-
- through(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic and creates a new KStream
from the topic using default serializers and
deserializers and producer's DefaultPartitioner
.
- through(StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- through(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- through(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- through(String, Produced<K, V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- through(String, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(String, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(StreamPartitioner<? super K, ? super V>, String, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(StreamPartitioner<? super K, ? super V>, String, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(Serde<K>, Serde<V>, String, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(Serde<K>, Serde<V>, String, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>, String, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>, String, StateStoreSupplier<KeyValueStore>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- through(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- Time - Class in org.apache.kafka.connect.data
-
A time representing a specific point in a day, not tied to any specific date.
- Time() - Constructor for class org.apache.kafka.connect.data.Time
-
- timeout(long) - Method in interface org.apache.kafka.connect.sink.SinkTaskContext
-
Set the timeout in milliseconds.
- TimeoutException - Exception in org.apache.kafka.common.errors
-
Indicates that a request timed out.
- TimeoutException() - Constructor for exception org.apache.kafka.common.errors.TimeoutException
-
- TimeoutException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.TimeoutException
-
- TimeoutException(String) - Constructor for exception org.apache.kafka.common.errors.TimeoutException
-
- TimeoutException(Throwable) - Constructor for exception org.apache.kafka.common.errors.TimeoutException
-
- timeoutMs - Variable in class org.apache.kafka.clients.admin.AbstractOptions
-
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.AbstractOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs() - Method in class org.apache.kafka.clients.admin.AbstractOptions
-
The request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.AlterConfigsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.CreateAclsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.CreateTopicsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.DeleteAclsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.DeleteTopicsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.DescribeAclsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.DescribeClusterOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.DescribeConfigsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.DescribeTopicsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timeoutMs(Integer) - Method in class org.apache.kafka.clients.admin.ListTopicsOptions
-
Set the request timeout in milliseconds for this operation or null
if the default request timeout for the
AdminClient should be used.
- timestamp() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The timestamp of this record
- timestamp() - Method in class org.apache.kafka.clients.consumer.OffsetAndTimestamp
-
- timestamp() - Method in class org.apache.kafka.clients.producer.ProducerRecord
-
- timestamp() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
The timestamp of the record in the topic/partition.
- timestamp() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- Timestamp - Class in org.apache.kafka.connect.data
-
A timestamp representing an absolute time, without timezone information.
- Timestamp() - Constructor for class org.apache.kafka.connect.data.Timestamp
-
- timestamp() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the current timestamp.
- TIMESTAMP_EXTRACTOR_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- timestampExtractor - Variable in class org.apache.kafka.streams.Consumed
-
- TimestampExtractor - Interface in org.apache.kafka.streams.processor
-
An interface that allows the Kafka Streams framework to extract a timestamp from an instance of
ConsumerRecord
.
- timestampType() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The timestamp type of this record
- timestampType() - Method in class org.apache.kafka.connect.sink.SinkRecord
-
- TimeWindowedKStream<K,V> - Interface in org.apache.kafka.streams.kstream
-
TimeWindowedKStream
is an abstraction of a
windowed record stream of
KeyValue
pairs.
- TimeWindows - Class in org.apache.kafka.streams.kstream
-
The fixed-size time-based window specifications used for aggregations.
- to(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic using default serializers specified in the config and producer's
DefaultPartitioner
.
- to(StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- to(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- to(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- to(String, Produced<K, V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic using the provided
Produced
instance.
- to(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- to(StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- to(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- to(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- toArray() - Method in interface org.apache.kafka.common.header.Headers
-
Returns all headers as an array, in the order they were added in.
- toConnectData(String, byte[]) - Method in interface org.apache.kafka.connect.storage.Converter
-
Convert a native object to a Kafka Connect data object.
- toConnectData(String, byte[]) - Method in class org.apache.kafka.connect.storage.StringConverter
-
- toEnrichedRst() - Method in class org.apache.kafka.common.config.ConfigDef
-
Configs with new metadata (group, orderInGroup, dependents) formatted with reStructuredText, suitable for embedding in Sphinx
documentation.
- toFile(String) - Static method in class org.apache.kafka.streams.kstream.Printed
-
Print the records of a
KStream
to a file.
- toFilter() - Method in class org.apache.kafka.common.acl.AccessControlEntry
-
Create a filter which matches only this AccessControlEntry.
- toFilter() - Method in class org.apache.kafka.common.acl.AclBinding
-
Create a filter which matches only this AclBinding.
- toFilter() - Method in class org.apache.kafka.common.resource.Resource
-
Create a filter which matches only this Resource.
- toHtmlTable() - Method in class org.apache.kafka.common.config.ConfigDef
-
- toLogical(Schema, int) - Static method in class org.apache.kafka.connect.data.Date
-
- toLogical(Schema, byte[]) - Static method in class org.apache.kafka.connect.data.Decimal
-
- toLogical(Schema, int) - Static method in class org.apache.kafka.connect.data.Time
-
- toLogical(Schema, long) - Static method in class org.apache.kafka.connect.data.Timestamp
-
- topic() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The topic this record is received from
- topic() - Method in class org.apache.kafka.clients.producer.ProducerRecord
-
- topic() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
The topic the record was appended to
- topic() - Method in class org.apache.kafka.common.PartitionInfo
-
The topic name
- topic() - Method in class org.apache.kafka.common.TopicPartition
-
- topic() - Method in class org.apache.kafka.common.TopicPartitionReplica
-
- topic() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- topic() - Method in class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
Return the name of the topic to create.
- topic() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the topic name of the current input record; could be null if it is not
available (for example, if this method is invoked from the punctuate call)
- topic() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the topic.
- topic() - Method in interface org.apache.kafka.streams.TopologyDescription.Sink
-
The topic name this sink node is writing to.
- TOPIC_PREFIX - Static variable in class org.apache.kafka.streams.StreamsConfig
-
Prefix used to provide default topic configs to be applied when creating internal topics.
- TopicAuthorizationException - Exception in org.apache.kafka.common.errors
-
- TopicAuthorizationException(Set<String>) - Constructor for exception org.apache.kafka.common.errors.TopicAuthorizationException
-
- TopicAuthorizationException(String) - Constructor for exception org.apache.kafka.common.errors.TopicAuthorizationException
-
- TopicConfig - Class in org.apache.kafka.common.config
-
Keys that can be used to configure a topic.
- TopicConfig() - Constructor for class org.apache.kafka.common.config.TopicConfig
-
- topicConfig - Variable in class org.apache.kafka.streams.kstream.Materialized
-
- TopicDescription - Class in org.apache.kafka.clients.admin
-
A detailed description of a single topic in the cluster.
- TopicDescription(String, boolean, List<TopicPartitionInfo>) - Constructor for class org.apache.kafka.clients.admin.TopicDescription
-
Create an instance with the specified parameters.
- TopicExistsException - Exception in org.apache.kafka.common.errors
-
- TopicExistsException(String) - Constructor for exception org.apache.kafka.common.errors.TopicExistsException
-
- TopicExistsException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.TopicExistsException
-
- topicGroupId - Variable in class org.apache.kafka.streams.processor.TaskId
-
The ID of the topic group.
- topicGroups() - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Returns the map of topic groups keyed by the group id.
- TopicListing - Class in org.apache.kafka.clients.admin
-
A listing of a topic in the cluster.
- TopicListing(String, boolean) - Constructor for class org.apache.kafka.clients.admin.TopicListing
-
Create an instance with the specified parameters.
- TopicPartition - Class in org.apache.kafka.common
-
A topic name and partition number
- TopicPartition(String, int) - Constructor for class org.apache.kafka.common.TopicPartition
-
- TopicPartitionInfo - Class in org.apache.kafka.common
-
A class containing leadership, replicas and ISR information for a topic partition.
- TopicPartitionInfo(int, Node, List<Node>, List<Node>) - Constructor for class org.apache.kafka.common.TopicPartitionInfo
-
Create an instance of this class with the provided parameters.
- TopicPartitionReplica - Class in org.apache.kafka.common
-
The topic name, partition number and the brokerId of the replica
- TopicPartitionReplica(String, int, int) - Constructor for class org.apache.kafka.common.TopicPartitionReplica
-
- topicPartitions() - Method in class org.apache.kafka.streams.processor.TaskMetadata
-
- topicPartitions() - Method in class org.apache.kafka.streams.state.StreamsMetadata
-
- topicPrefix(String) - Static method in class org.apache.kafka.streams.StreamsConfig
-
Prefix a property with
StreamsConfig.TOPIC_PREFIX
used to provide default topic configs to be applied when creating internal topics.
- topics() - Method in class org.apache.kafka.common.Cluster
-
Get all topics.
- topics() - Method in interface org.apache.kafka.streams.TopologyDescription.Source
-
The topic names this source node is reading from.
- TOPICS_CONFIG - Static variable in class org.apache.kafka.connect.sink.SinkConnector
-
Configuration key for the list of input topics for this connector.
- TOPICS_CONFIG - Static variable in class org.apache.kafka.connect.sink.SinkTask
-
The configuration key that provides the list of topics that are inputs for this
SinkTask.
- TopicsInfo(Set<String>, Set<String>, Map<String, InternalTopicConfig>, Map<String, InternalTopicConfig>) - Constructor for class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
Deprecated.
- Topology - Class in org.apache.kafka.streams
-
A logical representation of a ProcessorTopology
.
- Topology() - Constructor for class org.apache.kafka.streams.Topology
-
- Topology.AutoOffsetReset - Enum in org.apache.kafka.streams
-
- TopologyBuilder - Class in org.apache.kafka.streams.processor
-
- TopologyBuilder() - Constructor for class org.apache.kafka.streams.processor.TopologyBuilder
-
Deprecated.
Create a new builder.
- TopologyBuilder.AutoOffsetReset - Enum in org.apache.kafka.streams.processor
-
Deprecated.
Enum used to define auto offset reset policy when creating
KStream
or
KTable
.
- TopologyBuilder.TopicsInfo - Class in org.apache.kafka.streams.processor
-
Deprecated.
NOTE this class would not needed by developers working with the processor APIs, but only used
for internal functionalities.
- TopologyBuilderException - Exception in org.apache.kafka.streams.errors
-
- TopologyBuilderException(String) - Constructor for exception org.apache.kafka.streams.errors.TopologyBuilderException
-
Deprecated.
- TopologyBuilderException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.TopologyBuilderException
-
Deprecated.
- TopologyBuilderException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.TopologyBuilderException
-
Deprecated.
- TopologyDescription - Interface in org.apache.kafka.streams
-
- TopologyDescription.GlobalStore - Interface in org.apache.kafka.streams
-
- TopologyDescription.Node - Interface in org.apache.kafka.streams
-
A node of a topology.
- TopologyDescription.Processor - Interface in org.apache.kafka.streams
-
A processor node of a topology.
- TopologyDescription.Sink - Interface in org.apache.kafka.streams
-
A sink node of a topology.
- TopologyDescription.Source - Interface in org.apache.kafka.streams
-
A source node of a topology.
- TopologyDescription.Subtopology - Interface in org.apache.kafka.streams
-
- TopologyException - Exception in org.apache.kafka.streams.errors
-
Indicates a pre run time error occurred while parsing the
logical topology
to construct the
physical processor topology
.
- TopologyException(String) - Constructor for exception org.apache.kafka.streams.errors.TopologyException
-
- TopologyException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.TopologyException
-
- TopologyException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.TopologyException
-
- toRst() - Method in class org.apache.kafka.common.config.ConfigDef
-
Get the configs formatted with reStructuredText, suitable for embedding in Sphinx
documentation.
- toStream() - Method in interface org.apache.kafka.streams.kstream.KTable
-
Convert this changelog stream to a
KStream
.
- toStream(KeyValueMapper<? super K, ? super V, ? extends KR>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- toString() - Method in class org.apache.kafka.clients.admin.Config
-
- toString() - Method in class org.apache.kafka.clients.admin.ConfigEntry
-
- toString() - Method in class org.apache.kafka.clients.admin.DescribeReplicaLogDirsResult.ReplicaLogDirInfo
-
- toString() - Method in class org.apache.kafka.clients.admin.NewPartitions
-
- toString() - Method in class org.apache.kafka.clients.admin.NewTopic
-
- toString() - Method in class org.apache.kafka.clients.admin.TopicDescription
-
- toString() - Method in class org.apache.kafka.clients.admin.TopicListing
-
- toString() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
- toString() - Method in class org.apache.kafka.clients.consumer.OffsetAndMetadata
-
- toString() - Method in class org.apache.kafka.clients.consumer.OffsetAndTimestamp
-
- toString() - Method in class org.apache.kafka.clients.producer.ProducerRecord
-
- toString() - Method in class org.apache.kafka.clients.producer.RecordMetadata
-
- toString() - Method in class org.apache.kafka.common.acl.AccessControlEntry
-
- toString() - Method in class org.apache.kafka.common.acl.AccessControlEntryFilter
-
- toString() - Method in class org.apache.kafka.common.acl.AclBinding
-
- toString() - Method in class org.apache.kafka.common.acl.AclBindingFilter
-
- toString() - Method in class org.apache.kafka.common.Cluster
-
- toString() - Method in class org.apache.kafka.common.ClusterResource
-
- toString() - Method in class org.apache.kafka.common.config.ConfigDef.NonEmptyString
-
- toString() - Method in class org.apache.kafka.common.config.ConfigDef.Range
-
- toString() - Method in class org.apache.kafka.common.config.ConfigDef.ValidList
-
- toString() - Method in class org.apache.kafka.common.config.ConfigDef.ValidString
-
- toString() - Method in class org.apache.kafka.common.config.ConfigResource
-
- toString() - Method in class org.apache.kafka.common.config.ConfigValue
-
- toString() - Method in class org.apache.kafka.common.MetricName
-
- toString() - Method in class org.apache.kafka.common.MetricNameTemplate
-
- toString() - Method in class org.apache.kafka.common.Node
-
- toString() - Method in class org.apache.kafka.common.PartitionInfo
-
- toString() - Method in class org.apache.kafka.common.resource.Resource
-
- toString() - Method in class org.apache.kafka.common.resource.ResourceFilter
-
- toString() - Method in class org.apache.kafka.common.security.auth.KafkaPrincipal
-
- toString() - Method in class org.apache.kafka.common.TopicPartition
-
- toString() - Method in class org.apache.kafka.common.TopicPartitionInfo
-
- toString() - Method in class org.apache.kafka.common.TopicPartitionReplica
-
- toString() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- toString() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- toString() - Method in class org.apache.kafka.connect.data.SchemaAndValue
-
- toString() - Method in class org.apache.kafka.connect.data.Struct
-
- toString() - Method in class org.apache.kafka.connect.sink.SinkRecord
-
- toString() - Method in class org.apache.kafka.connect.source.SourceRecord
-
- toString() - Method in class org.apache.kafka.server.policy.AlterConfigPolicy.RequestMetadata
-
- toString() - Method in class org.apache.kafka.server.policy.CreateTopicPolicy.RequestMetadata
-
- toString() - Method in class org.apache.kafka.streams.KafkaStreams
-
- toString(String) - Method in class org.apache.kafka.streams.KafkaStreams
-
- toString() - Method in class org.apache.kafka.streams.KeyValue
-
- toString() - Method in class org.apache.kafka.streams.kstream.Window
-
- toString() - Method in class org.apache.kafka.streams.kstream.Windowed
-
- toString() - Method in class org.apache.kafka.streams.processor.TaskId
-
- toString() - Method in class org.apache.kafka.streams.processor.TaskMetadata
-
- toString() - Method in class org.apache.kafka.streams.processor.ThreadMetadata
-
- toString() - Method in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
Deprecated.
- toString() - Method in class org.apache.kafka.streams.state.HostInfo
-
- toString() - Method in class org.apache.kafka.streams.state.StreamsMetadata
-
- toSysOut() - Static method in class org.apache.kafka.streams.kstream.Printed
-
Print the records of a
KStream
to system out.
- totalCount() - Method in class org.apache.kafka.clients.admin.NewPartitions
-
The total number of partitions after the operation succeeds.
- TRANSACTION_TIMEOUT_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
transaction.timeout.ms
- TRANSACTION_TIMEOUT_DOC - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
- transactionAborted() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- TRANSACTIONAL_ID_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
transactional.id
- TRANSACTIONAL_ID_DOC - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
- TransactionalIdAuthorizationException - Exception in org.apache.kafka.common.errors
-
- TransactionalIdAuthorizationException(String) - Constructor for exception org.apache.kafka.common.errors.TransactionalIdAuthorizationException
-
- transactionCommitted() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- TransactionCoordinatorFencedException - Exception in org.apache.kafka.common.errors
-
- TransactionCoordinatorFencedException(String) - Constructor for exception org.apache.kafka.common.errors.TransactionCoordinatorFencedException
-
- TransactionCoordinatorFencedException(String, Throwable) - Constructor for exception org.apache.kafka.common.errors.TransactionCoordinatorFencedException
-
- transactionInFlight() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- transactionInitialized() - Method in class org.apache.kafka.clients.producer.MockProducer
-
- transform(TransformerSupplier<? super K, ? super V, KeyValue<K1, V1>>, String...) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Transform each record of the input stream into zero or more records in the output stream (both key and value type
can be altered arbitrarily).
- transform(K, V) - Method in interface org.apache.kafka.streams.kstream.Transformer
-
Transform the record with the given key and value.
- transform(V) - Method in interface org.apache.kafka.streams.kstream.ValueTransformer
-
Transform the given value to a new value.
- Transformation<R extends ConnectRecord<R>> - Interface in org.apache.kafka.connect.transforms
-
Single message transformation for Kafka Connect record types.
- Transformer<K,V,R> - Interface in org.apache.kafka.streams.kstream
-
The Transformer
interface for stateful mapping of an input record to zero, one, or multiple new output
records (both key and value type can be altered arbitrarily).
- TransformerSupplier<K,V,R> - Interface in org.apache.kafka.streams.kstream
-
A
TransformerSupplier
interface which can create one or more
Transformer
instances.
- transformValues(ValueTransformerSupplier<? super V, ? extends VR>, String...) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Transform the value of each input record into a new value (with possible new type) of the output record.
- type - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- type() - Method in class org.apache.kafka.common.config.ConfigResource
-
Return the resource type.
- type() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- type() - Method in interface org.apache.kafka.connect.data.Schema
-
- type() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- type(Schema.Type) - Static method in class org.apache.kafka.connect.data.SchemaBuilder
-
Create a SchemaBuilder for the specified type.
- typeOf(String) - Method in class org.apache.kafka.common.config.AbstractConfig
-
- validate(Map<String, String>) - Method in class org.apache.kafka.common.config.ConfigDef
-
Validate the current configuration values with the configuration definition.
- validate(Map<String, String>) - Method in class org.apache.kafka.connect.connector.Connector
-
Validate the connector configuration values against configuration definitions.
- validate() - Method in class org.apache.kafka.connect.data.Struct
-
Validates that this struct has filled in all the necessary data with valid values.
- validate(AlterConfigPolicy.RequestMetadata) - Method in interface org.apache.kafka.server.policy.AlterConfigPolicy
-
Validate the request parameters and throw a PolicyViolationException
with a suitable error
message if the alter configs request parameters for the provided resource do not satisfy this policy.
- validate(CreateTopicPolicy.RequestMetadata) - Method in interface org.apache.kafka.server.policy.CreateTopicPolicy
-
Validate the request parameters and throw a PolicyViolationException
with a suitable error
message if the create topics request parameters for the provided topic do not satisfy this policy.
- validateAll(Map<String, String>) - Method in class org.apache.kafka.common.config.ConfigDef
-
- validateOnly(boolean) - Method in class org.apache.kafka.clients.admin.AlterConfigsOptions
-
Set to true if the request should be validated without altering the configs.
- validateOnly() - Method in class org.apache.kafka.clients.admin.CreatePartitionsOptions
-
Return true if the request should be validated without creating new partitions.
- validateOnly(boolean) - Method in class org.apache.kafka.clients.admin.CreatePartitionsOptions
-
Set to true if the request should be validated without creating new partitions.
- validateOnly(boolean) - Method in class org.apache.kafka.clients.admin.CreateTopicsOptions
-
Set to true if the request should be validated without creating the topic.
- validateValue(Schema, Object) - Static method in class org.apache.kafka.connect.data.ConnectSchema
-
Validate that the value can be used with the schema, i.e.
- validateValue(String, Schema, Object) - Static method in class org.apache.kafka.connect.data.ConnectSchema
-
- validateValue(Object) - Method in class org.apache.kafka.connect.data.ConnectSchema
-
Validate that the value can be used for this schema, i.e.
- validator - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- validValues(String, Map<String, Object>) - Method in interface org.apache.kafka.common.config.ConfigDef.Recommender
-
The valid values for the configuration given the current configuration values.
- value() - Method in class org.apache.kafka.clients.admin.ConfigEntry
-
Return the value or null.
- value() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
-
The value
- value() - Method in class org.apache.kafka.clients.producer.ProducerRecord
-
- value() - Method in class org.apache.kafka.common.config.ConfigValue
-
- value(Object) - Method in class org.apache.kafka.common.config.ConfigValue
-
- value() - Method in interface org.apache.kafka.common.header.Header
-
- value() - Method in interface org.apache.kafka.common.Metric
-
- value() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- value() - Method in class org.apache.kafka.connect.data.SchemaAndValue
-
- value - Variable in class org.apache.kafka.streams.KeyValue
-
The value of the key-value pair.
- VALUE_DESERIALIZER_CLASS_CONFIG - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
value.deserializer
- VALUE_DESERIALIZER_CLASS_DOC - Static variable in class org.apache.kafka.clients.consumer.ConsumerConfig
-
- VALUE_SERDE_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- VALUE_SERIALIZER_CLASS_CONFIG - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
value.serializer
- VALUE_SERIALIZER_CLASS_DOC - Static variable in class org.apache.kafka.clients.producer.ProducerConfig
-
- valueDeserializer() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the value deserializer.
- ValueFactory() - Constructor for class org.apache.kafka.streams.state.Stores.ValueFactory
-
- valueFrom(byte[]) - Method in class org.apache.kafka.streams.state.StateSerdes
-
Deserialize the value from raw bytes.
- ValueJoiner<V1,V2,VR> - Interface in org.apache.kafka.streams.kstream
-
The ValueJoiner
interface for joining two values into a new value of arbitrary type.
- ValueMapper<V,VR> - Interface in org.apache.kafka.streams.kstream
-
The ValueMapper
interface for mapping a value to a new value of arbitrary type.
- valueOf(String) - Static method in enum org.apache.kafka.clients.consumer.OffsetResetStrategy
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.acl.AclOperation
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.acl.AclPermissionType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.config.ConfigDef.Importance
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.config.ConfigDef.Type
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.config.ConfigDef.Width
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.config.ConfigResource.Type
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.resource.ResourceType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.connect.data.Schema.Type
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.streams.errors.DeserializationExceptionHandler.DeserializationHandlerResponse
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.streams.KafkaStreams.State
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.streams.processor.PunctuationType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.streams.processor.TopologyBuilder.AutoOffsetReset
-
Deprecated.
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.kafka.streams.Topology.AutoOffsetReset
-
Returns the enum constant of this type with the specified name.
- values() - Method in class org.apache.kafka.clients.admin.AlterConfigsResult
-
Return a map from resources to futures which can be used to check the status of the operation on each resource.
- values() - Method in class org.apache.kafka.clients.admin.AlterReplicaLogDirsResult
-
Return a map from replica to future which can be used to check the status of individual replica movement.
- values() - Method in class org.apache.kafka.clients.admin.CreateAclsResult
-
Return a map from ACL bindings to futures which can be used to check the status of the creation of each ACL
binding.
- values() - Method in class org.apache.kafka.clients.admin.CreatePartitionsResult
-
Return a map from topic names to futures, which can be used to check the status of individual
partition creations.
- values() - Method in class org.apache.kafka.clients.admin.CreateTopicsResult
-
Return a map from topic names to futures, which can be used to check the status of individual
topic creations.
- values() - Method in class org.apache.kafka.clients.admin.DeleteAclsResult.FilterResults
-
Return a list of delete ACLs results for a given filter.
- values() - Method in class org.apache.kafka.clients.admin.DeleteAclsResult
-
Return a map from acl filters to futures which can be used to check the status of the deletions by each
filter.
- values() - Method in class org.apache.kafka.clients.admin.DeleteTopicsResult
-
Return a map from topic names to futures which can be used to check the status of
individual deletions.
- values() - Method in class org.apache.kafka.clients.admin.DescribeAclsResult
-
Return a future containing the ACLs requested.
- values() - Method in class org.apache.kafka.clients.admin.DescribeConfigsResult
-
Return a map from resources to futures which can be used to check the status of the configuration for each
resource.
- values() - Method in class org.apache.kafka.clients.admin.DescribeLogDirsResult
-
Return a map from brokerId to future which can be used to check the information of partitions on each individual broker
- values() - Method in class org.apache.kafka.clients.admin.DescribeReplicaLogDirsResult
-
Return a map from replica to future which can be used to check the log directory information of individual replicas
- values() - Method in class org.apache.kafka.clients.admin.DescribeTopicsResult
-
Return a map from topic names to futures which can be used to check the status of
individual topics.
- values() - Static method in enum org.apache.kafka.clients.consumer.OffsetResetStrategy
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.acl.AclOperation
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.acl.AclPermissionType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Method in class org.apache.kafka.common.config.AbstractConfig
-
- values() - Static method in enum org.apache.kafka.common.config.ConfigDef.Importance
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.config.ConfigDef.Type
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.config.ConfigDef.Width
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.config.ConfigResource.Type
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.resource.ResourceType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.common.security.auth.SecurityProtocol
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.connect.data.Schema.Type
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.streams.errors.DeserializationExceptionHandler.DeserializationHandlerResponse
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.streams.KafkaStreams.State
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.streams.processor.PunctuationType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.streams.processor.TopologyBuilder.AutoOffsetReset
-
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.kafka.streams.Topology.AutoOffsetReset
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- valueSchema() - Method in class org.apache.kafka.connect.connector.ConnectRecord
-
- valueSchema() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- valueSchema() - Method in interface org.apache.kafka.connect.data.Schema
-
Get the value schema for this map or array schema.
- valueSchema() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- valueSerde - Variable in class org.apache.kafka.streams.Consumed
-
- valueSerde(Serde<V>) - Static method in class org.apache.kafka.streams.kstream.Joined
-
Create an instance of
Joined
with a value
Serde
.
- valueSerde() - Method in class org.apache.kafka.streams.kstream.Joined
-
- valueSerde - Variable in class org.apache.kafka.streams.kstream.Materialized
-
- valueSerde - Variable in class org.apache.kafka.streams.kstream.Produced
-
- valueSerde(Serde<V>) - Static method in class org.apache.kafka.streams.kstream.Produced
-
Create a Produced instance with provided valueSerde.
- valueSerde - Variable in class org.apache.kafka.streams.kstream.Serialized
-
- valueSerde() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the default value serde
- valueSerde() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the value serde.
- valueSerde() - Method in class org.apache.kafka.streams.StreamsConfig
-
Deprecated.
- valueSerializer() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the value serializer.
- valuesWithPrefixOverride(String) - Method in class org.apache.kafka.common.config.AbstractConfig
-
Put all keys that do not start with prefix
and their parsed values in the result map and then
put all the remaining keys with the prefix stripped and their parsed values in the result map.
- ValueTransformer<V,VR> - Interface in org.apache.kafka.streams.kstream
-
The ValueTransformer
interface for stateful mapping of a value to a new value (with possible new type).
- ValueTransformerSupplier<V,VR> - Interface in org.apache.kafka.streams.kstream
-
A
ValueTransformerSupplier
interface which can create one or more
ValueTransformer
instances.
- version() - Method in class org.apache.kafka.connect.connector.Connector
-
Get the version of this connector.
- version() - Method in interface org.apache.kafka.connect.connector.Task
-
Get the version of this task.
- version() - Method in class org.apache.kafka.connect.data.ConnectSchema
-
- version() - Method in interface org.apache.kafka.connect.data.Schema
-
Get the optional version of the schema.
- version() - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
- version(Integer) - Method in class org.apache.kafka.connect.data.SchemaBuilder
-
Set the version of this schema.
- visible(String, Map<String, Object>) - Method in interface org.apache.kafka.common.config.ConfigDef.Recommender
-
Set the visibility of the configuration given the current configuration values.
- visible() - Method in class org.apache.kafka.common.config.ConfigValue
-
- visible(boolean) - Method in class org.apache.kafka.common.config.ConfigValue
-
- wakeup() - Method in interface org.apache.kafka.clients.consumer.Consumer
-
- wakeup() - Method in class org.apache.kafka.clients.consumer.KafkaConsumer
-
Wakeup the consumer.
- wakeup() - Method in class org.apache.kafka.clients.consumer.MockConsumer
-
- WakeupException - Exception in org.apache.kafka.common.errors
-
Exception used to indicate preemption of a blocking operation by an external thread.
- WakeupException() - Constructor for exception org.apache.kafka.common.errors.WakeupException
-
- WallclockTimestampExtractor - Class in org.apache.kafka.streams.processor
-
Retrieves current wall clock timestamps as System.currentTimeMillis()
.
- WallclockTimestampExtractor() - Constructor for class org.apache.kafka.streams.processor.WallclockTimestampExtractor
-
- width - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
-
- Window - Class in org.apache.kafka.streams.kstream
-
A single window instance, defined by its start and end timestamp.
- Window(long, long) - Constructor for class org.apache.kafka.streams.kstream.Window
-
Create a new window for the given start and end time.
- window() - Method in class org.apache.kafka.streams.kstream.Windowed
-
Return the window containing the values associated with this key.
- WINDOW_STORE_CHANGE_LOG_ADDITIONAL_RETENTION_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
windowstore.changelog.additional.retention.ms
- WindowBytesStoreSupplier - Interface in org.apache.kafka.streams.state
-
A store supplier that can be used to create one or more
WindowStore>
instances of type <Byte, byte[]>.
- Windowed<K> - Class in org.apache.kafka.streams.kstream
-
The result key type of a windowed stream aggregation.
- Windowed(K, Window) - Constructor for class org.apache.kafka.streams.kstream.Windowed
-
- windowed(long, long, int, boolean) - Method in interface org.apache.kafka.streams.state.Stores.PersistentKeyValueFactory
-
Deprecated.
Set the persistent store as a windowed key-value store
- windowedBy(Windows<W>) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- windowedBy(SessionWindows) - Method in interface org.apache.kafka.streams.kstream.KGroupedStream
-
- Windows<W extends Window> - Class in org.apache.kafka.streams.kstream
-
The window specification interface for fixed size windows that is used to define window boundaries and window
maintain duration.
- Windows() - Constructor for class org.apache.kafka.streams.kstream.Windows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
Not supported by JoinWindows
.
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.TimeWindows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.Windows
-
Create all windows that contain the provided timestamp, indexed by non-negative window start timestamps.
- windowSize() - Method in interface org.apache.kafka.streams.state.WindowBytesStoreSupplier
-
The size of the windows any store created from this supplier is creating.
- windowStore() - Static method in class org.apache.kafka.streams.state.QueryableStoreTypes
-
- WindowStore<K,V> - Interface in org.apache.kafka.streams.state
-
- windowStoreBuilder(WindowBytesStoreSupplier, Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.state.Stores
-
- WindowStoreIterator<V> - Interface in org.apache.kafka.streams.state
-
- with(Serde<K>, Serde<V>, TimestampExtractor, Topology.AutoOffsetReset) - Static method in class org.apache.kafka.streams.Consumed
-
Create an instance of
Consumed
with the supplied arguments.
- with(Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.Consumed
-
- with(TimestampExtractor) - Static method in class org.apache.kafka.streams.Consumed
-
- with(Topology.AutoOffsetReset) - Static method in class org.apache.kafka.streams.Consumed
-
- with(Serde<K>, Serde<V>, Serde<VO>) - Static method in class org.apache.kafka.streams.kstream.Joined
-
Create an instance of
Joined
with key, value, and otherValue
Serde
instances.
- with(Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.kstream.Materialized
-
- with(Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.kstream.Produced
-
Create a Produced instance with provided keySerde and valueSerde.
- with(Serde<K>, Serde<V>, StreamPartitioner<? super K, ? super V>) - Static method in class org.apache.kafka.streams.kstream.Produced
-
Create a Produced instance with provided keySerde, valueSerde, and partitioner.
- with(Serde<K>, Serde<V>) - Static method in class org.apache.kafka.streams.kstream.Serialized
-
Construct a
Serialized
instance with the provided key and value
Serde
s.
- with(long) - Static method in class org.apache.kafka.streams.kstream.SessionWindows
-
Create a new window specification with the specified inactivity gap in milliseconds.
- withBuiltinTypes(String, Class<K>, Class<V>) - Static method in class org.apache.kafka.streams.state.StateSerdes
-
Create a new instance of
StateSerdes
for the given state name and key-/value-type classes.
- withByteArrayKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be byte arrays.
- withByteArrayValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use byte arrays for values.
- withByteBufferKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
ByteBuffer
.
- withByteBufferValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use ByteBuffer
for values.
- withCachingDisabled() - Method in class org.apache.kafka.streams.kstream.Materialized
-
- withCachingEnabled() - Method in class org.apache.kafka.streams.kstream.Materialized
-
- withCachingEnabled() - Method in interface org.apache.kafka.streams.state.StoreBuilder
-
Enable caching on the store.
- withClientSaslSupport() - Method in class org.apache.kafka.common.config.ConfigDef
-
Add standard SASL client configuration options.
- withClientSslSupport() - Method in class org.apache.kafka.common.config.ConfigDef
-
Add standard SSL client configuration options.
- withDoubleKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
Double
s.
- withDoubleValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use Double
values.
- withIntegerKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
Integer
s.
- withIntegerValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use Integer
values.
- withKeys(Class<K>) - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
- withKeys(Serde<K>) - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the serializer and deserializer for the keys.
- withKeySerde(Serde<K>) - Method in class org.apache.kafka.streams.Consumed
-
- withKeySerde(Serde<K>) - Method in class org.apache.kafka.streams.kstream.Joined
-
Set the key
Serde
to be used.
- withKeySerde(Serde<K>) - Method in class org.apache.kafka.streams.kstream.Materialized
-
Set the keySerde the materialize
StateStore
will use.
- withKeySerde(Serde<K>) - Method in class org.apache.kafka.streams.kstream.Produced
-
Produce records using the provided keySerde.
- withKeySerde(Serde<K>) - Method in class org.apache.kafka.streams.kstream.Serialized
-
Construct a
Serialized
instance with the provided key
Serde
.
- withKeyValueMapper(KeyValueMapper<? super K, ? super V, String>) - Method in class org.apache.kafka.streams.kstream.Printed
-
Print the records of a
KStream
with the provided
KeyValueMapper
The provided KeyValueMapper's mapped value type must be
String
.
- withLabel(String) - Method in class org.apache.kafka.streams.kstream.Printed
-
Print the records of a
KStream
with the provided label.
- withLoggingDisabled() - Method in class org.apache.kafka.streams.kstream.Materialized
-
Disable change logging for the materialized
StateStore
.
- withLoggingDisabled() - Method in interface org.apache.kafka.streams.state.StoreBuilder
-
- withLoggingEnabled(Map<String, String>) - Method in class org.apache.kafka.streams.kstream.Materialized
-
Indicates that a changelog should be created for the store.
- withLoggingEnabled(Map<String, String>) - Method in interface org.apache.kafka.streams.state.StoreBuilder
-
Maintain a changelog for any changes made to the store.
- withLongKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
Long
s.
- withLongValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use Long
values.
- withOffsetResetPolicy(Topology.AutoOffsetReset) - Method in class org.apache.kafka.streams.Consumed
-
- withOtherValueSerde(Serde<VO>) - Method in class org.apache.kafka.streams.kstream.Joined
-
Set the otherValue
Serde
to be used.
- withPartitions(Map<TopicPartition, PartitionInfo>) - Method in class org.apache.kafka.common.Cluster
-
Return a copy of this cluster combined with `partitions`.
- withStreamPartitioner(StreamPartitioner<? super K, ? super V>) - Method in class org.apache.kafka.streams.kstream.Produced
-
Produce records using the provided partitioner.
- withStringKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
String
s.
- withStringValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use String
values.
- withTimestampExtractor(TimestampExtractor) - Method in class org.apache.kafka.streams.Consumed
-
- withUnderlyingMessage(String) - Static method in exception org.apache.kafka.clients.consumer.RetriableCommitFailedException
-
- withValues(Class<V>) - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use values of the specified type.
- withValues(Serde<V>) - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use the specified serializer and deserializer for the values.
- withValueSerde(Serde<V>) - Method in class org.apache.kafka.streams.Consumed
-
- withValueSerde(Serde<V>) - Method in class org.apache.kafka.streams.kstream.Joined
-
Set the value
Serde
to be used.
- withValueSerde(Serde<V>) - Method in class org.apache.kafka.streams.kstream.Materialized
-
Set the valueSerde the materialized
StateStore
will use.
- withValueSerde(Serde<V>) - Method in class org.apache.kafka.streams.kstream.Produced
-
Produce records using the provided valueSerde.
- withValueSerde(Serde<V>) - Method in class org.apache.kafka.streams.kstream.Serialized
-
Construct a
Serialized
instance with the provided value
Serde
.
- Wrapper(Deserializer<T>) - Constructor for class org.apache.kafka.common.serialization.ExtendedDeserializer.Wrapper
-
- Wrapper(Serializer<T>) - Constructor for class org.apache.kafka.common.serialization.ExtendedSerializer.Wrapper
-
- WrapperSerde(Serializer<T>, Deserializer<T>) - Constructor for class org.apache.kafka.common.serialization.Serdes.WrapperSerde
-
- writeAsText(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, String, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, KeyValueMapper<? super K, ? super V, String>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, String, KeyValueMapper<? super K, ? super V, String>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, Serde<K>, Serde<V>, KeyValueMapper<? super K, ? super V, String>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String, String, Serde<K>, Serde<V>, KeyValueMapper<? super K, ? super V, String>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- writeAsText(String, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- writeAsText(String, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- writeAsText(String, String, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- writeTo(DataOutputStream) - Method in class org.apache.kafka.streams.processor.TaskId
-
- writeTo(ByteBuffer) - Method in class org.apache.kafka.streams.processor.TaskId
-