Struct async_nats::jetstream::consumer::pull::FetchBuilder
source · pub struct FetchBuilder<'a> { /* private fields */ }
Expand description
Used for building configuration for a Batch with fetch()
semantics. Created by a FetchBuilder on a Consumer.
§Examples
use async_nats::jetstream::consumer::PullConsumer;
use futures::StreamExt;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events")
.await?
.get_consumer("pull")
.await?;
let mut messages = consumer
.fetch()
.max_messages(100)
.max_bytes(1024)
.messages()
.await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
Implementations§
source§impl<'a> FetchBuilder<'a>
impl<'a> FetchBuilder<'a>
pub fn new(consumer: &'a Consumer<Config>) -> Self
sourcepub fn max_bytes(self, max_bytes: usize) -> Self
pub fn max_bytes(self, max_bytes: usize) -> Self
Sets max bytes that can be buffered on the Client while processing already received messages. Higher values will yield better performance, but also potentially increase memory usage if application is acknowledging messages much slower than they arrive.
Default values should provide reasonable balance between performance and memory usage.
§Examples
use futures::StreamExt;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer = jetstream
.get_stream("events")
.await?
.get_consumer("pull")
.await?;
let mut messages = consumer.fetch().max_bytes(1024).messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub fn max_messages(self, batch: usize) -> Self
pub fn max_messages(self, batch: usize) -> Self
Sets max number of messages that can be buffered on the Client while processing already received messages. Higher values will yield better performance, but also potentially increase memory usage if application is acknowledging messages much slower than they arrive.
Default values should provide reasonable balance between performance and memory usage.
§Examples
use futures::StreamExt;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer = jetstream
.get_stream("events")
.await?
.get_consumer("pull")
.await?;
let mut messages = consumer.fetch().max_messages(100).messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub fn heartbeat(self, heartbeat: Duration) -> Self
pub fn heartbeat(self, heartbeat: Duration) -> Self
Sets heartbeat which will be send by the server if there are no messages for a given Consumer pending.
§Examples
use async_nats::jetstream::consumer::PullConsumer;
use futures::StreamExt;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer = jetstream
.get_stream("events")
.await?
.get_consumer("pull")
.await?;
let mut messages = consumer
.fetch()
.heartbeat(std::time::Duration::from_secs(10))
.messages()
.await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub fn expires(self, expires: Duration) -> Self
pub fn expires(self, expires: Duration) -> Self
Low level API that does not need tweaking for most use cases. Sets how long each batch request waits for whole batch of messages before timing out. Consumer pending.
§Examples
use async_nats::jetstream::consumer::PullConsumer;
use futures::StreamExt;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events")
.await?
.get_consumer("pull")
.await?;
let mut messages = consumer
.fetch()
.expires(std::time::Duration::from_secs(30))
.messages()
.await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub async fn messages(self) -> Result<Batch, BatchError>
pub async fn messages(self) -> Result<Batch, BatchError>
Creates actual Stream with provided configuration.
§Examples
use async_nats::jetstream::consumer::PullConsumer;
use futures::StreamExt;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events")
.await?
.get_consumer("pull")
.await?;
let mut messages = consumer.fetch().max_messages(100).messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}