Demonstrates how to use Kafka to process files (data streams).
Demonstrates how to use RabbitMQ to have multiple workers (consumers) consuming from a single queue.
Demonstrates how to use RabbitMQ to have a single producer emitting messages to a multiple consumers at a same time. Eg. Consumer and Logger
For a non critical message, we can just set the queue as exclusive: true
so, if a consumer disconnects from the queue, the message is lost.
For a critical message, we can set the queue as autoDelete: false
so, when the consumer disconnects from the queue, it will not be deleted and the message will be buffered until the consumer reconnects.
Demonstrates how to use RabbitMQ to process files (data streams) with retry and delay mechanism in a failure scenario. This example uses the rabbitmq_delayed_message_exchange
plugin.
Demonstrates how to use RabbitMQ to reprocess a message after a failure without using plugins, just with exchange and dead letter queue (with ttl).