Skip to content

NamanJain-Nash/Kafka

Repository files navigation

Kafka + .Net + Docker

Setup Kafka

It's very simple have a docker compose that we have need to run to start the kafka and it to be done and then to do so we can start both the zookeper and the kafka container

We will start using 'docker compose up -d'[To make it run in detach mode.]

Setting up the Topic

In kafka their are various worker node with each havinhg topics where it is stored

Get The Docker Access Connection to start the enter kafka terminal

docker exec -it kafka /bin/bash

Getting The topic Ready

kafka-topics.sh \ --create \ --topic test\ --partitions 1 \ --replication-factor 1 \ --if-not-exists \ --zookeeper zookeeper:2181

Check the Topics and etc

kafka-topics.sh \ --list \ --zookeeper zookeeper:2181

We can also check the listed topic details also

kafka-topics.sh
--describe
--topic test
--zookeeper zookeeper:2181

Publish and Subscribe using a Terminal

kafka-console-producer.sh \ --request-required-acks 1 \ --broker-list 0.0.0.0:9092 \ --topic test

We will be geting the rpoducer sh file and then request its Producer api and then use that to send using its ip and also the topic where it should be done

Making Read of the Topic check / consume it

kafka-console-consumer.sh \ --bootstrap-server 0.0.0.0:9092 \ --topic foo \ --zookeeper zookeeper:2181

.Net Application

The application is Designed in such a way that we have a Producer and a consumer in the case to get full utlization of the KafkA is been done.

The main scope of this application is to have a way to send the mesage and recive it and we have made a api to easily post the message and have also make a console application to get the consumer be their.

The pattern followed in a way that every unit is designed so reuability of the code is high

Api Application

The main aim is that we have a controller where a post call is their to send the topic and the data be their and then use it to send the main buisness layer where conversion of data is their to provide mapping and other logic check

Buisness Layer

The application is such that we have to convert the DTO to the main data and when we see it uses the producer service to post to the Service layer.

Service Layer

We will see their is a class where we use the consumer of kafka to encode the data and send it to the required topic and then be used inside the kafka and that make it be utilized by the consumer

Kafka

It will Store and manage it using the Zookepper and be used in the Subscriber and that can be many

Consumer

We will see that this is the consume the application that it uses the service to easily see and have a connection that is their to recive the connection and according to the logic that we have to open and close a logic and then utilized to see the console application that is their.

Service Layer

The main application that is their which is used to easily open and close the connection to collect the data from the respective group and its respective topic that is their and make the use of it to easily send into that group.

Kafka memory

It make the store of the memory locally

Kafka Application

Get the calls

We have setb the whole process to get the details and have used the api to publish so before start consumer we can get the info using

To get the amount of pushed content docker exec -it kafka /opt/kafka/bin/kafka-con kafka-console-consumer.sh --topic test-topic --from-beginning --bootstrap-server localhost:9092

After this we can easily see the topic and the work to get the whole info and the work be done

After having setup all the send code and the topic send we can easily make the consumer code to also run and show that can easily be done using localhost:9093[outside listner port] this is one that will consume it after getting a group id of the subscriber.

Messaging service

After the Subscriber has Recived the call it will filter out the conetnt from buisness layer and use the Email Service that uses Graph / SMTP to easily send the mail from the mailling address and have send to the respective subject,body and To is done.

RUN THE APP

You can run the dockerised kafka from kafak.yml this will start both the Zookeeper and the Kafka Server for the use.

Start the application you can either run them locally for a system with .Net 8 Installed or Can use app.yml file to run the application [Note You need to start the Kafka First]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published