Giter Club home page Giter Club logo

apache_kafka's Introduction

Apache Kafka Projects

This repository is to save a message producer and cosumer project and a payment project.

Techs used

  • Apache Kafka (Kafdrop)
  • Docker (Docker-Compose)
  • Zookeeper
  • Postman
  • SpringBoot
  • Java
  • WSL (Linux on Windows)

Also, if you don't have IntelliJ license, you can create and run SpringBoot projects using the 'Sping Initializr Java Support' and 'Spring Boot Tools' extension in VS Code. They were built by Microsoft.


Producer and Consumer Project

The objective with this project is to show how we can produce and consume messages using Apache Kafka with Kafdrop interface.

We will follow this flow at below:

FlowDiagram

Running the Docker-Compose

The first step is run the docker-compose where the Zookeeper, Kafka and Kafdrop are ran.

If you have WSL installed with docker and docker-compose dependencies installed, just run in the command line docker-compose up -d. The -d is used to say to Docker to run it in the background and keep the command line free.

You will see in the Docker Desktop something like this:

docker-compose

Kafdrop

To see the Kafdrop interface, you can access it in your browser at localhost:19000

kafdrop

Running the Spring projects

Next, to run the project you need to be in the producer directory and to click in the play button and in another vscode you need to be in the consumer directory and do the same. It will create a topic at Kafdrop named by 'str-topic' as we can see in the image above.

The Request using Postman

With all things running fine, we finally can send a request to the producer that will be serialized and saved at Apache Kafka topic.

The endpoint was wrote in the StringProducerResource.java. As we can see in the code below, the request is a POST and the enpoint is /producer.

@RestController
@RequestMapping(value = "/producer")
public class StringProducerResource {

    private final StringProducerService producerService;

    @PostMapping
    public ResponseEntity<?> sendMessage(@RequestBody String message) {
        producerService.sendMessage(message);
        return ResponseEntity.status(HttpStatus.CREATED).build();
    }
}

So, the complete URL will be: localhost:8000/producer because the producer are listen in the door 8000 and it will accept a raw JSON.

The Producer is waiting a key named message and a value (string). The raw JSON must be like:

postman

If the request was returned as 'CREATED' the request was OK.

Message in Kafdrop

To see the message consumed by the consumer, you can open the Kafdrop, click in the str-topic, see messages, and you will see the messages in the topics:

kafdrop_with_message

Consuming the Message

As the message in Kafdrop was saved, the consumer can take the massage from there if it knows which topic, partition and offset the message is in. As we created just 2 partitions in KafkaAdminConfig.java (.partitions(2)) as we can see in the code below, the message can be save in the partition 0 or 1, depending on the Kafka system or if you set it.

    @Bean
    public KafkaAdmin.NewTopics topics() {
        return new KafkaAdmin.NewTopics(
                TopicBuilder.name("str-topic").partitions(2).replicas(1).build()
        );
    }

To see the message being consumed by the Consumer, we can send another message:

{
    "message":"Apache Kafka repository"
}

Taking a look on the terminal where the Consumer is running, we can see the message be consumed


The Payment-Service Project

This project is very similar with the Producer and Consumer project. The difference is in the JSON message. Here, I prepared the software to receive a complete JSON object.

The main objective is to manage more than one informations received by requests and to create a Collection on Postman.

JSON Object

The body to be sent in the request needs to have the fields:

  • id
  • idUser
  • idProduct
  • cardNumber

Now, the request must be like this:

{
    "id":111,
    "idUser":22222,
    "idProduct":3,
    "cardNumber":"4444 5555 6666 7777 8888"
}

The request must use another endpoint: /payments

So, the complete URL will be: localhost:19000/payments

Receiving the Payment

If the payment was sent (request), the consumer from the payment-service software should confirm if the data (JSON Body) is correct and validate the payment.

Kafdrop saving payments request

Now, we can see the requests done in the payment-topic on Kafdrop in the localhost:19000. The last one was saved on the Partition 0 and offset 2

apache_kafka's People

Contributors

joaolevi avatar

Stargazers

Kleber Vasconcelos avatar André Pereira avatar

Watchers

Kostas Georgiou avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.