Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gitpod support for producer tutorial #1570

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 41 additions & 0 deletions .gitpod.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
github:
prebuilds:
master: true
branches: true

tasks:
- name: Open tutorial in browser
command: gp preview https://developer.confluent.io/tutorials/creating-first-apache-kafka-producer-application/kafka.html
- name: Docker
init: gp sync-await code-copy
command: docker compose -f tutorial-workspace/docker-compose.yml up -d ; exit ; clear
- name: Terminal
before: gp preview https://developer.confluent.io/tutorials/creating-first-apache-kafka-producer-application/kafka.html
init: |
curl -L --http1.1 https://cnfl.io/cli | sudo sh -s -- -b /usr/local/bin
mkdir tutorial-workspace
mv _includes/tutorials/kafka-producer-application/kafka/code/* tutorial-workspace
rm -rf -- !(tutorial-workspace)
rm -rf tutorial-workspace/tutorial-steps .git* .semaphore/
sdk default java 17.0.7.fx-zulu
command: gp sync-done code-copy ; cd tutorial-workspace ; clear

vscode:
extensions:
- github.github-vscode-theme
- vscjava.vscode-java-pack
- vscjava.vscode-java-debug

ports:
# zookeeper
- port: 2181
onOpen: ignore
visibility: private
# broker
- port: 29092
onOpen: ignore
visibility: private
# SR
- port: 8081
onOpen: ignore
visibility: private
48 changes: 48 additions & 0 deletions _data/harnesses/kafka-producer-application/gitpod.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
dev:
steps:
- title: Background
content:
- action: skip
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/background.adoc

- title: Application tour
content:
- action: skip
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/app-tour.adoc

- title: Compile the KafkaProducer application
content:
- action: execute
file: tutorial-steps/gitpod/build-uberjar.sh
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/build-uberjar.adoc

- title: Create a topic
content:
- action: execute
file: tutorial-steps/gitpod/harness-create-topic.sh
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/create-topic.adoc

- title: Run the KafkaProducer application
content:
- action: execute
file: tutorial-steps/gitpod/run-dev-app.sh
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/run-dev-app.adoc

- title: Confirm records sent by consuming from topic
content:
- action: execute_async
file: tutorial-steps/gitpod/harness-console-consumer.sh
stdout: tutorial-steps/gitpod/outputs/actual-output.txt
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/run-consumer.adoc

- title: Next steps
content:
- action: skip
render:
file: tutorials/kafka-producer-application/kafka/markup/gitpod/next-steps.adoc
4 changes: 2 additions & 2 deletions _data/tutorials.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -604,12 +604,12 @@ streams-to-table:
kafka: disabled
confluent: enabled
kafka-producer-application:
title: How to build your first Apache KafkaProducer application
title: How to build your first Apache Kafka® producer application
meta-description: build your first Kafka producer application
canonical: confluent
slug: /creating-first-apache-kafka-producer-application
question: How do you get started building your first Kafka producer application?
introduction: You'd like to integrate a KafkaProducer into your event-driven application,
introduction: You'd like to integrate a Kafka producer into your event-driven application,
but you're not sure where to start. In this tutorial, you'll build a small application
that uses a KafkaProducer to write records to Kafka.
status:
Expand Down
48 changes: 48 additions & 0 deletions _includes/gitpod-content.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
<div>
<section class="section" data-tracking-location="Run it">
<div class="container">
<div class="columns">
<div class="column is-full">
{% for step in site.data.harnesses[page.static_data][page.stack].dev.steps %}
<div class="tutorial-try-it-step content" data-tracking-location="Step {{ forloop.index }}">
<h2
class="title tutorial-section-item-title"
id="{{ step.title | slugify }}"
>
{{ step.title }}
</h2>
<div class="content-item">
<div class="item-display">

{% for section in step.content %}

{% if section.action == "docker_ksql_cli_session" or section.action == "docker_flinksql_cli_session" %}

{% if section.render.skip != true %}
{% capture prose %}{% include {{ section.render.file }} %}{% endcapture %}
{{ prose | asciidocify }}
{% endif %}

{% for subsection in section.stdin %}
{% capture prose %}{% include {{ subsection.render.file }} %}{% endcapture %}
{{ prose | asciidocify }}
{% endfor %}

{% elsif section.render.skip != true %}

{% capture prose %}{% include {{ section.render.file }} %}{% endcapture %}
{{ prose | asciidocify }}

{% endif %}

{% endfor %}

</div>
</div>
</div>
{% endfor %}
</div>
</div>
</div>
</section>
</div>
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
./gradlew shadowJar
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
kafka-console-consumer --topic output-topic \
--bootstrap-server broker:9092 \
--from-beginning \
--property print.key=true \
--property key.separator=" : "
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
kafka-topics --create --topic output-topic --bootstrap-server broker:9092 --replication-factor 1 --partitions 1
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
1 : value
2 : words
3 : All Streams
4 : Lead to
5 : Kafka
6 : Go to
7 : Kafka Summit
8 : How can
9 : a 10 ounce
10 : bird carry a
11 : 5lb coconut
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
docker exec broker kafka-topics --list --bootstrap-server broker:29092
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
docker exec -it broker bash
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
java -jar build/libs/kafka-producer-application-standalone-0.0.1.jar configuration/dev.properties input.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
In the file explorer on the right, navigate to `tutorial-workspace/src/main/java/io/confluent/developer/`.Right click
on `KafkaProducerApplication.java`, and then click `Open to the Side`.

Let's look at some of the key points of this program. About 20 lines down:

[source, java]
.KafkaProducerApplication constructor
----

public class KafkaProducerApplication {

private final Producer<String, String> producer;
final String outTopic;

public KafkaProducerApplication(final Producer<String, String> producer, <1>
final String topic) { <2>
this.producer = producer;
outTopic = topic;
}

----

<1> Passing in the `Producer` instance as a constructor parameter.
<2> The topic to write records to


In this tutorial you'll inject the dependencies in the `KafkaProducerApplication.main()` method.
Having this thin wrapper class around a `Producer` is not required, but it does help with making our code easier to test.

(In practice you may want to use a dependency injection framework library, such as the https://spring.io/projects/spring-framework[Spring Framework]).


Next let's take a look at the `KafkaProducerApplication.produce` method right below the constructor:
[source, java]
.KafkaProducerApplication.produce
----
public Future<RecordMetadata> produce(final String message) {
final String[] parts = message.split("-"); <1>
final String key, value;
if (parts.length > 1) {
key = parts[0];
value = parts[1];
} else {
key = null;
value = parts[0];
}
final ProducerRecord<String, String> producerRecord = new ProducerRecord<>(outTopic, key, value); <2>
return producer.send(producerRecord); <3>
}
----

<1> Process the String for sending message
<2> Create the `ProducerRecord`
<3> Send the record to the broker

The `KafkaProducerApplication.produce` method does some processing on a `String`, and then sends the https://kafka.apache.org/25/javadoc/org/apache/kafka/clients/producer/ProducerRecord.html[`ProducerRecord`]. While this code is a trivial example, it's enough to show the example of using a `KafkaProducer`.
Notice that https://kafka.apache.org/34/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html#send-org.apache.kafka.clients.producer.ProducerRecord-[`KafkaProducer.send`] returns a https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/util/concurrent/Future.html[Future] with a type of https://kafka.apache.org/34/javadoc/org/apache/kafka/clients/producer/RecordMetadata.html[RecordMetadata].

The `KafkaProducer.send` method is asynchronous and returns as soon as the provided record is placed in the buffer of records to be sent to the broker. Once the broker acknowledges that the record has been appended to its log, the broker completes the produce request, which the application receives as `RecordMetadata`—information about the committed message. This tutorial prints the `timestamp` and `offset` for each record sent using the `RecordMetadata` object. Note that calling `Future.get()` for any record will block until the produce request completes.
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
This tutorial walks you through how to write a basic Kafka producer application in Java.

The complete source code for this tutorial application resides in the `tutorial-workspace` directory of the Gitpod
workspace. Also, Kafka starts up automatically in Docker.
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
In your terminal window on the bottom of the Gitpod workspace, run the following command to build the application:

+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/build-uberjar.sh %}</code></pre>
+++++
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@

In this step we're going to create a topic for use during this tutorial. Before proceeding, ensure that Kafka is up and running by listing topics:
+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/list-topics.sh %}</code></pre>
+++++

You should see the `__consumer_offsets` and `_schemas` topics listed. If you see an error like `No such container: broker`, the images may still be downloading in the terminal named `Docker`.

When Kafka is up and running, run this command to open a shell on the broker Docker container
+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/open-docker-shell.sh %}</code></pre>
+++++

Next, create the topic that the producer can write to

+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/create-topic.sh %}</code></pre>
+++++

Enter `CTRL+D` to exit the broker shell.
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
Now that you have a working producer application and a way to test with `kafka-console-consumer`, you may want to modify the application to do something more interesting. For example, the program ends after printing each line from the file. How might you modify it to print a record every second? How would you change it to emit results of an API call every second and never end? Can you figure out how to emit weather data from https://open-meteo.com/[Open-Meteo] every second for your current location and store the results in a `temperature` topic?

When you are done tinkering, ensure that you shut down your Gitpod workspace so that you don't waste credits or incur cost unecessarily. You can do this by navigating to https://gitpod.io/workspaces[https://gitpod.io/workspaces], clicking the three dots on the right of the running workspace, and clicking `Delete` followed by `Delete Workspace`.

Once everything is cleaned up, head on back to Confluent Developer to try link:{{ site.url }}[another tutorial]!
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
Now we'll run a console consumer that will read topics from the output topic to confirm your application published the expected records.

First, open a shell on the broker Docker container:

+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/open-docker-shell.sh %}</code></pre>
+++++

Now run the console consumer:

+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/console-consumer.sh %}</code></pre>
+++++

The output from the consumer can vary if you added any of your own records, but it should look something like this:

++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/expected-output.txt %}</code></pre>
++++

Now close the consumer with a `CTRL+C` then the broker shell with a `CTRL+D`
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
Now that you have an uberjar for the KafkaProducerApplication, you can launch it in the workspace.
+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-producer-application/kafka/code/tutorial-steps/gitpod/run-dev-app.sh %}</code></pre>
+++++

After you run the previous command, the application will process the file and you should something like this on the console:

[source, text]
----
Offsets and timestamps committed in batch from input.txt
Record written to offset 0 timestamp 1597352120029
Record written to offset 1 timestamp 1597352120037
Record written to offset 2 timestamp 1597352120037
Record written to offset 3 timestamp 1597352120037
Record written to offset 4 timestamp 1597352120037
Record written to offset 5 timestamp 1597352120037
Record written to offset 6 timestamp 1597352120037
Record written to offset 7 timestamp 1597352120037
Record written to offset 8 timestamp 1597352120037
Record written to offset 9 timestamp 1597352120037
Record written to offset 10 timestamp 1597352120038
----

Now you can experiment some by creating your own file in base directory and re-run the above command and substitute your file name for `input.txt`

Remember any data before the `-` is the key and data after is the value.

Loading