From the course: Apache Flink: Exploratory Data Analytics with SQL

Unlock the full course today

Join today to access over 22,400 courses taught by industry experts or purchase this course individually.

Writing tables to Kafka

Writing tables to Kafka - SQL Tutorial

From the course: Apache Flink: Exploratory Data Analytics with SQL

Start my 1-month free trial

Writing tables to Kafka

- [Instructor] In this video, I will show you how to stream the output of a dynamic Flink table to a Kafka topic. We will output the sliding window table we created in the previous video to this Kafka topic. First, we need to set up a Kafka producer. We start out by setting up the properties object with the Kafka broker list information. Then, we need to set up a serializer that can convert each row in the Flink table to its equal and Kafka producer message. We implement this serialization schema class and its serialized method. Here, we extract the required information from the raw object and convert it to a string representation. We return the byte representation of that string. We now create a Kafka table sink. For creating the sink, we will provide the input schema which is the schema of this sliding window table. We provide the destination Kafka topic to produce messages to. Then, we provide the properties object…

Contents