From the course: Stream Processing Design Patterns with Kafka Streams

Unlock this course with a free trial

Join today to access over 22,600 courses taught by industry experts.

Alerts and thresholds: Pipeline implementation

Alerts and thresholds: Pipeline implementation - Kafka Tutorial

From the course: Stream Processing Design Patterns with Kafka Streams

Alerts and thresholds: Pipeline implementation

- [Instructor] Let's now review the alerts and threshold pattern, use case topology and code in this video. We start off with the Kafka alert data generator that generates exception messages to the Kafka input topic, streaming.alerts.input. This is run in a separate thread. Then we start building the topology for the alerts and thresholds pipeline. We set up the required Serdes as we did in the previous chapter. We then set up the properties object with the properties for our Kafka broker which we will use for both input and output. We build a case stream to consume the exception messages from the streaming.alerts.input topic. Then we use a map values function to read the CSV value, extract the individual attributes and use them to create the alert object. The results are pushed to another case stream. We then proceed to execute our first requirement, to filter critical alerts and publish them to another topic. We…

Contents