Files
spring-cloud-stream-samples/kafka-streams-samples/kafka-streams-global-table-join
2020-05-05 18:25:21 -04:00
..
2019-03-01 18:26:00 -05:00
2019-10-28 21:33:24 -04:00
2019-03-01 18:26:00 -05:00
2019-03-21 15:21:19 -04:00
2019-03-21 15:21:19 -04:00
2020-05-05 18:25:21 -04:00
2019-10-28 21:33:24 -04:00

== What is this app?

This is an example of a Spring Cloud Stream processor using Kafka Streams support.

The application uses two inputs - one KStream for user-clicks and a GlobalKTable for user-regions.
Then it joins the information from stream to table to find out total clicks per region. You could compare the this with ktable join sample.

=== Running the app:

Go to the root of the repository.

`docker-compose up -d`

`./mvnw clean package`

`java -jar target/kafka-streams-global-table-join-0.0.1-SNAPSHOT.jar`

`docker exec -it kafka-join-1 /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic user-regions --key-deserializer org.apache.kafka.common.serialization.StringDeserializer --value-deserializer org.apache.kafka.common.serialization.StringDeserializer --property print.key=true --property key.separator="-"`

`docker exec -it kafka-join-1 /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic output-topic --key-deserializer org.apache.kafka.common.serialization.StringDeserializer --value-deserializer org.apache.kafka.common.serialization.LongDeserializer --property print.key=true --property key.separator="-"`


Run the stand-alone `Producers` application to generate some data and watch the output on the console consumer above.