diff --git a/README.adoc b/README.adoc
index 7e1e0d5..16ba758 100644
--- a/README.adoc
+++ b/README.adoc
@@ -1,44 +1,69 @@
-== Samples
+== Spring Cloud Stream Sample Applications
-There are several samples, most running on the RabbitMQ transport (so you need RabbitMQ running locally to test them).
+This repository contains a collection of applications written using Spring Cloud Stream. All the applications are self contained.
+They can be run against either Kafka or RabbitMQ middleware technologies.
+You have the option of running the samples against local or Docker containerized versions of Kafka and Rabbit.
+For convenience, `docker-compose.yml` files are provided as part of each application wherever it is applicable.
+For this reason, Docker Compose is required and it’s recommended to use the https://docs.docker.com/compose/install/[latest version].
+These compose files bring up the middleware (kafka or Rabbit) and other necessary components for running each app.
+If you bring up Kafka or RabbitMQ in Docker containers, please make sure that you bring them down while in the same sample directory.
+You can read the README that is part of each sample and follow along the instructions to run them.
-To build the samples do:
+You can build the entire samples by going to the root of the repository and then do: `./mvnw clean package`
+However, the recommended approach to build them is to pick the sample that you are interested in and go to that particular app and follow the instructions there in the README for that app.
-```
- ./mvnw clean build
-```
+=== Following is the list of various sample applications provided
+==== Source samples
-* `double` is an example of an aggregate application, the Source and Sink are combined into one single application.
+* Sample JDBC source using MySQL - MariaDB variant - (Kafka and Rabbit)
-* `dynamic-source` publishes messages to dynamically created destinations.
+* Source with dynamic destinations (Kafka and Rabbit)
-* `kinesis-produce-consume` An example application using spring-cloud-stream-binder-aws-kinesis. Presents a web endpoint to send Orders, these are placed on a Kinesis stream and then consumed by the application from that stream.
+==== Sink samples
-* `multi-io` shows how to use configure multiple input/output channels inside a single application.
+* Simple JDBC sink using MariaDB (Kafka and Rabbit)
-* `multibinder-differentsystems` shows how an application could use same binder implementation but different configurations for its channels. In this case, a processor's input/output channels connect to same binder implementation but with two separate broker configurations.
+==== Processor samples
-* `multibinder` shows how an application could use multiple binders. In this case, the processor's input/output channels connect to different brokers using their own binder configurations.
+* Basic StreamListener sample (Kafka and Rabbit)
+* Transformer sample (Kafka and Rabbit)
+* Reactive processor sample (Kafka and Rabbit)
-* `non-self-contained-aggregate-app` shows how to write a non self-contained aggregate application.
+==== Multi IO sample
-* `reactive-processor-kafka` shows how to create a reactive Apache Kafka processor application.
+* Sample with multiple input/output bindings (Kafka and Rabbit)
-* `rxjava-processor` shows how to create an RxJava processor application.
+==== Multi Binder samples
-* `sink` A simple sink that logs the incoming payload. It has no options (but some could easily be added), and just logs incoming messages at INFO level.
+* Multi binder - Input with Kafka and output with Rabbit
+* Multi binder - Same binder type but different clusters (Kafka only, but can be extended for Rabbit as well)
-* `source` A simple time source example. It has a "fixedDelay" option (in milliseconds) for the period between emitting messages.
+==== Kinesis
-* `stream-listener` shows how to use StreamListener support to enable message mapping and automatic type conversion.
+* Kinesis produce consume sample
-* `test-embedded-kafka` is a sample that shows how to test with an embedded Apache Kafka broker.
-We generally recommend testing with the http://docs.spring.io/spring-cloud-stream/docs/current/reference/htmlsingle/#_testing[TestSupportBinder] but if you have a need for testing with an embedded broker, you can use the techniques in this sample.
+==== Kafka Streams samples
-* `transform` is a simple pass through logging transformer (just logs the incoming message and passes it on).
+A collection of various applications in stream processing using Spring Cloud Stream support for Kafka Streams binding.
-* `kstream` is a collection of applications that demonstrate the capabilities of the Spring Cloud Stream support for Kafka Streams
+* Kafka Streams word count
+* Kafka Streams branching
+* Kafka Streams DLQ
+* Kafka Streams aggregation
+* Kafka Streams Interactive query basic
+* Kafka Streams Interactive query advanced
+* Kafka Streams product tracker
+* Kafka Streams KTable join
+* Kafka Streams and normal Kafka binder together
-* `testing` is a bunch of applications and tests for them to demonstrate the capabilities of testing for the the Spring Cloud Stream applications.
+==== Testing samples
+* Sample with embedded Kafka
+* General testing patterns in Spring Cloud Stream
+
+==== Samples Acceptance Tests
+
+This module is strictly used as an end to end acceptance test framework for the samples in this repo.
+By default, the tests are not run as part of a normal build.
+Please see the README in the acceptance test module for more details.
\ No newline at end of file
diff --git a/dynamic-destination-source/pom.xml b/dynamic-destination-source/pom.xml
deleted file mode 100644
index 06302de..0000000
--- a/dynamic-destination-source/pom.xml
+++ /dev/null
@@ -1,94 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- dynamic-destination-source
- 0.0.1-SNAPSHOT
- jar
-
- dynamic-destination-source
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/dynamic-destination-source/start-kafka-shell.sh b/dynamic-destination-source/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/dynamic-destination-source/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/jdbc-sink/pom.xml b/jdbc-sink/pom.xml
deleted file mode 100644
index 369779a..0000000
--- a/jdbc-sink/pom.xml
+++ /dev/null
@@ -1,115 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- jdbc-sink
- 0.0.1-SNAPSHOT
- jar
-
- sample-jdbc-sink
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.RELEASE
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.mariadb.jdbc
- mariadb-java-client
- 1.1.9
- runtime
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.integration
- spring-integration-jdbc
-
-
- org.springframework.boot
- spring-boot-starter-jdbc
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
- org.springframework.cloud
- spring-cloud-stream-test-support
- test
-
-
- com.h2database
- h2
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/jdbc-sink/start-kafka-shell.sh b/jdbc-sink/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/jdbc-sink/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/jdbc-source/pom.xml b/jdbc-source/pom.xml
deleted file mode 100644
index f9440df..0000000
--- a/jdbc-source/pom.xml
+++ /dev/null
@@ -1,115 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- jdbc-source
- 0.0.1-SNAPSHOT
- jar
-
- sample-jdbc-source
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.RELEASE
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.mariadb.jdbc
- mariadb-java-client
- 1.1.9
- runtime
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.integration
- spring-integration-jdbc
-
-
- org.springframework.boot
- spring-boot-starter-jdbc
-
-
- org.springframework.cloud
- spring-cloud-stream-test-support
- test
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
- com.h2database
- h2
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/jdbc-source/start-kafka-shell.sh b/jdbc-source/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/jdbc-source/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-aggregate/.gitignore b/kafka-streams-samples/kafka-streams-aggregate/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/.gitignore
rename to kafka-streams-samples/kafka-streams-aggregate/.gitignore
diff --git a/kafka-streams/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-aggregate/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams/kafka-streams-aggregate/README.adoc b/kafka-streams-samples/kafka-streams-aggregate/README.adoc
similarity index 60%
rename from kafka-streams/kafka-streams-aggregate/README.adoc
rename to kafka-streams-samples/kafka-streams-aggregate/README.adoc
index ec6ad83..1a7fc02 100644
--- a/kafka-streams/kafka-streams-aggregate/README.adoc
+++ b/kafka-streams-samples/kafka-streams-aggregate/README.adoc
@@ -6,12 +6,11 @@ The application simply aggregates a string for a particular key.
=== Running the app:
-*Make the appropriate changes in application.yml if need be.
+Go to the root of the repository and do:
-`spring.cloud.stream.kstream.binder.brokers=` +
-`spring.cloud.stream.kstream.binder.zkNodes=`
+`docker-compose up -d`
-Go to the root of the repository and do: `./mvnw clean package`
+`./mvnw clean package`
`java -jar target/kafka-streams-aggregate-0.0.1-SNAPSHOT.jar`
diff --git a/kafka-streams/kafka-streams-word-count/docker/docker-compose.yml b/kafka-streams-samples/kafka-streams-aggregate/docker-compose.yml
similarity index 97%
rename from kafka-streams/kafka-streams-word-count/docker/docker-compose.yml
rename to kafka-streams-samples/kafka-streams-aggregate/docker-compose.yml
index 82e56c7..b6d7d38 100644
--- a/kafka-streams/kafka-streams-word-count/docker/docker-compose.yml
+++ b/kafka-streams-samples/kafka-streams-aggregate/docker-compose.yml
@@ -1,4 +1,4 @@
-version: '2'
+version: '3'
services:
kafka:
image: wurstmeister/kafka
diff --git a/kafka-streams/kafka-streams-aggregate/mvnw b/kafka-streams-samples/kafka-streams-aggregate/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/mvnw
rename to kafka-streams-samples/kafka-streams-aggregate/mvnw
diff --git a/kafka-streams/kafka-streams-aggregate/mvnw.cmd b/kafka-streams-samples/kafka-streams-aggregate/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-aggregate/mvnw.cmd
diff --git a/kafka-streams-samples/kafka-streams-aggregate/pom.xml b/kafka-streams-samples/kafka-streams-aggregate/pom.xml
new file mode 100644
index 0000000..e58f0de
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-aggregate/pom.xml
@@ -0,0 +1,42 @@
+
+
+ 4.0.0
+
+ kafka-streams-aggregate
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-aggregate
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/non-self-contained-aggregate-app/src/main/java/config/processor/ProcessorApplication.java b/kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/DomainEvent.java
similarity index 56%
rename from non-self-contained-aggregate-app/src/main/java/config/processor/ProcessorApplication.java
rename to kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/DomainEvent.java
index 576a1a7..5badcc1 100644
--- a/non-self-contained-aggregate-app/src/main/java/config/processor/ProcessorApplication.java
+++ b/kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/DomainEvent.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2017 the original author or authors.
+ * Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -14,14 +14,30 @@
* limitations under the License.
*/
-package config.processor;
-
-import org.springframework.boot.autoconfigure.SpringBootApplication;
+package kafka.streams.table.join;
/**
- * @author Marius Bogoevici
+ * @author Soby Chacko
*/
-@SpringBootApplication
-public class ProcessorApplication {
+public class DomainEvent {
+ String eventType;
+
+ String boardUuid;
+
+ public String getEventType() {
+ return eventType;
+ }
+
+ public void setEventType(String eventType) {
+ this.eventType = eventType;
+ }
+
+ public String getBoardUuid() {
+ return boardUuid;
+ }
+
+ public void setBoardUuid(String boardUuid) {
+ this.boardUuid = boardUuid;
+ }
}
diff --git a/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/KafkaStreamsAggregateSample.java b/kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/KafkaStreamsAggregateSample.java
similarity index 81%
rename from kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/KafkaStreamsAggregateSample.java
rename to kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/KafkaStreamsAggregateSample.java
index 14d7d5c..5af6f0e 100644
--- a/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/KafkaStreamsAggregateSample.java
+++ b/kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/KafkaStreamsAggregateSample.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.table.join;
import com.fasterxml.jackson.databind.ObjectMapper;
diff --git a/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/Producers.java b/kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/Producers.java
similarity index 72%
rename from kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/Producers.java
rename to kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/Producers.java
index 4821926..a675a9a 100644
--- a/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/Producers.java
+++ b/kafka-streams-samples/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/Producers.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.table.join;
import com.fasterxml.jackson.databind.ObjectMapper;
diff --git a/kafka-streams/kafka-streams-aggregate/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-aggregate/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-aggregate/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-aggregate/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-aggregate/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-aggregate/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-aggregate/src/test/java/kafka/streams/table/join/KafkaStreamsAggregateSampleTests.java b/kafka-streams-samples/kafka-streams-aggregate/src/test/java/kafka/streams/table/join/KafkaStreamsAggregateSampleTests.java
similarity index 100%
rename from kafka-streams/kafka-streams-aggregate/src/test/java/kafka/streams/table/join/KafkaStreamsAggregateSampleTests.java
rename to kafka-streams-samples/kafka-streams-aggregate/src/test/java/kafka/streams/table/join/KafkaStreamsAggregateSampleTests.java
diff --git a/kafka-streams/kafka-streams-branching/.gitignore b/kafka-streams-samples/kafka-streams-branching/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-branching/.gitignore
rename to kafka-streams-samples/kafka-streams-branching/.gitignore
diff --git a/kafka-streams/kafka-streams-branching/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-branching/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-branching/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-branching/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-branching/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-branching/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-branching/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-branching/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams-samples/kafka-streams-branching/README.adoc b/kafka-streams-samples/kafka-streams-branching/README.adoc
new file mode 100644
index 0000000..1bb9692
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-branching/README.adoc
@@ -0,0 +1,39 @@
+== What is this app?
+
+This is an example of a Spring Cloud Stream processor using Kafka Streams branching support.
+
+The example is based on the word count application from the https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/main/java/io/confluent/examples/streams/WordCountLambdaExample.java[reference documentation].
+It uses a single input and 3 output destinations.
+In essence, the application receives text messages from an input topic, filter them by language (Englihs, French, Spanish and ignoring the rest), and computes word occurrence counts in a configurable time window and report that in the output topics.
+This sample uses lambda expressions and thus requires Java 8+.
+
+By default native decoding and encoding are disabled and this means that any deserializaion on inbound and serialization on outbound is performed by the Binder using the configured content types.
+
+=== Running the app:
+
+Go to the root of the repository and do:
+
+`docker-compose up -d`
+
+`./mvnw clean package`
+
+`java -jar target/kafka-streams-branching-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kafka.streams.timeWindow.length=60000`
+
+Issue the following commands:
+
+`docker exec -it kafka-branch /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic words`
+
+On another terminal:
+
+`docker exec -it kafka-branch /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic english-counts`
+
+On another terminal:
+
+`docker exec -it kafka-branch /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic french-counts`
+
+On another terminal:
+
+`docker exec -it kafka-branch /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic spanish-counts`
+
+Enter text ("English", "French", "Spanish" - case doesn't matter) in the console producer and watch the output in the respective console consumer.
+The word "english" goes to topic english-counts, "french" goes to topic french-counts and "spanish" goes to spanish-counts.
\ No newline at end of file
diff --git a/kafka-streams-samples/kafka-streams-branching/docker-compose.yml b/kafka-streams-samples/kafka-streams-branching/docker-compose.yml
new file mode 100644
index 0000000..76d00fc
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-branching/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-branch
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-branching/mvnw b/kafka-streams-samples/kafka-streams-branching/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-branching/mvnw
rename to kafka-streams-samples/kafka-streams-branching/mvnw
diff --git a/kafka-streams/kafka-streams-branching/mvnw.cmd b/kafka-streams-samples/kafka-streams-branching/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-branching/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-branching/mvnw.cmd
diff --git a/kafka-streams-samples/kafka-streams-branching/pom.xml b/kafka-streams-samples/kafka-streams-branching/pom.xml
new file mode 100644
index 0000000..dd48b73
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-branching/pom.xml
@@ -0,0 +1,40 @@
+
+
+ 4.0.0
+
+ kafka-streams-branching
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-branching
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
diff --git a/kafka-streams/kafka-streams-branching/src/main/java/kafka/streams/branching/KafkaStreamsBranchingSample.java b/kafka-streams-samples/kafka-streams-branching/src/main/java/kafka/streams/branching/KafkaStreamsBranchingSample.java
similarity index 83%
rename from kafka-streams/kafka-streams-branching/src/main/java/kafka/streams/branching/KafkaStreamsBranchingSample.java
rename to kafka-streams-samples/kafka-streams-branching/src/main/java/kafka/streams/branching/KafkaStreamsBranchingSample.java
index 6ad2e25..acba5c2 100644
--- a/kafka-streams/kafka-streams-branching/src/main/java/kafka/streams/branching/KafkaStreamsBranchingSample.java
+++ b/kafka-streams-samples/kafka-streams-branching/src/main/java/kafka/streams/branching/KafkaStreamsBranchingSample.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.branching;
import org.apache.kafka.streams.KeyValue;
diff --git a/kafka-streams/kafka-streams-branching/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-branching/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-branching/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-branching/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-branching/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-branching/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-branching/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-branching/src/main/resources/logback.xml
diff --git a/self-contained-aggregate-app/src/test/java/demo/ModuleApplicationTests.java b/kafka-streams-samples/kafka-streams-branching/src/test/java/kafka/streams/branching/KafkaStreamsBranchingSampleTests.java
similarity index 62%
rename from self-contained-aggregate-app/src/test/java/demo/ModuleApplicationTests.java
rename to kafka-streams-samples/kafka-streams-branching/src/test/java/kafka/streams/branching/KafkaStreamsBranchingSampleTests.java
index 6f78d4e..c3b4009 100644
--- a/self-contained-aggregate-app/src/test/java/demo/ModuleApplicationTests.java
+++ b/kafka-streams-samples/kafka-streams-branching/src/test/java/kafka/streams/branching/KafkaStreamsBranchingSampleTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2015 the original author or authors.
+ * Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -14,23 +14,20 @@
* limitations under the License.
*/
-package demo;
+package kafka.streams.branching;
+import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
-
import org.springframework.boot.test.context.SpringBootTest;
-import org.springframework.test.annotation.DirtiesContext;
-import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
-import org.springframework.test.context.web.WebAppConfiguration;
+import org.springframework.test.context.junit4.SpringRunner;
-@RunWith(SpringJUnit4ClassRunner.class)
-@SpringBootTest(classes = AggregateApplication.class)
-@WebAppConfiguration
-@DirtiesContext
-public class ModuleApplicationTests {
+@RunWith(SpringRunner.class)
+@SpringBootTest
+public class KafkaStreamsBranchingSampleTests {
@Test
+ @Ignore
public void contextLoads() {
}
diff --git a/kafka-streams/kafka-streams-dlq-sample/.gitignore b/kafka-streams-samples/kafka-streams-dlq-sample/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/.gitignore
rename to kafka-streams-samples/kafka-streams-dlq-sample/.gitignore
diff --git a/kafka-streams/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-dlq-sample/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams-samples/kafka-streams-dlq-sample/README.adoc b/kafka-streams-samples/kafka-streams-dlq-sample/README.adoc
new file mode 100644
index 0000000..b3c570c
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-dlq-sample/README.adoc
@@ -0,0 +1,41 @@
+== What is this app?
+
+This is an example of a Spring Cloud Stream processor using Kafka Streams support.
+
+This is a demonstration of deserialization errors and DLQ in Kafka Streams binder.
+
+The example is based on the word count application from the https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/main/java/io/confluent/examples/streams/WordCountLambdaExample.java[reference documentation].
+It uses a single input and a single output.
+In essence, the application receives text messages from an input topic and computes word occurrence counts in a configurable time window and report that in an output topic.
+This sample uses lambda expressions and thus requires Java 8+.
+
+=== Running the app:
+
+`docker-compose up -d`
+
+Go to the root of the repository and do: `./mvnw clean package`
+
+`java -jar target/kstream-word-count-0.0.1-SNAPSHOT.jar`
+
+The default application.yml file demonstrates native decoding by Kafka.
+The default value serializer is set to IntegerSerde to force a deserialization errors.
+
+Issue the following commands:
+
+`docker exec -it kafka-dlq /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic words`
+
+On another terminal:
+
+`docker exec -it kafka-dlq /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic counts`
+
+On another terminal:
+
+`docker exec -it kafka-dlq /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic words-count-dlq`
+
+On the console producer, enter some text data.
+You will see that the messages produce deserialization errors and end up in the DLQ topic - words-count-dlq.
+You will not see any messages coming to the regular destination counts.
+
+There is another yaml file provided (by-framework-decoding.yml).
+Use that as application.yml to see how it works when the deserialization done by the framework.
+In this case also, the messages on error appear in the DLQ topic.
\ No newline at end of file
diff --git a/kafka-streams-samples/kafka-streams-dlq-sample/docker-compose.yml b/kafka-streams-samples/kafka-streams-dlq-sample/docker-compose.yml
new file mode 100644
index 0000000..3078c1d
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-dlq-sample/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-dlq
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-dlq-sample/mvnw b/kafka-streams-samples/kafka-streams-dlq-sample/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/mvnw
rename to kafka-streams-samples/kafka-streams-dlq-sample/mvnw
diff --git a/kafka-streams/kafka-streams-dlq-sample/mvnw.cmd b/kafka-streams-samples/kafka-streams-dlq-sample/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-dlq-sample/mvnw.cmd
diff --git a/kafka-streams-samples/kafka-streams-dlq-sample/pom.xml b/kafka-streams-samples/kafka-streams-dlq-sample/pom.xml
new file mode 100644
index 0000000..5b1c801
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-dlq-sample/pom.xml
@@ -0,0 +1,42 @@
+
+
+ 4.0.0
+
+ kafka-streams-dlq-sample
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-dlq-sample
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kafka-streams/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/KafkaStreamsDlqSample.java b/kafka-streams-samples/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/KafkaStreamsDlqSample.java
similarity index 79%
rename from kafka-streams/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/KafkaStreamsDlqSample.java
rename to kafka-streams-samples/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/KafkaStreamsDlqSample.java
index bf21f8f..e907071 100644
--- a/kafka-streams/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/KafkaStreamsDlqSample.java
+++ b/kafka-streams-samples/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/KafkaStreamsDlqSample.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.dlq.sample;
import org.apache.kafka.common.serialization.Serdes;
diff --git a/kafka-streams/kafka-streams-dlq-sample/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-dlq-sample/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-dlq-sample/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-dlq-sample/src/main/resources/by-framework-decoding.yml b/kafka-streams-samples/kafka-streams-dlq-sample/src/main/resources/by-framework-decoding.yml
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/src/main/resources/by-framework-decoding.yml
rename to kafka-streams-samples/kafka-streams-dlq-sample/src/main/resources/by-framework-decoding.yml
diff --git a/kafka-streams/kafka-streams-dlq-sample/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-dlq-sample/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-dlq-sample/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-dlq-sample/src/test/java/kafka/streams/dlq/sample/KafkaStreamsDlqExampleTests.java b/kafka-streams-samples/kafka-streams-dlq-sample/src/test/java/kafka/streams/dlq/sample/KafkaStreamsDlqExampleTests.java
similarity index 100%
rename from kafka-streams/kafka-streams-dlq-sample/src/test/java/kafka/streams/dlq/sample/KafkaStreamsDlqExampleTests.java
rename to kafka-streams-samples/kafka-streams-dlq-sample/src/test/java/kafka/streams/dlq/sample/KafkaStreamsDlqExampleTests.java
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/.gitignore b/kafka-streams-samples/kafka-streams-interactive-query-advanced/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/.gitignore
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/.gitignore
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams-samples/kafka-streams-interactive-query-advanced/README.adoc b/kafka-streams-samples/kafka-streams-interactive-query-advanced/README.adoc
new file mode 100644
index 0000000..3562a08
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-interactive-query-advanced/README.adoc
@@ -0,0 +1,26 @@
+== What is this app?
+
+This is an example of a Spring Cloud Stream processor using Kafka Streams support.
+
+This example is a Spring Cloud Stream adaptation of this Kafka Streams sample: https://github.com/confluentinc/kafka-streams-examples/tree/4.0.0-post/src/main/java/io/confluent/examples/streams/interactivequeries/kafkamusic
+
+This sample demonstrates the concept of interactive queries in kafka streams.
+There is a REST service provided as part of the application that can be used to query the store interactively.
+
+=== Running the app:
+
+1. `docker-compose up -d`
+
+2. Start the confluent schema registry: The following command is based on the confluent platform.
+
+`./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties`
+
+3. Go to the root of the repository and do: `./mvnw clean package`
+
+4. `java -jar target/kafka-streams-interactive-query-0.0.1-SNAPSHOT.jar`
+
+5. Run the stand-alone `Producers` application to generate data and start the processing.
+Keep it running for a while.
+
+6. Go to the URL: http://localhost:8080/charts/top-five?genre=Punk
+keep refreshing the URL and you will see the song play count information changes.
diff --git a/kafka-streams-samples/kafka-streams-interactive-query-advanced/docker-compose.yml b/kafka-streams-samples/kafka-streams-interactive-query-advanced/docker-compose.yml
new file mode 100644
index 0000000..f5aaab4
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-interactive-query-advanced/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-iq-advanced
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/mvnw b/kafka-streams-samples/kafka-streams-interactive-query-advanced/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/mvnw
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/mvnw
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/mvnw.cmd b/kafka-streams-samples/kafka-streams-interactive-query-advanced/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/mvnw.cmd
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/pom.xml b/kafka-streams-samples/kafka-streams-interactive-query-advanced/pom.xml
similarity index 59%
rename from kafka-streams/kafka-streams-interactive-query-advanced/pom.xml
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/pom.xml
index b3fef22..9927ad8 100644
--- a/kafka-streams/kafka-streams-interactive-query-advanced/pom.xml
+++ b/kafka-streams-samples/kafka-streams-interactive-query-advanced/pom.xml
@@ -3,26 +3,21 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
- kafka.streams.interactive.querykafka-streams-interactive-query-advanced0.0.1-SNAPSHOTjarkafka-streams-interactive-query-advanced
- Demo project for Spring Boot
+ Spring Cloud Stream sample for KStream interactive queries
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT4.0.01.8.2
@@ -49,26 +44,10 @@
kafka-schema-registry-client${confluent.version}
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.BUILD-SNAPSHOT
- org.springframework.cloudspring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
org.springframework.bootspring-boot-starter-test
@@ -76,24 +55,8 @@
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
- org.apache.avroavro-maven-plugin
@@ -112,27 +75,15 @@
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
- confluenthttp://packages.confluent.io/maven/
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQuerySample.java b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQuerySample.java
similarity index 94%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQuerySample.java
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQuerySample.java
index 9bf407a..249d705 100644
--- a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQuerySample.java
+++ b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQuerySample.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.interactive.query;
import io.confluent.kafka.serializers.AbstractKafkaAvroSerDeConfig;
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/Producers.java b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/Producers.java
similarity index 87%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/Producers.java
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/Producers.java
index e6efd72..6792c9d 100644
--- a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/Producers.java
+++ b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/Producers.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.interactive.query;
import io.confluent.kafka.serializers.AbstractKafkaAvroSerDeConfig;
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/SongPlayCountBean.java b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/SongPlayCountBean.java
similarity index 72%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/SongPlayCountBean.java
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/SongPlayCountBean.java
index 8d71057..906a9ea 100644
--- a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/SongPlayCountBean.java
+++ b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/java/kafka/streams/interactive/query/SongPlayCountBean.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.interactive.query;
import java.util.Objects;
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/playevent.avsc b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/playevent.avsc
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/playevent.avsc
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/playevent.avsc
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/song.avsc b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/song.avsc
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/song.avsc
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/song.avsc
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/songplaycount.avsc b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/songplaycount.avsc
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/songplaycount.avsc
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/avro/kafka/streams/interactive/query/songplaycount.avsc
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/src/test/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQueryTests.java b/kafka-streams-samples/kafka-streams-interactive-query-advanced/src/test/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQueryTests.java
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-advanced/src/test/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQueryTests.java
rename to kafka-streams-samples/kafka-streams-interactive-query-advanced/src/test/java/kafka/streams/interactive/query/KafkaStreamsInteractiveQueryTests.java
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/.jdk8 b/kafka-streams-samples/kafka-streams-interactive-query-basic/.jdk8
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/.jdk8
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/.jdk8
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams-samples/kafka-streams-interactive-query-basic/README.adoc b/kafka-streams-samples/kafka-streams-interactive-query-basic/README.adoc
new file mode 100644
index 0000000..59d946d
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-interactive-query-basic/README.adoc
@@ -0,0 +1,35 @@
+== What is this app?
+
+This is an example of a Spring Cloud Stream processor using Kafka Streams support.
+
+The example is based on a contrived use case of tracking products by interactively querying their status.
+The program accepts product ID's and track their counts hitherto by interactively querying the underlying store. \
+This sample uses lambda expressions and thus requires Java 8+.
+
+=== Running the app:
+
+Go to the root of the repository and do:
+
+`docker-compose up -d`
+
+`./mvnw clean package`
+
+`java -jar target/kafka-streams-interactive-query-basic-0.0.1-SNAPSHOT.jar --app.product.tracker.productIds=123,124,125`
+
+The above command will track products with ID's 123,124 and 125 and print their counts seen so far every 30 seconds.
+
+Issue the following commands:
+
+`docker exec -it kafka-iq-basic /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic products`
+
+Enter the following in the console producer (one line at a time) and watch the output on the console (or IDE) where the application is running.
+
+```
+{"id":"123"}
+{"id":"124"}
+{"id":"125"}
+{"id":"123"}
+{"id":"123"}
+{"id":"123"}
+```
+
diff --git a/kafka-streams-samples/kafka-streams-interactive-query-basic/docker-compose.yml b/kafka-streams-samples/kafka-streams-interactive-query-basic/docker-compose.yml
new file mode 100644
index 0000000..9c33031
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-interactive-query-basic/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-iq-basic
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/mvnw b/kafka-streams-samples/kafka-streams-interactive-query-basic/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/mvnw
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/mvnw
diff --git a/kafka-streams-samples/kafka-streams-interactive-query-basic/pom.xml b/kafka-streams-samples/kafka-streams-interactive-query-basic/pom.xml
new file mode 100644
index 0000000..5a762ed
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-interactive-query-basic/pom.xml
@@ -0,0 +1,41 @@
+
+
+ 4.0.0
+
+ kafka-streams-interactive-query-basic
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-interactive-query-basic
+ Spring Cloud Stream sample for KStream interactive queries
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/src/main/java/kafka/streams/product/tracker/KafkaStreamsInteractiveQueryApplication.java b/kafka-streams-samples/kafka-streams-interactive-query-basic/src/main/java/kafka/streams/product/tracker/KafkaStreamsInteractiveQueryApplication.java
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/src/main/java/kafka/streams/product/tracker/KafkaStreamsInteractiveQueryApplication.java
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/src/main/java/kafka/streams/product/tracker/KafkaStreamsInteractiveQueryApplication.java
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-interactive-query-basic/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-interactive-query-basic/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-interactive-query-basic/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-interactive-query-basic/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-message-channel/.gitignore b/kafka-streams-samples/kafka-streams-message-channel/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/.gitignore
rename to kafka-streams-samples/kafka-streams-message-channel/.gitignore
diff --git a/kafka-streams/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-message-channel/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams-samples/kafka-streams-message-channel/README.adoc b/kafka-streams-samples/kafka-streams-message-channel/README.adoc
new file mode 100644
index 0000000..704bb2e
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-message-channel/README.adoc
@@ -0,0 +1,39 @@
+== What is this app?
+
+This is an example of a Spring Cloud Stream processor using Kafka Streams support.
+
+The example is based on the word count application from the https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/main/java/io/confluent/examples/streams/WordCountLambdaExample.java[reference documentation].
+The application has two inputs one as a KStream and another one with the binding for a message channel.
+
+=== Running the app:
+
+Go to the root of the repository and do:
+
+`docker-compose up -d`
+
+`./mvnw clean package`
+
+`java -jar target/kafka-streams-message-channel-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kafka.streams.timeWindow.length=60000`
+
+Issue the following commands:
+
+`docker exec -it kafka-mc /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic words`
+
+On another terminal:
+
+`docker exec -it kafka-mc /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic counts`
+
+Enter some text in the console producer and watch the output in the console consumer.
+
+Also watch the console for logging statements from the regular sink StreamListener method.
+
+The default time window is configured for 5 seconds and you can change that using the following property.
+
+`spring.cloud.stream.kafka.streams.timeWindow.length` (value is expressed in milliseconds)
+
+In order to switch to a hopping window, you can use the `spring.cloud.stream.kafka.streams.timeWindow.advanceBy` (value in milliseconds).
+This will create an overlapped hopping windows depending on the value you provide.
+
+Here is an example with 2 overlapping windows (window length of 10 seconds and a hop (advance) by 5 seconds:
+
+`java -jar target/kafka-streams-message-channel-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kafka.streams.timeWindow.length=10000 --spring.cloud.stream.kafka.streams.timeWindow.advanceBy=5000`
\ No newline at end of file
diff --git a/kafka-streams-samples/kafka-streams-message-channel/docker-compose.yml b/kafka-streams-samples/kafka-streams-message-channel/docker-compose.yml
new file mode 100644
index 0000000..f06f273
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-message-channel/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-mc
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-message-channel/mvnw b/kafka-streams-samples/kafka-streams-message-channel/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/mvnw
rename to kafka-streams-samples/kafka-streams-message-channel/mvnw
diff --git a/kafka-streams/kafka-streams-message-channel/mvnw.cmd b/kafka-streams-samples/kafka-streams-message-channel/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-message-channel/mvnw.cmd
diff --git a/kafka-streams-samples/kafka-streams-message-channel/pom.xml b/kafka-streams-samples/kafka-streams-message-channel/pom.xml
new file mode 100644
index 0000000..a09387d
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-message-channel/pom.xml
@@ -0,0 +1,46 @@
+
+
+ 4.0.0
+
+ kafka-streams-message-channel
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-message-channel
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kafka-streams/kafka-streams-message-channel/src/main/java/kafka/streams/message/channel/KafkaStreamsWordCountApplication.java b/kafka-streams-samples/kafka-streams-message-channel/src/main/java/kafka/streams/message/channel/KafkaStreamsWordCountApplication.java
similarity index 83%
rename from kafka-streams/kafka-streams-message-channel/src/main/java/kafka/streams/message/channel/KafkaStreamsWordCountApplication.java
rename to kafka-streams-samples/kafka-streams-message-channel/src/main/java/kafka/streams/message/channel/KafkaStreamsWordCountApplication.java
index ed5c34c..e2a5014 100644
--- a/kafka-streams/kafka-streams-message-channel/src/main/java/kafka/streams/message/channel/KafkaStreamsWordCountApplication.java
+++ b/kafka-streams-samples/kafka-streams-message-channel/src/main/java/kafka/streams/message/channel/KafkaStreamsWordCountApplication.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.message.channel;
import org.apache.kafka.common.serialization.Serdes;
diff --git a/kafka-streams/kafka-streams-message-channel/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-message-channel/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-message-channel/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-message-channel/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-message-channel/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-message-channel/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-message-channel/src/test/java/kafka/streams/message/channel/KafkaStreamsWordCountApplicationTests.java b/kafka-streams-samples/kafka-streams-message-channel/src/test/java/kafka/streams/message/channel/KafkaStreamsWordCountApplicationTests.java
similarity index 100%
rename from kafka-streams/kafka-streams-message-channel/src/test/java/kafka/streams/message/channel/KafkaStreamsWordCountApplicationTests.java
rename to kafka-streams-samples/kafka-streams-message-channel/src/test/java/kafka/streams/message/channel/KafkaStreamsWordCountApplicationTests.java
diff --git a/kafka-streams/kafka-streams-product-tracker/.jdk8 b/kafka-streams-samples/kafka-streams-product-tracker/.jdk8
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/.jdk8
rename to kafka-streams-samples/kafka-streams-product-tracker/.jdk8
diff --git a/kafka-streams/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-product-tracker/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams/kafka-streams-product-tracker/README.adoc b/kafka-streams-samples/kafka-streams-product-tracker/README.adoc
similarity index 57%
rename from kafka-streams/kafka-streams-product-tracker/README.adoc
rename to kafka-streams-samples/kafka-streams-product-tracker/README.adoc
index 1583348..38445b9 100644
--- a/kafka-streams/kafka-streams-product-tracker/README.adoc
+++ b/kafka-streams-samples/kafka-streams-product-tracker/README.adoc
@@ -7,24 +7,12 @@ Although contrived, this type of use cases are pretty common in the industry whe
In essence, the application receives product information from input topic and count the interested products in a configurable time window and report that in an output topic.
This sample uses lambda expressions and thus requires Java 8+.
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console producer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-producer.sh --broker-list 192.168.99.100:9092 --topic products`
-5. Start the console consumer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --key-deserializer org.apache.kafka.common.serialization.IntegerDeserializer --property print.key=true --topic product-counts`
-
=== Running the app:
Go to the root of the repository and do:
+`docker-compose up -d`
+
`./mvnw clean package`
`java -jar target/kafka-streams-product-tracker-0.0.1-SNAPSHOT.jar --app.product.tracker.productIds=123,124,125 --spring.cloud.stream.kafka.streams.timeWindow.length=60000 --spring.cloud.stream.kafka.streams.timeWindow.advanceBy=30000`
@@ -32,12 +20,13 @@ Go to the root of the repository and do:
The above command will track products with ID's 123,124 and 125 every 30 seconds with the counts from the last minute.
In other words, every 30 seconds a new 1 minute window is started.
+Issue the following commands:
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
+`docker exec -it kafka-prod-tracker /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic products`
-`spring.cloud.stream.kstream.binder.brokers=` +
-`spring.cloud.stream.kstream.binder.zkNodes=`
+On another terminal:
+
+`docker exec -it kafka-prod-tracker /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --key-deserializer org.apache.kafka.common.serialization.IntegerDeserializer --property print.key=true --topic product-counts`
Enter the following in the console producer (one line at a time) and watch the output on the console consumer:
diff --git a/kafka-streams-samples/kafka-streams-product-tracker/docker-compose.yml b/kafka-streams-samples/kafka-streams-product-tracker/docker-compose.yml
new file mode 100644
index 0000000..50ac1f7
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-product-tracker/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-prod-tracker
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-product-tracker/mvnw b/kafka-streams-samples/kafka-streams-product-tracker/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/mvnw
rename to kafka-streams-samples/kafka-streams-product-tracker/mvnw
diff --git a/kafka-streams-samples/kafka-streams-product-tracker/pom.xml b/kafka-streams-samples/kafka-streams-product-tracker/pom.xml
new file mode 100644
index 0000000..42cf0b5
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-product-tracker/pom.xml
@@ -0,0 +1,42 @@
+
+
+ 4.0.0
+
+ kafka-streams-product-tracker
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-product-tracker
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kafka-streams/kafka-streams-product-tracker/src/main/java/kafka/streams/product/tracker/KafkaStreamsProductTrackerApplication.java b/kafka-streams-samples/kafka-streams-product-tracker/src/main/java/kafka/streams/product/tracker/KafkaStreamsProductTrackerApplication.java
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/src/main/java/kafka/streams/product/tracker/KafkaStreamsProductTrackerApplication.java
rename to kafka-streams-samples/kafka-streams-product-tracker/src/main/java/kafka/streams/product/tracker/KafkaStreamsProductTrackerApplication.java
diff --git a/kafka-streams/kafka-streams-product-tracker/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-product-tracker/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-product-tracker/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-product-tracker/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-product-tracker/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-product-tracker/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-product-tracker/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-table-join/.gitignore b/kafka-streams-samples/kafka-streams-table-join/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/.gitignore
rename to kafka-streams-samples/kafka-streams-table-join/.gitignore
diff --git a/kafka-streams/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-table-join/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams-samples/kafka-streams-table-join/README.adoc b/kafka-streams-samples/kafka-streams-table-join/README.adoc
new file mode 100644
index 0000000..b884f44
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-table-join/README.adoc
@@ -0,0 +1,22 @@
+== What is this app?
+
+This is an example of a Spring Cloud Stream processor using Kafka Streams support.
+
+This example is a Spring Cloud Stream adaptation of this Kafka Streams sample: https://github.com/confluentinc/kafka-streams-examples/blob/4.0.0-post/src/test/java/io/confluent/examples/streams/StreamToTableJoinIntegrationTest.java
+
+The application uses two inputs - one KStream for user-clicks and a KTable for user-regions.
+Then it joins the information from stream to table to find out total clicks per region.
+
+=== Running the app:
+
+Go to the root of the repository.
+
+`docker-compose up -d`
+
+`./mvnw clean package`
+
+`java -jar target/kafka-streams-table-join-0.0.1-SNAPSHOT.jar`
+
+`docker exec -it kafka-join /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic output-topic --key-deserializer org.apache.kafka.common.serialization.StringDeserializer --value-deserializer org.apache.kafka.common.serialization.LongDeserializer --property print.key=true --property key.separator="-"`
+
+Run the stand-alone `Producers` application to generate some data and watch the output on the console consumer above.
\ No newline at end of file
diff --git a/kafka-streams-samples/kafka-streams-table-join/docker-compose.yml b/kafka-streams-samples/kafka-streams-table-join/docker-compose.yml
new file mode 100644
index 0000000..f241d8f
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-table-join/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-join
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-table-join/mvnw b/kafka-streams-samples/kafka-streams-table-join/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/mvnw
rename to kafka-streams-samples/kafka-streams-table-join/mvnw
diff --git a/kafka-streams/kafka-streams-table-join/mvnw.cmd b/kafka-streams-samples/kafka-streams-table-join/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-table-join/mvnw.cmd
diff --git a/kafka-streams-samples/kafka-streams-table-join/pom.xml b/kafka-streams-samples/kafka-streams-table-join/pom.xml
new file mode 100644
index 0000000..35d7027
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-table-join/pom.xml
@@ -0,0 +1,42 @@
+
+
+ 4.0.0
+
+ kafka-streams-table-join
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-table-join
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kafka-streams/kafka-streams-table-join/src/main/java/kafka/streams/table/join/KafkaStreamsTableJoin.java b/kafka-streams-samples/kafka-streams-table-join/src/main/java/kafka/streams/table/join/KafkaStreamsTableJoin.java
similarity index 80%
rename from kafka-streams/kafka-streams-table-join/src/main/java/kafka/streams/table/join/KafkaStreamsTableJoin.java
rename to kafka-streams-samples/kafka-streams-table-join/src/main/java/kafka/streams/table/join/KafkaStreamsTableJoin.java
index 85231bd..91975a5 100644
--- a/kafka-streams/kafka-streams-table-join/src/main/java/kafka/streams/table/join/KafkaStreamsTableJoin.java
+++ b/kafka-streams-samples/kafka-streams-table-join/src/main/java/kafka/streams/table/join/KafkaStreamsTableJoin.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.table.join;
import org.apache.kafka.common.serialization.Serdes;
diff --git a/kafka-streams/kafka-streams-table-join/src/main/java/kafka/streams/table/join/Producers.java b/kafka-streams-samples/kafka-streams-table-join/src/main/java/kafka/streams/table/join/Producers.java
similarity index 81%
rename from kafka-streams/kafka-streams-table-join/src/main/java/kafka/streams/table/join/Producers.java
rename to kafka-streams-samples/kafka-streams-table-join/src/main/java/kafka/streams/table/join/Producers.java
index 0f988f7..b77ff81 100644
--- a/kafka-streams/kafka-streams-table-join/src/main/java/kafka/streams/table/join/Producers.java
+++ b/kafka-streams-samples/kafka-streams-table-join/src/main/java/kafka/streams/table/join/Producers.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.table.join;
import org.apache.kafka.clients.producer.ProducerConfig;
diff --git a/kafka-streams/kafka-streams-table-join/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-table-join/src/main/resources/application.yml
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-table-join/src/main/resources/application.yml
diff --git a/kafka-streams/kafka-streams-table-join/src/main/resources/logback.xml b/kafka-streams-samples/kafka-streams-table-join/src/main/resources/logback.xml
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/src/main/resources/logback.xml
rename to kafka-streams-samples/kafka-streams-table-join/src/main/resources/logback.xml
diff --git a/kafka-streams/kafka-streams-table-join/src/test/java/kafka/streams/table/join/KafkaStreamsTableJoinTests.java b/kafka-streams-samples/kafka-streams-table-join/src/test/java/kafka/streams/table/join/KafkaStreamsTableJoinTests.java
similarity index 100%
rename from kafka-streams/kafka-streams-table-join/src/test/java/kafka/streams/table/join/KafkaStreamsTableJoinTests.java
rename to kafka-streams-samples/kafka-streams-table-join/src/test/java/kafka/streams/table/join/KafkaStreamsTableJoinTests.java
diff --git a/kafka-streams/kafka-streams-word-count/.gitignore b/kafka-streams-samples/kafka-streams-word-count/.gitignore
similarity index 100%
rename from kafka-streams/kafka-streams-word-count/.gitignore
rename to kafka-streams-samples/kafka-streams-word-count/.gitignore
diff --git a/kafka-streams/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.jar b/kafka-streams-samples/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.jar
similarity index 100%
rename from kafka-streams/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.jar
rename to kafka-streams-samples/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.jar
diff --git a/kafka-streams/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.properties b/kafka-streams-samples/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.properties
similarity index 100%
rename from kafka-streams/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.properties
rename to kafka-streams-samples/kafka-streams-word-count/.mvn/wrapper/maven-wrapper.properties
diff --git a/kafka-streams/kafka-streams-word-count/README.adoc b/kafka-streams-samples/kafka-streams-word-count/README.adoc
similarity index 55%
rename from kafka-streams/kafka-streams-word-count/README.adoc
rename to kafka-streams-samples/kafka-streams-word-count/README.adoc
index a0e0e96..82cbc94 100644
--- a/kafka-streams/kafka-streams-word-count/README.adoc
+++ b/kafka-streams-samples/kafka-streams-word-count/README.adoc
@@ -7,31 +7,25 @@ It uses a single input and a single output.
In essence, the application receives text messages from an input topic and computes word occurrence counts in a configurable time window and report that in an output topic.
This sample uses lambda expressions and thus requires Java 8+.
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console producer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-producer.sh --broker-list 192.168.99.100:9092 --topic words`
-5. Start the console consumer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --topic counts`
-
=== Running the app:
-Go to the root of the repository and do: `./mvnw clean package`
+Go to the root of the repository.
+
+`docker-compose up -d`
+
+`./mvnw clean package`
`java -jar target/kafka-streams-word-count-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kafka.streams.timeWindow.length=60000`
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
+Assuming you are running the dockerized Kafka cluster as above.
-`spring.cloud.stream.kafka.streams.binder.brokers=` +
-`spring.cloud.stream.kafka.streams.binder.zkNodes=`
+Issue the following commands:
+
+`docker exec -it kafka-wordcount /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic words`
+
+On another terminal:
+
+`docker exec -it kafka-wordcount /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic counts`
Enter some text in the console producer and watch the output in the console consumer.
diff --git a/kafka-streams-samples/kafka-streams-word-count/docker-compose.yml b/kafka-streams-samples/kafka-streams-word-count/docker-compose.yml
new file mode 100644
index 0000000..38003e5
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-word-count/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-wordcount
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
diff --git a/kafka-streams/kafka-streams-word-count/mvnw b/kafka-streams-samples/kafka-streams-word-count/mvnw
similarity index 100%
rename from kafka-streams/kafka-streams-word-count/mvnw
rename to kafka-streams-samples/kafka-streams-word-count/mvnw
diff --git a/kafka-streams/kafka-streams-word-count/mvnw.cmd b/kafka-streams-samples/kafka-streams-word-count/mvnw.cmd
similarity index 100%
rename from kafka-streams/kafka-streams-word-count/mvnw.cmd
rename to kafka-streams-samples/kafka-streams-word-count/mvnw.cmd
diff --git a/kafka-streams-samples/kafka-streams-word-count/pom.xml b/kafka-streams-samples/kafka-streams-word-count/pom.xml
new file mode 100644
index 0000000..20dc5c3
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-word-count/pom.xml
@@ -0,0 +1,45 @@
+
+
+ 4.0.0
+
+ kafka-streams-word-count
+ 0.0.1-SNAPSHOT
+ jar
+
+ kafka-streams-word-count
+ Demo project for Spring Boot
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-streams
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kafka-streams/kafka-streams-word-count/src/main/java/kafka/streams/word/count/KafkaStreamsWordCountApplication.java b/kafka-streams-samples/kafka-streams-word-count/src/main/java/kafka/streams/word/count/KafkaStreamsWordCountApplication.java
similarity index 80%
rename from kafka-streams/kafka-streams-word-count/src/main/java/kafka/streams/word/count/KafkaStreamsWordCountApplication.java
rename to kafka-streams-samples/kafka-streams-word-count/src/main/java/kafka/streams/word/count/KafkaStreamsWordCountApplication.java
index 5ef19f9..b18f201 100644
--- a/kafka-streams/kafka-streams-word-count/src/main/java/kafka/streams/word/count/KafkaStreamsWordCountApplication.java
+++ b/kafka-streams-samples/kafka-streams-word-count/src/main/java/kafka/streams/word/count/KafkaStreamsWordCountApplication.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package kafka.streams.word.count;
import org.apache.kafka.common.serialization.Serdes;
@@ -8,7 +24,6 @@ import org.apache.kafka.streams.kstream.Serialized;
import org.apache.kafka.streams.kstream.TimeWindows;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
-import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
@@ -26,7 +41,6 @@ public class KafkaStreamsWordCountApplication {
}
@EnableBinding(KafkaStreamsProcessor.class)
- @EnableAutoConfiguration
public static class WordCountProcessorApplication {
@Autowired
diff --git a/kafka-streams-samples/kafka-streams-word-count/src/main/java/kafka/streams/word/count/SampleRunner.java b/kafka-streams-samples/kafka-streams-word-count/src/main/java/kafka/streams/word/count/SampleRunner.java
new file mode 100644
index 0000000..69d201e
--- /dev/null
+++ b/kafka-streams-samples/kafka-streams-word-count/src/main/java/kafka/streams/word/count/SampleRunner.java
@@ -0,0 +1,94 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package kafka.streams.word.count;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.springframework.cloud.stream.annotation.EnableBinding;
+import org.springframework.cloud.stream.annotation.Input;
+import org.springframework.cloud.stream.annotation.Output;
+import org.springframework.cloud.stream.annotation.StreamListener;
+import org.springframework.context.annotation.Bean;
+import org.springframework.integration.annotation.InboundChannelAdapter;
+import org.springframework.integration.annotation.Poller;
+import org.springframework.integration.core.MessageSource;
+import org.springframework.messaging.MessageChannel;
+import org.springframework.messaging.SubscribableChannel;
+import org.springframework.messaging.support.GenericMessage;
+
+import java.util.Random;
+import java.util.concurrent.atomic.AtomicBoolean;
+
+/**
+ * Provides a test source and sink to trigger the kafka streams processor
+ * and test the output respectively.
+ *
+ * @author Soby Chacko
+ */
+public class SampleRunner {
+
+ //Following code is only used as a test harness.
+
+ //Following source is used as test producer.
+ @EnableBinding(TestSource.class)
+ static class TestProducer {
+
+ private AtomicBoolean semaphore = new AtomicBoolean(true);
+
+ private String[] randomWords = new String[]{"foo", "bar", "foobar", "baz", "fox"};
+ private Random random = new Random();
+
+ @Bean
+ @InboundChannelAdapter(channel = TestSource.OUTPUT, poller = @Poller(fixedDelay = "1000"))
+ public MessageSource sendTestData() {
+ return () -> {
+ int idx = random.nextInt(5);
+ return new GenericMessage<>(randomWords[idx]);
+ };
+ }
+ }
+
+ //Following sink is used as test consumer for the above processor. It logs the data received through the processor.
+ @EnableBinding(TestSink.class)
+ static class TestConsumer {
+
+ private final Log logger = LogFactory.getLog(getClass());
+
+ @StreamListener(TestSink.INPUT)
+ public void receive(String data) {
+ logger.info("Data received..." + data);
+ }
+ }
+
+ interface TestSink {
+
+ String INPUT = "input1";
+
+ @Input(INPUT)
+ SubscribableChannel input1();
+
+ }
+
+ interface TestSource {
+
+ String OUTPUT = "output1";
+
+ @Output(TestSource.OUTPUT)
+ MessageChannel output();
+
+ }
+}
diff --git a/kafka-streams/kafka-streams-word-count/src/main/resources/application.yml b/kafka-streams-samples/kafka-streams-word-count/src/main/resources/application.yml
similarity index 71%
rename from kafka-streams/kafka-streams-word-count/src/main/resources/application.yml
rename to kafka-streams-samples/kafka-streams-word-count/src/main/resources/application.yml
index 6ffba33..2d89258 100644
--- a/kafka-streams/kafka-streams-word-count/src/main/resources/application.yml
+++ b/kafka-streams-samples/kafka-streams-word-count/src/main/resources/application.yml
@@ -12,9 +12,11 @@ spring.cloud.stream.bindings.input:
destination: words
consumer:
headerMode: raw
-spring.cloud.stream.kafka.streams.binder:
- brokers: localhost #192.168.99.100
- zkNodes: localhost #192.168.99.100
+#For testing
+spring.cloud.stream.bindings.input1.destination: counts
+spring.cloud.stream.bindings.output1.destination: words
+spring.cloud.stream.bindings.input1.binder: kafka
+spring.cloud.stream.bindings.output1.binder: kafka
diff --git a/kafka-streams/kafka-streams-word-count/src/test/java/kafka/streams/word/count/KafkaStreamsWordCountApplicationTests.java b/kafka-streams-samples/kafka-streams-word-count/src/test/java/kafka/streams/word/count/KafkaStreamsWordCountApplicationTests.java
similarity index 100%
rename from kafka-streams/kafka-streams-word-count/src/test/java/kafka/streams/word/count/KafkaStreamsWordCountApplicationTests.java
rename to kafka-streams-samples/kafka-streams-word-count/src/test/java/kafka/streams/word/count/KafkaStreamsWordCountApplicationTests.java
diff --git a/kafka-streams/pom.xml b/kafka-streams-samples/pom.xml
similarity index 68%
rename from kafka-streams/pom.xml
rename to kafka-streams-samples/pom.xml
index eeb7395..d716dfc 100644
--- a/kafka-streams/pom.xml
+++ b/kafka-streams-samples/pom.xml
@@ -1,18 +1,12 @@
4.0.0
-
- spring-cloud-stream-kafka-stream-samples
+ spring.cloud.stream.samples
+ kafka-streams-samples
+ 0.0.1-SNAPSHOTpom
-
-
- org.springframework.cloud
- spring-cloud-stream-samples
- 1.2.0.BUILD-SNAPSHOT
-
-
- spring-cloud-stream-kafka-stream-samples
- Parent Project for Kafka Streams Samples
+ kafka-streams-samples
+ Collection of Spring Cloud Stream Kafka Streams sampleskafka-streams-word-count
diff --git a/kafka-streams/kafka-streams-aggregate/docker/docker-compose.yml b/kafka-streams/kafka-streams-aggregate/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-aggregate/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-aggregate/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-aggregate/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-aggregate/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-aggregate/pom.xml b/kafka-streams/kafka-streams-aggregate/pom.xml
deleted file mode 100644
index 44bf7dd..0000000
--- a/kafka-streams/kafka-streams-aggregate/pom.xml
+++ /dev/null
@@ -1,89 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.aggregate
- kafka-streams-aggregate
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-aggregate
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.RELEASE
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/DomainEvent.java b/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/DomainEvent.java
deleted file mode 100644
index 4fedea4..0000000
--- a/kafka-streams/kafka-streams-aggregate/src/main/java/kafka/streams/table/join/DomainEvent.java
+++ /dev/null
@@ -1,27 +0,0 @@
-package kafka.streams.table.join;
-
-/**
- * @author Soby Chacko
- */
-public class DomainEvent {
-
- String eventType;
-
- String boardUuid;
-
- public String getEventType() {
- return eventType;
- }
-
- public void setEventType(String eventType) {
- this.eventType = eventType;
- }
-
- public String getBoardUuid() {
- return boardUuid;
- }
-
- public void setBoardUuid(String boardUuid) {
- this.boardUuid = boardUuid;
- }
-}
diff --git a/kafka-streams/kafka-streams-branching/README.adoc b/kafka-streams/kafka-streams-branching/README.adoc
deleted file mode 100644
index bd54df2..0000000
--- a/kafka-streams/kafka-streams-branching/README.adoc
+++ /dev/null
@@ -1,47 +0,0 @@
-== What is this app?
-
-This is an example of a Spring Cloud Stream processor using Kafka Streams branching support.
-
-The example is based on the word count application from the https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/main/java/io/confluent/examples/streams/WordCountLambdaExample.java[reference documentation].
-It uses a single input and 3 output destinations.
-In essence, the application receives text messages from an input topic, filter them by language (Englihs, French, Spanish and ignoring the rest), and computes word occurrence counts in a configurable time window and report that in the output topics.
-This sample uses lambda expressions and thus requires Java 8+.
-
-By default native decoding and encoding are disabled and this means that any deserializaion on inbound and serialization on outbound is performed by the Binder using the configured content types.
-
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console producer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-producer.sh --broker-list 192.168.99.100:9092 --topic words`
-5. Start the console consumer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-a. `bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --topic english-counts`
-b. On another console: `bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --topic french-counts`
-c. On another console: `bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --topic spanish-counts`
-
-=== Running the app:
-
-Go to the root of the repository and do: `./mvnw clean package`
-
-`java -jar target/kafka-streams-branching-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kafka.streams.timeWindow.length=60000`
-
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
-
-`spring.cloud.stream.kafka.streams.binder.brokers=` +
-`spring.cloud.stream.kafka.streams.binder.zkNodes=`
-
-Enter text ("English", "French", "Spanish" - case doesn't matter) in the console producer and watch the output in the respective console consumer - The word "english" goes to topic english-counts, "french" goes to topic french-counts and "spanish" goes to spanish-counts.
-
-In order to switch to a hopping window, you can use the `spring.cloud.stream.kafka.streams.timeWindow.advanceBy` (value in milliseconds).
-This will create an overlapped hopping windows depending on the value you provide.
-
-Here is an example with 2 overlapping windows (window length of 10 seconds and a hop (advance) by 5 seconds:
-
-`java -jar target/kafka-streams-branching-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kafka.streams.timeWindow.length=10000 --spring.cloud.stream.kafka.streams.timeWindow.advanceBy=5000`
diff --git a/kafka-streams/kafka-streams-branching/docker/docker-compose.yml b/kafka-streams/kafka-streams-branching/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-branching/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-branching/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-branching/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-branching/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-branching/pom.xml b/kafka-streams/kafka-streams-branching/pom.xml
deleted file mode 100644
index b1e17b5..0000000
--- a/kafka-streams/kafka-streams-branching/pom.xml
+++ /dev/null
@@ -1,83 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.branching
- kafka-streams-branching
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-branching
- KStream Branching Example
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-branching/src/test/java/kafka/streams/branching/KafkaStreamsBranchingSampleTests.java b/kafka-streams/kafka-streams-branching/src/test/java/kafka/streams/branching/KafkaStreamsBranchingSampleTests.java
deleted file mode 100644
index 1f77158..0000000
--- a/kafka-streams/kafka-streams-branching/src/test/java/kafka/streams/branching/KafkaStreamsBranchingSampleTests.java
+++ /dev/null
@@ -1,18 +0,0 @@
-package kafka.streams.branching;
-
-import org.junit.Ignore;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.springframework.boot.test.context.SpringBootTest;
-import org.springframework.test.context.junit4.SpringRunner;
-
-@RunWith(SpringRunner.class)
-@SpringBootTest
-public class KafkaStreamsBranchingSampleTests {
-
- @Test
- @Ignore
- public void contextLoads() {
- }
-
-}
diff --git a/kafka-streams/kafka-streams-dlq-sample/README.adoc b/kafka-streams/kafka-streams-dlq-sample/README.adoc
deleted file mode 100644
index 38b1680..0000000
--- a/kafka-streams/kafka-streams-dlq-sample/README.adoc
+++ /dev/null
@@ -1,52 +0,0 @@
-== What is this app?
-
-This is an example of a Spring Cloud Stream processor using Kafka Streams support.
-
-This is a demonstration of deserialization errors and DLQ in Kafka Streams binder.
-
-The example is based on the word count application from the https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/main/java/io/confluent/examples/streams/WordCountLambdaExample.java[reference documentation].
-It uses a single input and a single output.
-In essence, the application receives text messages from an input topic and computes word occurrence counts in a configurable time window and report that in an output topic.
-This sample uses lambda expressions and thus requires Java 8+.
-
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console producer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-producer.sh --broker-list 192.168.99.100:9092 --topic words`
-5. Start the console consumer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --topic counts`
-
-=== Running the app:
-
-Go to the root of the repository and do: `./mvnw clean package`
-
-`java -jar target/kstream-word-count-0.0.1-SNAPSHOT.jar`
-
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
-
-`spring.cloud.stream.kstream.binder.brokers=` +
-`spring.cloud.stream.kstream.binder.zkNodes=`
-
-The default application.yml file demonstrates native decoding by Kafka.
-The default value serializer is set to IntegerSerde and from the console send some ascii text data.
-You will see that the messages erred on deserialization end up in the DLQ topic - words-count-dlq.
-
-There is another yaml file provided (by-framework-decoding.yml).
-Use that as application.yml to see when it works when the deserialization done by the framework.
-In this case also, the messages on error appear in the DLQ topic.
-
-Look in the console for what is doing the serialization/deserializtion.
-You will see messages as below on the console:
-
-"Native decoding is disabled for input. Inbound message conversion done by Spring Cloud Stream."
-
-"Native encoding is disabled for counts. Outbound message conversion done by Spring Cloud Stream.
-
diff --git a/kafka-streams/kafka-streams-dlq-sample/docker/docker-compose.yml b/kafka-streams/kafka-streams-dlq-sample/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-dlq-sample/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-dlq-sample/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-dlq-sample/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-dlq-sample/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-dlq-sample/pom.xml b/kafka-streams/kafka-streams-dlq-sample/pom.xml
deleted file mode 100644
index 9a0d09a..0000000
--- a/kafka-streams/kafka-streams-dlq-sample/pom.xml
+++ /dev/null
@@ -1,83 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.dlq.sample
- kafka-streams-dlq-sample
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-dlq-sample
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/WordCountExample.java b/kafka-streams/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/WordCountExample.java
deleted file mode 100644
index 5421c5e..0000000
--- a/kafka-streams/kafka-streams-dlq-sample/src/main/java/kafka/streams/dlq/sample/WordCountExample.java
+++ /dev/null
@@ -1,101 +0,0 @@
-package kafka.streams.dlq.sample;
-
-import org.apache.kafka.common.serialization.Serde;
-import org.apache.kafka.common.serialization.Serdes;
-import org.apache.kafka.streams.KafkaStreams;
-import org.apache.kafka.streams.StreamsBuilder;
-import org.apache.kafka.streams.StreamsConfig;
-import org.apache.kafka.streams.errors.LogAndContinueExceptionHandler;
-import org.apache.kafka.streams.kstream.KStream;
-import org.apache.kafka.streams.kstream.KTable;
-import org.apache.kafka.streams.kstream.Produced;
-
-import java.util.Arrays;
-import java.util.Properties;
-import java.util.regex.Pattern;
-
-public class WordCountExample {
- public static void main(String[] args) throws Exception{
-
- final String bootstrapServers = args.length > 0 ? args[0] : "localhost:9092";
-
-
- // Set up serializers and deserializers, which we will use for overriding the default serdes
- // specified above.
- final Serde stringSerde = Serdes.String();
- final Serde longSerde = Serdes.Long();
-
- // In the subsequent lines we define the processing topology of the Streams application.
- final StreamsBuilder builder = new StreamsBuilder();
-
- // Construct a `KStream` from the input topic "TextLinesTopic", where message values
- // represent lines of text (for the sake of this example, we ignore whatever may be stored
- // in the message keys).
- //
- // Note: We could also just call `builder.stream("TextLinesTopic")` if we wanted to leverage
- // the default serdes specified in the Streams configuration above, because these defaults
- // match what's in the actual topic. However we explicitly set the deserializers in the
- // call to `stream()` below in order to show how that's done, too.
- final KStream textLines = builder.stream("words");
-
- final Pattern pattern = Pattern.compile("\\W+", Pattern.UNICODE_CHARACTER_CLASS);
-
- final KTable wordCounts = textLines
- // Split each text line, by whitespace, into words. The text lines are the record
- // values, i.e. we can ignore whatever data is in the record keys and thus invoke
- // `flatMapValues()` instead of the more generic `flatMap()`.
- .flatMapValues(value -> Arrays.asList(pattern.split(value.toLowerCase())))
- // Count the occurrences of each word (record key).
- //
- // This will change the stream type from `KStream` to `KTable`
- // (word -> count). In the `count` operation we must provide a name for the resulting KTable,
- // which will be used to name e.g. its associated state store and changelog topic.
- //
- // Note: no need to specify explicit serdes because the resulting key and value types match our default serde settings
- .groupBy((key, word) -> word)
- .count();
-
- // Write the `KTable` to the output topic.
- wordCounts.toStream().to("counts", Produced.with(stringSerde, longSerde));
-
- final Properties streamsConfiguration = new Properties();
- // Give the Streams application a unique name. The name must be unique in the Kafka cluster
- // against which the application is run.
- streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-lambda-example");
- streamsConfiguration.put(StreamsConfig.CLIENT_ID_CONFIG, "wordcount-lambda-example-client");
- // Where to find Kafka broker(s).
- streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
- // Specify default (de)serializers for record keys and for record values.
- streamsConfiguration.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
- streamsConfiguration.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
- // Records should be flushed every 10 seconds. This is less than the default
- // in order to keep this example interactive.
- streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 1000);
- // For illustrative purposes we disable record caches
- streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
-
- streamsConfiguration.put(StreamsConfig.DEFAULT_DESERIALIZATION_EXCEPTION_HANDLER_CLASS_CONFIG,
- LogAndContinueExceptionHandler.class);
-
- // Now that we have finished the definition of the processing topology we can actually run
- // it via `start()`. The Streams application as a whole can be launched just like any
- // normal Java application that has a `main()` method.
- final KafkaStreams streams = new KafkaStreams(builder.build(), streamsConfiguration);
- // Always (and unconditionally) clean local state prior to starting the processing topology.
- // We opt for this unconditional call here because this will make it easier for you to play around with the example
- // when resetting the application for doing a re-run (via the Application Reset Tool,
- // http://docs.confluent.io/current/streams/developer-guide.html#application-reset-tool).
- //
- // The drawback of cleaning up local state prior is that your app must rebuilt its local state from scratch, which
- // will take time and will require reading all the state-relevant data from the Kafka cluster over the network.
- // Thus in a production scenario you typically do not want to clean up always as we do here but rather only when it
- // is truly needed, i.e., only under certain conditions (e.g., the presence of a command line flag for your app).
- // See `ApplicationResetExample.java` for a production-like example.
- streams.cleanUp();
- streams.start();
-
- // Add shutdown hook to respond to SIGTERM and gracefully close Kafka Streams
- Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
- }
-
-}
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/README.adoc b/kafka-streams/kafka-streams-interactive-query-advanced/README.adoc
deleted file mode 100644
index 6ecfb4c..0000000
--- a/kafka-streams/kafka-streams-interactive-query-advanced/README.adoc
+++ /dev/null
@@ -1,38 +0,0 @@
-== What is this app?
-
-This is an example of a Spring Cloud Stream processor using Kafka Streams support.
-
-This example is a Spring Cloud Stream adaptation of this Kafka Streams sample: https://github.com/confluentinc/kafka-streams-examples/tree/4.0.0-post/src/main/java/io/confluent/examples/streams/interactivequeries/kafkamusic
-
-This sample demonstrates the concept of interactive queries in kafka streams.
-There is a REST service provided as part of the application that can be used to query the store interactively.
-
-==== Starting Kafka in a docker container
-
-* Skip these steps if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-
-=== Running the app:
-
-1. Start the confluent schema registry: The following command is based on the confluent platform.
-
-`./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties`
-
-2. Go to the root of the repository and do: `./mvnw clean package`
-
-`java -jar target/kafka-streams-interactive-query-0.0.1-SNAPSHOT.jar`
-
-3. Run the stand-alone `Producers` application to generate data and start the processing.
-Keep it running for a while.
-
-4. Go to the URL: http://localhost:8080/charts/top-five?genre=Punk
-keep refreshing the URL and you will see the song play count information changes.
-
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
-
-`spring.cloud.stream.kstream.binder.brokers=` +
-`spring.cloud.stream.kstream.binder.zkNodes=`
-
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/docker/docker-compose.yml b/kafka-streams/kafka-streams-interactive-query-advanced/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-interactive-query-advanced/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-interactive-query-advanced/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-interactive-query-advanced/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-interactive-query-advanced/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/README.adoc b/kafka-streams/kafka-streams-interactive-query-basic/README.adoc
deleted file mode 100644
index 99259c0..0000000
--- a/kafka-streams/kafka-streams-interactive-query-basic/README.adoc
+++ /dev/null
@@ -1,48 +0,0 @@
-== What is this app?
-
-This is an example of a Spring Cloud Stream processor using Kafka Streams support.
-
-The example is based on a contrived use case of tracking products by interactively querying their status.
-The program accepts product ID's and track their counts hitherto by interactively querying the underlying store. \
-This sample uses lambda expressions and thus requires Java 8+.
-
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console producer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic products`
-
-`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --key-deserializer org.apache.kafka.common.serialization.IntegerDeserializer --property print.key=true --topic product-counts print.value=true --value-deserializer org.apache.kafka.common.serialization.LongDeserializer`
-
-=== Running the app:
-
-Go to the root of the repository and do:
-
-`./mvnw clean package`
-
-`java -jar target/kafka-streams-interactive-query-basic-0.0.1-SNAPSHOT.jar --app.product.tracker.productIds=123,124,125`
-
-The above command will track products with ID's 123,124 and 125 and print their counts seen so far every 30 seconds.
-
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
-
-`spring.cloud.stream.kafka.streams.binder.brokers=` +
-`spring.cloud.stream.kafka.streams.binder.zkNodes=`
-
-Enter the following in the console producer (one line at a time) and watch the output on the console (or IDE) where the application is running.
-
-```
-{"id":"123"}
-{"id":"124"}
-{"id":"125"}
-{"id":"123"}
-{"id":"123"}
-{"id":"123"}
-```
-
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/docker/docker-compose.yml b/kafka-streams/kafka-streams-interactive-query-basic/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-interactive-query-basic/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-interactive-query-basic/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-interactive-query-basic/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-interactive-query-basic/pom.xml b/kafka-streams/kafka-streams-interactive-query-basic/pom.xml
deleted file mode 100644
index 042a81a..0000000
--- a/kafka-streams/kafka-streams-interactive-query-basic/pom.xml
+++ /dev/null
@@ -1,89 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.interactive.query
- kafka-streams-interactive-query-basic
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-interactive-query-basic
- Spring Cloud Stream sample for KStream interactive queries
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.RELEASE
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-message-channel/README.adoc b/kafka-streams/kafka-streams-message-channel/README.adoc
deleted file mode 100644
index 494b588..0000000
--- a/kafka-streams/kafka-streams-message-channel/README.adoc
+++ /dev/null
@@ -1,47 +0,0 @@
-== What is this app?
-
-This is an example of a Spring Cloud Stream processor using Kafka Streams support.
-
-The example is based on the word count application from the https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/main/java/io/confluent/examples/streams/WordCountLambdaExample.java[reference documentation].
-The application has two inputs one as a KStream and another one with the binding for a message channel.
-
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console producer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-producer.sh --broker-list 192.168.99.100:9092 --topic words`
-5. Start the console consumer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-consumer.sh --bootstrap-server 192.168.99.100:9092 --topic counts`
-
-=== Running the app:
-
-Go to the root of the repository and do: `./mvnw clean package`
-
-`java -jar target/kstream-word-count-0.0.1-SNAPSHOT.jar`
-
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
-
-`spring.cloud.stream.kstream.binder.brokers=` +
-`spring.cloud.stream.kstream.binder.zkNodes=`
-
-Enter some text in the console producer and watch the output in the console consumer.
-
-Also watch the console for logging statements from the regular sink StreamListener method.
-
-The default time window is configured for 5 seconds and you can change that using the following property.
-
-`spring.cloud.stream.kstream.timeWindow.length` (value is expressed in milliseconds)
-
-In order to switch to a hopping window, you can use the `spring.cloud.stream.kstream.timeWindow.advnceBy` (value in milliseconds).
-This will create an overlapped hopping windows depending on the value you provide.
-
-Here is an example with 2 overlapping windows (window length of 10 seconds and a hop (advance) by 5 seconds:
-
-`java -jar target/kstream-word-count-0.0.1-SNAPSHOT.jar --spring.cloud.stream.kstream.timeWindow.length=10000 --spring.cloud.stream.kstream.timeWindow.advnceBy=5000`
\ No newline at end of file
diff --git a/kafka-streams/kafka-streams-message-channel/docker/docker-compose.yml b/kafka-streams/kafka-streams-message-channel/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-message-channel/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-message-channel/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-message-channel/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-message-channel/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-message-channel/pom.xml b/kafka-streams/kafka-streams-message-channel/pom.xml
deleted file mode 100644
index a469828..0000000
--- a/kafka-streams/kafka-streams-message-channel/pom.xml
+++ /dev/null
@@ -1,88 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.message.channel
- kafka-streams-message-channel
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-message-channel
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-product-tracker/docker/docker-compose.yml b/kafka-streams/kafka-streams-product-tracker/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-product-tracker/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-product-tracker/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-product-tracker/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-product-tracker/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-product-tracker/pom.xml b/kafka-streams/kafka-streams-product-tracker/pom.xml
deleted file mode 100644
index c5315a5..0000000
--- a/kafka-streams/kafka-streams-product-tracker/pom.xml
+++ /dev/null
@@ -1,89 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.product.tracker
- kafka-streams-product-tracker
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-product-tracker
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.RELEASE
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-table-join/README.adoc b/kafka-streams/kafka-streams-table-join/README.adoc
deleted file mode 100644
index da59dda..0000000
--- a/kafka-streams/kafka-streams-table-join/README.adoc
+++ /dev/null
@@ -1,33 +0,0 @@
-== What is this app?
-
-This is an example of a Spring Cloud Stream processor using Kafka Streams support.
-
-This example is a Spring Cloud Stream adaptation of this Kafka Streams sample: https://github.com/confluentinc/kafka-streams-examples/blob/4.0.0-post/src/test/java/io/confluent/examples/streams/StreamToTableJoinIntegrationTest.java
-
-The application uses two inputs - one KStream for user-clicks and a KTable for user-regions.
-Then it joins the information from stream to table to find out total clicks per region.
-
-==== Starting Kafka in a docker container
-
-* Skip steps 1-3 if you already have a non-Docker Kafka environment.
-
-1. Go to the docker directory in this repo and invoke the command `docker-compose up -d`.
-2. Ensure that in the docker directory and then invoke the script `start-kafka-shell.sh`
-3. cd $KAFKA_HOME
-4. Start the console consumer: +
-Assuming that you are running kafka on a docker container on mac osx. Change the zookeeper IP address accordingly otherwise. +
-`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic output-topic --key-deserializer org.apache.kafka.common.serialization.StringDeserializer --value-deserializer org.apache.kafka.common.serialization.LongDeserializer --property print.key=true --property key.separator="-"`
-
-=== Running the app:
-
-Go to the root of the repository and do: `./mvnw clean package`
-
-`java -jar target/kafka-streams-table-join-0.0.1-SNAPSHOT.jar`
-
-* By default we use the docker container IP (mac osx specific) in the `application.yml` for Kafka broker and zookeeper.
-Change it in `application.yml` (which requires a rebuild) or pass them as runtime arguments as below.
-
-`spring.cloud.stream.kstream.binder.brokers=` +
-`spring.cloud.stream.kstream.binder.zkNodes=`
-
-Run the stand-alone `Producers` application to generate some data and watch the output on the console producer.
\ No newline at end of file
diff --git a/kafka-streams/kafka-streams-table-join/docker/docker-compose.yml b/kafka-streams/kafka-streams-table-join/docker/docker-compose.yml
deleted file mode 100644
index ecf41fb..0000000
--- a/kafka-streams/kafka-streams-table-join/docker/docker-compose.yml
+++ /dev/null
@@ -1,14 +0,0 @@
-version: '2'
-services:
- zookeeper:
- image: wurstmeister/zookeeper
- ports:
- - "2181:2181"
- kafka:
- image: wurstmeister/kafka
- ports:
- - "9092:9092"
- environment:
- KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
- KAFKA_ADVERTISED_PORT: 9092
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
diff --git a/kafka-streams/kafka-streams-table-join/docker/start-kafka-shell.sh b/kafka-streams/kafka-streams-table-join/docker/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/kafka-streams/kafka-streams-table-join/docker/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/kafka-streams/kafka-streams-table-join/pom.xml b/kafka-streams/kafka-streams-table-join/pom.xml
deleted file mode 100644
index b933ae0..0000000
--- a/kafka-streams/kafka-streams-table-join/pom.xml
+++ /dev/null
@@ -1,83 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.table.join
- kafka-streams-table-join
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-table-join
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-word-count/pom.xml b/kafka-streams/kafka-streams-word-count/pom.xml
deleted file mode 100644
index 5cee85b..0000000
--- a/kafka-streams/kafka-streams-word-count/pom.xml
+++ /dev/null
@@ -1,83 +0,0 @@
-
-
- 4.0.0
-
- kafka.streams.word.count
- kafka-streams-word-count
- 0.0.1-SNAPSHOT
- jar
-
- kafka-streams-word-count
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.kafka
- spring-kafka
- 2.1.3.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka-streams
- 2.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/kafka-streams/kafka-streams-word-count/src/main/resources/logback.xml b/kafka-streams/kafka-streams-word-count/src/main/resources/logback.xml
deleted file mode 100644
index 870ac9e..0000000
--- a/kafka-streams/kafka-streams-word-count/src/main/resources/logback.xml
+++ /dev/null
@@ -1,12 +0,0 @@
-
-
-
-
- %d{ISO8601} %5p %t %c{2}:%L - %m%n
-
-
-
-
-
-
-
\ No newline at end of file
diff --git a/kinesis-produce-consume/.mvn b/kinesis-produce-consume/.mvn
deleted file mode 120000
index 19172e1..0000000
--- a/kinesis-produce-consume/.mvn
+++ /dev/null
@@ -1 +0,0 @@
-../.mvn
\ No newline at end of file
diff --git a/kinesis-produce-consume/pom.xml b/kinesis-produce-consume/pom.xml
deleted file mode 100644
index 94eff6a..0000000
--- a/kinesis-produce-consume/pom.xml
+++ /dev/null
@@ -1,96 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- kinesis-produce-consume
- 0.0.1-SNAPSHOT
- jar
- Demo project for spring-cloud-stream-kinesis-binder
-
- kinesis-produce-consume
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.RELEASE
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-starter-stream-kinesis
- 1.0.0.BUILD-SNAPSHOT
-
-
-
- org.springframework.boot
- spring-boot-starter-security
-
-
-
- org.springframework.boot
- spring-boot-starter-data-rest
-
-
-
- org.springframework.boot
- spring-boot-starter-data-jpa
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
-
diff --git a/kinesis-samples/kinesis-produce-consume/.mvn b/kinesis-samples/kinesis-produce-consume/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/kinesis-samples/kinesis-produce-consume/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/kinesis-produce-consume/README.adoc b/kinesis-samples/kinesis-produce-consume/README.adoc
similarity index 100%
rename from kinesis-produce-consume/README.adoc
rename to kinesis-samples/kinesis-produce-consume/README.adoc
diff --git a/dynamic-destination-source/mvnw b/kinesis-samples/kinesis-produce-consume/mvnw
similarity index 100%
rename from dynamic-destination-source/mvnw
rename to kinesis-samples/kinesis-produce-consume/mvnw
diff --git a/dynamic-destination-source/mvnw.cmd b/kinesis-samples/kinesis-produce-consume/mvnw.cmd
old mode 100644
new mode 100755
similarity index 100%
rename from dynamic-destination-source/mvnw.cmd
rename to kinesis-samples/kinesis-produce-consume/mvnw.cmd
diff --git a/kinesis-samples/kinesis-produce-consume/pom.xml b/kinesis-samples/kinesis-produce-consume/pom.xml
new file mode 100644
index 0000000..a72e1fe
--- /dev/null
+++ b/kinesis-samples/kinesis-produce-consume/pom.xml
@@ -0,0 +1,51 @@
+
+
+ 4.0.0
+
+ kinesis-produce-consume
+ 0.0.1-SNAPSHOT
+ jar
+ kinesis-produce-consume
+ Spring Cloud Stream Kinesis Sample
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-starter-stream-kinesis
+ 1.0.0.BUILD-SNAPSHOT
+
+
+
+ org.springframework.boot
+ spring-boot-starter-security
+
+
+
+ org.springframework.boot
+ spring-boot-starter-data-rest
+
+
+
+ org.springframework.boot
+ spring-boot-starter-data-jpa
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/kinesis-produce-consume/src/main/java/demo/KinesisApplication.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/KinesisApplication.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/KinesisApplication.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/KinesisApplication.java
diff --git a/kinesis-produce-consume/src/main/java/demo/controller/OrderController.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/controller/OrderController.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/controller/OrderController.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/controller/OrderController.java
diff --git a/kinesis-produce-consume/src/main/java/demo/data/Order.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/data/Order.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/data/Order.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/data/Order.java
diff --git a/kinesis-produce-consume/src/main/java/demo/repository/OrderRepository.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/repository/OrderRepository.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/repository/OrderRepository.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/repository/OrderRepository.java
diff --git a/kinesis-produce-consume/src/main/java/demo/stream/Event.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/Event.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/stream/Event.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/Event.java
diff --git a/kinesis-produce-consume/src/main/java/demo/stream/OrderProcessor.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/OrderProcessor.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/stream/OrderProcessor.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/OrderProcessor.java
diff --git a/kinesis-produce-consume/src/main/java/demo/stream/OrderStreamConfiguration.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/OrderStreamConfiguration.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/stream/OrderStreamConfiguration.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/OrderStreamConfiguration.java
diff --git a/kinesis-produce-consume/src/main/java/demo/stream/OrdersSource.java b/kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/OrdersSource.java
similarity index 100%
rename from kinesis-produce-consume/src/main/java/demo/stream/OrdersSource.java
rename to kinesis-samples/kinesis-produce-consume/src/main/java/demo/stream/OrdersSource.java
diff --git a/kinesis-produce-consume/src/main/resources/application.yml b/kinesis-samples/kinesis-produce-consume/src/main/resources/application.yml
similarity index 100%
rename from kinesis-produce-consume/src/main/resources/application.yml
rename to kinesis-samples/kinesis-produce-consume/src/main/resources/application.yml
diff --git a/kinesis-produce-consume/src/main/resources/bootstrap.yml b/kinesis-samples/kinesis-produce-consume/src/main/resources/bootstrap.yml
similarity index 100%
rename from kinesis-produce-consume/src/main/resources/bootstrap.yml
rename to kinesis-samples/kinesis-produce-consume/src/main/resources/bootstrap.yml
diff --git a/kinesis-samples/pom.xml b/kinesis-samples/pom.xml
new file mode 100644
index 0000000..bb59b81
--- /dev/null
+++ b/kinesis-samples/pom.xml
@@ -0,0 +1,14 @@
+
+
+ 4.0.0
+ spring.cloud.stream.samples
+ kinesis-samples
+ 0.0.1-SNAPSHOT
+ pom
+ kinesis-samples
+ Collection of Spring Cloud Stream Aggregate Samples
+
+
+ kinesis-produce-consume
+
+
diff --git a/multi-io-samples/multi-io/.mvn b/multi-io-samples/multi-io/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/multi-io-samples/multi-io/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/multi-io/README.adoc b/multi-io-samples/multi-io/README.adoc
similarity index 80%
rename from multi-io/README.adoc
rename to multi-io-samples/multi-io/README.adoc
index 5aa1f2b..eddf1e6 100644
--- a/multi-io/README.adoc
+++ b/multi-io-samples/multi-io/README.adoc
@@ -56,4 +56,20 @@ Received message FromSource2
At Sink1
******************
Received message FromSource1
-```
\ No newline at end of file
+```
+
+* `docker-compose down`
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/multi-io-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
+
+
diff --git a/multi-io-samples/multi-io/docker-compose-rabbit.yml b/multi-io-samples/multi-io/docker-compose-rabbit.yml
new file mode 100644
index 0000000..7c3da92
--- /dev/null
+++ b/multi-io-samples/multi-io/docker-compose-rabbit.yml
@@ -0,0 +1,7 @@
+version: '3'
+services:
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/reactive-processor/docker-compose.yml b/multi-io-samples/multi-io/docker-compose.yml
similarity index 91%
rename from reactive-processor/docker-compose.yml
rename to multi-io-samples/multi-io/docker-compose.yml
index d47d783..8b8692a 100644
--- a/reactive-processor/docker-compose.yml
+++ b/multi-io-samples/multi-io/docker-compose.yml
@@ -1,7 +1,8 @@
-version: '2'
+version: '3'
services:
kafka:
image: wurstmeister/kafka
+ container_name: kafka
ports:
- "9092:9092"
environment:
diff --git a/jdbc-sink/mvnw b/multi-io-samples/multi-io/mvnw
similarity index 100%
rename from jdbc-sink/mvnw
rename to multi-io-samples/multi-io/mvnw
diff --git a/jdbc-sink/mvnw.cmd b/multi-io-samples/multi-io/mvnw.cmd
similarity index 100%
rename from jdbc-sink/mvnw.cmd
rename to multi-io-samples/multi-io/mvnw.cmd
diff --git a/multi-io-samples/multi-io/pom.xml b/multi-io-samples/multi-io/pom.xml
new file mode 100644
index 0000000..678bf43
--- /dev/null
+++ b/multi-io-samples/multi-io/pom.xml
@@ -0,0 +1,64 @@
+
+
+ 4.0.0
+
+ multi-io
+ 0.0.1-SNAPSHOT
+ jar
+ multi-io
+ Spring Cloud Stream Sample JDBC Source App
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.cloud
+ spring-cloud-stream-test-support
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/multi-io/src/main/java/demo/MultipleIOChannelsApplication.java b/multi-io-samples/multi-io/src/main/java/demo/MultipleIOChannelsApplication.java
similarity index 100%
rename from multi-io/src/main/java/demo/MultipleIOChannelsApplication.java
rename to multi-io-samples/multi-io/src/main/java/demo/MultipleIOChannelsApplication.java
diff --git a/multi-io/src/main/java/demo/SampleSink.java b/multi-io-samples/multi-io/src/main/java/demo/SampleSink.java
similarity index 100%
rename from multi-io/src/main/java/demo/SampleSink.java
rename to multi-io-samples/multi-io/src/main/java/demo/SampleSink.java
diff --git a/multi-io/src/main/java/demo/SampleSource.java b/multi-io-samples/multi-io/src/main/java/demo/SampleSource.java
similarity index 100%
rename from multi-io/src/main/java/demo/SampleSource.java
rename to multi-io-samples/multi-io/src/main/java/demo/SampleSource.java
diff --git a/multi-io/src/main/resources/application.yml b/multi-io-samples/multi-io/src/main/resources/application.yml
similarity index 100%
rename from multi-io/src/main/resources/application.yml
rename to multi-io-samples/multi-io/src/main/resources/application.yml
diff --git a/multi-io/src/test/java/demo/ModuleApplicationTests.java b/multi-io-samples/multi-io/src/test/java/demo/ModuleApplicationTests.java
similarity index 100%
rename from multi-io/src/test/java/demo/ModuleApplicationTests.java
rename to multi-io-samples/multi-io/src/test/java/demo/ModuleApplicationTests.java
diff --git a/multi-io-samples/pom.xml b/multi-io-samples/pom.xml
new file mode 100644
index 0000000..f2c00d7
--- /dev/null
+++ b/multi-io-samples/pom.xml
@@ -0,0 +1,14 @@
+
+
+ 4.0.0
+ spring.cloud.stream.samples
+ multi-io-samples
+ 0.0.1-SNAPSHOT
+ pom
+ multi-io-samples
+ Collection of Spring Cloud Stream Aggregate Samples
+
+
+ multi-io
+
+
diff --git a/multi-io/.mvn b/multi-io/.mvn
deleted file mode 120000
index 19172e1..0000000
--- a/multi-io/.mvn
+++ /dev/null
@@ -1 +0,0 @@
-../.mvn
\ No newline at end of file
diff --git a/multi-io/pom.xml b/multi-io/pom.xml
deleted file mode 100644
index 1358257..0000000
--- a/multi-io/pom.xml
+++ /dev/null
@@ -1,89 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- multi-io
- 0.0.1-SNAPSHOT
- jar
-
- multi-io
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
diff --git a/multi-io/start-kafka-shell.sh b/multi-io/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/multi-io/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/multibinder-differentsystems/.mvn b/multibinder-differentsystems/.mvn
deleted file mode 120000
index 19172e1..0000000
--- a/multibinder-differentsystems/.mvn
+++ /dev/null
@@ -1 +0,0 @@
-../.mvn
\ No newline at end of file
diff --git a/multibinder-differentsystems/README.adoc b/multibinder-differentsystems/README.adoc
deleted file mode 100644
index 5c7c840..0000000
--- a/multibinder-differentsystems/README.adoc
+++ /dev/null
@@ -1,12 +0,0 @@
-== Spring Cloud Stream Multibinder Application with Different Systems
-
-This example shows how to run a Spring Cloud Stream application with the same binder type configured for two separate Kafka clusters.
-
-To run the example, command line parameters for the Zookeeper ensembles and Kafka clusters must be provided, as in the following example:
-
-````
-java -jar spring-cloud-stream-samples/multibinder-differentsystems/target/spring-cloud-stream-sample-multibinder-differentsystems-1.0.0.BUILD-SNAPSHOT-exec.jar --kafkaBroker1=localhost:9092 --zk1=localhost:2181 --kafkaBroker2=localhost:9093 --zk2=localhost:2182
-````
-
-This assumes that two Kafka clusters and Zookeeper ensembles are running locally. Alternatively, the default values of `localhost:9092` and `localhost:2181` can be provided for both clusters.
-
diff --git a/multibinder-differentsystems/pom.xml b/multibinder-differentsystems/pom.xml
deleted file mode 100644
index 16f668d..0000000
--- a/multibinder-differentsystems/pom.xml
+++ /dev/null
@@ -1,96 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- multibinder-differentsystems
- 0.0.1-SNAPSHOT
- jar
-
- multibinder-differentsystems
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.RELEASE
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
- org.springframework.kafka
- spring-kafka-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/multibinder-differentsystems/src/main/java/multibinder/BridgeTransformer.java b/multibinder-differentsystems/src/main/java/multibinder/BridgeTransformer.java
deleted file mode 100644
index da680be..0000000
--- a/multibinder-differentsystems/src/main/java/multibinder/BridgeTransformer.java
+++ /dev/null
@@ -1,35 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package multibinder;
-
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.annotation.StreamListener;
-import org.springframework.cloud.stream.messaging.Processor;
-import org.springframework.messaging.handler.annotation.SendTo;
-
-/**
- * @author Marius Bogoevici
- */
-@EnableBinding(Processor.class)
-public class BridgeTransformer {
-
- @StreamListener(Processor.INPUT)
- @SendTo(Processor.OUTPUT)
- public Object transform(Object payload) {
- return payload;
- }
-}
diff --git a/multibinder-differentsystems/src/main/resources/application.yml b/multibinder-differentsystems/src/main/resources/application.yml
deleted file mode 100644
index bbddc06..0000000
--- a/multibinder-differentsystems/src/main/resources/application.yml
+++ /dev/null
@@ -1,34 +0,0 @@
-server:
- port: 8082
-spring:
- cloud:
- stream:
- bindings:
- input:
- destination: dataIn
- binder: kafka1
- group: testGroup
-# output:
-# destination: dataOut
-# binder: kafka2
- binders:
- kafka1:
- type: kafka
- environment:
- spring:
- cloud:
- stream:
- kafka:
- binder:
- brokers: ${kafkaBroker1}
- zkNodes: ${zk1}
-# kafka2:
-# type: kafka
-# environment:
-# spring:
-# cloud:
-# stream:
-# kafka:
-# binder:
-# brokers: ${kafkaBroker2}
-# zkNodes: ${zk2}
diff --git a/multibinder-samples/multibinder-kafka-rabbit/.mvn b/multibinder-samples/multibinder-kafka-rabbit/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/multibinder-samples/multibinder-kafka-rabbit/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/multibinder-samples/multibinder-kafka-rabbit/README.adoc b/multibinder-samples/multibinder-kafka-rabbit/README.adoc
new file mode 100644
index 0000000..d439a2f
--- /dev/null
+++ b/multibinder-samples/multibinder-kafka-rabbit/README.adoc
@@ -0,0 +1,33 @@
+== Spring Cloud Stream Multibinder Application with Different Systems
+
+This example shows how to run a Spring Cloud Stream application with two different binder types (Kafka and RabbitMQ)
+
+## Running the application
+
+The following instructions assume that you are running Kafka and RabbitMQ as a Docker images.
+
+Go to the application root:
+
+* `docker-compose up -d`
+
+This brings up both Kafka and RabbitMQ clusters in docker containers.
+Local ports mapped are 9092 and 5672 for Kafka and RabbitMQ respectively.
+
+* `./mvnw clean package`
+
+The sample comes with a convenient test producer and consumer to see the processor in action.
+After running the program, watch your console, every second some data is sent to Kafka and it is received through RabbitMQ.
+
+```
+java -jar target/multibinder-0.0.1-SNAPSHOT.jar
+```
+
+You can also send data directly to Kafka topic:
+
+`docker exec -it kafka-multibinder /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic dataIn`
+
+Enter some text and verify it comes through the console output (test consumer)
+
+You can also go the RabbitMQ UI: `http:localhost:15672`
+
+Once you are done testing: `docker-compose down`
diff --git a/multibinder-samples/multibinder-kafka-rabbit/docker-compose.yml b/multibinder-samples/multibinder-kafka-rabbit/docker-compose.yml
new file mode 100644
index 0000000..0697236
--- /dev/null
+++ b/multibinder-samples/multibinder-kafka-rabbit/docker-compose.yml
@@ -0,0 +1,25 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-multibinder
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
+ rabbitmq:
+ image: rabbitmq:management
+ container_name: rabbit
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/jdbc-source/mvnw b/multibinder-samples/multibinder-kafka-rabbit/mvnw
similarity index 100%
rename from jdbc-source/mvnw
rename to multibinder-samples/multibinder-kafka-rabbit/mvnw
diff --git a/jdbc-source/mvnw.cmd b/multibinder-samples/multibinder-kafka-rabbit/mvnw.cmd
similarity index 100%
rename from jdbc-source/mvnw.cmd
rename to multibinder-samples/multibinder-kafka-rabbit/mvnw.cmd
diff --git a/multibinder/pom.xml b/multibinder-samples/multibinder-kafka-rabbit/pom.xml
similarity index 61%
rename from multibinder/pom.xml
rename to multibinder-samples/multibinder-kafka-rabbit/pom.xml
index dd9a644..5a38533 100644
--- a/multibinder/pom.xml
+++ b/multibinder-samples/multibinder-kafka-rabbit/pom.xml
@@ -2,26 +2,20 @@
4.0.0
- spring-cloud-stream-sample-multibinder
+ multibinder-kafka-rabbit
+ 0.0.1-SNAPSHOTjar
- spring-cloud-stream-sample-multibinder
- Demo project for multiple binders of different types (Redis and Rabbit)
+ multibinder-kafka-rabbit
+ Spring Cloud Stream Multibinder Sample
- org.springframework.cloud
- spring-cloud-stream-samples
- 1.2.0.BUILD-SNAPSHOT
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
-
- multibinder.MultibinderApplication
-
-
-
- org.springframework.cloud
- spring-cloud-stream
- org.springframework.cloudspring-cloud-stream-binder-rabbit
@@ -35,12 +29,6 @@
spring-cloud-stream-binder-rabbit-test-supporttest
-
- org.springframework.boot
- spring-boot-configuration-processor
- true
-
-
org.springframework.bootspring-boot-starter-test
@@ -57,11 +45,7 @@
org.springframework.bootspring-boot-maven-plugin
-
- exec
-
-
diff --git a/multibinder-samples/multibinder-kafka-rabbit/src/main/java/multibinder/BridgeTransformer.java b/multibinder-samples/multibinder-kafka-rabbit/src/main/java/multibinder/BridgeTransformer.java
new file mode 100644
index 0000000..a777f4f
--- /dev/null
+++ b/multibinder-samples/multibinder-kafka-rabbit/src/main/java/multibinder/BridgeTransformer.java
@@ -0,0 +1,94 @@
+/*
+ * Copyright 2015 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package multibinder;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.springframework.cloud.stream.annotation.EnableBinding;
+import org.springframework.cloud.stream.annotation.Input;
+import org.springframework.cloud.stream.annotation.Output;
+import org.springframework.cloud.stream.annotation.StreamListener;
+import org.springframework.cloud.stream.messaging.Processor;
+import org.springframework.context.annotation.Bean;
+import org.springframework.integration.annotation.InboundChannelAdapter;
+import org.springframework.integration.annotation.Poller;
+import org.springframework.integration.core.MessageSource;
+import org.springframework.messaging.MessageChannel;
+import org.springframework.messaging.SubscribableChannel;
+import org.springframework.messaging.handler.annotation.SendTo;
+import org.springframework.messaging.support.GenericMessage;
+
+import java.util.concurrent.atomic.AtomicBoolean;
+
+/**
+ * @author Marius Bogoevici
+ * @author Soby Chacko
+ */
+@EnableBinding(Processor.class)
+public class BridgeTransformer {
+
+ @StreamListener(Processor.INPUT)
+ @SendTo(Processor.OUTPUT)
+ public Object transform(Object payload) {
+ return payload;
+ }
+
+ //Following source is used as test producer.
+ @EnableBinding(TestSource.class)
+ static class TestProducer {
+
+ private AtomicBoolean semaphore = new AtomicBoolean(true);
+
+ @Bean
+ @InboundChannelAdapter(channel = TestSource.OUTPUT, poller = @Poller(fixedDelay = "1000"))
+ public MessageSource sendTestData() {
+ return () ->
+ new GenericMessage<>(this.semaphore.getAndSet(!this.semaphore.get()) ? "foo" : "bar");
+
+ }
+ }
+
+ //Following sink is used as test consumer for the above processor. It logs the data received through the processor.
+ @EnableBinding(TestSink.class)
+ static class TestConsumer {
+
+ private final Log logger = LogFactory.getLog(getClass());
+
+ @StreamListener(TestSink.INPUT)
+ public void receive(String data) {
+ logger.info("Data received..." + data);
+ }
+ }
+
+ interface TestSink {
+
+ String INPUT = "input1";
+
+ @Input(INPUT)
+ SubscribableChannel input1();
+
+ }
+
+ interface TestSource {
+
+ String OUTPUT = "output1";
+
+ @Output(TestSource.OUTPUT)
+ MessageChannel output();
+
+ }
+}
diff --git a/multibinder-differentsystems/src/main/java/multibinder/MultibinderApplication.java b/multibinder-samples/multibinder-kafka-rabbit/src/main/java/multibinder/MultibinderApplication.java
similarity index 100%
rename from multibinder-differentsystems/src/main/java/multibinder/MultibinderApplication.java
rename to multibinder-samples/multibinder-kafka-rabbit/src/main/java/multibinder/MultibinderApplication.java
diff --git a/multibinder-samples/multibinder-kafka-rabbit/src/main/resources/application.yml b/multibinder-samples/multibinder-kafka-rabbit/src/main/resources/application.yml
new file mode 100644
index 0000000..39f3a21
--- /dev/null
+++ b/multibinder-samples/multibinder-kafka-rabbit/src/main/resources/application.yml
@@ -0,0 +1,19 @@
+spring:
+ cloud:
+ stream:
+ bindings:
+ input:
+ destination: dataIn
+ binder: kafka
+ output:
+ destination: dataOut
+ binder: rabbit
+ #Test sink binding (used for testing)
+ output1:
+ destination: dataIn
+ binder: kafka
+ #Test sink binding (used for testing)
+ input1:
+ destination: dataOut
+ binder: rabbit
+
diff --git a/multibinder/src/test/java/multibinder/RabbitAndKafkaBinderApplicationTests.java b/multibinder-samples/multibinder-kafka-rabbit/src/test/java/multibinder/RabbitAndKafkaBinderApplicationTests.java
similarity index 100%
rename from multibinder/src/test/java/multibinder/RabbitAndKafkaBinderApplicationTests.java
rename to multibinder-samples/multibinder-kafka-rabbit/src/test/java/multibinder/RabbitAndKafkaBinderApplicationTests.java
diff --git a/dynamic-destination-source/.mvn b/multibinder-samples/multibinder-two-kafka-clusters/.mvn
similarity index 100%
rename from dynamic-destination-source/.mvn
rename to multibinder-samples/multibinder-two-kafka-clusters/.mvn
diff --git a/multibinder-samples/multibinder-two-kafka-clusters/README.adoc b/multibinder-samples/multibinder-two-kafka-clusters/README.adoc
new file mode 100644
index 0000000..ef4b649
--- /dev/null
+++ b/multibinder-samples/multibinder-two-kafka-clusters/README.adoc
@@ -0,0 +1,38 @@
+== Spring Cloud Stream Multibinder Application with Different Systems
+
+This example shows how to run a Spring Cloud Stream application with the same binder type configured for two separate Kafka clusters.
+
+
+## Running the application
+
+The following instructions assume that you are running Kafka as a Docker image.
+
+* Go to the application root
+* `docker-compose up -d`
+
+This brings up two Kafka clusters in docker containers.
+Local ports mapped for kafka are 9092 and 9093 (Zookeeper local parts mapped are 2181 and 2182).
+
+* `./mvnw clean package`
+
+The sample comes with a convenient test producer and consumer to see the processor in action.
+After running the program, watch your console, every second some data is sent to Kafka cluster 1 and it is received through Kafka cluster 2.
+
+To run the example, command line parameters for the Zookeeper ensembles and Kafka clusters must be provided, as in the following example:
+```
+java -jar target/multibinder-differentsystems-0.0.1-SNAPSHOT.jar --kafkaBroker1=localhost:9092 --zk1=localhost:2181 --kafkaBroker2=localhost:9093 --zk2=localhost:2182
+```
+
+Alternatively, the default values of `localhost:9092` and `localhost:2181` can be provided for both clusters.
+
+Assuming you are running two dockerized Kafka clusters as above.
+
+Issue the following commands:
+
+`docker exec -it kafka-multibinder-1 /opt/kafka/bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic dataIn`
+
+On another terminal:
+
+`docker exec -it kafka-multibinder-2 /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9093 --topic dataOut`
+
+Enter some text on the first one and the same text appears on the second one.
diff --git a/multibinder-samples/multibinder-two-kafka-clusters/docker-compose.yml b/multibinder-samples/multibinder-two-kafka-clusters/docker-compose.yml
new file mode 100644
index 0000000..e7082eb
--- /dev/null
+++ b/multibinder-samples/multibinder-two-kafka-clusters/docker-compose.yml
@@ -0,0 +1,36 @@
+version: '3'
+services:
+ kafka1:
+ image: wurstmeister/kafka
+ container_name: kafka-multibinder-1
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper1:2181
+ depends_on:
+ - zookeeper1
+ zookeeper1:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper1
+ kafka2:
+ image: wurstmeister/kafka
+ container_name: kafka-multibinder-2
+ ports:
+ - "9093:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper2:2181
+ depends_on:
+ - zookeeper2
+ zookeeper2:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2182:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper2
\ No newline at end of file
diff --git a/kinesis-produce-consume/mvnw b/multibinder-samples/multibinder-two-kafka-clusters/mvnw
similarity index 100%
rename from kinesis-produce-consume/mvnw
rename to multibinder-samples/multibinder-two-kafka-clusters/mvnw
diff --git a/kinesis-produce-consume/mvnw.cmd b/multibinder-samples/multibinder-two-kafka-clusters/mvnw.cmd
old mode 100755
new mode 100644
similarity index 100%
rename from kinesis-produce-consume/mvnw.cmd
rename to multibinder-samples/multibinder-two-kafka-clusters/mvnw.cmd
diff --git a/multibinder-samples/multibinder-two-kafka-clusters/pom.xml b/multibinder-samples/multibinder-two-kafka-clusters/pom.xml
new file mode 100644
index 0000000..fd134ee
--- /dev/null
+++ b/multibinder-samples/multibinder-two-kafka-clusters/pom.xml
@@ -0,0 +1,50 @@
+
+
+ 4.0.0
+
+ multibinder-two-kafka-clusters
+ 0.0.1-SNAPSHOT
+ jar
+ multibinder-two-kafka-clusters
+ Spring Cloud Stream Multibinder Two Kafka Clusters Sample
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+ 2.0.0.BUILD-SNAPSHOT
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka-core
+ 2.0.0.BUILD-SNAPSHOT
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.kafka
+ spring-kafka-test
+ test
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/multibinder-samples/multibinder-two-kafka-clusters/src/main/java/multibinder/BridgeTransformer.java b/multibinder-samples/multibinder-two-kafka-clusters/src/main/java/multibinder/BridgeTransformer.java
new file mode 100644
index 0000000..a777f4f
--- /dev/null
+++ b/multibinder-samples/multibinder-two-kafka-clusters/src/main/java/multibinder/BridgeTransformer.java
@@ -0,0 +1,94 @@
+/*
+ * Copyright 2015 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package multibinder;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.springframework.cloud.stream.annotation.EnableBinding;
+import org.springframework.cloud.stream.annotation.Input;
+import org.springframework.cloud.stream.annotation.Output;
+import org.springframework.cloud.stream.annotation.StreamListener;
+import org.springframework.cloud.stream.messaging.Processor;
+import org.springframework.context.annotation.Bean;
+import org.springframework.integration.annotation.InboundChannelAdapter;
+import org.springframework.integration.annotation.Poller;
+import org.springframework.integration.core.MessageSource;
+import org.springframework.messaging.MessageChannel;
+import org.springframework.messaging.SubscribableChannel;
+import org.springframework.messaging.handler.annotation.SendTo;
+import org.springframework.messaging.support.GenericMessage;
+
+import java.util.concurrent.atomic.AtomicBoolean;
+
+/**
+ * @author Marius Bogoevici
+ * @author Soby Chacko
+ */
+@EnableBinding(Processor.class)
+public class BridgeTransformer {
+
+ @StreamListener(Processor.INPUT)
+ @SendTo(Processor.OUTPUT)
+ public Object transform(Object payload) {
+ return payload;
+ }
+
+ //Following source is used as test producer.
+ @EnableBinding(TestSource.class)
+ static class TestProducer {
+
+ private AtomicBoolean semaphore = new AtomicBoolean(true);
+
+ @Bean
+ @InboundChannelAdapter(channel = TestSource.OUTPUT, poller = @Poller(fixedDelay = "1000"))
+ public MessageSource sendTestData() {
+ return () ->
+ new GenericMessage<>(this.semaphore.getAndSet(!this.semaphore.get()) ? "foo" : "bar");
+
+ }
+ }
+
+ //Following sink is used as test consumer for the above processor. It logs the data received through the processor.
+ @EnableBinding(TestSink.class)
+ static class TestConsumer {
+
+ private final Log logger = LogFactory.getLog(getClass());
+
+ @StreamListener(TestSink.INPUT)
+ public void receive(String data) {
+ logger.info("Data received..." + data);
+ }
+ }
+
+ interface TestSink {
+
+ String INPUT = "input1";
+
+ @Input(INPUT)
+ SubscribableChannel input1();
+
+ }
+
+ interface TestSource {
+
+ String OUTPUT = "output1";
+
+ @Output(TestSource.OUTPUT)
+ MessageChannel output();
+
+ }
+}
diff --git a/multibinder/src/main/java/multibinder/MultibinderApplication.java b/multibinder-samples/multibinder-two-kafka-clusters/src/main/java/multibinder/MultibinderApplication.java
similarity index 100%
rename from multibinder/src/main/java/multibinder/MultibinderApplication.java
rename to multibinder-samples/multibinder-two-kafka-clusters/src/main/java/multibinder/MultibinderApplication.java
diff --git a/multibinder-samples/multibinder-two-kafka-clusters/src/main/resources/application.yml b/multibinder-samples/multibinder-two-kafka-clusters/src/main/resources/application.yml
new file mode 100644
index 0000000..f2c5376
--- /dev/null
+++ b/multibinder-samples/multibinder-two-kafka-clusters/src/main/resources/application.yml
@@ -0,0 +1,40 @@
+spring:
+ cloud:
+ stream:
+ bindings:
+ input:
+ destination: dataIn
+ binder: kafka1
+ group: testGroup
+ output:
+ destination: dataOut
+ binder: kafka2
+ #Test sink binding (used for testing)
+ output1:
+ destination: dataIn
+ binder: kafka1
+ #Test sink binding (used for testing)
+ input1:
+ destination: dataOut
+ binder: kafka2
+ binders:
+ kafka1:
+ type: kafka
+ environment:
+ spring:
+ cloud:
+ stream:
+ kafka:
+ binder:
+ brokers: ${kafkaBroker1}
+ zkNodes: ${zk1}
+ kafka2:
+ type: kafka
+ environment:
+ spring:
+ cloud:
+ stream:
+ kafka:
+ binder:
+ brokers: ${kafkaBroker2}
+ zkNodes: ${zk2}
diff --git a/multibinder-differentsystems/src/test/java/multibinder/TwoKafkaBindersApplicationTest.java b/multibinder-samples/multibinder-two-kafka-clusters/src/test/java/multibinder/TwoKafkaBindersApplicationTest.java
similarity index 82%
rename from multibinder-differentsystems/src/test/java/multibinder/TwoKafkaBindersApplicationTest.java
rename to multibinder-samples/multibinder-two-kafka-clusters/src/test/java/multibinder/TwoKafkaBindersApplicationTest.java
index a8e0e4d..d906fff 100644
--- a/multibinder-differentsystems/src/test/java/multibinder/TwoKafkaBindersApplicationTest.java
+++ b/multibinder-samples/multibinder-two-kafka-clusters/src/test/java/multibinder/TwoKafkaBindersApplicationTest.java
@@ -18,7 +18,10 @@ package multibinder;
import org.hamcrest.CoreMatchers;
import org.hamcrest.Matchers;
-import org.junit.*;
+import org.junit.Assert;
+import org.junit.BeforeClass;
+import org.junit.ClassRule;
+import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.beans.factory.annotation.Autowired;
@@ -26,17 +29,13 @@ import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.cloud.stream.binder.Binder;
import org.springframework.cloud.stream.binder.BinderFactory;
import org.springframework.cloud.stream.binder.ExtendedConsumerProperties;
-import org.springframework.cloud.stream.binder.ExtendedProducerProperties;
import org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder;
import org.springframework.cloud.stream.binder.kafka.properties.KafkaBinderConfigurationProperties;
import org.springframework.cloud.stream.binder.kafka.properties.KafkaConsumerProperties;
-import org.springframework.cloud.stream.binder.kafka.properties.KafkaProducerProperties;
-import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.channel.QueueChannel;
import org.springframework.kafka.test.rule.KafkaEmbedded;
import org.springframework.messaging.Message;
import org.springframework.messaging.MessageChannel;
-import org.springframework.messaging.support.MessageBuilder;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.context.junit4.SpringRunner;
@@ -49,7 +48,6 @@ import static org.hamcrest.Matchers.equalTo;
@SpringBootTest(
webEnvironment = SpringBootTest.WebEnvironment.NONE)
@DirtiesContext
-@Ignore
public class TwoKafkaBindersApplicationTest {
@ClassRule
@@ -91,19 +89,14 @@ public class TwoKafkaBindersApplicationTest {
@Test
public void messagingWorks() {
- DirectChannel dataProducer = new DirectChannel();
- ((KafkaMessageChannelBinder) binderFactory.getBinder("kafka1", MessageChannel.class))
- .bindProducer("dataIn", dataProducer, new ExtendedProducerProperties<>(new KafkaProducerProperties()));
-
QueueChannel dataConsumer = new QueueChannel();
((KafkaMessageChannelBinder) binderFactory.getBinder("kafka2", MessageChannel.class)).bindConsumer("dataOut", UUID.randomUUID().toString(),
dataConsumer, new ExtendedConsumerProperties<>(new KafkaConsumerProperties()));
- String testPayload = "testFoo" + UUID.randomUUID().toString();
- dataProducer.send(MessageBuilder.withPayload(testPayload).build());
+ //receiving test message sent by the test producer in the application
Message> receive = dataConsumer.receive(60_000);
Assert.assertThat(receive, Matchers.notNullValue());
- Assert.assertThat(receive.getPayload(), CoreMatchers.equalTo(testPayload));
+ Assert.assertThat(receive.getPayload(), CoreMatchers.equalTo("foo".getBytes()));
}
}
diff --git a/multibinder-samples/pom.xml b/multibinder-samples/pom.xml
new file mode 100644
index 0000000..4f64a45
--- /dev/null
+++ b/multibinder-samples/pom.xml
@@ -0,0 +1,15 @@
+
+
+ 4.0.0
+ spring.cloud.stream.samples
+ multibinder-samples
+ 0.0.1-SNAPSHOT
+ pom
+ multibinder-samples
+ Collection of Spring Cloud Stream Aggregate Samples
+
+
+ multibinder-kafka-rabbit
+ multibinder-two-kafka-clusters
+
+
diff --git a/multibinder/src/main/java/multibinder/BridgeTransformer.java b/multibinder/src/main/java/multibinder/BridgeTransformer.java
deleted file mode 100644
index 0e25a55..0000000
--- a/multibinder/src/main/java/multibinder/BridgeTransformer.java
+++ /dev/null
@@ -1,36 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package multibinder;
-
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.annotation.StreamListener;
-import org.springframework.cloud.stream.messaging.Processor;
-import org.springframework.integration.annotation.ServiceActivator;
-import org.springframework.messaging.handler.annotation.SendTo;
-
-/**
- * @author Marius Bogoevici
- */
-@EnableBinding(Processor.class)
-public class BridgeTransformer {
-
- @StreamListener(Processor.INPUT)
- @SendTo(Processor.OUTPUT)
- public Object transform(Object payload) {
- return payload;
- }
-}
diff --git a/multibinder/src/main/resources/application.yml b/multibinder/src/main/resources/application.yml
deleted file mode 100644
index 9dcd1b1..0000000
--- a/multibinder/src/main/resources/application.yml
+++ /dev/null
@@ -1,12 +0,0 @@
-server:
- port: 8082
-spring:
- cloud:
- stream:
- bindings:
- input:
- destination: dataIn
- binder: kafka
- output:
- destination: dataOut
- binder: rabbit
diff --git a/non-self-contained-aggregate-app/.mvn b/non-self-contained-aggregate-app/.mvn
deleted file mode 120000
index 19172e1..0000000
--- a/non-self-contained-aggregate-app/.mvn
+++ /dev/null
@@ -1 +0,0 @@
-../.mvn
\ No newline at end of file
diff --git a/non-self-contained-aggregate-app/README.adoc b/non-self-contained-aggregate-app/README.adoc
deleted file mode 100644
index e04f97d..0000000
--- a/non-self-contained-aggregate-app/README.adoc
+++ /dev/null
@@ -1,49 +0,0 @@
-Spring Cloud Stream - Non self-contained Aggregate application sample
-=====================================================================
-
-In this *Spring Cloud Stream* sample, the application shows how to write a non self-contained aggregate application.
-A non self-contained application is the one that has its applications directly bound but either or both the input and output of the application is bound to the external broker.
-
-## Requirements
-
-To run this sample, you will need to have installed:
-
-* Java 8 or Above
-
-## Code Tour
-
-* NonSelfContainedAggregateApplication - the Spring Boot Main Aggregate Application that directly binds `Source` and `Processor` application while the processor application's output is bound to Kafka.
-* SourceAppConfiguration - Configuration for the source
-* ProcessorAppConfiguraion - Configuration for the processor
-* SourceApplication - Spring Boot app for the source
-* ProcessorApplication - Spring Boot app for the processor
-
-## Running the application
-
-* Go to the application root
-
-* Ensure that you have Kafka running
-
-* Run the Kafka console consume as below:
-
-`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic processor-output`
-
-`processor-output` is the destination that is bound from the processor.
-
-* `./mvnw clean package`
-
-* `java -jar target/non-self-contained-aggregate-app-0.0.1-SNAPSHOT.jar`
-
-Source application sends a message every second which will initiate the processor which will produce the output into a destination on the broker (Kafka topic in this case).
-You will see output similar to the following printed on the kafka console consumer.
-
-```
-2018-03-02 19:46:07
-2018-03-02 19:46:08
-2018-03-02 19:46:09
-2018-03-02 19:46:10
-2018-03-02 19:46:11
-2018-03-02 19:46:12
-2018-03-02 19:46:13
-2018-03-02 19:46:14
-```
\ No newline at end of file
diff --git a/non-self-contained-aggregate-app/pom.xml b/non-self-contained-aggregate-app/pom.xml
deleted file mode 100644
index e2423dc..0000000
--- a/non-self-contained-aggregate-app/pom.xml
+++ /dev/null
@@ -1,94 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- non-self-contained-aggregate-app
- 0.0.1-SNAPSHOT
- jar
-
- non-self-contained-aggregate-app
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/non-self-contained-aggregate-app/src/main/java/config/processor/ProcessorAppConfiguration.java b/non-self-contained-aggregate-app/src/main/java/config/processor/ProcessorAppConfiguration.java
deleted file mode 100644
index fe32143..0000000
--- a/non-self-contained-aggregate-app/src/main/java/config/processor/ProcessorAppConfiguration.java
+++ /dev/null
@@ -1,35 +0,0 @@
-/*
- * Copyright 2017 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.processor;
-
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.messaging.Processor;
-import org.springframework.integration.annotation.Transformer;
-import org.springframework.messaging.Message;
-
-/**
- * @author Marius Bogoevici
- */
-@EnableBinding(Processor.class)
-public class ProcessorAppConfiguration {
-
- @Transformer(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT)
- public Message> transform(Message> inbound) {
- return inbound;
- }
-
-}
diff --git a/non-self-contained-aggregate-app/src/main/java/config/source/SourceAppConfiguration.java b/non-self-contained-aggregate-app/src/main/java/config/source/SourceAppConfiguration.java
deleted file mode 100644
index 3de2e84..0000000
--- a/non-self-contained-aggregate-app/src/main/java/config/source/SourceAppConfiguration.java
+++ /dev/null
@@ -1,45 +0,0 @@
-/*
- * Copyright 2017 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.source;
-
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.messaging.Source;
-import org.springframework.context.annotation.Bean;
-import org.springframework.integration.annotation.InboundChannelAdapter;
-import org.springframework.integration.annotation.Poller;
-import org.springframework.integration.core.MessageSource;
-import org.springframework.messaging.support.GenericMessage;
-
-import java.text.SimpleDateFormat;
-import java.util.Date;
-
-/**
- * @author Dave Syer
- * @author Marius Bogoevici
- */
-@EnableBinding(Source.class)
-public class SourceAppConfiguration {
-
- private String format = "yyyy-MM-dd HH:mm:ss";
-
- @Bean
- @InboundChannelAdapter(value = Source.OUTPUT, poller = @Poller(fixedDelay = "${fixedDelay}", maxMessagesPerPoll = "1"))
- public MessageSource timerMessageSource() {
- return () -> new GenericMessage<>(new SimpleDateFormat(this.format).format(new Date()));
- }
-
-}
diff --git a/non-self-contained-aggregate-app/src/main/java/config/source/SourceApplication.java b/non-self-contained-aggregate-app/src/main/java/config/source/SourceApplication.java
deleted file mode 100644
index c1d45a6..0000000
--- a/non-self-contained-aggregate-app/src/main/java/config/source/SourceApplication.java
+++ /dev/null
@@ -1,26 +0,0 @@
-/*
- * Copyright 2017 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.source;
-
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-
-/**
- * @author Marius Bogoevici
- */
-@SpringBootApplication
-public class SourceApplication {
-}
diff --git a/non-self-contained-aggregate-app/src/main/java/demo/NonSelfContainedAggregateApplication.java b/non-self-contained-aggregate-app/src/main/java/demo/NonSelfContainedAggregateApplication.java
deleted file mode 100644
index 0aae5ab..0000000
--- a/non-self-contained-aggregate-app/src/main/java/demo/NonSelfContainedAggregateApplication.java
+++ /dev/null
@@ -1,35 +0,0 @@
-/*
- * Copyright 2017 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package demo;
-
-import config.processor.ProcessorApplication;
-import config.source.SourceApplication;
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-import org.springframework.cloud.stream.aggregate.AggregateApplicationBuilder;
-
-/**
- * @author Ilayaperumal Gopinathan
- */
-@SpringBootApplication
-public class NonSelfContainedAggregateApplication {
-
- public static void main(String[] args) {
- new AggregateApplicationBuilder(NonSelfContainedAggregateApplication.class)
- .from(SourceApplication.class).args("--fixedDelay=1000")
- .via(ProcessorApplication.class).namespace("a").run("--spring.cloud.stream.bindings.output.destination=processor-output");
- }
-}
diff --git a/non-self-contained-aggregate-app/src/main/resources/application.yml b/non-self-contained-aggregate-app/src/main/resources/application.yml
deleted file mode 100644
index d84e8d1..0000000
--- a/non-self-contained-aggregate-app/src/main/resources/application.yml
+++ /dev/null
@@ -1 +0,0 @@
-fixedDelay: 1000
diff --git a/non-self-contained-aggregate-app/src/test/java/demo/ModuleApplicationTests.java b/non-self-contained-aggregate-app/src/test/java/demo/ModuleApplicationTests.java
deleted file mode 100644
index d3943e3..0000000
--- a/non-self-contained-aggregate-app/src/test/java/demo/ModuleApplicationTests.java
+++ /dev/null
@@ -1,37 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package demo;
-
-import org.junit.Test;
-import org.junit.runner.RunWith;
-
-import org.springframework.boot.test.context.SpringBootTest;
-import org.springframework.test.annotation.DirtiesContext;
-import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
-import org.springframework.test.context.web.WebAppConfiguration;
-
-@RunWith(SpringJUnit4ClassRunner.class)
-@SpringBootTest(classes = NonSelfContainedAggregateApplication.class)
-@WebAppConfiguration
-@DirtiesContext
-public class ModuleApplicationTests {
-
- @Test
- public void contextLoads() {
- }
-
-}
diff --git a/non-self-contained-aggregate-app/start-kafka-shell.sh b/non-self-contained-aggregate-app/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/non-self-contained-aggregate-app/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/pom.xml b/pom.xml
index 82a7e9e..513bf0a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1,42 +1,42 @@
4.0.0
- spring-cloud-stream-samples
- 1.2.0.BUILD-SNAPSHOT
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOTpomhttps://github.com/spring-cloud/spring-cloud-stream-samples
-
- Pivotal Software, Inc.
- http://www.spring.io
-
+
+ spring-cloud-stream-samples-parent
+ Demo project for Spring Boot
+
- org.springframework.cloud
- spring-cloud-build
- 1.3.5.RELEASE
+ org.springframework.boot
+ spring-boot-starter-parent
+ 2.0.0.RELEASE
+
+
+
+ sink-samples
+ source-samples
+ processor-samples
+ kafka-streams-samples
+ multi-io-samples
+ kinesis-samples
+ multibinder-samples
+ testing-samples
+ samples-acceptance-tests
+
+
UTF-8
- Ditmars.BUILD-SNAPSHOT
- 2.1.0.RELEASE
- 1.8
+ UTF-8
+ 1.8
+ Elmhurst.BUILD-SNAPSHOT
-
- jdbc-sink
- jdbc-source
- streamlistener-basic
- self-contained-aggregate-app
- non-self-contained-aggregate-app
- uppercase-transformer
- reactive-processor
- kafka-streams
- multi-io
- dynamic-destination-source
- test-embedded-kafka
- kinesis-produce-consume
-
- multibinder-differentsystems
- testing
-
+
@@ -46,103 +46,80 @@
pomimport
-
- org.springframework.cloud
- spring-cloud-stream-sample-source
- 1.2.0.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-sample-sink
- 1.2.0.BUILD-SNAPSHOT
-
-
- org.springframework.cloud
- spring-cloud-stream-sample-transform
- 1.2.0.BUILD-SNAPSHOT
-
-
- org.springframework.kafka
- spring-kafka-test
- test
- ${spring-kafka.version}
-
-
-
-
-
-
- maven-deploy-plugin
-
- true
-
-
-
-
-
-
-
- spring
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
- spring-releases
- Spring Releases
- http://repo.spring.io/release
-
- false
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
- spring-releases
- Spring Releases
- http://repo.spring.io/libs-release-local
-
- false
-
-
-
-
-
+
+
+
+ org.springframework.boot
+ spring-boot-starter-actuator
+
+
+ org.springframework.boot
+ spring-boot-starter
+
+
+ org.springframework.boot
+ spring-boot-starter-web
+
+
+
+
+
+ spring-snapshots
+ Spring Snapshots
+ http://repo.spring.io/libs-snapshot-local
+
+ true
+
+
+ false
+
+
+
+ spring-milestones
+ Spring Milestones
+ http://repo.spring.io/libs-milestone-local
+
+ false
+
+
+
+ spring-releases
+ Spring Releases
+ http://repo.spring.io/release
+
+ false
+
+
+
+
+
+ spring-snapshots
+ Spring Snapshots
+ http://repo.spring.io/libs-snapshot-local
+
+ true
+
+
+ false
+
+
+
+ spring-milestones
+ Spring Milestones
+ http://repo.spring.io/libs-milestone-local
+
+ false
+
+
+
+ spring-releases
+ Spring Releases
+ http://repo.spring.io/libs-release-local
+
+ false
+
+
+
diff --git a/processor-samples/pom.xml b/processor-samples/pom.xml
new file mode 100644
index 0000000..4ed9198
--- /dev/null
+++ b/processor-samples/pom.xml
@@ -0,0 +1,17 @@
+
+
+ 4.0.0
+ spring.cloud.stream.samples
+ processor-samples
+ 0.0.1-SNAPSHOT
+ pom
+ processor-samples
+ Collection of Spring Cloud Stream Processor Samples
+
+
+ streamlistener-basic
+ uppercase-transformer
+ reactive-processor
+ sensor-average-reactive
+
+
diff --git a/reactive-processor/.jdk8 b/processor-samples/reactive-processor/.jdk8
similarity index 100%
rename from reactive-processor/.jdk8
rename to processor-samples/reactive-processor/.jdk8
diff --git a/processor-samples/reactive-processor/.mvn b/processor-samples/reactive-processor/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/processor-samples/reactive-processor/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/reactive-processor/README.adoc b/processor-samples/reactive-processor/README.adoc
similarity index 70%
rename from reactive-processor/README.adoc
rename to processor-samples/reactive-processor/README.adoc
index d439b02..e62e774 100644
--- a/reactive-processor/README.adoc
+++ b/processor-samples/reactive-processor/README.adoc
@@ -24,7 +24,7 @@ The main application contains the reactive processor that receives textual data
It then sends the aggregated data through the outbound destination of the processor.
The application also provides a source and sink for testing.
-Test source will generate some text every second and the test sink will verify that the processor has converted the text into its uppercase.
+Test source will generate some text every second and the test sink will verify that the processor has accumulated the text over a duration.
Test source is bound to the same broker destination where the processor is listening from.
Similarly test sink is bound to the same broker destination where the processor is producing to.
@@ -35,4 +35,18 @@ Data received: foobarfoobarfoo
Data received: barfoobarfoobar
Data received: foobarfoobarfoo
Data received: foobarfoobarfoo
-```
\ No newline at end of file
+```
+
+* `docker-compose down`
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/reactive-processor-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
diff --git a/processor-samples/reactive-processor/docker-compose-rabbit.yml b/processor-samples/reactive-processor/docker-compose-rabbit.yml
new file mode 100644
index 0000000..7c3da92
--- /dev/null
+++ b/processor-samples/reactive-processor/docker-compose-rabbit.yml
@@ -0,0 +1,7 @@
+version: '3'
+services:
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/processor-samples/reactive-processor/docker-compose.yml b/processor-samples/reactive-processor/docker-compose.yml
new file mode 100644
index 0000000..976e368
--- /dev/null
+++ b/processor-samples/reactive-processor/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-reactive-processor
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
\ No newline at end of file
diff --git a/multi-io/mvnw b/processor-samples/reactive-processor/mvnw
similarity index 100%
rename from multi-io/mvnw
rename to processor-samples/reactive-processor/mvnw
diff --git a/multi-io/mvnw.cmd b/processor-samples/reactive-processor/mvnw.cmd
similarity index 100%
rename from multi-io/mvnw.cmd
rename to processor-samples/reactive-processor/mvnw.cmd
diff --git a/processor-samples/reactive-processor/pom.xml b/processor-samples/reactive-processor/pom.xml
new file mode 100644
index 0000000..0c39c63
--- /dev/null
+++ b/processor-samples/reactive-processor/pom.xml
@@ -0,0 +1,64 @@
+
+
+ 4.0.0
+
+ reactive-processor
+ 0.0.1-SNAPSHOT
+ jar
+ reactive-processor
+ Spring Cloud Stream Reactive Processor
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-reactive
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/reactive-processor/src/main/java/reactive/kafka/ReactiveKafkaProcessorApplication.java b/processor-samples/reactive-processor/src/main/java/reactive/kafka/ReactiveProcessorApplication.java
similarity index 88%
rename from reactive-processor/src/main/java/reactive/kafka/ReactiveKafkaProcessorApplication.java
rename to processor-samples/reactive-processor/src/main/java/reactive/kafka/ReactiveProcessorApplication.java
index ee19074..3ef7a91 100644
--- a/reactive-processor/src/main/java/reactive/kafka/ReactiveKafkaProcessorApplication.java
+++ b/processor-samples/reactive-processor/src/main/java/reactive/kafka/ReactiveProcessorApplication.java
@@ -1,5 +1,7 @@
package reactive.kafka;
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.annotation.EnableBinding;
@@ -21,10 +23,10 @@ import java.util.concurrent.atomic.AtomicBoolean;
@SpringBootApplication
@EnableBinding(Processor.class)
-public class ReactiveKafkaProcessorApplication {
+public class ReactiveProcessorApplication {
public static void main(String[] args) {
- SpringApplication.run(ReactiveKafkaProcessorApplication.class, args);
+ SpringApplication.run(ReactiveProcessorApplication.class, args);
}
@StreamListener
@@ -51,16 +53,17 @@ public class ReactiveKafkaProcessorApplication {
public MessageSource sendTestData() {
return () ->
new GenericMessage<>(this.semaphore.getAndSet(!this.semaphore.get()) ? "foo" : "bar");
-
}
}
@EnableBinding(Sink.class)
static class TestSink {
+ private final Log logger = LogFactory.getLog(getClass());
+
@StreamListener("test-sink")
public void receive(String payload) {
- System.out.println("Data received: " + payload);
+ logger.info("Data received: " + payload);
}
}
@@ -73,5 +76,4 @@ public class ReactiveKafkaProcessorApplication {
@Output("test-source")
MessageChannel sampleSource();
}
-
}
diff --git a/reactive-processor/src/main/resources/application.yml b/processor-samples/reactive-processor/src/main/resources/application.yml
similarity index 92%
rename from reactive-processor/src/main/resources/application.yml
rename to processor-samples/reactive-processor/src/main/resources/application.yml
index 6faba4b..e9a9b04 100644
--- a/reactive-processor/src/main/resources/application.yml
+++ b/processor-samples/reactive-processor/src/main/resources/application.yml
@@ -1,5 +1,3 @@
-server:
- port: 8082
spring:
cloud:
stream:
diff --git a/processor-samples/sensor-average-reactive/.jdk8 b/processor-samples/sensor-average-reactive/.jdk8
new file mode 100644
index 0000000..e69de29
diff --git a/processor-samples/sensor-average-reactive/.mvn b/processor-samples/sensor-average-reactive/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/processor-samples/sensor-average-reactive/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/processor-samples/sensor-average-reactive/README.adoc b/processor-samples/sensor-average-reactive/README.adoc
new file mode 100644
index 0000000..1a72489
--- /dev/null
+++ b/processor-samples/sensor-average-reactive/README.adoc
@@ -0,0 +1,56 @@
+Spring Cloud Stream Reactive Processor Sensor Average Sample
+=============================================================
+
+This is a Spring Cloud Stream reactive processor sample that showcase a sensor temperature average calculator.
+
+## Requirements
+
+To run this sample, you will need to have installed:
+
+* Java 8 or Above
+
+## Running the application
+
+The following instructions assume that you are running Kafka as a Docker image.
+
+* Go to the application root (not the repository root, but this application)
+* `docker-compose up -d`
+
+* `./mvnw clean package`
+
+* `java -jar target/sensor-average-reactive-0.0.1-SNAPSHOT.jar`
+
+The main application contains the reactive processor that receives the sensor data for a duration (3 seconds) and averages them.
+It then sends the average data (per sensor id) through the outbound destination of the processor.
+
+The application also provides a source and sink for testing.
+Test source will generate some sensor data every 100 milliseconds and the test sink will verify that the processor has calculated the average.
+Test source is bound to the same broker destination where the processor is listening from.
+Similarly test sink is bound to the same broker destination where the processor is producing to.
+
+You will see output similar to the following on the console every 3 seconds.
+
+```
+Data received: {"id":100100,"average":89.0}
+Data received: {"id":100200,"average":84.0}
+Data received: {"id":100300,"average":88.0}
+Data received: {"id":100100,"average":85.0}
+Data received: {"id":100200,"average":85.0}
+Data received: {"id":100300,"average":83.0}
+Data received: {"id":100100,"average":80.0}
+Data received: {"id":100200,"average":90.0}
+```
+
+* `docker-compose down`
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/sensor-average-reactive-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
diff --git a/processor-samples/sensor-average-reactive/docker-compose-rabbit.yml b/processor-samples/sensor-average-reactive/docker-compose-rabbit.yml
new file mode 100644
index 0000000..7c3da92
--- /dev/null
+++ b/processor-samples/sensor-average-reactive/docker-compose-rabbit.yml
@@ -0,0 +1,7 @@
+version: '3'
+services:
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/multi-io/docker-compose.yml b/processor-samples/sensor-average-reactive/docker-compose.yml
similarity index 88%
rename from multi-io/docker-compose.yml
rename to processor-samples/sensor-average-reactive/docker-compose.yml
index d47d783..8f6e15e 100644
--- a/multi-io/docker-compose.yml
+++ b/processor-samples/sensor-average-reactive/docker-compose.yml
@@ -1,7 +1,8 @@
-version: '2'
+version: '3'
services:
kafka:
image: wurstmeister/kafka
+ container_name: kafka-sensor-avg
ports:
- "9092:9092"
environment:
diff --git a/multibinder-differentsystems/mvnw b/processor-samples/sensor-average-reactive/mvnw
similarity index 100%
rename from multibinder-differentsystems/mvnw
rename to processor-samples/sensor-average-reactive/mvnw
diff --git a/multibinder-differentsystems/mvnw.cmd b/processor-samples/sensor-average-reactive/mvnw.cmd
similarity index 100%
rename from multibinder-differentsystems/mvnw.cmd
rename to processor-samples/sensor-average-reactive/mvnw.cmd
diff --git a/processor-samples/sensor-average-reactive/pom.xml b/processor-samples/sensor-average-reactive/pom.xml
new file mode 100644
index 0000000..a11b45a
--- /dev/null
+++ b/processor-samples/sensor-average-reactive/pom.xml
@@ -0,0 +1,64 @@
+
+
+ 4.0.0
+
+ sensor-average-reactive
+ 0.0.1-SNAPSHOT
+ jar
+ sensor-average-reactive
+ Spring Cloud Stream Reactive Processor for Sensor Average
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-reactive
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/processor-samples/sensor-average-reactive/src/main/java/sample/sensor/average/SensorAverageProcessorApplication.java b/processor-samples/sensor-average-reactive/src/main/java/sample/sensor/average/SensorAverageProcessorApplication.java
new file mode 100644
index 0000000..8fa0e33
--- /dev/null
+++ b/processor-samples/sensor-average-reactive/src/main/java/sample/sensor/average/SensorAverageProcessorApplication.java
@@ -0,0 +1,210 @@
+package sample.sensor.average;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.springframework.boot.SpringApplication;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.springframework.cloud.stream.annotation.EnableBinding;
+import org.springframework.cloud.stream.annotation.Input;
+import org.springframework.cloud.stream.annotation.Output;
+import org.springframework.cloud.stream.annotation.StreamListener;
+import org.springframework.cloud.stream.messaging.Processor;
+import org.springframework.context.annotation.Bean;
+import org.springframework.integration.annotation.InboundChannelAdapter;
+import org.springframework.integration.annotation.Poller;
+import org.springframework.integration.core.MessageSource;
+import org.springframework.messaging.MessageChannel;
+import org.springframework.messaging.SubscribableChannel;
+import org.springframework.messaging.support.GenericMessage;
+import reactor.core.publisher.Flux;
+import reactor.core.publisher.GroupedFlux;
+import reactor.core.publisher.Mono;
+
+import java.time.Duration;
+import java.util.Random;
+import java.util.concurrent.atomic.AtomicBoolean;
+
+@SpringBootApplication
+@EnableBinding(Processor.class)
+public class SensorAverageProcessorApplication {
+
+ public static void main(String[] args) {
+ SpringApplication.run(SensorAverageProcessorApplication.class, args);
+ }
+
+ @StreamListener
+ @Output(Processor.OUTPUT)
+ public Flux calculateAverage(@Input(Processor.INPUT) Flux data) {
+ return data.window(Duration.ofSeconds(3)).flatMap(
+ window -> window.groupBy(Sensor::getId).flatMap(this::calculateAverage));
+ }
+
+ private Mono calculateAverage(GroupedFlux group) {
+ return group
+ .reduce(new Accumulator(0, 0),
+ (a, d) -> new Accumulator(a.getCount() + 1, a.getTotalValue() + d.getTemperature()))
+ .map(accumulator -> new Average(group.key(), (accumulator.getTotalValue()) / accumulator.getCount()));
+ }
+
+ static class Accumulator {
+
+ private int count;
+
+ private int totalValue;
+
+ public Accumulator(int count, int totalValue) {
+ this.count = count;
+ this.totalValue = totalValue;
+ }
+
+ /**
+ * @return the count
+ */
+ public int getCount() {
+ return count;
+ }
+
+ /**
+ * @param count the count to set
+ */
+ public void setCount(int count) {
+ this.count = count;
+ }
+
+ /**
+ * @return the totalValue
+ */
+ public int getTotalValue() {
+ return totalValue;
+ }
+
+ /**
+ * @param totalValue the totalValue to set
+ */
+ public void setTotalValue(int totalValue) {
+ this.totalValue = totalValue;
+ }
+ }
+
+ static class Average {
+
+ private int id;
+
+ private double average;
+
+ public Average(int id, double average) {
+ this.id = id;
+ this.average = average;
+ }
+
+ /**
+ * @return the id
+ */
+ public int getId() {
+ return id;
+ }
+
+ /**
+ * @param id the id to set
+ */
+ public void setId(int id) {
+ this.id = id;
+ }
+
+ /**
+ * @return the average
+ */
+ public double getAverage() {
+ return average;
+ }
+
+ /**
+ * @param average the average to set
+ */
+ public void setAverage(double average) {
+ this.average = average;
+ }
+ }
+
+ static class Sensor {
+
+ private int id;
+
+ private int temperature;
+
+ /**
+ * @return the id
+ */
+ public int getId() {
+ return id;
+ }
+
+ /**
+ * @param id the id to set
+ */
+ public void setId(int id) {
+ this.id = id;
+ }
+
+ /**
+ * @return the temperature
+ */
+ public int getTemperature() {
+ return temperature;
+ }
+
+ /**
+ * @param temperature the temperature to set
+ */
+ public void setTemperature(int temperature) {
+ this.temperature = temperature;
+ }
+ }
+
+ //Following source and sinks are used for testing only.
+ //Test source will send data to the same destination where the processor receives data
+ //Test sink will consume data from the same destination where the processor produces data
+
+ @EnableBinding(Source.class)
+ static class TestSource {
+
+ private AtomicBoolean semaphore = new AtomicBoolean(true);
+ private Random random = new Random();
+ private int[] ids = new int[]{100100, 100200, 100300};
+
+ @Bean
+ @InboundChannelAdapter(channel = "test-source", poller = @Poller(fixedDelay = "100"))
+ public MessageSource sendTestData() {
+
+ return () -> {
+ int id = ids[random.nextInt(3)];
+ int temperature = random.nextInt((102 - 65) + 1) + 65;
+ Sensor sensor = new Sensor();
+ sensor.setId(id);
+ sensor.setTemperature(temperature);
+ return new GenericMessage<>(sensor);
+ };
+ }
+ }
+
+ @EnableBinding(Sink.class)
+ static class TestSink {
+
+ private final Log logger = LogFactory.getLog(getClass());
+
+ @StreamListener("test-sink")
+ public void receive(String payload) {
+ logger.info("Data received: " + payload);
+ }
+ }
+
+ public interface Sink {
+ @Input("test-sink")
+ SubscribableChannel sampleSink();
+ }
+
+ public interface Source {
+ @Output("test-source")
+ MessageChannel sampleSource();
+ }
+}
diff --git a/processor-samples/sensor-average-reactive/src/main/resources/application.yml b/processor-samples/sensor-average-reactive/src/main/resources/application.yml
new file mode 100644
index 0000000..bce2b3a
--- /dev/null
+++ b/processor-samples/sensor-average-reactive/src/main/resources/application.yml
@@ -0,0 +1,12 @@
+spring:
+ cloud:
+ stream:
+ bindings:
+ output:
+ destination: average
+ test-sink:
+ destination: average
+ input:
+ destination: sensor
+ test-source:
+ destination: sensor
diff --git a/processor-samples/streamlistener-basic/.mvn b/processor-samples/streamlistener-basic/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/processor-samples/streamlistener-basic/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/streamlistener-basic/README.adoc b/processor-samples/streamlistener-basic/README.adoc
similarity index 84%
rename from streamlistener-basic/README.adoc
rename to processor-samples/streamlistener-basic/README.adoc
index 75c22f9..dd6f3e7 100644
--- a/streamlistener-basic/README.adoc
+++ b/processor-samples/streamlistener-basic/README.adoc
@@ -55,4 +55,17 @@ Transforming the value to HI and with the type class demo.Bar
At the Sink
******************
Received transformed message HI of type class demo.Foo
-```
\ No newline at end of file
+```
+* `docker-compose down`
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/streamlistener-basic-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
\ No newline at end of file
diff --git a/processor-samples/streamlistener-basic/docker-compose-rabbit.yml b/processor-samples/streamlistener-basic/docker-compose-rabbit.yml
new file mode 100644
index 0000000..7c3da92
--- /dev/null
+++ b/processor-samples/streamlistener-basic/docker-compose-rabbit.yml
@@ -0,0 +1,7 @@
+version: '3'
+services:
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/processor-samples/streamlistener-basic/docker-compose.yml b/processor-samples/streamlistener-basic/docker-compose.yml
new file mode 100644
index 0000000..1f221a2
--- /dev/null
+++ b/processor-samples/streamlistener-basic/docker-compose.yml
@@ -0,0 +1,19 @@
+version: '3'
+services:
+ kafka:
+ image: wurstmeister/kafka
+ container_name: kafka-streamlistener
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
\ No newline at end of file
diff --git a/non-self-contained-aggregate-app/mvnw b/processor-samples/streamlistener-basic/mvnw
similarity index 100%
rename from non-self-contained-aggregate-app/mvnw
rename to processor-samples/streamlistener-basic/mvnw
diff --git a/non-self-contained-aggregate-app/mvnw.cmd b/processor-samples/streamlistener-basic/mvnw.cmd
similarity index 100%
rename from non-self-contained-aggregate-app/mvnw.cmd
rename to processor-samples/streamlistener-basic/mvnw.cmd
diff --git a/processor-samples/streamlistener-basic/pom.xml b/processor-samples/streamlistener-basic/pom.xml
new file mode 100644
index 0000000..02d12cb
--- /dev/null
+++ b/processor-samples/streamlistener-basic/pom.xml
@@ -0,0 +1,65 @@
+
+
+ 4.0.0
+
+ streamlistener-basic
+ 0.0.1-SNAPSHOT
+ jar
+ streamlistener-basic
+ Spring Cloud Stream StreamListener Basic Sample
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.cloud
+ spring-cloud-stream-test-support
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/streamlistener-basic/src/main/java/demo/Bar.java b/processor-samples/streamlistener-basic/src/main/java/demo/Bar.java
similarity index 100%
rename from streamlistener-basic/src/main/java/demo/Bar.java
rename to processor-samples/streamlistener-basic/src/main/java/demo/Bar.java
diff --git a/streamlistener-basic/src/main/java/demo/Foo.java b/processor-samples/streamlistener-basic/src/main/java/demo/Foo.java
similarity index 100%
rename from streamlistener-basic/src/main/java/demo/Foo.java
rename to processor-samples/streamlistener-basic/src/main/java/demo/Foo.java
diff --git a/streamlistener-basic/src/main/java/demo/SampleSink.java b/processor-samples/streamlistener-basic/src/main/java/demo/SampleSink.java
similarity index 79%
rename from streamlistener-basic/src/main/java/demo/SampleSink.java
rename to processor-samples/streamlistener-basic/src/main/java/demo/SampleSink.java
index 1bff86e..0e92a6c 100644
--- a/streamlistener-basic/src/main/java/demo/SampleSink.java
+++ b/processor-samples/streamlistener-basic/src/main/java/demo/SampleSink.java
@@ -16,6 +16,8 @@
package demo;
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.Input;
import org.springframework.cloud.stream.annotation.StreamListener;
@@ -27,13 +29,13 @@ import org.springframework.messaging.SubscribableChannel;
@EnableBinding(SampleSink.Sink.class)
public class SampleSink {
+ private final Log logger = LogFactory.getLog(getClass());
+
// Sink application definition
@StreamListener(Sink.SAMPLE)
public void receive(Foo foo) {
- System.out.println("******************");
- System.out.println("At the Sink");
- System.out.println("******************");
- System.out.println("Received transformed message " + foo.getValue() + " of type " + foo.getClass());
+ logger.info("******************\nAt the Sink\n******************");
+ logger.info("Received transformed message " + foo.getValue() + " of type " + foo.getClass());
}
public interface Sink {
diff --git a/streamlistener-basic/src/main/java/demo/SampleSource.java b/processor-samples/streamlistener-basic/src/main/java/demo/SampleSource.java
similarity index 86%
rename from streamlistener-basic/src/main/java/demo/SampleSource.java
rename to processor-samples/streamlistener-basic/src/main/java/demo/SampleSource.java
index ab7da49..208d943 100644
--- a/streamlistener-basic/src/main/java/demo/SampleSource.java
+++ b/processor-samples/streamlistener-basic/src/main/java/demo/SampleSource.java
@@ -16,6 +16,8 @@
package demo;
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.Output;
import org.springframework.context.annotation.Bean;
@@ -32,16 +34,16 @@ import org.springframework.messaging.support.MessageBuilder;
@EnableBinding(SampleSource.Source.class)
public class SampleSource {
+ private final Log logger = LogFactory.getLog(getClass());
+
@Bean
@InboundChannelAdapter(value = Source.SAMPLE, poller = @Poller(fixedDelay = "1000", maxMessagesPerPoll = "1"))
public MessageSource timerMessageSource() {
return new MessageSource() {
public Message receive() {
- System.out.println("******************");
- System.out.println("At the Source");
- System.out.println("******************");
+ logger.info("******************\nAt the Source\n******************");
String value = "{\"value\":\"hi\"}";
- System.out.println("Sending value: " + value);
+ logger.info("Sending value: " + value);
return MessageBuilder.withPayload(value).build();
}
};
diff --git a/streamlistener-basic/src/main/java/demo/SampleTransformer.java b/processor-samples/streamlistener-basic/src/main/java/demo/SampleTransformer.java
similarity index 74%
rename from streamlistener-basic/src/main/java/demo/SampleTransformer.java
rename to processor-samples/streamlistener-basic/src/main/java/demo/SampleTransformer.java
index c75c3ee..7216bde 100644
--- a/streamlistener-basic/src/main/java/demo/SampleTransformer.java
+++ b/processor-samples/streamlistener-basic/src/main/java/demo/SampleTransformer.java
@@ -16,6 +16,8 @@
package demo;
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.cloud.stream.messaging.Processor;
@@ -29,14 +31,14 @@ public class SampleTransformer {
private static final String TRANSFORMATION_VALUE = "HI";
+ private final Log logger = LogFactory.getLog(getClass());
+
@StreamListener(Processor.INPUT)
@SendTo(Processor.OUTPUT)
public Bar receive(Bar bar) {
- System.out.println("******************");
- System.out.println("At the transformer");
- System.out.println("******************");
- System.out.println("Received value "+ bar.getValue() + " of type " + bar.getClass());
- System.out.println("Transforming the value to " + TRANSFORMATION_VALUE + " and with the type " + bar.getClass());
+ logger.info("******************\nAt the transformer\n******************");
+ logger.info("Received value "+ bar.getValue() + " of type " + bar.getClass());
+ logger.info("Transforming the value to " + TRANSFORMATION_VALUE + " and with the type " + bar.getClass());
bar.setValue(TRANSFORMATION_VALUE);
return bar;
}
diff --git a/streamlistener-basic/src/main/java/demo/TypeConversionApplication.java b/processor-samples/streamlistener-basic/src/main/java/demo/TypeConversionApplication.java
similarity index 100%
rename from streamlistener-basic/src/main/java/demo/TypeConversionApplication.java
rename to processor-samples/streamlistener-basic/src/main/java/demo/TypeConversionApplication.java
diff --git a/streamlistener-basic/src/main/resources/application.yml b/processor-samples/streamlistener-basic/src/main/resources/application.yml
similarity index 100%
rename from streamlistener-basic/src/main/resources/application.yml
rename to processor-samples/streamlistener-basic/src/main/resources/application.yml
diff --git a/streamlistener-basic/src/test/java/demo/ModuleApplicationTests.java b/processor-samples/streamlistener-basic/src/test/java/demo/ModuleApplicationTests.java
similarity index 100%
rename from streamlistener-basic/src/test/java/demo/ModuleApplicationTests.java
rename to processor-samples/streamlistener-basic/src/test/java/demo/ModuleApplicationTests.java
diff --git a/processor-samples/uppercase-transformer/.mvn b/processor-samples/uppercase-transformer/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/processor-samples/uppercase-transformer/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/uppercase-transformer/README.adoc b/processor-samples/uppercase-transformer/README.adoc
similarity index 64%
rename from uppercase-transformer/README.adoc
rename to processor-samples/uppercase-transformer/README.adoc
index e4b1520..96a8fc8 100644
--- a/uppercase-transformer/README.adoc
+++ b/processor-samples/uppercase-transformer/README.adoc
@@ -20,7 +20,7 @@ The following instructions assume that you are running Kafka as a Docker image.
* `java -jar target/uppercase-transformer-0.0.1-SNAPSHOT.jar`
-The main application is the uppercase transfomer which is a processor.
+The main application is the uppercase transformer which is a processor.
The application also provides a source and sink for testing.
Test source will generate some text every second and the test sink will verify that the processor has converted the text into its uppercase.
@@ -36,4 +36,17 @@ Data received: FOO
Data received: BAR
Data received: FOO
Data received: BAR
-```
\ No newline at end of file
+```
+* `docker-compose down`
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/uppercase-transformer-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
\ No newline at end of file
diff --git a/processor-samples/uppercase-transformer/docker-compose-rabbit.yml b/processor-samples/uppercase-transformer/docker-compose-rabbit.yml
new file mode 100644
index 0000000..7c3da92
--- /dev/null
+++ b/processor-samples/uppercase-transformer/docker-compose-rabbit.yml
@@ -0,0 +1,7 @@
+version: '3'
+services:
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/non-self-contained-aggregate-app/docker-compose.yml b/processor-samples/uppercase-transformer/docker-compose.yml
similarity index 88%
rename from non-self-contained-aggregate-app/docker-compose.yml
rename to processor-samples/uppercase-transformer/docker-compose.yml
index d47d783..a7b9943 100644
--- a/non-self-contained-aggregate-app/docker-compose.yml
+++ b/processor-samples/uppercase-transformer/docker-compose.yml
@@ -1,7 +1,8 @@
-version: '2'
+version: '3'
services:
kafka:
image: wurstmeister/kafka
+ container_name: kafka-uppercase-tx
ports:
- "9092:9092"
environment:
diff --git a/reactive-processor/mvnw b/processor-samples/uppercase-transformer/mvnw
similarity index 100%
rename from reactive-processor/mvnw
rename to processor-samples/uppercase-transformer/mvnw
diff --git a/reactive-processor/mvnw.cmd b/processor-samples/uppercase-transformer/mvnw.cmd
similarity index 100%
rename from reactive-processor/mvnw.cmd
rename to processor-samples/uppercase-transformer/mvnw.cmd
diff --git a/processor-samples/uppercase-transformer/pom.xml b/processor-samples/uppercase-transformer/pom.xml
new file mode 100644
index 0000000..5d61171
--- /dev/null
+++ b/processor-samples/uppercase-transformer/pom.xml
@@ -0,0 +1,63 @@
+
+
+ 4.0.0
+
+ uppercase-transformer
+ 0.0.1-SNAPSHOT
+ jar
+ uppercase-transformer
+ Spring Cloud Stream Uppercase Transformer
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.cloud
+ spring-cloud-stream-test-support
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
diff --git a/uppercase-transformer/src/main/java/demo/UppercaseTransformer.java b/processor-samples/uppercase-transformer/src/main/java/demo/UppercaseTransformer.java
similarity index 100%
rename from uppercase-transformer/src/main/java/demo/UppercaseTransformer.java
rename to processor-samples/uppercase-transformer/src/main/java/demo/UppercaseTransformer.java
diff --git a/uppercase-transformer/src/main/java/demo/UppercaseTransformerApplication.java b/processor-samples/uppercase-transformer/src/main/java/demo/UppercaseTransformerApplication.java
similarity index 100%
rename from uppercase-transformer/src/main/java/demo/UppercaseTransformerApplication.java
rename to processor-samples/uppercase-transformer/src/main/java/demo/UppercaseTransformerApplication.java
diff --git a/uppercase-transformer/src/main/resources/application.yml b/processor-samples/uppercase-transformer/src/main/resources/application.yml
similarity index 100%
rename from uppercase-transformer/src/main/resources/application.yml
rename to processor-samples/uppercase-transformer/src/main/resources/application.yml
diff --git a/uppercase-transformer/src/test/java/demo/ModuleApplicationTests.java b/processor-samples/uppercase-transformer/src/test/java/demo/ModuleApplicationTests.java
similarity index 100%
rename from uppercase-transformer/src/test/java/demo/ModuleApplicationTests.java
rename to processor-samples/uppercase-transformer/src/test/java/demo/ModuleApplicationTests.java
diff --git a/reactive-processor/.mvn b/reactive-processor/.mvn
deleted file mode 120000
index 19172e1..0000000
--- a/reactive-processor/.mvn
+++ /dev/null
@@ -1 +0,0 @@
-../.mvn
\ No newline at end of file
diff --git a/reactive-processor/pom.xml b/reactive-processor/pom.xml
deleted file mode 100644
index 83b6d7d..0000000
--- a/reactive-processor/pom.xml
+++ /dev/null
@@ -1,100 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- reactive-processor
- 0.0.1-SNAPSHOT
- jar
-
- reactive-processor
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
- org.springframework.cloud
- spring-cloud-stream-reactive
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
-
diff --git a/reactive-processor/start-kafka-shell.sh b/reactive-processor/start-kafka-shell.sh
deleted file mode 100755
index 62663e4..0000000
--- a/reactive-processor/start-kafka-shell.sh
+++ /dev/null
@@ -1,2 +0,0 @@
-#!/bin/bash
-docker run --rm -v /var/run/docker.sock:/var/run/docker.sock -e HOST_IP=$1 -e ZK=$2 -i -t wurstmeister/kafka /bin/bash
diff --git a/jdbc-sink/.mvn b/samples-acceptance-tests/.mvn
similarity index 100%
rename from jdbc-sink/.mvn
rename to samples-acceptance-tests/.mvn
diff --git a/samples-acceptance-tests/README.adoc b/samples-acceptance-tests/README.adoc
new file mode 100644
index 0000000..cb0f8e8
--- /dev/null
+++ b/samples-acceptance-tests/README.adoc
@@ -0,0 +1,10 @@
+=== Samples Acceptance Tests
+
+This is an accptance test module for the samples in this repo.
+The tests launch the Spring Cloud Stream samples as stand alone Spring Boot applications and then verify their correctness.
+
+By default, these tests are not run as part of the normal build, as they are mainly intended for continuous integration testing with ongoing changes in the framework.
+
+In order to run the tests, we recommend to run the script `./runAcceptanceTest.sh` in this directory.
+The script will launch all the middleware and other components in docker containers first.
+Then it builds the applications and run them.
\ No newline at end of file
diff --git a/samples-acceptance-tests/docker-compose.yml b/samples-acceptance-tests/docker-compose.yml
new file mode 100644
index 0000000..8bece76
--- /dev/null
+++ b/samples-acceptance-tests/docker-compose.yml
@@ -0,0 +1,54 @@
+version: '3'
+volumes:
+ data-volume: {}
+services:
+ mysql:
+ image: mariadb
+ ports:
+ - "3306:3306"
+ environment:
+ MYSQL_ROOT_PASSWORD: pwd
+ MYSQL_DATABASE: sample_mysql_db
+ volumes:
+ - data-volume:/var/lib/mysql
+ kafka:
+ image: wurstmeister/kafka
+ ports:
+ - "9092:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
+ depends_on:
+ - zookeeper
+ zookeeper:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2181:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
+
+ # used for multi Kafka cluster testing
+
+ kafka2:
+ image: wurstmeister/kafka
+ container_name: kafka-2
+ ports:
+ - "9093:9092"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=127.0.0.1
+ - KAFKA_ADVERTISED_PORT=9092
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper2:2181
+ depends_on:
+ - zookeeper2
+ zookeeper2:
+ image: wurstmeister/zookeeper
+ ports:
+ - "2182:2181"
+ environment:
+ - KAFKA_ADVERTISED_HOST_NAME=zookeeper2
\ No newline at end of file
diff --git a/self-contained-aggregate-app/mvnw b/samples-acceptance-tests/mvnw
similarity index 100%
rename from self-contained-aggregate-app/mvnw
rename to samples-acceptance-tests/mvnw
diff --git a/self-contained-aggregate-app/mvnw.cmd b/samples-acceptance-tests/mvnw.cmd
similarity index 100%
rename from self-contained-aggregate-app/mvnw.cmd
rename to samples-acceptance-tests/mvnw.cmd
diff --git a/samples-acceptance-tests/pom.xml b/samples-acceptance-tests/pom.xml
new file mode 100644
index 0000000..0e6541d
--- /dev/null
+++ b/samples-acceptance-tests/pom.xml
@@ -0,0 +1,45 @@
+
+
+ 4.0.0
+
+ samples-acceptance-tests
+ 0.0.1-SNAPSHOT
+ jar
+ samples-acceptance-tests
+ Collection of Spring Cloud Stream Aggregate Samples
+
+
+ org.springframework.cloud
+ spring-cloud-build
+ 2.0.0.BUILD-SNAPSHOT
+
+
+
+ true
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-web
+ test
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.boot
+ spring-boot-starter-jdbc
+ test
+
+
+ org.mariadb.jdbc
+ mariadb-java-client
+ 1.1.9
+ test
+
+
+
+
diff --git a/samples-acceptance-tests/runAcceptanceTests.sh b/samples-acceptance-tests/runAcceptanceTests.sh
new file mode 100755
index 0000000..03e739f
--- /dev/null
+++ b/samples-acceptance-tests/runAcceptanceTests.sh
@@ -0,0 +1,166 @@
+
+#!/bin/bash
+
+pushd () {
+ command pushd "$@" > /dev/null
+}
+
+popd () {
+ command popd "$@" > /dev/null
+}
+
+function prepare_jdbc_source_with_kafka_and_rabbit_binders() {
+ pushd ../source-samples/jdbc-source
+./mvnw clean package -DskipTests
+
+cp target/jdbc-source-*-SNAPSHOT.jar /tmp/jdbc-source-kafka-sample.jar
+
+./mvnw clean package -P rabbit-binder -DskipTests
+
+cp target/jdbc-source-*-SNAPSHOT.jar /tmp/jdbc-source-rabbit-sample.jar
+
+popd
+
+}
+
+function prepare_jdbc_sink_with_kafka_and_rabbit_binders() {
+ pushd ../sink-samples/jdbc-sink
+./mvnw clean package -DskipTests
+
+cp target/jdbc-sink-*-SNAPSHOT.jar /tmp/jdbc-sink-kafka-sample.jar
+
+./mvnw clean package -P rabbit-binder -DskipTests
+
+cp target/jdbc-sink-*-SNAPSHOT.jar /tmp/jdbc-sink-rabbit-sample.jar
+
+popd
+
+}
+
+function prepare_dynamic_source_with_kafka_and_rabbit_binders() {
+ pushd ../source-samples/dynamic-destination-source
+./mvnw clean package -DskipTests
+
+cp target/dynamic-destination-source-*-SNAPSHOT.jar /tmp/dynamic-destination-source-kafka-sample.jar
+
+./mvnw clean package -P rabbit-binder -DskipTests
+
+cp target/dynamic-destination-source-*-SNAPSHOT.jar /tmp/dynamic-destination-source-rabbit-sample.jar
+
+popd
+
+}
+
+function prepare_multi_binder_with_kafka_rabbit() {
+ pushd ../multibinder-samples/multibinder-kafka-rabbit
+./mvnw clean package -DskipTests
+
+cp target/multibinder-kafka-rabbit-*-SNAPSHOT.jar /tmp/multibinder-kafka-rabbit-sample.jar
+
+popd
+
+}
+
+function prepare_multi_binder_with_two_kafka_clusters() {
+ pushd ../multibinder-samples/multibinder-two-kafka-clusters
+./mvnw clean package -DskipTests
+
+cp target/multibinder-two-kafka-clusters-*-SNAPSHOT.jar /tmp/multibinder-two-kafka-clusters-sample.jar
+
+popd
+
+}
+
+function prepare_kafka_streams_word_count() {
+ pushd ../kafka-streams-samples/kafka-streams-word-count
+./mvnw clean package -DskipTests
+
+cp target/kafka-streams-word-count-*-SNAPSHOT.jar /tmp/kafka-streams-word-count-sample.jar
+
+popd
+
+}
+
+function prepare_streamlistener_basic_with_kafka_rabbit_binders() {
+pushd ../processor-samples/streamlistener-basic
+./mvnw clean package -DskipTests
+
+cp target/streamlistener-basic-*-SNAPSHOT.jar /tmp/streamlistener-basic-kafka-sample.jar
+
+./mvnw clean package -P rabbit-binder -DskipTests
+
+cp target/streamlistener-basic-*-SNAPSHOT.jar /tmp/streamlistener-basic-rabbit-sample.jar
+
+popd
+
+}
+
+function prepare_reactive_processor_with_kafka_rabbit_binders() {
+pushd ../processor-samples/reactive-processor
+./mvnw clean package -DskipTests
+
+cp target/reactive-processor-*-SNAPSHOT.jar /tmp/reactive-processor-kafka-sample.jar
+
+./mvnw clean package -P rabbit-binder -DskipTests
+
+cp target/reactive-processor-*-SNAPSHOT.jar /tmp/reactive-processor-rabbit-sample.jar
+
+popd
+
+}
+
+function prepare_sensor_average_reactive_with_kafka_rabbit_binders() {
+pushd ../processor-samples/sensor-average-reactive
+./mvnw clean package -DskipTests
+
+cp target/sensor-average-reactive-*-SNAPSHOT.jar /tmp/sensor-average-reactive-kafka-sample.jar
+
+./mvnw clean package -P rabbit-binder -DskipTests
+
+cp target/sensor-average-reactive-*-SNAPSHOT.jar /tmp/sensor-average-reactive-rabbit-sample.jar
+
+popd
+
+}
+
+#Main script starting
+
+echo "Starting Kafka broker as a Docker container..."
+
+docker-compose up -d
+
+prepare_jdbc_source_with_kafka_and_rabbit_binders
+prepare_jdbc_sink_with_kafka_and_rabbit_binders
+prepare_dynamic_source_with_kafka_and_rabbit_binders
+prepare_multi_binder_with_kafka_rabbit
+prepare_multi_binder_with_two_kafka_clusters
+prepare_kafka_streams_word_count
+prepare_streamlistener_basic_with_kafka_rabbit_binders
+prepare_reactive_processor_with_kafka_rabbit_binders
+prepare_sensor_average_reactive_with_kafka_rabbit_binders
+
+echo "Running tests"
+
+./mvnw clean package -Dmaven.test.skip=false
+
+docker-compose down
+
+# Post cleanup
+
+rm /tmp/jdbc-source-kafka-sample.jar
+rm /tmp/jdbc-source-rabbit-sample.jar
+rm /tmp/jdbc-sink-kafka-sample.jar
+rm /tmp/jdbc-sink-rabbit-sample.jar
+rm /tmp/dynamic-destination-source-kafka-sample.jar
+rm /tmp/dynamic-destination-source-rabbit-sample.jar
+rm /tmp/multibinder-kafka-rabbit-sample.jar
+rm /tmp/multibinder-two-kafka-clusters-sample.jar
+rm /tmp/kafka-streams-word-count-sample.jar
+rm /tmp/streamlistener-basic-kafka-sample.jar
+rm /tmp/streamlistener-basic-rabbit-sample.jar
+rm /tmp/reactive-processor-kafka-sample.jar
+rm /tmp/reactive-processor-rabbit-sample.jar
+rm /tmp/sensor-average-reactive-kafka-sample.jar
+rm /tmp/sensor-average-reactive-rabbit-sample.jar
+
+rm /tmp/foobar.log
\ No newline at end of file
diff --git a/samples-acceptance-tests/src/test/java/sample/acceptance/tests/SampleAcceptanceTests.java b/samples-acceptance-tests/src/test/java/sample/acceptance/tests/SampleAcceptanceTests.java
new file mode 100644
index 0000000..43e40c1
--- /dev/null
+++ b/samples-acceptance-tests/src/test/java/sample/acceptance/tests/SampleAcceptanceTests.java
@@ -0,0 +1,352 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package sample.acceptance.tests;
+
+import org.junit.After;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import org.springframework.jdbc.core.JdbcTemplate;
+import org.springframework.jdbc.datasource.SingleConnectionDataSource;
+import org.springframework.util.StringUtils;
+import org.springframework.web.client.HttpClientErrorException;
+import org.springframework.web.client.RestTemplate;
+
+import javax.sql.DataSource;
+import java.util.stream.Stream;
+
+import static org.junit.Assert.fail;
+
+/**
+ * Do not run these tests as part of an IDE build or individually.
+ * These are acceptance tests for the spring cloud stream samples.
+ * The recommended way to run these tests are using the runAcceptanceTests.sh script in this module.
+ * More about running that script can be found in the README.
+ *
+ * @author Soby Chacko
+ */
+public class SampleAcceptanceTests {
+
+ private static final Logger logger = LoggerFactory.getLogger(SampleAcceptanceTests.class);
+
+ private Process process;
+
+ private Process startTheApp(String[] cmds) throws Exception {
+ ProcessBuilder pb = new ProcessBuilder(cmds);
+ process = pb.start();
+ return process;
+ }
+
+ @After
+ public void stopTheApp() {
+ if (process != null) {
+ process.destroyForcibly();
+ }
+ }
+
+ private void waitForExpectedMessagesToAppearInTheLogs(String app, String... textToSearch) {
+ boolean foundAssertionStrings = waitForLogEntry(app, textToSearch);
+ if (!foundAssertionStrings) {
+ fail("Did not find the text looking for after waiting for 30 seconds");
+ }
+ }
+
+ private void waitForAppToStartFully(String app, String message) {
+ boolean started = waitForLogEntry(app, message);
+ if (!started) {
+ fail("process didn't start in 30 seconds");
+ }
+ }
+
+ @Test
+ public void testJdbcSourceSampleKafka() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/jdbc-source-kafka-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("JDBC Source", "Started SampleJdbcSource in");
+ waitForExpectedMessagesToAppearInTheLogs("JDBC Source",
+ "Data received...[{id=1, name=Bob, tag=null}, {id=2, name=Jane, tag=null}, {id=3, name=John, tag=null}]");
+ }
+
+ @Test
+ public void testJdbcSourceSampleRabbit() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/jdbc-source-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("JDBC Source", "Started SampleJdbcSource in");
+ waitForExpectedMessagesToAppearInTheLogs("JDBC Source",
+ "Data received...[{id=1, name=Bob, tag=null}, {id=2, name=Jane, tag=null}, {id=3, name=John, tag=null}]");
+ }
+
+ @Test
+ public void testJdbcSinkSampleKafka() throws Exception {
+
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/jdbc-sink-kafka-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("JDBC Sink", "Started SampleJdbcSink in");
+
+ verifyJdbcSink();
+ }
+
+ @Test
+ public void testJdbcSinkSampleRabbit() throws Exception {
+
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/jdbc-sink-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("JDBC Sink", "Started SampleJdbcSink in");
+
+ verifyJdbcSink();
+ }
+
+ private void verifyJdbcSink() {
+ JdbcTemplate db;
+ DataSource dataSource = new SingleConnectionDataSource("jdbc:mariadb://localhost:3306/sample_mysql_db",
+ "root", "pwd", false);
+
+ db = new JdbcTemplate(dataSource);
+
+ long timeout = System.currentTimeMillis() + (30 * 1000);
+ boolean exists = false;
+ while (!exists && System.currentTimeMillis() < timeout) {
+ try {
+ Thread.sleep(5 * 1000);
+ } catch (InterruptedException e) {
+ Thread.currentThread().interrupt();
+ throw new IllegalStateException(e.getMessage(), e);
+ }
+
+ Integer count = db.queryForObject("select count(*) from test", Integer.class);
+
+ if (count > 0) {
+ exists = true;
+ }
+ }
+ if (!exists) {
+ fail("No records found in database!");
+ }
+ }
+
+ @Test
+ public void testDynamicSourceSampleKafka() throws Exception {
+
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/dynamic-destination-source-kafka-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ verifyDynamicSourceApp();
+ }
+
+ @Test
+ public void testDynamicSourceSampleRabbit() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/dynamic-destination-source-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ verifyDynamicSourceApp();
+ }
+
+ private void verifyDynamicSourceApp() {
+ waitForAppToStartFully("Dynamic Source", "Started SourceApplication in");
+ RestTemplate restTemplate = new RestTemplate();
+ restTemplate.postForObject(
+ "http://localhost:8080",
+ "{\"id\":\"customerId-1\",\"bill-pay\":\"100\"}", String.class);
+
+ waitForExpectedMessagesToAppearInTheLogs("Dynamic Source",
+ "Data received from customer-1...{\"id\":\"customerId-1\",\"bill-pay\":\"100\"}");
+
+ restTemplate.postForObject(
+ "http://localhost:8080",
+ "{\"id\":\"customerId-2\",\"bill-pay2\":\"200\"}", String.class);
+
+ waitForExpectedMessagesToAppearInTheLogs("Dynamic Source",
+ "Data received from customer-2...{\"id\":\"customerId-2\",\"bill-pay2\":\"200\"}");
+ }
+
+ @Test
+ public void testMultiBinderKafkaInputRabbitOutput() throws Exception {
+ startTheApp(new String[]{"java", "-jar", "/tmp/multibinder-kafka-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"});
+
+ waitForAppToStartFully("Multibinder", "Started MultibinderApplication in");
+
+ waitForExpectedMessagesToAppearInTheLogs("Multibinder", "Data received...bar", "Data received...foo");
+ }
+
+ @Test
+ public void testMultiBinderTwoKafkaClusters() throws Exception {
+
+ startTheApp(new String[]{"java", "-jar", "/tmp/multibinder-two-kafka-clusters-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*",
+ "--kafkaBroker1=localhost:9092", "--zk1=localhost:2181",
+ "--kafkaBroker2=localhost:9093", "--zk2=localhost:2182"});
+
+ waitForAppToStartFully("Multibinder 2 Kafka Clusters", "Started MultibinderApplication in");
+
+ waitForExpectedMessagesToAppearInTheLogs("Multibinder 2 Kafka Clusters", "Data received...bar", "Data received...foo");
+ }
+
+ @Test
+ public void testStreamListenerBasicSampleKafka() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/streamlistener-basic-kafka-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("Streamlistener basic", "Started TypeConversionApplication in");
+ waitForExpectedMessagesToAppearInTheLogs("Streamlistener basic",
+ "At the Source", "Sending value: {\"value\":\"hi\"}", "At the transformer",
+ "Received value hi of type class demo.Bar",
+ "Transforming the value to HI and with the type class demo.Bar",
+ "At the Sink",
+ "Received transformed message HI of type class demo.Foo");
+ }
+
+ @Test
+ public void testStreamListenerBasicSampleRabbit() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/streamlistener-basic-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("Streamlistener basic", "Started TypeConversionApplication in");
+ waitForExpectedMessagesToAppearInTheLogs("Streamlistener basic",
+ "At the Source", "Sending value: {\"value\":\"hi\"}", "At the transformer",
+ "Received value hi of type class demo.Bar",
+ "Transforming the value to HI and with the type class demo.Bar",
+ "At the Sink",
+ "Received transformed message HI of type class demo.Foo");
+ }
+
+ @Test
+ public void testReactiveProcessorSampleKafka() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/reactive-processor-kafka-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("Reactive processor", "Started ReactiveProcessorApplication in");
+ waitForExpectedMessagesToAppearInTheLogs("Reactive processor",
+ "Data received: foobarfoobarfoo",
+ "Data received: barfoobarfoobar");
+ }
+
+ @Test
+ public void testReactiveProcessorSampleRabbit() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/reactive-processor-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("Reactive processor", "Started ReactiveProcessorApplication in");
+ waitForExpectedMessagesToAppearInTheLogs("Reactive processor",
+ "Data received: foobarfoobarfoo",
+ "Data received: barfoobarfoobar");
+ }
+
+ @Test
+ public void testSensorAverageReactiveSampleKafka() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/sensor-average-reactive-kafka-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("Sensor average", "Started SensorAverageProcessorApplication in");
+ waitForExpectedMessagesToAppearInTheLogs("Sensor average",
+ "Data received: {\"id\":100100,\"average\":",
+ "Data received: {\"id\":100200,\"average\":", "Data received: {\"id\":100300,\"average\":");
+ }
+
+ @Test
+ public void testSensorAverageReactiveSampleRabbit() throws Exception {
+ process = startTheApp(new String[]{
+ "java", "-jar", "/tmp/sensor-average-reactive-rabbit-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*"
+ });
+ waitForAppToStartFully("Sensor average", "Started SensorAverageProcessorApplication in");
+ waitForExpectedMessagesToAppearInTheLogs("Sensor average",
+ "Data received: {\"id\":100100,\"average\":",
+ "Data received: {\"id\":100200,\"average\":", "Data received: {\"id\":100300,\"average\":");
+ }
+
+ @Test
+ public void testKafkaStreamsWordCount() throws Exception {
+ startTheApp(new String[]{"java", "-jar", "/tmp/kafka-streams-word-count-sample.jar", "--logging.file=/tmp/foobar.log",
+ "--management.endpoints.web.exposure.include=*",
+ "--spring.cloud.stream.kafka.streams.timeWindow.length=60000"});
+
+ waitForAppToStartFully("Kafka Streams WordCount", "Started KafkaStreamsWordCountApplication in");
+
+ waitForExpectedMessagesToAppearInTheLogs("Kafka Streams WordCount",
+ "Data received...{\"word\":\"foo\",\"count\":1,",
+ "Data received...{\"word\":\"bar\",\"count\":1,",
+ "Data received...{\"word\":\"foobar\",\"count\":1,",
+ "Data received...{\"word\":\"baz\",\"count\":1,",
+ "Data received...{\"word\":\"fox\",\"count\":1,");
+ }
+
+ boolean waitForLogEntry(String app, String... entries) {
+ logger.info("Looking for '" + StringUtils.arrayToCommaDelimitedString(entries) + "' in logfile for " + app);
+ long timeout = System.currentTimeMillis() + (30 * 1000);
+ boolean exists = false;
+ while (!exists && System.currentTimeMillis() < timeout) {
+ try {
+ Thread.sleep(7 * 1000);
+ } catch (InterruptedException e) {
+ Thread.currentThread().interrupt();
+ throw new IllegalStateException(e.getMessage(), e);
+ }
+ if (!exists) {
+ logger.info("Polling to get log file. Remaining poll time = "
+ + (timeout - System.currentTimeMillis() + " ms."));
+ String log = getLog("http://localhost:8080/actuator");
+ if (log != null) {
+ if (Stream.of(entries).allMatch(s -> log.contains(s))) {
+ exists = true;
+ }
+ }
+ }
+ }
+ if (exists) {
+ logger.info("Matched all '" + StringUtils.arrayToCommaDelimitedString(entries) + "' in logfile for app " + app);
+ } else {
+ logger.error("ERROR: Couldn't find all '" + StringUtils.arrayToCommaDelimitedString(entries) + "' in logfile for " + app);
+ }
+ return exists;
+ }
+
+ String getLog(String url) {
+ RestTemplate restTemplate = new RestTemplate();
+ String logFileUrl = String.format("%s/logfile", url);
+ String log = null;
+ try {
+ log = restTemplate.getForObject(logFileUrl, String.class);
+ if (log == null) {
+ logger.info("Unable to retrieve logfile from '" + logFileUrl);
+ } else {
+ logger.info("Retrieved logfile from '" + logFileUrl);
+ }
+ } catch (HttpClientErrorException e) {
+ logger.info("Failed to access logfile from '" + logFileUrl + "' due to : " + e.getMessage());
+ } catch (Exception e) {
+ logger.warn("Error while trying to access logfile from '" + logFileUrl + "' due to : " + e);
+ }
+ return log;
+ }
+
+}
diff --git a/self-contained-aggregate-app/.mvn b/self-contained-aggregate-app/.mvn
deleted file mode 120000
index 19172e1..0000000
--- a/self-contained-aggregate-app/.mvn
+++ /dev/null
@@ -1 +0,0 @@
-../.mvn
\ No newline at end of file
diff --git a/self-contained-aggregate-app/README.adoc b/self-contained-aggregate-app/README.adoc
deleted file mode 100644
index 4486f6d..0000000
--- a/self-contained-aggregate-app/README.adoc
+++ /dev/null
@@ -1,42 +0,0 @@
-Spring Cloud Stream Aggregate Application Sample
-================================================
-
-This is a basic example of a Spring Cloud Stream aggregate application that is self contained.
-This sample follows the chain of source -> processor -> sink.
-Since the final component is a sink, we can run this aggregate sample without using a middleware, i.e. the entire chain is done in memory using the channels and thus self contained.
-
-## Requirements
-
-To run this sample, you will need to have installed:
-
-* Java 8 or Above
-
-## Code Tour
-
-* SourceAppConfiguration - Configuration for the source
-* ProcessorAppConfiguraion - Configuration for the processor
-* SinkAppConfiguration - Configuration for the sink
-* SourceApplication - Spring Boot app for the source
-* ProcessorApplication - Spring Boot app for the processor
-* SinkApplication - Spring Boot app for the sink
-* AggregateApplication - The main aggregate application
-
-## Running the application
-
-* Go to the application root
-
-* `./mvnw clean package`
-
-* `java -jar target/self-contained-aggregate-app-0.0.1-SNAPSHOT.jar`
-
-Source application sends a message every second which will initiate the processor and then the sink (all in memory through the aggregate app).
-You will see output similar to the following printed on the console every second.
-
-```
-2018-03-02 18:29:06.546 INFO 29080 --- [ask-scheduler-1] config.sink.SinkModuleDefinition : Received: 2018-03-02 18:29:06
-2018-03-02 18:29:07.552 INFO 29080 --- [ask-scheduler-2] config.sink.SinkModuleDefinition : Received: 2018-03-02 18:29:07
-2018-03-02 18:29:08.558 INFO 29080 --- [ask-scheduler-1] config.sink.SinkModuleDefinition : Received: 2018-03-02 18:29:08
-2018-03-02 18:29:09.564 INFO 29080 --- [ask-scheduler-3] config.sink.SinkModuleDefinition : Received: 2018-03-02 18:29:09
-2018-03-02 18:29:10.569 INFO 29080 --- [ask-scheduler-2] config.sink.SinkModuleDefinition : Received: 2018-03-02 18:29:10
-2018-03-02 18:29:11.570 INFO 29080 --- [ask-scheduler-4] config.sink.SinkModuleDefinition : Received: 2018-03-02 18:29:11
-```
\ No newline at end of file
diff --git a/self-contained-aggregate-app/pom.xml b/self-contained-aggregate-app/pom.xml
deleted file mode 100644
index 1932e81..0000000
--- a/self-contained-aggregate-app/pom.xml
+++ /dev/null
@@ -1,93 +0,0 @@
-
-
- 4.0.0
-
- spring.cloud.stream.samples
- self-contained-aggregate-app
- 0.0.1-SNAPSHOT
- jar
-
- self-contained-aggregate-app
- Demo project for Spring Boot
-
-
- org.springframework.boot
- spring-boot-starter-parent
- 2.0.0.BUILD-SNAPSHOT
-
-
-
-
- UTF-8
- UTF-8
- 1.8
- Finchley.BUILD-SNAPSHOT
-
-
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- org.springframework.boot
- spring-boot-starter-web
-
-
- org.springframework.boot
- spring-boot-starter
-
-
- org.springframework.cloud
- spring-cloud-stream-binder-kafka
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- ${spring-cloud.version}
- pom
- import
-
-
-
-
-
-
-
- org.springframework.boot
- spring-boot-maven-plugin
-
-
-
-
-
-
- spring-snapshots
- Spring Snapshots
- http://repo.spring.io/libs-snapshot-local
-
- true
-
-
- false
-
-
-
- spring-milestones
- Spring Milestones
- http://repo.spring.io/libs-milestone-local
-
- false
-
-
-
-
diff --git a/self-contained-aggregate-app/src/main/java/config/processor/ProcessorAppConfiguration.java b/self-contained-aggregate-app/src/main/java/config/processor/ProcessorAppConfiguration.java
deleted file mode 100644
index 575ae6b..0000000
--- a/self-contained-aggregate-app/src/main/java/config/processor/ProcessorAppConfiguration.java
+++ /dev/null
@@ -1,34 +0,0 @@
-/*
- * Copyright 2016 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.processor;
-
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.messaging.Processor;
-import org.springframework.integration.annotation.Transformer;
-import org.springframework.messaging.Message;
-
-/**
- * @author Marius Bogoevici
- */
-@EnableBinding(Processor.class)
-public class ProcessorAppConfiguration {
-
- @Transformer(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT)
- public Message> transform(Message> inbound) {
- return inbound;
- }
-}
diff --git a/self-contained-aggregate-app/src/main/java/config/processor/ProcessorApplication.java b/self-contained-aggregate-app/src/main/java/config/processor/ProcessorApplication.java
deleted file mode 100644
index 5bd2c51..0000000
--- a/self-contained-aggregate-app/src/main/java/config/processor/ProcessorApplication.java
+++ /dev/null
@@ -1,27 +0,0 @@
-/*
- * Copyright 2016 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.processor;
-
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-
-/**
- * @author Marius Bogoevici
- */
-@SpringBootApplication
-public class ProcessorApplication {
-
-}
diff --git a/self-contained-aggregate-app/src/main/java/config/sink/SinkAppConfiguration.java b/self-contained-aggregate-app/src/main/java/config/sink/SinkAppConfiguration.java
deleted file mode 100644
index b540725..0000000
--- a/self-contained-aggregate-app/src/main/java/config/sink/SinkAppConfiguration.java
+++ /dev/null
@@ -1,39 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.sink;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.messaging.Sink;
-import org.springframework.integration.annotation.ServiceActivator;
-
-/**
- * @author Dave Syer
- * @author Marius Bogoevici
- */
-@EnableBinding(Sink.class)
-public class SinkAppConfiguration {
-
- private static Logger logger = LoggerFactory.getLogger(SinkAppConfiguration.class);
-
- @ServiceActivator(inputChannel=Sink.INPUT)
- public void loggerSink(Object payload) {
- logger.info("Received: " + payload);
- }
-
-}
diff --git a/self-contained-aggregate-app/src/main/java/config/sink/SinkApplication.java b/self-contained-aggregate-app/src/main/java/config/sink/SinkApplication.java
deleted file mode 100644
index e093d1f..0000000
--- a/self-contained-aggregate-app/src/main/java/config/sink/SinkApplication.java
+++ /dev/null
@@ -1,26 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.sink;
-
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-
-/**
- * @author Marius Bogoevici
- */
-@SpringBootApplication
-public class SinkApplication {
-}
diff --git a/self-contained-aggregate-app/src/main/java/config/source/SourceAppConfiguration.java b/self-contained-aggregate-app/src/main/java/config/source/SourceAppConfiguration.java
deleted file mode 100644
index 684d2bc..0000000
--- a/self-contained-aggregate-app/src/main/java/config/source/SourceAppConfiguration.java
+++ /dev/null
@@ -1,45 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.source;
-
-import org.springframework.cloud.stream.annotation.EnableBinding;
-import org.springframework.cloud.stream.messaging.Source;
-import org.springframework.context.annotation.Bean;
-import org.springframework.integration.annotation.InboundChannelAdapter;
-import org.springframework.integration.annotation.Poller;
-import org.springframework.integration.core.MessageSource;
-import org.springframework.messaging.support.GenericMessage;
-
-import java.text.SimpleDateFormat;
-import java.util.Date;
-
-/**
- * @author Dave Syer
- * @author Marius Bogoevici
- */
-@EnableBinding(Source.class)
-public class SourceAppConfiguration {
-
- private String format = "yyyy-MM-dd HH:mm:ss";
-
- @Bean
- @InboundChannelAdapter(value = Source.OUTPUT, poller = @Poller(fixedDelay = "${fixedDelay}", maxMessagesPerPoll = "1"))
- public MessageSource timerMessageSource() {
- return () -> new GenericMessage<>(new SimpleDateFormat(this.format).format(new Date()));
- }
-
-}
diff --git a/self-contained-aggregate-app/src/main/java/config/source/SourceApplication.java b/self-contained-aggregate-app/src/main/java/config/source/SourceApplication.java
deleted file mode 100644
index f1d9077..0000000
--- a/self-contained-aggregate-app/src/main/java/config/source/SourceApplication.java
+++ /dev/null
@@ -1,26 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package config.source;
-
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-
-/**
- * @author Marius Bogoevici
- */
-@SpringBootApplication
-public class SourceApplication {
-}
diff --git a/self-contained-aggregate-app/src/main/java/demo/AggregateApplication.java b/self-contained-aggregate-app/src/main/java/demo/AggregateApplication.java
deleted file mode 100644
index 796333a..0000000
--- a/self-contained-aggregate-app/src/main/java/demo/AggregateApplication.java
+++ /dev/null
@@ -1,35 +0,0 @@
-/*
- * Copyright 2015 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package demo;
-
-import config.processor.ProcessorApplication;
-import config.sink.SinkApplication;
-import config.source.SourceApplication;
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-import org.springframework.cloud.stream.aggregate.AggregateApplicationBuilder;
-
-@SpringBootApplication
-public class AggregateApplication {
-
- public static void main(String[] args) {
- new AggregateApplicationBuilder(AggregateApplication.class, args)
- .from(SourceApplication.class).args("--fixedDelay=1000")
- .via(ProcessorApplication.class)
- .to(SinkApplication.class).args("--debug=true").run();
- }
-
-}
diff --git a/sink-samples/jdbc-sink/.mvn b/sink-samples/jdbc-sink/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/sink-samples/jdbc-sink/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/jdbc-sink/README.adoc b/sink-samples/jdbc-sink/README.adoc
similarity index 81%
rename from jdbc-sink/README.adoc
rename to sink-samples/jdbc-sink/README.adoc
index fd1f97c..69736cc 100644
--- a/jdbc-sink/README.adoc
+++ b/sink-samples/jdbc-sink/README.adoc
@@ -19,7 +19,7 @@ To run this sample, you will need to have installed:
* Java 8 or Above
-## Running the application
+## Running the application with Kafka binder
The following instructions assume that you are running Kafka and MySql as Docker images.
@@ -50,3 +50,15 @@ If you go with this kafka console producer approach, make sure you comment out t
`select * from test;`
Repeat the query a few times and you will see additional records each time.
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/sample-jdbc-sink-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
\ No newline at end of file
diff --git a/sink-samples/jdbc-sink/docker-compose-rabbit.yml b/sink-samples/jdbc-sink/docker-compose-rabbit.yml
new file mode 100644
index 0000000..512395e
--- /dev/null
+++ b/sink-samples/jdbc-sink/docker-compose-rabbit.yml
@@ -0,0 +1,18 @@
+version: '3'
+volumes:
+ data-volume: {}
+services:
+ mysql:
+ image: mariadb
+ ports:
+ - "3306:3306"
+ environment:
+ MYSQL_ROOT_PASSWORD: pwd
+ MYSQL_DATABASE: sample_mysql_db
+ volumes:
+ - data-volume:/var/lib/mysql
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/jdbc-sink/docker-compose.yml b/sink-samples/jdbc-sink/docker-compose.yml
similarity index 100%
rename from jdbc-sink/docker-compose.yml
rename to sink-samples/jdbc-sink/docker-compose.yml
diff --git a/streamlistener-basic/mvnw b/sink-samples/jdbc-sink/mvnw
similarity index 100%
rename from streamlistener-basic/mvnw
rename to sink-samples/jdbc-sink/mvnw
diff --git a/streamlistener-basic/mvnw.cmd b/sink-samples/jdbc-sink/mvnw.cmd
similarity index 100%
rename from streamlistener-basic/mvnw.cmd
rename to sink-samples/jdbc-sink/mvnw.cmd
diff --git a/sink-samples/jdbc-sink/pom.xml b/sink-samples/jdbc-sink/pom.xml
new file mode 100644
index 0000000..92039af
--- /dev/null
+++ b/sink-samples/jdbc-sink/pom.xml
@@ -0,0 +1,83 @@
+
+
+ 4.0.0
+
+ jdbc-sink
+ 0.0.1-SNAPSHOT
+ jar
+ sample-jdbc-sink
+ Spring Cloud Stream Sample JDBC Sink App
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.mariadb.jdbc
+ mariadb-java-client
+ 1.1.9
+ runtime
+
+
+ org.springframework.integration
+ spring-integration-jdbc
+
+
+ org.springframework.boot
+ spring-boot-starter-jdbc
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.cloud
+ spring-cloud-stream-test-support
+ test
+
+
+ com.h2database
+ h2
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
diff --git a/jdbc-sink/src/main/java/demo/SampleJdbcSink.java b/sink-samples/jdbc-sink/src/main/java/demo/SampleJdbcSink.java
similarity index 81%
rename from jdbc-sink/src/main/java/demo/SampleJdbcSink.java
rename to sink-samples/jdbc-sink/src/main/java/demo/SampleJdbcSink.java
index b00d9f7..dfdc55f 100644
--- a/jdbc-sink/src/main/java/demo/SampleJdbcSink.java
+++ b/sink-samples/jdbc-sink/src/main/java/demo/SampleJdbcSink.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package demo;
import org.springframework.beans.factory.annotation.Autowired;
@@ -85,7 +101,7 @@ public class SampleJdbcSink {
}
}
- //Following source is used as test producer.
+ //Following source is used as a test producer.
@EnableBinding(Source.class)
static class TestSource {
@@ -106,7 +122,6 @@ public class SampleJdbcSink {
return () ->
new GenericMessage<>(this.semaphore.getAndSet(!this.semaphore.get()) ? foo1 : foo2);
-
}
}
diff --git a/jdbc-sink/src/main/resources/application.yml b/sink-samples/jdbc-sink/src/main/resources/application.yml
similarity index 100%
rename from jdbc-sink/src/main/resources/application.yml
rename to sink-samples/jdbc-sink/src/main/resources/application.yml
diff --git a/jdbc-sink/src/main/resources/sample-schema.sql b/sink-samples/jdbc-sink/src/main/resources/sample-schema.sql
similarity index 100%
rename from jdbc-sink/src/main/resources/sample-schema.sql
rename to sink-samples/jdbc-sink/src/main/resources/sample-schema.sql
diff --git a/jdbc-sink/src/test/java/demo/ModuleApplicationTests.java b/sink-samples/jdbc-sink/src/test/java/demo/ModuleApplicationTests.java
similarity index 96%
rename from jdbc-sink/src/test/java/demo/ModuleApplicationTests.java
rename to sink-samples/jdbc-sink/src/test/java/demo/ModuleApplicationTests.java
index f3c41bb..3d4e386 100644
--- a/jdbc-sink/src/test/java/demo/ModuleApplicationTests.java
+++ b/sink-samples/jdbc-sink/src/test/java/demo/ModuleApplicationTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2015 the original author or authors.
+ * Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
diff --git a/jdbc-sink/src/test/resources/application.properties b/sink-samples/jdbc-sink/src/test/resources/application.properties
similarity index 100%
rename from jdbc-sink/src/test/resources/application.properties
rename to sink-samples/jdbc-sink/src/test/resources/application.properties
diff --git a/sink-samples/pom.xml b/sink-samples/pom.xml
new file mode 100644
index 0000000..02670e6
--- /dev/null
+++ b/sink-samples/pom.xml
@@ -0,0 +1,14 @@
+
+
+ 4.0.0
+ spring.cloud.stream.samples
+ sink-samples
+ 0.0.1-SNAPSHOT
+ pom
+ sink-samples
+ Collection of Spring Cloud Stream Sink Samples
+
+
+ jdbc-sink
+
+
diff --git a/source-samples/dynamic-destination-source/.mvn b/source-samples/dynamic-destination-source/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/source-samples/dynamic-destination-source/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/dynamic-destination-source/README.adoc b/source-samples/dynamic-destination-source/README.adoc
similarity index 50%
rename from dynamic-destination-source/README.adoc
rename to source-samples/dynamic-destination-source/README.adoc
index 5ad62cf..d0b84b2 100644
--- a/dynamic-destination-source/README.adoc
+++ b/source-samples/dynamic-destination-source/README.adoc
@@ -15,7 +15,7 @@ The class `SourceWithDynamicDestination` is a REST controller that registers the
When a payload is sent to 'http://localhost:8080/' by a POST request (port 8080 is the default), this application uses a router that is SpEL based which usea a `BinderAwareChannelResolver` to resolve the destination dynamically at runtime.
Currently, this router uses `payload.id` as the SpEL expression for its resolver to resolve the destination name.
-## Running the application
+## Running the application with Kafka binder
The following instructions assume that you are running Kafka as a Docker image.
@@ -32,6 +32,37 @@ curl -H "Content-Type: application/json" -X POST -d '{"id":"customerId-1","bill-
curl -H "Content-Type: application/json" -X POST -d '{"id":"customerId-2","bill-pay":"150"}' http://localhost:8080
-The destinations 'customerId-1' and 'customerId-2' are created at the broker (for example: topic in case of Kafka (exchange in case of Rabbit) with the names 'customerId-1' and 'customerId-2') and the data are published to the appropriate destinations dynamically.
+The destinations 'customerId-1' and 'customerId-2' are created as topics on Kafka and the data is published to the appropriate destinations dynamically.
Do the above HTTP post a few times and watch the output appear on the corresponding dynamically created topics.
+
+`docker exec -it kafka-dynamic-source /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic customerId-1`
+`docker exec -it kafka-dynamic-source /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic customerId-2`
+
+There are two convenient test sinks are provided as part of the application for easy testing which log the data from the above dynamic sources.
+
+* `docker-compose down`
+
+## Running the application with Rabbit binder
+
+* Go to the application root
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/dynamic-destination-source-0.0.1-SNAPSHOT.jar`
+
+Upon starting the application on the default port 8080, if the following data are sent:
+
+curl -H "Content-Type: application/json" -X POST -d '{"id":"customerId-1","bill-pay":"100"}' http://localhost:8080
+
+curl -H "Content-Type: application/json" -X POST -d '{"id":"customerId-2","bill-pay":"150"}' http://localhost:8080
+
+The destinations 'customerId-1' and 'customerId-2' are created as exchanges in Rabbit and the data is published to the appropriate destinations dynamically.
+
+Do the above HTTP post a few times and watch the output appear on the corresponding exchanges.
+
+`http://localhost:15672`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
+
diff --git a/source-samples/dynamic-destination-source/docker-compose-rabbit.yml b/source-samples/dynamic-destination-source/docker-compose-rabbit.yml
new file mode 100644
index 0000000..7c3da92
--- /dev/null
+++ b/source-samples/dynamic-destination-source/docker-compose-rabbit.yml
@@ -0,0 +1,7 @@
+version: '3'
+services:
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/dynamic-destination-source/docker-compose.yml b/source-samples/dynamic-destination-source/docker-compose.yml
similarity index 90%
rename from dynamic-destination-source/docker-compose.yml
rename to source-samples/dynamic-destination-source/docker-compose.yml
index d47d783..5bed2ed 100644
--- a/dynamic-destination-source/docker-compose.yml
+++ b/source-samples/dynamic-destination-source/docker-compose.yml
@@ -2,6 +2,7 @@ version: '2'
services:
kafka:
image: wurstmeister/kafka
+ container_name: kafka-dynamic-source
ports:
- "9092:9092"
environment:
diff --git a/test-embedded-kafka/mvnw b/source-samples/dynamic-destination-source/mvnw
similarity index 100%
rename from test-embedded-kafka/mvnw
rename to source-samples/dynamic-destination-source/mvnw
diff --git a/test-embedded-kafka/mvnw.cmd b/source-samples/dynamic-destination-source/mvnw.cmd
similarity index 100%
rename from test-embedded-kafka/mvnw.cmd
rename to source-samples/dynamic-destination-source/mvnw.cmd
diff --git a/source-samples/dynamic-destination-source/pom.xml b/source-samples/dynamic-destination-source/pom.xml
new file mode 100644
index 0000000..9161aa1
--- /dev/null
+++ b/source-samples/dynamic-destination-source/pom.xml
@@ -0,0 +1,64 @@
+
+
+ 4.0.0
+
+ dynamic-destination-source
+ 0.0.1-SNAPSHOT
+ jar
+ dynamic-destination-source
+ Spring Cloud Stream Sample JDBC Source App
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ org.springframework.cloud
+ spring-cloud-stream-test-support
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/dynamic-destination-source/src/main/java/demo/SourceApplication.java b/source-samples/dynamic-destination-source/src/main/java/demo/SourceApplication.java
similarity index 100%
rename from dynamic-destination-source/src/main/java/demo/SourceApplication.java
rename to source-samples/dynamic-destination-source/src/main/java/demo/SourceApplication.java
diff --git a/dynamic-destination-source/src/main/java/demo/SourceWithDynamicDestination.java b/source-samples/dynamic-destination-source/src/main/java/demo/SourceWithDynamicDestination.java
similarity index 76%
rename from dynamic-destination-source/src/main/java/demo/SourceWithDynamicDestination.java
rename to source-samples/dynamic-destination-source/src/main/java/demo/SourceWithDynamicDestination.java
index ed74b72..5c584b4 100644
--- a/dynamic-destination-source/src/main/java/demo/SourceWithDynamicDestination.java
+++ b/source-samples/dynamic-destination-source/src/main/java/demo/SourceWithDynamicDestination.java
@@ -16,9 +16,13 @@
package demo;
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.cloud.stream.annotation.EnableBinding;
+import org.springframework.cloud.stream.annotation.Input;
+import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.cloud.stream.binding.BinderAwareChannelResolver;
import org.springframework.context.annotation.Bean;
import org.springframework.expression.spel.standard.SpelExpressionParser;
@@ -29,6 +33,7 @@ import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.router.ExpressionEvaluatingRouter;
import org.springframework.messaging.MessageChannel;
import org.springframework.messaging.MessageHeaders;
+import org.springframework.messaging.SubscribableChannel;
import org.springframework.messaging.support.MessageBuilder;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.RequestBody;
@@ -42,6 +47,7 @@ import static org.springframework.web.bind.annotation.RequestMethod.POST;
/**
* @author Ilayaperumal Gopinathan
+ * @author Soby Chacko
*/
@EnableBinding
@Controller
@@ -78,4 +84,36 @@ public class SourceWithDynamicDestination {
router.setChannelResolver(resolver);
return router;
}
+
+ //Following sink is used as test consumer. It logs the data received through the consumer.
+ @EnableBinding(Sink.class)
+ static class TestSink {
+
+ private final Log logger = LogFactory.getLog(getClass());
+
+ @StreamListener(Sink.INPUT1)
+ public void receive(String data) {
+ logger.info("Data received from customer-1..." + data);
+ }
+
+ @StreamListener(Sink.INPUT2)
+ public void receiveX(String data) {
+ logger.info("Data received from customer-2..." + data);
+ }
+ }
+
+ interface Sink {
+
+ String INPUT1 = "input1";
+ String INPUT2 = "input2";
+
+
+ @Input(INPUT1)
+ SubscribableChannel input1();
+
+
+ @Input(INPUT2)
+ SubscribableChannel input2();
+
+ }
}
diff --git a/source-samples/dynamic-destination-source/src/main/resources/application.yml b/source-samples/dynamic-destination-source/src/main/resources/application.yml
new file mode 100644
index 0000000..c4a461a
--- /dev/null
+++ b/source-samples/dynamic-destination-source/src/main/resources/application.yml
@@ -0,0 +1,3 @@
+# Input bindings used for testing
+spring.cloud.stream.bindings.input1.destination: customerId-1
+spring.cloud.stream.bindings.input2.destination: customerId-2
diff --git a/dynamic-destination-source/src/test/java/demo/ModuleApplicationTests.java b/source-samples/dynamic-destination-source/src/test/java/demo/ModuleApplicationTests.java
similarity index 100%
rename from dynamic-destination-source/src/test/java/demo/ModuleApplicationTests.java
rename to source-samples/dynamic-destination-source/src/test/java/demo/ModuleApplicationTests.java
diff --git a/source-samples/jdbc-source/.mvn b/source-samples/jdbc-source/.mvn
new file mode 120000
index 0000000..d21aa17
--- /dev/null
+++ b/source-samples/jdbc-source/.mvn
@@ -0,0 +1 @@
+../../.mvn
\ No newline at end of file
diff --git a/jdbc-source/README.adoc b/source-samples/jdbc-source/README.adoc
similarity index 82%
rename from jdbc-source/README.adoc
rename to source-samples/jdbc-source/README.adoc
index a893aed..ee27048 100644
--- a/jdbc-source/README.adoc
+++ b/source-samples/jdbc-source/README.adoc
@@ -1,7 +1,6 @@
Spring Cloud Stream Source Sample
==================================
-
## What is this app?
This is a Spring Boot application that is a Spring Cloud Stream sample source app that polls a JDBC Source.
@@ -31,7 +30,7 @@ Refer to the `application.yml` for the defaults used for the following propertie
`jdbc.query` - SQL used to query the database.
`jdbc.update` - SQL used to skip records already seen (details below)
-## Running the application
+## Running the application using Kafka binder
The following instructions assume that you are running Kafka and MySql as Docker images.
@@ -68,4 +67,18 @@ If you want to query the app every second now, and ignore any records that you a
`insert into test values (20, 'Bob', NULL);`
`insert into test values (22, 'Bob', NULL);`
-Watch the data appear on the application console.
\ No newline at end of file
+Watch the data appear on the application console.
+
+Once you are done testing, stop Kafka running in the docker container: `docker-compose down`
+
+## Running the application using Rabbit binder
+
+All the instructions above apply here also, but instead of running the default `docker-compose.yml`, use the command below to start a Rabbitmq cluser.
+
+* `docker-compose -f docker-compose-rabbit.yml up -d`
+
+* `./mvnw clean package -P rabbit-binder`
+
+* `java -jar target/sample-jdbc-source-0.0.1-SNAPSHOT.jar`
+
+Once you are done testing: `docker-compose -f docker-compose-rabbit.yml down`
\ No newline at end of file
diff --git a/source-samples/jdbc-source/docker-compose-rabbit.yml b/source-samples/jdbc-source/docker-compose-rabbit.yml
new file mode 100644
index 0000000..512395e
--- /dev/null
+++ b/source-samples/jdbc-source/docker-compose-rabbit.yml
@@ -0,0 +1,18 @@
+version: '3'
+volumes:
+ data-volume: {}
+services:
+ mysql:
+ image: mariadb
+ ports:
+ - "3306:3306"
+ environment:
+ MYSQL_ROOT_PASSWORD: pwd
+ MYSQL_DATABASE: sample_mysql_db
+ volumes:
+ - data-volume:/var/lib/mysql
+ rabbitmq:
+ image: rabbitmq:management
+ ports:
+ - 5672:5672
+ - 15672:15672
\ No newline at end of file
diff --git a/jdbc-source/docker-compose.yml b/source-samples/jdbc-source/docker-compose.yml
similarity index 92%
rename from jdbc-source/docker-compose.yml
rename to source-samples/jdbc-source/docker-compose.yml
index 604504d..2405901 100644
--- a/jdbc-source/docker-compose.yml
+++ b/source-samples/jdbc-source/docker-compose.yml
@@ -1,4 +1,4 @@
-version: '2'
+version: '3'
volumes:
data-volume: {}
services:
@@ -13,6 +13,7 @@ services:
- data-volume:/var/lib/mysql
kafka:
image: wurstmeister/kafka
+ container_name: kafka-jdbcc-source
ports:
- "9092:9092"
environment:
diff --git a/uppercase-transformer/mvnw b/source-samples/jdbc-source/mvnw
similarity index 100%
rename from uppercase-transformer/mvnw
rename to source-samples/jdbc-source/mvnw
diff --git a/uppercase-transformer/mvnw.cmd b/source-samples/jdbc-source/mvnw.cmd
similarity index 100%
rename from uppercase-transformer/mvnw.cmd
rename to source-samples/jdbc-source/mvnw.cmd
diff --git a/source-samples/jdbc-source/pom.xml b/source-samples/jdbc-source/pom.xml
new file mode 100644
index 0000000..e48b9a6
--- /dev/null
+++ b/source-samples/jdbc-source/pom.xml
@@ -0,0 +1,84 @@
+
+
+ 4.0.0
+
+ jdbc-source
+ 0.0.1-SNAPSHOT
+ jar
+ sample-jdbc-source
+ Spring Cloud Stream Sample JDBC Source App
+
+
+ spring.cloud.stream.samples
+ spring-cloud-stream-samples-parent
+ 0.0.1-SNAPSHOT
+ ../..
+
+
+
+
+ org.mariadb.jdbc
+ mariadb-java-client
+ 1.1.9
+ runtime
+
+
+ org.springframework.integration
+ spring-integration-jdbc
+
+
+ org.springframework.boot
+ spring-boot-starter-jdbc
+
+
+ org.springframework.cloud
+ spring-cloud-stream-test-support
+ test
+
+
+ org.springframework.boot
+ spring-boot-starter-test
+ test
+
+
+ com.h2database
+ h2
+ test
+
+
+
+
+
+ kafka-binder
+
+ true
+
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-kafka
+
+
+
+
+ rabbit-binder
+
+
+ org.springframework.cloud
+ spring-cloud-stream-binder-rabbit
+
+
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+
+
diff --git a/jdbc-source/src/main/java/demo/JdbcSourceProperties.java b/source-samples/jdbc-source/src/main/java/demo/JdbcSourceProperties.java
similarity index 57%
rename from jdbc-source/src/main/java/demo/JdbcSourceProperties.java
rename to source-samples/jdbc-source/src/main/java/demo/JdbcSourceProperties.java
index e46f795..db2826b 100644
--- a/jdbc-source/src/main/java/demo/JdbcSourceProperties.java
+++ b/source-samples/jdbc-source/src/main/java/demo/JdbcSourceProperties.java
@@ -1,3 +1,19 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package demo;
import org.springframework.boot.context.properties.ConfigurationProperties;
diff --git a/jdbc-source/src/main/java/demo/SampleJdbcSource.java b/source-samples/jdbc-source/src/main/java/demo/SampleJdbcSource.java
similarity index 77%
rename from jdbc-source/src/main/java/demo/SampleJdbcSource.java
rename to source-samples/jdbc-source/src/main/java/demo/SampleJdbcSource.java
index 703f5a7..f35c7f2 100644
--- a/jdbc-source/src/main/java/demo/SampleJdbcSource.java
+++ b/source-samples/jdbc-source/src/main/java/demo/SampleJdbcSource.java
@@ -1,5 +1,23 @@
+/*
+ * Copyright 2018 the original author or authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
package demo;
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@@ -84,13 +102,16 @@ public class SampleJdbcSource {
DatabasePopulatorUtils.execute(populator, dataSource);
}
- //Following sink is used as test consumer. It logs the data received through the consumer.
+ //Following sink is used as a test consumer. It logs the data received through the consumer.
@EnableBinding(Sink.class)
static class TestSink {
+ private final Log logger = LogFactory.getLog(getClass());
+
@StreamListener(Sink.INPUT)
public void receive(List