A JHipster module that generates Apache Kafka consumers and producers and more!
- Basic Consumer/Producer API use
- Several prompt options (
polling.timeout
,auto.offset.reset.policy
,bootstrap.servers
) - AKHQ (KafkaHQ) support
- Topic management
You can have more details about work in progress in issues:
- Schema Registry and Avro support
- Producer API (ordered messages, high throughput...)
- Deserialization alternatives (JacksonSerde) as a prompt option
- Security (SSL protocol as a prompt option, safe mode...)
- JHipster entity sub-generator hook
- JHipster microservices applications support
- Kafka Connect support
- Kafka Streams support
This is a JHipster module, that is meant to be used in a JHipster application. You can use it to generate Apache Kafka consumers and producers in a JHipster backend (Spring Boot / Java only supported at the moment). It uses Apache Kafka client as a base.
As this is a JHipster module, we expect you have JHipster and its related tools already installed:
Or just run:
npm i -g generator-jhipster
v6.x_maintenance
branch of this module.
yo
(Yeoman) is needed to make it work correctly!
To install this module and Yeoman (yo
):
npm install -g generator-jhipster-kafka yo
To update this module:
npm update -g generator-jhipster-kafka
To install this module and Yeoman (yo
):
yarn global add generator-jhipster-kafka yo
To update this module:
yarn global upgrade generator-jhipster-kafka
This describes how to use basically this module with a JHipster generated project.
If you want to use local versions of JHipster and the Kafka module:
- Go to your
generator-jhipster
project folder and runnpm link
- Go to your
generator-jhipster-kafka
project folder and runnpm link
- In your project generated with JHipster run
npm link generator-jhipster generator-jhipster-kafka
generator-jhipster
or generator-jhipster-kafka
folder, you will have to repeat the previous steps.
Important : The following steps and use cases are to be done on a single generated JHipster monolithic application.
📅 In a near future it will be achievable between two or more monolithic applications or in microservices.
- Ensure you have a JHipster version > 6.0.0 with:
jhipster --version
- Create a JHipster project in a new folder:
mkdir myproject && cd myproject && jhipster
(you can also create a backend project only withjhipster --skip-client
) - Choose
Apache Kafka as asynchronous messages broker
in server side options when answering the following question : "Which other technologies would you like to use?" - In the same folder, then run
yo jhipster-kafka
and then follow the use case you need - After the generation have been done, run Kafka with:
docker-compose -f src/main/docker/kafka.yml up -d
(or without-d
and ensure you have a docker-compose version >= 1.27.4) - Run your application with:
./mvnw
The different use cases are listed on another page.
Do your own configuration step-by-step!
The END represents the end of the prompts below, when files are written after confirmation (you can use the --force
option with yo jhipster-kafka
to overwrite all files).
- Do you want to clean up your current Kafka configuration? (default = N)
- If "y" was typed: all configurations and classes will be deleted and fully regenerated
- If "n" was typed: the new configuration will be merged with the previous one
- What is your bootstrap servers string connection (you can add several bootstrap servers by using a "," delimiter)? (default = localhost:9092)
- For which entity (class name)?
- No entity (will be typed String) (default)
- How would you prefix your objects (no entity, for instance: [SomeEventType]Consumer|Producer...)?
- Foo
- Bar
- ...
- No entity (will be typed String) (default)
- Which components would you like to generate?
- Consumer
- Producer
- Which topic for (entity/prefix)?
- Default topic name following this convention: message_type.application_type.entity_name (default)
- Custom topic name
- What is the topic name for (entity/prefix)?
- queuing.application_name.existing_topic_name
- ...
- If "Consumer" was selected: What is the consumer polling timeout (in ms)? (default = 10000)
- If "Consumer" was selected: Define the auto offset reset policy (what to do when there is no initial offset in Kafka or if the current offset does not exist any more on the server)?
- earliest (automatically reset the offset to the earliest offset) (default)
- latest (automatically reset the offset to the latest offset)
- none (throw exception to the consumer if no previous offset is found for the consumer group)
- If "Producer" was selected: Do you want to send ordered messages for (entity/prefix) production? (default = Y)
- Do you want to continue adding consumers or producers? (default = N)
- If "N" was selected: END
You can use yo jhipster-kafka --skip-prompts
to use the default prompts values to generate:
- a minimal
kafka
configuration inapplication.yml
files with only abootstrap.servers
- a
akhq.yml
docker-compose file to run AKHQ (see below) - a
GenericConsumer.java
that you extend to create your own consumers
You can use your producer (*Producer.java
) in other layers like resources or services by instancing it and using its send
method (which is asynchronous).
By default a *KafkaResource.java
is also generated with the producer. It has an endpoint to call the generated producer. Supposing you have generated consumers and producers for an existing Foo
JHIpster entity (with a String field foo
) and for no entity (Bar
prefix), you can test it that way with curl
and jq
:
token=`curl -X POST localhost:8080/api/authenticate -d '{ "username": "admin", "password": "admin" }' -H "Content-Type: application/json"|jq -r '.id_token'`
# For a producer linked to an entity (Foo as JSON):
curl -H "Authorization: Bearer $token" -H 'Content-Type: application/json' -d '{ "foo": "foo" }' -X POST localhost:8080/api/foos/kafka
# For a producer not linked to an entity (String):
curl -H "Authorization: Bearer $token" -H 'Content-Type: application/json' -d 'bar' -X POST localhost:8080/api/bars/kafka
Generated consumers should not be explicitly used in other classes as each of them is running in a thread listening to the incoming messages. However, messages sent through the generated producer will be read if an associated consumer has been generated as well. You just have to read the logs of ./mvnw
locally.
🚀 AKHQ (previously known as KafkaHQ) can be used following those steps in the root directory:
- Run
docker-compose -f src/main/docker/kafka.yml -f src/main/docker/akhq.yml up -d
to launch the ZooKeeper and Kafka services with AKHQ - Go to http://localhost:11817
- Start your application with
./mvnw
to manage your topics and more!
If you want to contribute to the module, please read the CONTRIBUTING.md.
In addition, here are some things to know before to dive into:
- JavaScript ES6 is used for the generator and Java for the generated files
- The module is a Yeoman generator (the module is a sub-generator of JHipster which is itself a Yeoman-based generator)
- Module tests are done with mocha, yeoman-assert and yeoman-test. You can run them with
npm test
- Format rules are defined in
.eslintrc
and.prettierrc
generators/app/index.js
is the main entrypoint, it usesprompts.js
andfiles.js
generators/app/prompts.js
is used to manage prompts optionsgenerators/app/files.js
is used to write files to the generated projectgenerators/app/templates
contains ejs templates (.ejs
) used to generate files
Apache-2.0 © François Delbrayelle (main contributor and stream leader) and all contributors, thank you!