This project is an exemplary Apache Flink application, which presents stream processing in casino.
- Maven
- Docker compose
- Flink 1.6+
This repository allows you to run an example casino stream processing on Flink's cluster. Architecture:
- Casino sites - an area that gathers several machines. Example:
The Calypso
,The Chariot
- Machines - physical machines that plays casino games. All machines sends events to the Kafka queue. Example:
M0001
,M0002
, etc. - Kafka - message queue
- Flink - stream processing engine
- Elasticsearch - storage for processed events
- Kibana - data visualization
There is prepared the docker-compose.yaml
file which allows you to run the following services:
Elasticsearch
- storage for processed events (sink)Kibana
- data visualizationKafka
- message queueZookeeper
- configuration service for Kafka
To run all services, execute the following command:
docker-compose up -d
To crate all necessary java libraries (Flink's job and casino simulation), execute the following command:
mvn package
To run Flink's local cluster, execute the following command in Flink's bin directory:
start-cluster
To deploy job, run the following command:
flink run -d core\target\casino-streaming-core-1.0-SNAPSHOT.jar
To run casino simulation, execute the following command:
java -jar simulation/target/casino-streaming-simulation-1.0-SNAPSHOT.jar
Finally, go to the Kibana page, open the Dashboard
menu, click add a new dashboard,
click add panel button and select all visualisations
and saved searches
.