ℹ️ This docker-compose file is configured to run multiple nodes.
This is a Hadoop Cluster that contains the necessary tools that can be used in the BigData domain, It's a collection of docker containers that you can use directly to have the maximum of tools like :
- Hive
- Hue
- MySql
- Zookeeper
- Kafka
- Hbase
- Mongo
- Metabase
- Streamsets
- Sqoop
- Storm
- NiFi
- namenode : fjardim/namenode_sqoop
- datanode : fjardim/datanode
- hive-server : fjardim/hive
- hive-metastore : fjardim/hive
- hive-metastore-postgresql : fjardim/hive-metastore
- hue : fjardim/hue
- mysql : fjardim/mysql
- zookeeper : fjardim/zookeeper
- kafka : fjardim/kafka
- presto-coordinator : fjardim/prestodb
- hbase-master : fjardim/hbase-master
- mongo : fjardim/mongo
- mongo-express : fjardim/mongo-express
- kafkamanager : fjardim/kafkamanager
- metabase : fjardim/metabase
- streamsets : streamsets/datacollector:3.13.0-latest
- storm : fmantuano/apache-storm:develop
- jupyter-spark : fjardim/jupyter-spark
- Apache-NiFi : apache/nifi:latest
git clone https://github.com/ven2day/Bigdata-docker-sandbox.git
cd docker-bigdata-tools
sudo docker-compose up -d
⚠️ It takes some time for launch and configure all the images
- URL : http://localhost:50070/
👁️ You can see here 3 Live Nodes**
- URL : http://localhost:50075/
- URL : http://localhost:50080/
- URL : http://localhost:50085/
- URL : http://localhost:8888/
Username : admin Password : admin
After click in Sign In
Now you can use Hive
- Simple Query for test
CREATE TABLE IF NOT EXISTS users(id INT, name VARCHAR(45), website VARCHAR(45));
INSERT INTO users VALUES(1,"mahmoud zakaria","www.mahmoud.ma");
- After insert data you can execute select query.
SELECT *FROM users;
- Hue Dashboard
- URL : http://localhost:9000/
- URL : http://localhost:8080/
- URL : http://localhost:16010/
- URL : http://localhost:8090/
- URL : http://localhost:8889/
- URL : http://localhost:8081/
- URL : http://localhost:18630/
Username : admin Password : admin
- Credits to : * Fábio Jardim, * Mahmoud Zakaria