Develop all service of the data workbench at local that base on docker-compose;
The base image used to build other images or compile service;
The main image to run service with docker-compose;
The image used to migrate database; If you update the table struct or data,
run make compose-migrate-db
to make it works at local develop environment;
Running services of DataWorkbench at local, you need:
pull all services code of DataWorkbench under same directory;
install docker-compose at local;
all command in this section execute under project deploy
;
- pull builder
make update-builder
- build all images
make build-all
- Launch dataomnis services at local
make compose-up
- check logs of service
make compose-logs-f [service=apiserver,spacemanager]
After all services running, you could write code, then:
run make compose-migrate-db
to update the database if needed;
run make update [service=apiserver]
to update the service;
- add the service to
service
inMakefile
- add copy-sentence for DB schema sql in
build/db/Dockerfile
if need - add the service in
docker-compose.yaml
refer tospacemanager
- add test scripy to tests directory
- make test
create dir datanode / namenode / journalnode / zookeeper under {{ .Values.hdfs-cluster.hdfsHome }}/hdfs-cluster/{{ .Release.Name }} on all k8s workers for hdfs
cd code/deploy
helm -n dataomnis install dataomnis ./helm/dataomnis
cd code/deploy
helm -n dataomnis delete dataomnis
helm -n dataomnis upgrade dataomnis ./helm/dataomnis/